Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Add support for PEP 525-style garbage collection hooks #15

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 5 commits into from
Apr 17, 2018

Conversation

oremanj
Copy link
Member

@oremanj oremanj commented Apr 5, 2018

Add async_generator.get_asyncgen_hooks() and async_generator.set_asyncgen_hooks(), and use them under the same circumstances the equivalents in sys would be used for native generators. On 3.6+, the async_generator names refer to the same functions as the sys names, so there's only one set of hooks, and frameworks that use sys.set_asyncgen_hooks() should work seamlessly with async_generator functions as well.

There's an issue with GC ordering or something in pypy, which causes the new logic to make the interpreter segfault under some circumstances, so keep the old logic (which complains if any async_generator is GC'ed before its execution has completed) if we detect that we're running under pypy. The bug has been reported upstream: https://bitbucket.org/pypy/pypy/issues/2786/

@codecov
Copy link

codecov bot commented Apr 5, 2018

Codecov Report

Merging #15 into master will not change coverage.
The diff coverage is 100%.

Impacted file tree graph

@@          Coverage Diff           @@
##           master    #15    +/-   ##
======================================
  Coverage     100%   100%            
======================================
  Files           7      7            
  Lines         776    972   +196     
  Branches       59     77    +18     
======================================
+ Hits          776    972   +196
Impacted Files Coverage Δ
async_generator/__init__.py 100% <ø> (ø) ⬆️
async_generator/_tests/test_async_generator.py 100% <100%> (ø) ⬆️
async_generator/_impl.py 100% <100%> (ø) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 17521f8...c9168c1. Read the comment docs.

@oremanj oremanj requested a review from njsmith April 5, 2018 20:53
if coro_state is CORO_CREATED:
(firstiter, self._finalizer) = get_asyncgen_hooks()
if firstiter is not None:
firstiter(self)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This could call the firstiter hook twice:

agen = agen_function()
agen.asend(None).close()
agen.asend(None).close()

Pretty obscure, but native agens don't seem to call it twice in this case. We should add a test case, and I guess an explicit have-we-done-hook-setup variable.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for spotting! Fixed and added a test.


async def step():
if self.ag_running:
raise ValueError("async generator already executing")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What do you think of moving this check, and the self.ag_running = True, up out of the async def?

There would be one tricky thing to worry about: if a coroutine object gets closed or GCed before it even starts, then python doesn't enforce its normal guarantee that finally blocks will be executed. So if we did do this I guess we'd need to add another yield at the top of step, and immediately iterate it that far when we created the coroutine object, in order to guarantee that eventually ag_running = False would happen. Maybe this is more work than it's worth.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess the intent would be to guard against two asend() coroutines existing simultaneously, even if one is exhausted before the second is started? Given the implementation difficulties, I don't think it's worth it, since it seems impossible to actually interleave the underlying send() calls in a way that would create confusion. (I originally put the ag_running test outside of step() with the setting-to-True still inside, and that did create potential problems, which is why I added test_reentrance_forbidden_simultaneous_asends.)


################################################################
# Cleanup
################################################################

async def aclose(self):
state = getcoroutinestate(self._coroutine)
if state is CORO_CLOSED or self._closed:
return
self._closed = True
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What's this code for?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It ensures that if aclose() fails due to a yield during stack unwinding, we still consider the generator "as closed as it's going to get", and don't try again to unwind it during __del__. I added a comment to clarify.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ah, nice catch!

gen.__del__()
for turns in (1, 2, 3):
gen = awaits_when_unwinding()
for turn in range(1, turns + 1):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you make it yield 0, 1, 2 then this will be a bit simpler :-)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

await gen.aclose()
# Closed, so no problem
gen.__del__()
for turns in (1, 2, 3):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you rename turnsstop_after_turn or similar, to make it more immediately obvious what this is doing?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

with pytest.raises(TypeError):
set_asyncgen_hooks(finalizer=False)

def in_thread(results=[]):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nitpick: we always pass results to in_thread, so it should be a mandatory argument (no default)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done (and using a list as a default argument, too - what was I even thinking?)

like you would use ``sys.set_asyncgen_hooks()`` with native
generators. On Python 3.6+, the former is an alias for the latter,
so libraries that use the native mechanism should work seamlessly
with ``@async_generator`` functions.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should probably include a warning here that not all async libraries will use these hooks. For example presumably trio will use these if it uses any, but if you're using asyncio on 3.5 then it won't do the same async generator cleanup that it does on 3.6+.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

good thought - I added a sentence about this. (When we teach trio about these, we can go back and say "but if you're using trio, it will just work!" if we want. :P)

@oremanj
Copy link
Member Author

oremanj commented Apr 17, 2018

ping on this?

@njsmith njsmith merged commit 1272b9c into python-trio:master Apr 17, 2018
@njsmith
Copy link
Member

njsmith commented Apr 17, 2018

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants
Morty Proxy This is a proxified and sanitized view of the page, visit original site.