-
-
Notifications
You must be signed in to change notification settings - Fork 26
Add support for PEP 525-style garbage collection hooks #15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Codecov Report
@@ Coverage Diff @@
## master #15 +/- ##
======================================
Coverage 100% 100%
======================================
Files 7 7
Lines 776 972 +196
Branches 59 77 +18
======================================
+ Hits 776 972 +196
Continue to review full report at Codecov.
|
async_generator/_impl.py
Outdated
if coro_state is CORO_CREATED: | ||
(firstiter, self._finalizer) = get_asyncgen_hooks() | ||
if firstiter is not None: | ||
firstiter(self) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This could call the firstiter
hook twice:
agen = agen_function()
agen.asend(None).close()
agen.asend(None).close()
Pretty obscure, but native agens don't seem to call it twice in this case. We should add a test case, and I guess an explicit have-we-done-hook-setup variable.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for spotting! Fixed and added a test.
|
||
async def step(): | ||
if self.ag_running: | ||
raise ValueError("async generator already executing") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What do you think of moving this check, and the self.ag_running = True
, up out of the async def
?
There would be one tricky thing to worry about: if a coroutine object gets close
d or GCed before it even starts, then python doesn't enforce its normal guarantee that finally
blocks will be executed. So if we did do this I guess we'd need to add another yield
at the top of step
, and immediately iterate it that far when we created the coroutine object, in order to guarantee that eventually ag_running = False
would happen. Maybe this is more work than it's worth.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess the intent would be to guard against two asend() coroutines existing simultaneously, even if one is exhausted before the second is started? Given the implementation difficulties, I don't think it's worth it, since it seems impossible to actually interleave the underlying send() calls in a way that would create confusion. (I originally put the ag_running test outside of step() with the setting-to-True still inside, and that did create potential problems, which is why I added test_reentrance_forbidden_simultaneous_asends
.)
async_generator/_impl.py
Outdated
|
||
################################################################ | ||
# Cleanup | ||
################################################################ | ||
|
||
async def aclose(self): | ||
state = getcoroutinestate(self._coroutine) | ||
if state is CORO_CLOSED or self._closed: | ||
return | ||
self._closed = True |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What's this code for?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It ensures that if aclose() fails due to a yield during stack unwinding, we still consider the generator "as closed as it's going to get", and don't try again to unwind it during __del__
. I added a comment to clarify.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ah, nice catch!
gen.__del__() | ||
for turns in (1, 2, 3): | ||
gen = awaits_when_unwinding() | ||
for turn in range(1, turns + 1): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If you make it yield 0, 1, 2
then this will be a bit simpler :-)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
await gen.aclose() | ||
# Closed, so no problem | ||
gen.__del__() | ||
for turns in (1, 2, 3): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you rename turns
→ stop_after_turn
or similar, to make it more immediately obvious what this is doing?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
with pytest.raises(TypeError): | ||
set_asyncgen_hooks(finalizer=False) | ||
|
||
def in_thread(results=[]): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nitpick: we always pass results
to in_thread
, so it should be a mandatory argument (no default)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done (and using a list as a default argument, too - what was I even thinking?)
docs/source/reference.rst
Outdated
like you would use ``sys.set_asyncgen_hooks()`` with native | ||
generators. On Python 3.6+, the former is an alias for the latter, | ||
so libraries that use the native mechanism should work seamlessly | ||
with ``@async_generator`` functions. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should probably include a warning here that not all async libraries will use these hooks. For example presumably trio will use these if it uses any, but if you're using asyncio on 3.5 then it won't do the same async generator cleanup that it does on 3.6+.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
good thought - I added a sentence about this. (When we teach trio about these, we can go back and say "but if you're using trio, it will just work!" if we want. :P)
ping on this? |
Thanks! |
Add
async_generator.get_asyncgen_hooks()
andasync_generator.set_asyncgen_hooks()
, and use them under the same circumstances the equivalents insys
would be used for native generators. On 3.6+, theasync_generator
names refer to the same functions as thesys
names, so there's only one set of hooks, and frameworks that usesys.set_asyncgen_hooks()
should work seamlessly withasync_generator
functions as well.There's an issue with GC ordering or something in pypy, which causes the new logic to make the interpreter segfault under some circumstances, so keep the old logic (which complains if any async_generator is GC'ed before its execution has completed) if we detect that we're running under pypy. The bug has been reported upstream: https://bitbucket.org/pypy/pypy/issues/2786/