bpo-36907: fix refcount bug in _PyStack_UnpackDict()#13381
bpo-36907: fix refcount bug in _PyStack_UnpackDict()#13381encukou merged 2 commits intopython:masterpython/cpython:masterfrom jdemeyer:_PyStack_UnpackDict_refcountjdemeyer/cpython:_PyStack_UnpackDict_refcountCopy head branch name to clipboard
Conversation
|
I decided to increase the refcount in This should probably be backported to 3.7 too. |
Objects/call.c
Outdated
There was a problem hiding this comment.
Is there a reason to not keep memcpy for the bulk copy, and only add an INCREF loop?
There was a problem hiding this comment.
Is there a reason to not keep
memcpyfor the bulk copy, and only add an INCREF loop?
Not really, it's mostly a matter of style. I like doing the INCREF at the same time as the assignment. I also find the for loop variant simpler to understand.
Lib/test/test_call.py
Outdated
There was a problem hiding this comment.
This is supposed to trigger garbage collection, right? Why not make it explicit?
| L = [2**i for i in range(10000)] | |
| gc.collect() |
(with import gc at the top)
There was a problem hiding this comment.
This is supposed to trigger garbage collection, right? Why not make it explicit?
Not really, it is meant to clobber memory. The bug is an use-after-free but crashing is only guaranteed if the used-after-freed memory is corrupted. But I'll add a comment explaining that.
There was a problem hiding this comment.
IMO we just leave that to the debug allocator, or valgrind runs.
There was a problem hiding this comment.
IMO we just leave that to the debug allocator, or valgrind runs.
Fine for me, but I have to mention that a similar thing was accepted by you(!) in #9084.
There was a problem hiding this comment.
I hope the things I learned since then are useful.
@vstinner, what's your view on this? Is it useful to clobber memory with L = [2**i for i in range(10000)]?
There was a problem hiding this comment.
@vstinner, what's your view on this? Is it useful to clobber memory with L = [2**i for i in range(10000)]?
I'm sorry but I don't have the bandwidth right now to follow the discussion on PEP 590 and it's implementation :-( Retry to ping me in 1 or 2 weeks if you want ;-)
There was a problem hiding this comment.
@vstinner Ah, sorry. This isn't really about PEP 590, though. It's a refcounting bug, an object is deallocated while it's still in use.
In normal builds, the bug will manifest only if you clobber the memory after it is deallocated. The debug allocator (or valgrind) should catch it even if the L = [2**i for i in range(10000)] is not done.
Is it OK to rely on the debug allocator (or valgrind) to prevent regressions?
(If you don't have bandwidth for that either, let me know)
There was a problem hiding this comment.
In the mean time, I removed that line.
7f45962 to
bdea0de
Compare
bdea0de to
c047904
Compare
|
Can this be merged please? That would be good for the PEP 590 implementation, which uses this function. |
encukou
left a comment
There was a problem hiding this comment.
I took the liberty to add a gc.collect() call.
|
#13493 is a backport to 3.7. |
A minimal patch fixing bpo-36907
CC @encukou
https://bugs.python.org/issue36907