Conversation
|
Hello, and thanks for your contribution! I'm a bot set up to make sure that the project can legally accept your contribution by verifying you have signed the PSF contributor agreement (CLA). Unfortunately we couldn't find an account corresponding to your GitHub username on bugs.python.org (b.p.o) to verify you have signed the CLA (this might be simply due to a missing "GitHub Name" entry in your b.p.o account settings). This is necessary for legal reasons before we can look at your contribution. Please follow the steps outlined in the CPython devguide to rectify this issue. Thanks again to your contribution and we look forward to looking at it! |
|
I will add documentation, NEWS.d, and update Misc/ACKS in a separate commit. |
| collections, and are awaiting to undergo a full collection for | ||
| the first time. */ | ||
| Py_ssize_t long_lived_pending; | ||
| struct gc_generation permanent_generation; |
There was a problem hiding this comment.
I would put this field after generation0.
| } | ||
|
|
||
| PyDoc_STRVAR(gc_get_freeze_stats__doc__, | ||
| "get_freeze_stats() -> n\n" |
There was a problem hiding this comment.
Since this is not equivalent to what get_stats() is returning, can you rename to get_frozen_count()?
| ); | ||
|
|
||
| static PyObject * | ||
| gc_freeze(PyObject *module) |
There was a problem hiding this comment.
We'd prefer if you used ArgumentClinic for those functions. It's very easy to use: https://docs.python.org/3/howto/clinic.html
| for (i = 0; i < NUM_GENERATIONS; i++) | ||
| PySys_FormatStderr(" %zd", | ||
| gc_list_size(GEN_HEAD(i))); | ||
| PySys_WriteStderr("\ngc: objects in permanent generation: %d", |
| self.assertEqual(new[2]["collections"], old[2]["collections"] + 1) | ||
|
|
||
| def test_freeze(self): | ||
| gc.freeze() |
There was a problem hiding this comment.
You should really undo the freezing after this test. We don't want tests to have so large side effects.
There was a problem hiding this comment.
Good point! This will require adding the "thaw" or "unfreeze" function that will move objects back from the permanent generation to a properly collected one. Do you think it would be enough for that call to move all of them to gen0? This won't be exactly "undoing the freeze" but doesn't require keeping track of which object belonged to which generation before.
There was a problem hiding this comment.
Anything that empties the permanent generation is good enough IMHO. People can call gc.collect explicitly afterwards if they want to collect those objects.
PS: as a non-native English locutor, I much prefer "unfreeze" to "thaw" :-)
There was a problem hiding this comment.
Oh, didn't notice your "general comment" until now. So it seems we agree that moving everything to a single generation makes sense. But you're suggesting the oldest generation (gen2) instead. Makes sense.
There was a problem hiding this comment.
ok, I was just too lazy to add it but fine will update..
|
|
||
| Freeze all current tracked objects and ignore them for future collections. | ||
|
|
||
| This can be used before a fork to make the gc copy-on-write friendly. |
There was a problem hiding this comment.
Rather than "a fork", perhaps use the wording "a POSIX fork() call", as that may not be obvious to all readers.
|
Two general comments:
|
| gc_unfreeze_impl(PyObject *module) | ||
| /*[clinic end generated code: output=1c15f2043b25e169 input=2dd52b170f4cef6c]*/ | ||
| { | ||
| gc_list_merge(&_PyRuntime.gc.permanent_generation.head, GEN_HEAD(NUM_GENERATIONS-1)); |
There was a problem hiding this comment.
Do we need to update _PyRuntime.gc.permanent_generation.count after this?
There was a problem hiding this comment.
No, it's always 0 since we don't use it, instead gc_list_size() is used for count.
count is not always the number of objects in generations(it is for 0 but not 1 and 2).
pitrou
left a comment
There was a problem hiding this comment.
The current version of the PR looks basically good to me. Thank you!
|
Does fix already in 3.7.0 release? |
|
@Somewater I think you misunderstood what |
|
Thanks for answer. But can I manage with it without full disable of GC? I have big structure in parent process memory (python's objects) and have to read it from forked childs. |
|
If you're actually using those objects in the children, they will be duplicated because of reference counting. |
Introduces a new API that allows for moving all objects currently tracked by the garbage collector to a permanent generation, effectively removing them from future collection events. This can be used to protect those objects from having their
PyGC_Headmutated. In effect, this enables great copy-on-write stability atfork(). More details on the issue.https://bugs.python.org/issue31558