Details
-
Bug
-
Status: Open
-
P3
-
Resolution: Unresolved
-
2.14.0
-
None
-
None
-
OS: Debian rodete.
Tested using:
Beam versions: 2.13.0, 2.15.0.dev
Python versions: Python 2.7, Python 3.7.
Runners: DirectRunner, DataflowRunner.
Description
It seems that Apache Beam's Pipeline instances are not garbage collected, even if the pipelines are finished or cancelled and there are no references to those pipelines in the Python interpreter.
For pipelines executed in a script, this is not a problem. However, for interactive pipelines executed inside a Jupyter notebook, this limits how well we can track and remove the dependencies of those pipelines. For example, if a pipeline reads from some cache, it would be nice to be able to delete that cache once there are no references to it from pipelines or the global namespace.
The issue can be reproduced using the following script: https://gist.github.com/ostrokach/a16556dc77c96b87fe23c2fbd8fb6346.
On further examination, turns out that this is due to the _PubSubReadEvaluator._subscription_cache class attribute keeping references to all ReadFromPubSub transforms.
Attachments
Issue Links
- links to