Details
-
Improvement
-
Status: Resolved
-
P2
-
Resolution: Fixed
-
None
-
None
-
Python SDK
ApacheBeam version==2.13.0
worker_type==n1-standard-4
Description
I'm developing a streaming pipeline with big memory consumption in one of the PTransforms.
After some period after starting this pipeline fails without any specific logs (see attachment file)
It looks like, that it happens because of OutOfMemory.
It would be great to set a limit of threads that will be used in a single worker to control memory load.
I found such option in JAVA SDK (--numberOfWorkerHarnessThreads), but in Python SDK it is absent