Details
-
Bug
-
Status: Resolved
-
Trivial
-
Resolution: Fixed
-
None
-
Any
Description
As Giacomo originally reported:
In ml_ops.sh there are:
--num-executors ${SPK_EXEC} \
and:
--conf spark.dynamicAllocation.enabled=true \
which trigger the warning:
WARN spark.SparkContext: Dynamic Allocation and num executors both
set, thus dynamic allocation disabled.
Shouldn't we remove the "--num-executors" and add instead:
--conf spark.dynamicAllocation.maxExecutors=${SPK_EXEC} \
?
Still need to determine if we keep it dynamic of fixed number of executors.
Attachments
Issue Links
- links to