Details
-
Sub-task
-
Status: Open
-
Major
-
Resolution: Unresolved
-
None
-
None
-
None
Description
Spark has an executor blacklist feature that throws errors similar to the following:
Aborting TaskSet 52.0 because task 0 (partition 0) cannot run anywhere due to node and executor blacklist. Blacklisting behavior can be configured via spark.blacklist.*.
I think the message changed in Spark 2.4.0, but its similar to the one above.
It would be good to have some custom parsing logic and a custom ErroMsg for Spark blacklist errors.