Details
-
Bug
-
Status: Open
-
Minor
-
Resolution: Unresolved
-
3.4.0
-
None
-
None
-
```text
scala> println(spark.version)
3.4.0scala> println(sc.master)
local[*]
```
Description
While in `local[*]` mode, the following `SparkException` is thrown:
```text
org.apache.spark.SparkException: TaskResourceProfiles are only supported for Standalone cluster for now when dynamic allocation is disabled.
at org.apache.spark.resource.ResourceProfileManager.isSupported(ResourceProfileManager.scala:71)
at org.apache.spark.resource.ResourceProfileManager.addResourceProfile(ResourceProfileManager.scala:126)
at org.apache.spark.rdd.RDD.withResources(RDD.scala:1802)
... 42 elided
```
This happens for the following snippet:
```scala
val rdd = sc.range(0, 9)
import org.apache.spark.resource.ResourceProfileBuilder
val rpb = new ResourceProfileBuilder
val rp1 = rpb.build()
rdd.withResources(rp1)
```