Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-30560

allow driver to consume a fractional core

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Open
    • Minor
    • Resolution: Unresolved
    • 3.1.0
    • None
    • Scheduler, Spark Core
    • None

    Description

      see https://stackoverflow.com/questions/56781927/apache-spark-standalone-scheduler-why-does-driver-need-a-whole-core-in-cluste

      this is to make it possible for a driver to use 0.2 cores rather than a whole core

      Standard CPUs, no GPUs

      Attachments

        Activity

          People

            Unassigned Unassigned
            toopt4 t oo
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated: