Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-24889

dataset.unpersist() doesn't update storage memory stats

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 2.3.0
    • 2.3.2, 2.4.0
    • Spark Core
    • None

    Description

      Steps to reproduce:

      1) Start a Spark cluster, and check the storage memory value from the Spark Web UI "Executors" tab (it should be equal to zero if you just started)

      2) Run:

      val df = spark.sqlContext.range(1, 1000000000)
      df.cache()
      df.count()
      df.unpersist(true)

      3) Check the storage memory value again, now it's equal to 1GB

       

      Looks like the memory is actually released, but stats aren't updated. This issue makes cluster management more complicated.

      Attachments

        1. image-2018-07-23-10-53-58-474.png
          20 kB
          Yuri Bogomolov

        Issue Links

          Activity

            People

              viirya L. C. Hsieh
              bogomolov Yuri Bogomolov
              Votes:
              0 Vote for this issue
              Watchers:
              7 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: