Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-24889

dataset.unpersist() doesn't update storage memory stats

Attach filesAttach ScreenshotVotersWatch issueWatchersCreate sub-taskLinkCloneUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 2.3.0
    • 2.3.2, 2.4.0
    • Spark Core
    • None

    Description

      Steps to reproduce:

      1) Start a Spark cluster, and check the storage memory value from the Spark Web UI "Executors" tab (it should be equal to zero if you just started)

      2) Run:

      val df = spark.sqlContext.range(1, 1000000000)
      df.cache()
      df.count()
      df.unpersist(true)

      3) Check the storage memory value again, now it's equal to 1GB

       

      Looks like the memory is actually released, but stats aren't updated. This issue makes cluster management more complicated.

      Attachments

        Issue Links

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            viirya L. C. Hsieh
            bogomolov Yuri Bogomolov
            Votes:
            0 Vote for this issue
            Watchers:
            7 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Slack

                Issue deployment