Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-8309

OpenHashMap doesn't work with more than 12M items

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Critical
    • Resolution: Fixed
    • 1.3.1, 1.4.0
    • 1.3.2, 1.4.1, 1.5.0
    • Spark Core
    • None

    Description

      The problem might be demonstrated with the following testcase:

        test("support for more than 12M items") {
          val cnt = 12000000 // 12M
          val map = new OpenHashMap[Int, Int](cnt)
          for (i <- 0 until cnt) {
            map(i) = 1
          }
          val numInvalidValues = map.iterator.count(_._2 == 0)
          assertResult(0)(numInvalidValues)
        }
      
      

      Attachments

        Activity

          People

            slavik.baranov Vyacheslav Baranov
            wildfire Vyacheslav Baranov
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: