Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-17521

Error when I use sparkContext.makeRDD(Seq())

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Trivial
    • Resolution: Fixed
    • 2.0.0
    • 2.0.1, 2.1.0
    • Spark Core

    Description

      when i use sc.makeRDD below
      ```
      val data3 = sc.makeRDD(Seq())
      println(data3.partitions.length)
      ```
      I got an error:
      Exception in thread "main" java.lang.IllegalArgumentException: Positive number of slices required
      We can fix this bug just modify the last line ,do a check of seq.size
      ````
      def makeRDD[T: ClassTag](seq: Seq[(T, Seq[String])]): RDD[T] = withScope

      { assertNotStopped() val indexToPrefs = seq.zipWithIndex.map(t => (t._2, t._1._2)).toMap new ParallelCollectionRDD[T](this, seq.map(_._1), seq.size, indexToPrefs) }

      ```

      Attachments

        Activity

          People

            codlife Jianfei Wang
            codlife Jianfei Wang
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: