Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-25075 Build and test Spark against Scala 2.13
  3. SPARK-30158

Resolve Array + reference type compile problems in 2.13, with sc.parallelize

Attach filesAttach ScreenshotVotersWatch issueWatchersLinkCloneUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Sub-task
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • 3.0.0
    • 3.0.0
    • Spark Core, SQL
    • None

    Description

      Scala 2.13 has some different rules about resolving Arrays as Seqs when the array is of a reference type. This primarily affects calls to sc.parallelize(Array(...)) where elements aren't primitive:

      [ERROR] [Error] /Users/seanowen/Documents/spark_2.13/mllib/src/main/scala/org/apache/spark/mllib/pmml/PMMLExportable.scala:61: overloaded method value apply with alternatives:
        (x: Unit,xs: Unit*)Array[Unit] <and>
        (x: Double,xs: Double*)Array[Double] <and>
        ...
      

      This is easy to resolve by using Seq instead.

      Closely related: WrappedArray is a type def in 2.13, which makes it unusable in Java. One set of tests needs to adapt.

      Attachments

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            srowen Sean R. Owen
            srowen Sean R. Owen
            Votes:
            0 Vote for this issue
            Watchers:
            0 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Slack

                Issue deployment