Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-25075 Build and test Spark against Scala 2.13
  3. SPARK-35838

Ensure all modules can be maven test independently in Scala 2.13

    XMLWordPrintableJSON

Details

    • Sub-task
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • 3.2.0
    • 3.2.0
    • Build
    • None

    Description

       

      Execute

       

      mvn clean install -Phadoop-3.2 -Phive-2.3 -Phadoop-cloud -Pmesos -Pyarn -Pkinesis-asl -Phive-thriftserver -Pspark-ganglia-lgpl -Pkubernetes -Phive -Pscala-2.13 -pl external/kafka-0-10-sql
      

       

      1 scala test aborted, the error message is 

      Discovery starting.
      Discovery completed in 857 milliseconds.
      Run starting. Expected test count is: 464
      ...
      KafkaRelationSuiteV2:
      - explicit earliest to latest offsets
      - default starting and ending offsets
      - explicit offsets
      - default starting and ending offsets with headers
      - timestamp provided for starting and ending
      - timestamp provided for starting, offset provided for ending
      - timestamp provided for ending, offset provided for starting
      - timestamp provided for starting, ending not provided
      - timestamp provided for ending, starting not provided
      - global timestamp provided for starting and ending
      - no matched offset for timestamp - startingOffsets
      - preferences on offset related options
      - no matched offset for timestamp - endingOffsets
      *** RUN ABORTED ***
        java.lang.NoClassDefFoundError: scala/collection/parallel/TaskSupport
        at org.apache.spark.SparkContext.$anonfun$union$1(SparkContext.scala:1411)
        at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
        at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
        at org.apache.spark.SparkContext.withScope(SparkContext.scala:788)
        at org.apache.spark.SparkContext.union(SparkContext.scala:1405)
        at org.apache.spark.sql.execution.UnionExec.doExecute(basicPhysicalOperators.scala:697)
        at org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:182)
        at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:220)
        at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
        at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:217)
        ...
        Cause: java.lang.ClassNotFoundException: scala.collection.parallel.TaskSupport
        at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
        at org.apache.spark.SparkContext.$anonfun$union$1(SparkContext.scala:1411)
        at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
        at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
        at org.apache.spark.SparkContext.withScope(SparkContext.scala:788)
        at org.apache.spark.SparkContext.union(SparkContext.scala:1405)
        at org.apache.spark.sql.execution.UnionExec.doExecute(basicPhysicalOperators.scala:697)
        ...
      
      

       

      Attachments

        Activity

          People

            LuciferYang Yang Jie
            LuciferYang Yang Jie
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: