Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-29925

Maven Build fails with Hadoop Version 3.2.0

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Minor
    • Resolution: Invalid
    • 3.0.0
    • None
    • Build
    • None

    Description

      Build fails at Spark Core stage when using Maven with specified Hadoop version 3.2. The build command run is:

      ./build/mvn -DskipTests -Dhadoop.version=3.2.0 package
      

      The build error output is

      [INFO] 
      [INFO] --- scala-maven-plugin:4.2.0:testCompile (scala-test-compile-first) @ spark-core_2.12 ---
      [INFO] Using incremental compilation using Mixed compile order
      [INFO] Compiling 262 Scala sources and 27 Java sources to /usr/local/src/spark/core/target/scala-2.12/test-classes ...
      [ERROR] [Error] /usr/local/src/spark/core/src/test/scala/org/apache/spark/util/PropertiesCloneBenchmark.scala:23: object lang is not a member of package org.apache.commons
      [ERROR] [Error] /usr/local/src/spark/core/src/test/scala/org/apache/spark/util/PropertiesCloneBenchmark.scala:49: not found: value SerializationUtils
      [ERROR] two errors found

      The problem does not occur when building without Hadoop package specification, i.e. when running:

      ./build/mvn -DskipTests package
      

       

      Attachments

        Activity

          People

            Unassigned Unassigned
            dcolkitt Douglas Colkitt
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: