XMLWordPrintableJSON

Details

    • Sub-task
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 3.2.0
    • 3.1.0
    • MLlib
    • None

    Description

      <del>This may be the first failed build:</del>
      https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-maven-hadoop-2.7-scala-2.13/52/

      Possible Work Around Fix

      Move

      case class Data(word: String, vector: Array[Float])

      out of the class Word2VecModel

      Attempts to git bisect

      master branch git "bisect"
      cc23581e2645c91fa8d6e6c81dc87b4221718bb1 fail
      3d0323401f7a3e4369a3d3f4ff98f15d19e8a643 fail
      9d9d4a8e122cf1137edeca857e925f7e76c1ace2 fail
      f5d2165c95fe83f24be9841807613950c1d5d6d0 fail 2020-12-01

      Attached Stack Trace

      To reproduce it in master:

      ./dev/change-scala-version.sh 2.13

      sbt -Pscala-2.13
      > project mllib
      > testOnly org.apache.spark.ml.feature.Word2VecSuite

      [info] Word2VecSuite:
      [info] - params (45 milliseconds)
      [info] - Word2Vec (5 seconds, 768 milliseconds)
      [info] - getVectors (549 milliseconds)
      [info] - findSynonyms (222 milliseconds)
      [info] - window size (382 milliseconds)
      [info] - Word2Vec read/write numPartitions calculation (1 millisecond)
      [info] - Word2Vec read/write (669 milliseconds)
      [info] - Word2VecModel read/write *** FAILED *** (423 milliseconds)
      [info] org.apache.spark.SparkException: Job aborted.
      [info] at org.apache.spark.sql.execution.datasources.FileFormatWriter$.write(FileFormatWriter.scala:231)
      [info] at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand.run(InsertIntoHadoopFsRelationCommand.scala:188)
      [info] at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult$lzycompute(commands.scala:108)
      [info] at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult(commands.scala:106)
      [info] at org.apache.spark.sql.execution.command.DataWritingCommandExec.doExecute(commands.scala:131)
      [info] at org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:180)
      [info] at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:218)
      [info] at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
      [info] at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:215)
      [info] at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:176)
      [info] at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:132)
      [info] at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:131)
      [info] at org.apache.spark.sql.DataFrameWriter.$anonfun$runCommand$1(DataFrameWriter.scala:989)
      [info] at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
      [info] at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
      [info] at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
      [info] at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772)
      [info] at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
      [info] at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:989)
      [info] at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:438)
      [info] at org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:415)
      [info] at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:293)
      [info] at org.apache.spark.sql.DataFrameWriter.parquet(DataFrameWriter.scala:874)
      [info] at org.apache.spark.ml.feature.Word2VecModel$Word2VecModelWriter.saveImpl(Word2Vec.scala:368)
      [info] at org.apache.spark.ml.util.MLWriter.save(ReadWrite.scala:168)
      [info] at org.apache.spark.ml.util.MLWritable.save(ReadWrite.scala:287)
      [info] at org.apache.spark.ml.util.MLWritable.save$(ReadWrite.scala:287)
      [info] at org.apache.spark.ml.feature.Word2VecModel.save(Word2Vec.scala:207)
      [info] at org.apache.spark.ml.util.DefaultReadWriteTest.testDefaultReadWrite(DefaultReadWriteTest.scala:51)
      [info] at org.apache.spark.ml.util.DefaultReadWriteTest.testDefaultReadWrite$(DefaultReadWriteTest.scala:42)
      [info] at org.apache.spark.ml.feature.Word2VecSuite.testDefaultReadWrite(Word2VecSuite.scala:28)
      [info] at org.apache.spark.ml.feature.Word2VecSuite.$anonfun$new$25(Word2VecSuite.scala:205)
      [info] at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
      [info] at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
      [info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
      [info] at org.scalatest.Transformer.apply(Transformer.scala:22)
      [info] at org.scalatest.Transformer.apply(Transformer.scala:20)
      [info] at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:190)
      [info] at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:176)
      [info] at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:188)
      [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:200)
      [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
      [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:200)
      [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:182)
      [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:61)
      [info] at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234)
      [info] at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227)
      [info] at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:61)
      [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:233)
      [info] at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413)
      [info] at scala.collection.immutable.List.foreach(List.scala:333)
      [info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
      [info] at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396)
      [info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475)
      [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:233)
      [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:232)
      [info] at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563)
      [info] at org.scalatest.Suite.run(Suite.scala:1112)
      [info] at org.scalatest.Suite.run$(Suite.scala:1094)
      [info] at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563)
      [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:237)
      [info] at org.scalatest.SuperEngine.runImpl(Engine.scala:535)
      [info] at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:237)
      [info] at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:236)
      [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:61)
      [info] at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
      [info] at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
      [info] at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
      [info] at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:61)
      [info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:318)
      [info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:513)
      [info] at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:413)
      [info] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
      [info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
      [info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
      [info] at java.lang.Thread.run(Thread.java:748)
      [info] Cause: java.util.concurrent.ExecutionException: org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 73, Column 65: failed to compile: org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 73, Column 65: No applicable constructor/method found for zero actual parameters; candidates are: "public java.lang.String org.apache.spark.ml.feature.Word2VecModel$Data.word()"
      [info] at com.google.common.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:306)
      [info] at com.google.common.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:293)
      [info] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:116)
      [info] at com.google.common.util.concurrent.Uninterruptibles.getUninterruptibly(Uninterruptibles.java:135)
      [info] at com.google.common.cache.LocalCache$Segment.getAndRecordStats(LocalCache.java:2410)
      [info] at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2380)
      [info] at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
      [info] at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2257)
      [info] at com.google.common.cache.LocalCache.get(LocalCache.java:4000)
      [info] at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4004)
      [info] at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
      [info] at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.compile(CodeGenerator.scala:1351)
      [info] at org.apache.spark.sql.execution.WholeStageCodegenExec.liftedTree1$1(WholeStageCodegenExec.scala:721)
      [info] at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:720)
      [info] at org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:180)
      [info] at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:218)
      [info] at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
      [info] at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:215)
      [info] at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:176)
      [info] at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.doExecute(AdaptiveSparkPlanExec.scala:295)
      [info] at org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:180)
      [info] at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:218)
      [info] at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
      [info] at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:215)
      [info] at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:176)
      [info] at org.apache.spark.sql.execution.datasources.FileFormatWriter$.write(FileFormatWriter.scala:177)
      [info] at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand.run(InsertIntoHadoopFsRelationCommand.scala:188)
      [info] at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult$lzycompute(commands.scala:108)
      [info] at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult(commands.scala:106)
      [info] at org.apache.spark.sql.execution.command.DataWritingCommandExec.doExecute(commands.scala:131)
      [info] at org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:180)
      [info] at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:218)
      [info] at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
      [info] at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:215)
      [info] at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:176)
      [info] at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:132)
      [info] at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:131)
      [info] at org.apache.spark.sql.DataFrameWriter.$anonfun$runCommand$1(DataFrameWriter.scala:989)
      [info] at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
      [info] at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
      [info] at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
      [info] at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772)
      [info] at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
      [info] at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:989)
      [info] at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:438)
      [info] at org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:415)
      [info] at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:293)
      [info] at org.apache.spark.sql.DataFrameWriter.parquet(DataFrameWriter.scala:874)
      [info] at org.apache.spark.ml.feature.Word2VecModel$Word2VecModelWriter.saveImpl(Word2Vec.scala:368)
      [info] at org.apache.spark.ml.util.MLWriter.save(ReadWrite.scala:168)
      [info] at org.apache.spark.ml.util.MLWritable.save(ReadWrite.scala:287)
      [info] at org.apache.spark.ml.util.MLWritable.save$(ReadWrite.scala:287)
      [info] at org.apache.spark.ml.feature.Word2VecModel.save(Word2Vec.scala:207)
      [info] at org.apache.spark.ml.util.DefaultReadWriteTest.testDefaultReadWrite(DefaultReadWriteTest.scala:51)
      [info] at org.apache.spark.ml.util.DefaultReadWriteTest.testDefaultReadWrite$(DefaultReadWriteTest.scala:42)
      [info] at org.apache.spark.ml.feature.Word2VecSuite.testDefaultReadWrite(Word2VecSuite.scala:28)
      [info] at org.apache.spark.ml.feature.Word2VecSuite.$anonfun$new$25(Word2VecSuite.scala:205)
      [info] at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
      [info] at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
      [info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
      [info] at org.scalatest.Transformer.apply(Transformer.scala:22)
      [info] at org.scalatest.Transformer.apply(Transformer.scala:20)
      [info] at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:190)
      [info] at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:176)
      [info] at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:188)
      [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:200)
      [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
      [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:200)
      [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:182)
      [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:61)
      [info] at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234)
      [info] at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227)
      [info] at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:61)
      [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:233)
      [info] at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413)
      [info] at scala.collection.immutable.List.foreach(List.scala:333)
      [info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
      [info] at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396)
      [info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475)
      [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:233)
      [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:232)
      [info] at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563)
      [info] at org.scalatest.Suite.run(Suite.scala:1112)
      [info] at org.scalatest.Suite.run$(Suite.scala:1094)
      [info] at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563)
      [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:237)
      [info] at org.scalatest.SuperEngine.runImpl(Engine.scala:535)
      [info] at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:237)
      [info] at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:236)
      [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:61)
      [info] at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
      [info] at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
      [info] at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
      [info] at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:61)
      [info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:318)
      [info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:513)
      [info] at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:413)
      [info] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
      [info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
      [info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
      [info] at java.lang.Thread.run(Thread.java:748)
      [info] Cause: org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 73, Column 65: failed to compile: org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 73, Column 65: No applicable constructor/method found for zero actual parameters; candidates are: "public java.lang.String org.apache.spark.ml.feature.Word2VecModel$Data.word()"
      [info] at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.org$apache$spark$sql$catalyst$expressions$codegen$CodeGenerator$$doCompile(CodeGenerator.scala:1415)
      [info] at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:1500)
      [info] at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:1497)
      [info] at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599)
      [info] at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2379)
      [info] at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
      [info] at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2257)
      [info] at com.google.common.cache.LocalCache.get(LocalCache.java:4000)
      [info] at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4004)
      [info] at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
      [info] at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.compile(CodeGenerator.scala:1351)
      [info] at org.apache.spark.sql.execution.WholeStageCodegenExec.liftedTree1$1(WholeStageCodegenExec.scala:721)
      [info] at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:720)
      [info] at org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:180)
      [info] at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:218)
      [info] at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
      [info] at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:215)
      [info] at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:176)
      [info] at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.doExecute(AdaptiveSparkPlanExec.scala:295)
      [info] at org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:180)
      [info] at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:218)
      [info] at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
      [info] at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:215)
      [info] at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:176)
      [info] at org.apache.spark.sql.execution.datasources.FileFormatWriter$.write(FileFormatWriter.scala:177)
      [info] at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand.run(InsertIntoHadoopFsRelationCommand.scala:188)
      [info] at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult$lzycompute(commands.scala:108)
      [info] at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult(commands.scala:106)
      [info] at org.apache.spark.sql.execution.command.DataWritingCommandExec.doExecute(commands.scala:131)
      [info] at org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:180)
      [info] at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:218)
      [info] at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
      [info] at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:215)
      [info] at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:176)
      [info] at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:132)
      [info] at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:131)
      [info] at org.apache.spark.sql.DataFrameWriter.$anonfun$runCommand$1(DataFrameWriter.scala:989)
      [info] at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
      [info] at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
      [info] at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
      [info] at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772)
      [info] at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
      [info] at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:989)
      [info] at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:438)
      [info] at org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:415)
      [info] at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:293)
      [info] at org.apache.spark.sql.DataFrameWriter.parquet(DataFrameWriter.scala:874)
      [info] at org.apache.spark.ml.feature.Word2VecModel$Word2VecModelWriter.saveImpl(Word2Vec.scala:368)
      [info] at org.apache.spark.ml.util.MLWriter.save(ReadWrite.scala:168)
      [info] at org.apache.spark.ml.util.MLWritable.save(ReadWrite.scala:287)
      [info] at org.apache.spark.ml.util.MLWritable.save$(ReadWrite.scala:287)
      [info] at org.apache.spark.ml.feature.Word2VecModel.save(Word2Vec.scala:207)
      [info] at org.apache.spark.ml.util.DefaultReadWriteTest.testDefaultReadWrite(DefaultReadWriteTest.scala:51)
      [info] at org.apache.spark.ml.util.DefaultReadWriteTest.testDefaultReadWrite$(DefaultReadWriteTest.scala:42)
      [info] at org.apache.spark.ml.feature.Word2VecSuite.testDefaultReadWrite(Word2VecSuite.scala:28)
      [info] at org.apache.spark.ml.feature.Word2VecSuite.$anonfun$new$25(Word2VecSuite.scala:205)
      [info] at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
      [info] at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
      [info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
      [info] at org.scalatest.Transformer.apply(Transformer.scala:22)
      [info] at org.scalatest.Transformer.apply(Transformer.scala:20)
      [info] at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:190)
      [info] at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:176)
      [info] at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:188)
      [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:200)
      [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
      [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:200)
      [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:182)
      [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:61)
      [info] at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234)
      [info] at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227)
      [info] at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:61)
      [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:233)
      [info] at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413)
      [info] at scala.collection.immutable.List.foreach(List.scala:333)
      [info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
      [info] at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396)
      [info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475)
      [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:233)
      [info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:232)
      [info] at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1563)
      [info] at org.scalatest.Suite.run(Suite.scala:1112)
      [info] at org.scalatest.Suite.run$(Suite.scala:1094)
      [info] at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1563)
      [info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:237)
      [info] at org.scalatest.SuperEngine.runImpl(Engine.scala:535)
      [info] at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:237)
      [info] at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:236)
      [info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:61)
      [info] at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
      [info] at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
      [info] at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
      [info] at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:61)
      [info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:318)
      [info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:513)
      [info] at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:413)
      [info] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
      [info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
      [info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
      [info] at java.lang.Thread.run(Thread.java:748)
      [info] - Word2Vec works with input that is non-nullable (NGram) (776 milliseconds)
      [info] ScalaTest
      [info] Run completed in 11 seconds, 705 milliseconds.
      [info] Total number of tests run: 9
      [info] Suites: completed 1, aborted 0
      [info] Tests: succeeded 8, failed 1, canceled 0, ignored 0, pending 0
      [info] *** 1 TEST FAILED ***

      Attachments

        Activity

          People

            koert koert kuipers
            sadhen Darcy Shen
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: