Uploaded image for project: 'Parquet'
  1. Parquet
  2. PARQUET-2311

Incomptable with latest spark version

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Open
    • Major
    • Resolution: Unresolved
    • 1.14.0
    • None
    • parquet-mr
    • None

    Description

      Im getting the following errors when I try use version 1.14.0 of parquet with the 3.5.0 of spark

       

      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/DataFrameWriter.scala:661: type mismatch;
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/DataFrameWriter.scala:665: type mismatch;
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveSessionCatalog.scala:163: value options is not a member of org.apache.spark.sql.catalyst.plans.logical.TableSpecBase
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveSessionCatalog.scala:165: type mismatch;
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveSessionCatalog.scala:166: type mismatch;
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveSessionCatalog.scala:167: type mismatch;
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveSessionCatalog.scala:167: type mismatch;
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveSessionCatalog.scala:175: value options is not a member of org.apache.spark.sql.catalyst.plans.logical.TableSpecBase
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveSessionCatalog.scala:180: type mismatch;
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveSessionCatalog.scala:181: type mismatch;
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveSessionCatalog.scala:182: type mismatch;
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveSessionCatalog.scala:182: type mismatch;
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSqlParser.scala:258: value multipartIdentifier is not a member of org.apache.spark.sql.catalyst.parser.SqlBaseParser.UseNamespaceContext
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSqlParser.scala:333: type mismatch;
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSqlParser.scala:334: type mismatch;
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSqlParser.scala:495: value multipartIdentifier is not a member of org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateViewContext
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSqlParser.scala:510: value multipartIdentifier is not a member of org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateViewContext
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSqlParser.scala:555: value multipartIdentifier is not a member of org.apache.spark.sql.catalyst.parser.SqlBaseParser.CreateFunctionContext
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSqlParser.scala:594: value multipartIdentifier is not a member of org.apache.spark.sql.catalyst.parser.SqlBaseParser.DropFunctionContext
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSqlParser.scala:818: not enough arguments for method unsupportedLocalFileSchemeError: (ctx: org.apache.spark.sql.catalyst.parser.SqlBaseParser.InsertOverwriteDirContext, actualSchema: String)Throwable.
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/command/ddl.scala:1040: not enough arguments for method cannotOverwritePathBeingReadFromError: (path: String)Throwable.
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/command/views.scala:546: type mismatch;
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSourceStrategy.scala:154: wrong number of arguments for pattern org.apache.spark.sql.catalyst.plans.logical.InsertIntoStatement(table: org.apache.spark.sql.catalyst.plans.logical.LogicalPlan,partitionSpec: Map[String,Option[String]],userSpecifiedCols: Seq[String],query: org.apache.spark.sql.catalyst.plans.logical.LogicalPlan,overwrite: Boolean,ifPartitionNotExists: Boolean,byName: Boolean)
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSourceStrategy.scala:166: wrong number of arguments for pattern org.apache.spark.sql.catalyst.plans.logical.InsertIntoStatement(table: org.apache.spark.sql.catalyst.plans.logical.LogicalPlan,partitionSpec: Map[String,Option[String]],userSpecifiedCols: Seq[String],query: org.apache.spark.sql.catalyst.plans.logical.LogicalPlan,overwrite: Boolean,ifPartitionNotExists: Boolean,byName: Boolean)
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSourceStrategy.scala:276: wrong number of arguments for pattern org.apache.spark.sql.catalyst.plans.logical.InsertIntoStatement(table: org.apache.spark.sql.catalyst.plans.logical.LogicalPlan,partitionSpec: Map[String,Option[String]],userSpecifiedCols: Seq[String],query: org.apache.spark.sql.catalyst.plans.logical.LogicalPlan,overwrite: Boolean,ifPartitionNotExists: Boolean,byName: Boolean)
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSourceStrategy.scala:280: wrong number of arguments for pattern org.apache.spark.sql.catalyst.plans.logical.InsertIntoStatement(table: org.apache.spark.sql.catalyst.plans.logical.LogicalPlan,partitionSpec: Map[String,Option[String]],userSpecifiedCols: Seq[String],query: org.apache.spark.sql.catalyst.plans.logical.LogicalPlan,overwrite: Boolean,ifPartitionNotExists: Boolean,byName: Boolean)
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/FallBackFileSourceV2.scala:37: wrong number of arguments for pattern org.apache.spark.sql.catalyst.plans.logical.InsertIntoStatement(table: org.apache.spark.sql.catalyst.plans.logical.LogicalPlan,partitionSpec: Map[String,Option[String]],userSpecifiedCols: Seq[String],query: org.apache.spark.sql.catalyst.plans.logical.LogicalPlan,overwrite: Boolean,ifPartitionNotExists: Boolean,byName: Boolean)
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/rules.scala:379: value mismatchedInsertedDataColumnNumberError is not a member of object org.apache.spark.sql.errors.QueryCompilationErrors
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/rules.scala:412: wrong number of arguments for pattern org.apache.spark.sql.catalyst.plans.logical.InsertIntoStatement(table: org.apache.spark.sql.catalyst.plans.logical.LogicalPlan,partitionSpec: Map[String,Option[String]],userSpecifiedCols: Seq[String],query: org.apache.spark.sql.catalyst.plans.logical.LogicalPlan,overwrite: Boolean,ifPartitionNotExists: Boolean,byName: Boolean)
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/rules.scala:493: wrong number of arguments for pattern org.apache.spark.sql.catalyst.plans.logical.InsertIntoStatement(table: org.apache.spark.sql.catalyst.plans.logical.LogicalPlan,partitionSpec: Map[String,Option[String]],userSpecifiedCols: Seq[String],query: org.apache.spark.sql.catalyst.plans.logical.LogicalPlan,overwrite: Boolean,ifPartitionNotExists: Boolean,byName: Boolean)
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/rules.scala:515: wrong number of arguments for pattern org.apache.spark.sql.catalyst.plans.logical.InsertIntoStatement(table: org.apache.spark.sql.catalyst.plans.logical.LogicalPlan,partitionSpec: Map[String,Option[String]],userSpecifiedCols: Seq[String],query: org.apache.spark.sql.catalyst.plans.logical.LogicalPlan,overwrite: Boolean,ifPartitionNotExists: Boolean,byName: Boolean)
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/DataSourceV2Strategy.scala:185: type mismatch;
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/DataSourceV2Strategy.scala:192: type mismatch;
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/DataSourceV2Strategy.scala:195: type mismatch;
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/DataSourceV2Strategy.scala:213: type mismatch;
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/DataSourceV2Strategy.scala:216: type mismatch;
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/DataSourceV2Strategy.scala:228: type mismatch;
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/DataSourceV2Strategy.scala:238: type mismatch;
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/DataSourceV2Strategy.scala:313: wrong number of arguments for pattern org.apache.spark.sql.catalyst.plans.logical.ReplaceData(table: org.apache.spark.sql.catalyst.analysis.NamedRelation,condition: org.apache.spark.sql.catalyst.expressions.Expression,query: org.apache.spark.sql.catalyst.plans.logical.LogicalPlan,originalTable: org.apache.spark.sql.catalyst.analysis.NamedRelation,groupFilterCondition: Option[org.apache.spark.sql.catalyst.expressions.Expression],write: Option[org.apache.spark.sql.connector.write.Write])
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/GroupBasedRowLevelOperationScanPlanning.scala:42: not enough patterns for object GroupBasedRowLevelOperation offering (org.apache.spark.sql.catalyst.plans.logical.ReplaceData, org.apache.spark.sql.catalyst.expressions.Expression, Option[org.apache.spark.sql.catalyst.expressions.Expression], org.apache.spark.sql.catalyst.plans.logical.LogicalPlan): expected 4, found 3
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/OptimizeMetadataOnlyDeleteFromTable.scala:76: wrong number of arguments for pattern org.apache.spark.sql.catalyst.plans.logical.ReplaceData(table: org.apache.spark.sql.catalyst.analysis.NamedRelation,condition: org.apache.spark.sql.catalyst.expressions.Expression,query: org.apache.spark.sql.catalyst.plans.logical.LogicalPlan,originalTable: org.apache.spark.sql.catalyst.analysis.NamedRelation,groupFilterCondition: Option[org.apache.spark.sql.catalyst.expressions.Expression],write: Option[org.apache.spark.sql.connector.write.Write])
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/V2SessionCatalog.scala:90: value tableNotSupportTimeTravelError is not a member of object org.apache.spark.sql.errors.QueryCompilationErrors
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/V2SessionCatalog.scala:93: value tableNotSupportTimeTravelError is not a member of object org.apache.spark.sql.errors.QueryCompilationErrors
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/V2Writes.scala:98: wrong number of arguments for pattern org.apache.spark.sql.catalyst.plans.logical.ReplaceData(table: org.apache.spark.sql.catalyst.analysis.NamedRelation,condition: org.apache.spark.sql.catalyst.expressions.Expression,query: org.apache.spark.sql.catalyst.plans.logical.LogicalPlan,originalTable: org.apache.spark.sql.catalyst.analysis.NamedRelation,groupFilterCondition: Option[org.apache.spark.sql.catalyst.expressions.Expression],write: Option[org.apache.spark.sql.connector.write.Write])
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/dynamicpruning/RowLevelOperationRuntimeGroupFiltering.scala:47: not enough patterns for object GroupBasedRowLevelOperation offering (org.apache.spark.sql.catalyst.plans.logical.ReplaceData, org.apache.spark.sql.catalyst.expressions.Expression, Option[org.apache.spark.sql.catalyst.expressions.Expression], org.apache.spark.sql.catalyst.plans.logical.LogicalPlan): expected 4, found 3
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/execution/dynamicpruning/RowLevelOperationRuntimeGroupFiltering.scala:48: constructor cannot be instantiated to expected type;
      [ERROR] [Error] spark/sql/core/src/main/scala/org/apache/spark/sql/streaming/progress.scala:176: value jsonValue is not a member of org.apache.spark.sql.Row
      [ERROR] 47 errors found
      [ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:4.8.0:compile (scala-compile-first) on project spark-sql_2.12: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:4.8.0:compile failed: org.apache.commons.exec.ExecuteException: Process exited with an error: 255 (Exit value: 255) -> [Help 1]
      [ERROR] 
      [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
      [ERROR] Re-run Maven using the -X switch to enable full debug logging.
      [ERROR] 
      [ERROR] For more information about the errors and possible solutions, please read the following articles:
      [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException

       

      Any help would be appreciated

      Attachments

        Activity

          People

            Unassigned Unassigned
            rdoolan ronan doolan
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated: