Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-34260

UnresolvedException when creating temp view twice

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 2.4.7, 3.0.2, 3.1.1, 3.2.0
    • 2.4.8, 3.0.2, 3.1.1
    • SQL
    • None

    Description

      when creating temp view twice, there is an UnresolvedException, queries to reproduce:

      sql("create or replace temp view v as select * from (select * from range(10))")
      sql("create or replace temp view v as select * from (select * from range(10))")
      

      error message:

      org.apache.spark.sql.catalyst.analysis.UnresolvedException: Invalid call to toAttribute on unresolved object, tree: *
              at org.apache.spark.sql.catalyst.analysis.Star.toAttribute(unresolved.scala:295)
              at org.apache.spark.sql.catalyst.plans.logical.Project.$anonfun$output$1(basicLogicalOperators.scala:62)
              at scala.collection.immutable.List.map(List.scala:293)
              at org.apache.spark.sql.catalyst.plans.logical.Project.output(basicLogicalOperators.scala:62)
              at org.apache.spark.sql.catalyst.plans.logical.SubqueryAlias.output(basicLogicalOperators.scala:945)
              at org.apache.spark.sql.catalyst.plans.QueryPlan.$anonfun$allAttributes$1(QueryPlan.scala:431)
              at scala.collection.immutable.List.flatMap(List.scala:366)
              at org.apache.spark.sql.catalyst.plans.QueryPlan.allAttributes$lzycompute(QueryPlan.scala:431)
              at org.apache.spark.sql.catalyst.plans.QueryPlan.allAttributes(QueryPlan.scala:431)
              at org.apache.spark.sql.catalyst.plans.QueryPlan.$anonfun$doCanonicalize$2(QueryPlan.scala:404)
              at org.apache.spark.sql.catalyst.plans.QueryPlan.$anonfun$mapExpressions$1(QueryPlan.scala:116)
              at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:73)
              at org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpression$1(QueryPlan.scala:116)
              at org.apache.spark.sql.catalyst.plans.QueryPlan.recursiveTransform$1(QueryPlan.scala:127)
              at org.apache.spark.sql.catalyst.plans.QueryPlan.$anonfun$mapExpressions$3(QueryPlan.scala:132)
              at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
              at scala.collection.immutable.List.foreach(List.scala:431)
              at scala.collection.TraversableLike.map(TraversableLike.scala:286)
              at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
              at scala.collection.immutable.List.map(List.scala:305)
              at org.apache.spark.sql.catalyst.plans.QueryPlan.recursiveTransform$1(QueryPlan.scala:132)
              at org.apache.spark.sql.catalyst.plans.QueryPlan.$anonfun$mapExpressions$4(QueryPlan.scala:137)
              at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:243)
              at org.apache.spark.sql.catalyst.plans.QueryPlan.mapExpressions(QueryPlan.scala:137)
              at org.apache.spark.sql.catalyst.plans.QueryPlan.doCanonicalize(QueryPlan.scala:389)
              at org.apache.spark.sql.catalyst.plans.QueryPlan.canonicalized$lzycompute(QueryPlan.scala:373)
              at org.apache.spark.sql.catalyst.plans.QueryPlan.canonicalized(QueryPlan.scala:372)
              at org.apache.spark.sql.catalyst.plans.QueryPlan.sameResult(QueryPlan.scala:420)
              at org.apache.spark.sql.execution.command.CreateViewCommand.run(views.scala:118)
              at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
              at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
              at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:79)
              at org.apache.spark.sql.Dataset.$anonfun$logicalPlan$1(Dataset.scala:228)
              at org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3699)
              at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
              at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
              at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
              at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772)
              at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
              at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3697)
              at org.apache.spark.sql.Dataset.<init>(Dataset.scala:228)
              at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:99)
              at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772)
              at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:96)
              at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:615)
              at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772)
              at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:610)
      

      Attachments

        Activity

          People

            linhongliu-db Linhong Liu
            linhongliu-db Linhong Liu
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: