Uploaded image for project: 'Apache Hudi'
  1. Apache Hudi
  2. HUDI-4400

Fix missing bloom filters in metadata table in non-partitioned table

    XMLWordPrintableJSON

Details

    Description

      When doing upserts with Bloom Index using metadata table for a non-partitioned table, the writer throws exception complaining that the bloom filter of a file cannot be found:

      org.apache.hudi.exception.HoodieUpsertException: Failed to upsert for commit time 20220714055920837
        at org.apache.hudi.table.action.commit.BaseWriteHelper.write(BaseWriteHelper.java:64)
        at org.apache.hudi.table.action.commit.SparkUpsertCommitActionExecutor.execute(SparkUpsertCommitActionExecutor.java:45)
        at org.apache.hudi.table.HoodieSparkCopyOnWriteTable.upsert(HoodieSparkCopyOnWriteTable.java:113)
        at org.apache.hudi.table.HoodieSparkCopyOnWriteTable.upsert(HoodieSparkCopyOnWriteTable.java:97)
        at org.apache.hudi.client.SparkRDDWriteClient.upsert(SparkRDDWriteClient.java:155)
        at org.apache.hudi.DataSourceUtils.doWriteOperation(DataSourceUtils.java:207)
        at org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:320)
        at org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:171)
        at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
        at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:75)
        at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:73)
        at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:84)
        at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.$anonfun$applyOrElse$1(QueryExecution.scala:110)
        at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
        at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
        at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
        at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
        at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
        at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:110)
        at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:106)
        at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:481)
        at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:82)
        at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:481)
        at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:30)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267)
        at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263)
        at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
        at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
        at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:457)
        at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:106)
        at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:93)
        at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:91)
        at org.apache.spark.sql.execution.QueryExecution.assertCommandExecuted(QueryExecution.scala:128)
        at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:848)
        at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:382)
        at org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:355)
        at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:239)
        at Benchmark.writeToHudi(Benchmark.scala:557)
        at Benchmark.doWriteRound(Benchmark.scala:398)
        at Benchmark.$anonfun$doWrites$1(Benchmark.scala:368)
        at Benchmark.$anonfun$doWrites$1$adapted(Benchmark.scala:330)
        at scala.collection.immutable.Range.foreach(Range.scala:158)
        at Benchmark.doWrites(Benchmark.scala:330)
        at executeScenario(<console>:213)
        at $anonfun$res1$2(<console>:31)
        at $anonfun$res1$2$adapted(<console>:30)
        at scala.collection.immutable.List.foreach(List.scala:431)
        at $anonfun$res1$1(<console>:30)
        at $anonfun$res1$1$adapted(<console>:29)
        at scala.collection.immutable.List.foreach(List.scala:431)
        ... 51 elided
      Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 99 in stage 43.0 failed 4 times, most recent failure: Lost task 99.3 in stage 43.0 (TID 2129) (ip-172-31-14-36.us-east-2.compute.internal executor 10): java.lang.RuntimeException: org.apache.hudi.exception.HoodieIndexException: Failed to get the bloom filter for (,fda48b5f-bab1-472a-a24c-b28892b7b968-0_80-241-0_20220714055806329.parquet)
          at org.apache.hudi.client.utils.LazyIterableIterator.next(LazyIterableIterator.java:121)
          at scala.collection.convert.Wrappers$JIteratorWrapper.next(Wrappers.scala:46)
          at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:486)
          at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:492)
          at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:513)
          at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:491)
          at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:140)
          at org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59)
          at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
          at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:52)
          at org.apache.spark.scheduler.Task.run(Task.scala:131)
          at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:506)
          at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1462)
          at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:509)
          at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
          at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
          at java.lang.Thread.run(Thread.java:750)
      Caused by: org.apache.hudi.exception.HoodieIndexException: Failed to get the bloom filter for (,fda48b5f-bab1-472a-a24c-b28892b7b968-0_80-241-0_20220714055806329.parquet)
          at org.apache.hudi.index.bloom.HoodieMetadataBloomIndexCheckFunction$BloomIndexLazyKeyCheckIterator.lambda$computeNext$2(HoodieMetadataBloomIndexCheckFunction.java:124)
          at java.util.HashMap.forEach(HashMap.java:1290)
          at org.apache.hudi.index.bloom.HoodieMetadataBloomIndexCheckFunction$BloomIndexLazyKeyCheckIterator.computeNext(HoodieMetadataBloomIndexCheckFunction.java:117)
          at org.apache.hudi.index.bloom.HoodieMetadataBloomIndexCheckFunction$BloomIndexLazyKeyCheckIterator.computeNext(HoodieMetadataBloomIndexCheckFunction.java:74)
          at org.apache.hudi.client.utils.LazyIterableIterator.next(LazyIterableIterator.java:119)
          ... 16 more 

      Attachments

        Issue Links

          Activity

            People

              guoyihua Ethan Guo
              guoyihua Ethan Guo
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: