Uploaded image for project: 'IMPALA'
  1. IMPALA
  2. IMPALA-11738

Data loading failed at load-functional-query-exhaustive-hive-generated-orc-def-block.sql

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • Impala 4.1.1
    • Impala 4.3.0
    • None
    • None

    Description

      Ran "./bin/bootstrap_development.sh" to build the system from scratch.

      It seems to crash in hive-server2 when it executes a query

      select count(*) as mv_count from functional_orc_def.mv1_alltypes_jointbl

      during loading load-functional-query-exhaustive-hive-generated-orc-def-block.sql.

      Found errors in load-functional-query-exhaustive-hive-generated-orc-def-block.sql.log:

      Unknown HS2 problem when communicating with Thrift server.
      Error: org.apache.thrift.transport.TTransportException: java.net.SocketException: Broken pipe (Write failed) (state=08S01,code=0)
      java.sql.SQLException: org.apache.thrift.transport.TTransportException: java.net.SocketException: Broken pipe (Write failed)
              at org.apache.hive.jdbc.HiveStatement.closeStatementIfNeeded(HiveStatement.java:225)
              at org.apache.hive.jdbc.HiveStatement.closeClientOperation(HiveStatement.java:266)
              at org.apache.hive.jdbc.HiveStatement.close(HiveStatement.java:289)
              at org.apache.hive.beeline.Commands.executeInternal(Commands.java:1067)
              at org.apache.hive.beeline.Commands.execute(Commands.java:1217)
              at org.apache.hive.beeline.Commands.sql(Commands.java:1146)
              at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:1504)
              at org.apache.hive.beeline.BeeLine.execute(BeeLine.java:1362)
              at org.apache.hive.beeline.BeeLine.executeFile(BeeLine.java:1336)
              at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:1134)
              at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:1089)
              at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:547)
              at org.apache.hive.beeline.BeeLine.main(BeeLine.java:529)
              at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
              at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
              at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
              at java.lang.reflect.Method.invoke(Method.java:498)
              at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
              at org.apache.hadoop.util.RunJar.main(RunJar.java:232)

      Also found a crash jstack:

      Java frames: (J=compiled Java code, j=interpreted, Vv=VM code)
      j org.apache.hadoop.io.compress.zlib.ZlibCompressor.initIDs()V+0
      j org.apache.hadoop.io.compress.zlib.ZlibCompressor.<clinit>()V+18
      v ~StubRoutines::call_stub
      j org.apache.hadoop.io.compress.zlib.ZlibFactory.loadNativeZLib()V+6
      j org.apache.hadoop.io.compress.zlib.ZlibFactory.<clinit>()V+12
      v ~StubRoutines::call_stub
      j org.apache.hadoop.io.compress.DefaultCodec.getDecompressorType()Ljava/lang/Class;+4
      j org.apache.hadoop.io.compress.CodecPool.getDecompressor(Lorg/apache/hadoop/io/compress/CompressionCodec;)Lorg/apache/hadoop/io/compress/Decompressor;+4
      j org.apache.hadoop.io.SequenceFile$Reader.init(Z)V+486
      j org.apache.hadoop.io.SequenceFile$Reader.initialize(Lorg/apache/hadoop/fs/Path;Lorg/apache/hadoop/fs/FSDataInputStream;JJLorg/apache/hadoop/conf/Configuration;Z)V+84
      j org.apache.hadoop.io.SequenceFile$Reader.<init>(Lorg/apache/hadoop/conf/Configuration;[Lorg/apache/hadoop/io/SequenceFile$Reader$Option;)V+407
      j org.apache.hadoop.io.SequenceFile$Reader.<init>(Lorg/apache/hadoop/fs/FileSystem;Lorg/apache/hadoop/fs/Path;Lorg/apache/hadoop/conf/Configuration;)V+17
      j org.apache.hadoop.mapred.SequenceFileRecordReader.<init>(Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/mapred/FileSplit;)V+30
      j org.apache.hadoop.mapred.SequenceFileInputFormat.getRecordReader(Lorg/apache/hadoop/mapred/InputSplit;Lorg/apache/hadoop/mapred/JobConf;Lorg/apache/hadoop/mapred/Reporter;)Lorg/apache/hadoop/mapred/RecordReader;+19
      j org.apache.hadoop.hive.ql.exec.FetchOperator$FetchInputFormatSplit.getRecordReader(Lorg/apache/hadoop/mapred/JobConf;)Lorg/apache/hadoop/mapred/RecordReader;+12
      j org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader()Lorg/apache/hadoop/mapred/RecordReader;+266
      j org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow()Lorg/apache/hadoop/hive/serde2/objectinspector/InspectableObject;+25
      j org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow()Z+70
      j org.apache.hadoop.hive.ql.exec.FetchTask.executeInner(Ljava/util/List;)Z+170
      j org.apache.hadoop.hive.ql.exec.FetchTask.execute()I+12
      J 22489 C1 org.apache.hadoop.hive.ql.Driver.runInternal(Ljava/lang/String;Z)V (1199 bytes) @ 0x00007f928563b904 [0x00007f9285638600+0x3304]
      J 22488 C1 org.apache.hadoop.hive.ql.Driver.run(Ljava/lang/String;Z)Lorg/apache/hadoop/hive/ql/processors/CommandProcessorResponse; (269 bytes) @ 0x00007f928561fb44 [0x00007f928561faa0+0xa4]
      J 19121 C1 org.apache.hadoop.hive.ql.reexec.ReExecDriver.run()Lorg/apache/hadoop/hive/ql/processors/CommandProcessorResponse; (300 bytes) @ 0x00007f9283c0c034 [0x00007f9283c0b4c0+0xb74]

      Attachments

        Activity

          People

            joemcdonnell Joe McDonnell
            baggio000 Yida Wu
            Votes:
            0 Vote for this issue
            Watchers:
            5 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: