Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Resolved
-
1.1.0
-
None
-
None
-
hadoop-2.6.0-cdh5.5.0
hive-1.1.0-cdh5.5.0
-
Important
Description
NullPointerException when void convert to varchar using Tez
Just like follow testSQL. we can run it using MR ,but we get a NullPointerException when using Tez. we found it failed in SQL compiling when doing constant propagation optimizer.
testSQL:
create table testTezVarchar(a varchar(10),b varchar(10));
create table testTezVoid as select null as a1,null as b1 from events5;
set hive.execution.engine=mr;
insert into testTezVarchar select a1 as a,b1 as b from testTezVoid;
set hive.execution.engine=tez;
insert into testTezVarchar select a1 as a,b1 as b from testTezVoid;
Exception:
Caused by: java.lang.NullPointerException
at org.apache.hadoop.hive.common.type.HiveBaseChar.enforceMaxLength(HiveBaseChar.java:44)
at org.apache.hadoop.hive.serde2.io.HiveVarcharWritable.set(HiveVarcharWritable.java:65)
at org.apache.hadoop.hive.serde2.io.HiveVarcharWritable.set(HiveVarcharWritable.java:61)
at org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableHiveVarcharObjectInspector.set(WritableHiveVarcharObjectInspector.java:130)
at org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorConverter$HiveVarcharConverter.convert(PrimitiveObjectInspectorConverter.java:490)
at org.apache.hadoop.hive.ql.udf.generic.GenericUDFToVarchar.evaluate(GenericUDFToVarchar.java:84)
at org.apache.hadoop.hive.ql.optimizer.ConstantPropagateProcFactory.evaluateFunction(ConstantPropagateProcFactory.java:516)
at org.apache.hadoop.hive.ql.optimizer.ConstantPropagateProcFactory.foldExpr(ConstantPropagateProcFactory.java:228)
at org.apache.hadoop.hive.ql.optimizer.ConstantPropagateProcFactory.access$000(ConstantPropagateProcFactory.java:91)
at org.apache.hadoop.hive.ql.optimizer.ConstantPropagateProcFactory$ConstantPropagateSelectProc.process(ConstantPropagateProcFactory.java:718)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:90)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatchAndReturn(DefaultGraphWalker.java:94)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:78)
at org.apache.hadoop.hive.ql.optimizer.ConstantPropagate$ConstantPropagateWalker.walk(ConstantPropagate.java:142)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:109)
at org.apache.hadoop.hive.ql.optimizer.ConstantPropagate.transform(ConstantPropagate.java:112)
at org.apache.hadoop.hive.ql.parse.TezCompiler.runDynamicPartitionPruning(TezCompiler.java:311)
at org.apache.hadoop.hive.ql.parse.TezCompiler.optimizeOperatorPlan(TezCompiler.java:117)
at org.apache.hadoop.hive.ql.parse.TaskCompiler.compile(TaskCompiler.java:101)
Now we just set "hive.optimize.constant.propagation=false" to avoid this issue,does anybody knows how to solve it ?
Attachments
Attachments
Issue Links
- is superceded by
-
HIVE-11217 CTAS statements throws error, when the table is stored as ORC File format and select clause has NULL/VOID type column
- Closed