Hive Runtime Error while processing row in Hive
Asked Answered
I

2

8

I've got issue while Querying on ORC file format table

I was trying below query:

INSERT INTO TABLE <db_name>.<table_name> SELECT FROM <db_name>.<table_name> WHERE CONDITIONS;

which results in:

TaskAttempt 2 failed, info=[Error: Failure while running task:java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveExceptio
 Hive Runtime Error while processing row
      at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:186)
      at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:138)
      at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:324)
      at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:176)
      at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:168)
      at java.security.AccessController.doPrivileged(Native Method)
      at javax.security.auth.Subject.doAs(Subject.java:422)
      at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
      at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.call(TezTaskRunner.java:168)
      at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.call(TezTaskRunner.java:163)
      at java.util.concurrent.FutureTask.run(FutureTask.java:266)
      at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
      at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
      at java.lang.Thread.run(Thread.java:745)
used by: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row
      at org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:91)
      at org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.pushRecord(MapRecordSource.java:68)
      at org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.run(MapRecordProcessor.java:294)
      at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:163)
      ... 13 more
used by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row
      at org.apache.hadoop.hive.ql.exec.vector.VectorMapOperator.process(VectorMapOperator.java:52)
      at org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:83)
      ... 16 more
used by: org.apache.hadoop.hive.ql.metadata.HiveException: Error evaluating 'Hotels4U'
      at org.apache.hadoop.hive.ql.exec.vector.VectorSelectOperator.processOp(VectorSelectOperator.java:126)
      at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815)
      at org.apache.hadoop.hive.ql.exec.TableScanOperator.processOp(TableScanOperator.java:95)
      at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:157)
      at org.apache.hadoop.hive.ql.exec.vector.VectorMapOperator.process(VectorMapOperator.java:45)
      ... 17 more
used by: java.lang.ArrayIndexOutOfBoundsException: 48
      at org.apache.hadoop.hive.ql.exec.vector.expressions.ConstantVectorExpression.evaluateBytes(ConstantVectorExpression.java:124)
      at org.apache.hadoop.hive.ql.exec.vector.expressions.ConstantVectorExpression.evaluate(ConstantVectorExpression.java:156)
      at org.apache.hadoop.hive.ql.exec.vector.VectorSelectOperator.processOp(VectorSelectOperator.java:124)
Irritate answered 23/2, 2015 at 13:19 Comment(0)
I
5

To solve this issue set below parameters from hive shell.

hive>set hive.vectorized.execution.enabled=false; hive>set hive.vectorized.execution.reduce.enabled=false;

Then run insert overwrite commands.

Irritate answered 23/2, 2015 at 13:19 Comment(2)
Can you explain what these parameters do?Preoccupation
The quality of this question and answer pair is really subpar. But it works.Cyclorama
T
2

I had a similar problem. It turned out this is because I don't have enough space. After I deleted some old tables in Hive and release some memory space, it works OK.

Transduction answered 7/7, 2016 at 13:54 Comment(1)
One upvote for this. Deleting old files from HDFS helped me to resolve the issue. Thanks a lot.Adapter

© 2022 - 2024 — McMap. All rights reserved.