Hive多表连接异常,java.lang.ArrayIndexOutOfBounds :140,官方Bug,在3.0.0版本已经被解决了

发布于:2021-11-29 11:05:01

后续官方解决方案:https://issues.apache.org/jira/browse/HIVE-14564



异常详细情况

2019-02-28 16:33:44,429 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Got allocated containers 1
2019-02-28 16:33:44,429 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigning container Container: [ContainerId: container_e25_1551269222015_0034_01_000005, NodeId: bigdata001:45454, NodeHttpAddress: bigdata001:8042, Resource: , Priority: 5, Token: Token { kind: ContainerToken, service: 192.168.30.230:45454 }, ] to fast fail map
2019-02-28 16:33:44,431 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned from earlierFailedMaps
2019-02-28 16:33:44,432 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Assigned container container_e25_1551269222015_0034_01_000005 to attempt_1551269222015_0034_m_000000_3
2019-02-28 16:33:44,432 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Recalculating schedule, headroom=
2019-02-28 16:33:44,432 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce slow start threshold not met. completedMapsForReduceSlowstart 1
2019-02-28 16:33:44,432 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: After Scheduling: PendingReds:4 ScheduledMaps:0 ScheduledReds:0 AssignedMaps:1 AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:4 ContRel:0 HostLocal:0 RackLocal:1
2019-02-28 16:33:44,432 INFO [AsyncDispatcher event handler] org.apache.hadoop.yarn.util.RackResolver: Resolved bigdata001 to /default-rack
2019-02-28 16:33:44,433 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1551269222015_0034_m_000000_3 TaskAttempt Transitioned from UNASSIGNED to ASSIGNED
2019-02-28 16:33:44,433 INFO [ContainerLauncher #6] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Processing the event EventType: CONTAINER_REMOTE_LAUNCH for container container_e25_1551269222015_0034_01_000005 taskAttempt attempt_1551269222015_0034_m_000000_3
2019-02-28 16:33:44,433 INFO [ContainerLauncher #6] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Launching attempt_1551269222015_0034_m_000000_3
2019-02-28 16:33:44,434 INFO [ContainerLauncher #6] org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : bigdata001:45454
2019-02-28 16:33:44,441 INFO [ContainerLauncher #6] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Shuffle port returned by ContainerManager for attempt_1551269222015_0034_m_000000_3 : 13562
2019-02-28 16:33:44,441 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: TaskAttempt: [attempt_1551269222015_0034_m_000000_3] using containerId: [container_e25_1551269222015_0034_01_000005 on NM: [bigdata001:45454]
2019-02-28 16:33:44,442 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1551269222015_0034_m_000000_3 TaskAttempt Transitioned from ASSIGNED to RUNNING
2019-02-28 16:33:45,434 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: getResources() for application_1551269222015_0034: ask=1 release= 0 newContainers=0 finishedContainers=0 resourcelimit= knownNMs=3
2019-02-28 16:33:45,785 INFO [Socket Reader #1 for port 35318] SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for job_1551269222015_0034 (auth:SIMPLE)
2019-02-28 16:33:45,796 INFO [IPC Server handler 4 on 35318] org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID : jvm_1551269222015_0034_m_27487790694405 asked for a task
2019-02-28 16:33:45,796 INFO [IPC Server handler 4 on 35318] org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID: jvm_1551269222015_0034_m_27487790694405 given task: attempt_1551269222015_0034_m_000000_3
2019-02-28 16:33:47,740 INFO [IPC Server handler 1 on 35318] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of TaskAttempt attempt_1551269222015_0034_m_000000_3 is : 0.0
2019-02-28 16:33:47,743 FATAL [IPC Server handler 2 on 35318] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task: attempt_1551269222015_0034_m_000000_3 - exited : java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row [Error getting row data with exception java.lang.ArrayIndexOutOfBoundsException: -1746617499
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils.readVInt(LazyBinaryUtils.java:314)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils.checkObjectByteInfo(LazyBinaryUtils.java:183)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.parse(LazyBinaryStruct.java:142)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.getField(LazyBinaryStruct.java:202)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.objectinspector.LazyBinaryStructObjectInspector.getStructFieldData(LazyBinaryStructObjectInspector.java:64)
?? ?at org.apache.hadoop.hive.serde2.SerDeUtils.buildJSONString(SerDeUtils.java:354)
?? ?at org.apache.hadoop.hive.serde2.SerDeUtils.getJSONString(SerDeUtils.java:198)
?? ?at org.apache.hadoop.hive.serde2.SerDeUtils.getJSONString(SerDeUtils.java:184)
?? ?at org.apache.hadoop.hive.ql.exec.MapOperator.toErrorMessage(MapOperator.java:588)
?? ?at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:557)
?? ?at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:163)
?? ?at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
?? ?at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
?? ?at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
?? ?at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
?? ?at java.security.AccessController.doPrivileged(Native Method)
?? ?at javax.security.auth.Subject.doAs(Subject.java:422)
?? ?at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
?? ?at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
?]
?? ?at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:172)
?? ?at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
?? ?at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
?? ?at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
?? ?at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
?? ?at java.security.AccessController.doPrivileged(Native Method)
?? ?at javax.security.auth.Subject.doAs(Subject.java:422)
?? ?at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
?? ?at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row [Error getting row data with exception java.lang.ArrayIndexOutOfBoundsException: -1746617499
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils.readVInt(LazyBinaryUtils.java:314)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils.checkObjectByteInfo(LazyBinaryUtils.java:183)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.parse(LazyBinaryStruct.java:142)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.getField(LazyBinaryStruct.java:202)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.objectinspector.LazyBinaryStructObjectInspector.getStructFieldData(LazyBinaryStructObjectInspector.java:64)
?? ?at org.apache.hadoop.hive.serde2.SerDeUtils.buildJSONString(SerDeUtils.java:354)
?? ?at org.apache.hadoop.hive.serde2.SerDeUtils.getJSONString(SerDeUtils.java:198)
?? ?at org.apache.hadoop.hive.serde2.SerDeUtils.getJSONString(SerDeUtils.java:184)
?? ?at org.apache.hadoop.hive.ql.exec.MapOperator.toErrorMessage(MapOperator.java:588)
?? ?at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:557)
?? ?at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:163)
?? ?at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
?? ?at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
?? ?at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
?? ?at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
?? ?at java.security.AccessController.doPrivileged(Native Method)
?? ?at javax.security.auth.Subject.doAs(Subject.java:422)
?? ?at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
?? ?at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
?]
?? ?at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:562)
?? ?at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:163)
?? ?... 8 more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ArrayIndexOutOfBoundsException: -1746617499
?? ?at org.apache.hadoop.hive.ql.exec.ReduceSinkOperator.process(ReduceSinkOperator.java:403)
?? ?at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:838)
?? ?at org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:130)
?? ?at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:167)
?? ?at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:552)
?? ?... 9 more
Caused by: java.lang.ArrayIndexOutOfBoundsException: -1746617499
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils.readVInt(LazyBinaryUtils.java:314)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils.checkObjectByteInfo(LazyBinaryUtils.java:183)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.parse(LazyBinaryStruct.java:142)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.getField(LazyBinaryStruct.java:202)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.objectinspector.LazyBinaryStructObjectInspector.getStructFieldData(LazyBinaryStructObjectInspector.java:64)
?? ?at org.apache.hadoop.hive.ql.exec.ExprNodeColumnEvaluator._evaluate(ExprNodeColumnEvaluator.java:94)
?? ?at org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:77)
?? ?at org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:65)
?? ?at org.apache.hadoop.hive.ql.exec.ReduceSinkOperator.computeHashCode(ReduceSinkOperator.java:480)
?? ?at org.apache.hadoop.hive.ql.exec.ReduceSinkOperator.process(ReduceSinkOperator.java:368)
?? ?... 13 more

2019-02-28 16:33:47,743 INFO [IPC Server handler 2 on 35318] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Diagnostics report from attempt_1551269222015_0034_m_000000_3: Error: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row [Error getting row data with exception java.lang.ArrayIndexOutOfBoundsException: -1746617499
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils.readVInt(LazyBinaryUtils.java:314)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils.checkObjectByteInfo(LazyBinaryUtils.java:183)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.parse(LazyBinaryStruct.java:142)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.getField(LazyBinaryStruct.java:202)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.objectinspector.LazyBinaryStructObjectInspector.getStructFieldData(LazyBinaryStructObjectInspector.java:64)
?? ?at org.apache.hadoop.hive.serde2.SerDeUtils.buildJSONString(SerDeUtils.java:354)
?? ?at org.apache.hadoop.hive.serde2.SerDeUtils.getJSONString(SerDeUtils.java:198)
?? ?at org.apache.hadoop.hive.serde2.SerDeUtils.getJSONString(SerDeUtils.java:184)
?? ?at org.apache.hadoop.hive.ql.exec.MapOperator.toErrorMessage(MapOperator.java:588)
?? ?at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:557)
?? ?at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:163)
?? ?at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
?? ?at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
?? ?at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
?? ?at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
?? ?at java.security.AccessController.doPrivileged(Native Method)
?? ?at javax.security.auth.Subject.doAs(Subject.java:422)
?? ?at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
?? ?at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
?]
?? ?at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:172)
?? ?at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
?? ?at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
?? ?at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
?? ?at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
?? ?at java.security.AccessController.doPrivileged(Native Method)
?? ?at javax.security.auth.Subject.doAs(Subject.java:422)
?? ?at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
?? ?at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row [Error getting row data with exception java.lang.ArrayIndexOutOfBoundsException: -1746617499
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils.readVInt(LazyBinaryUtils.java:314)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils.checkObjectByteInfo(LazyBinaryUtils.java:183)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.parse(LazyBinaryStruct.java:142)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.getField(LazyBinaryStruct.java:202)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.objectinspector.LazyBinaryStructObjectInspector.getStructFieldData(LazyBinaryStructObjectInspector.java:64)
?? ?at org.apache.hadoop.hive.serde2.SerDeUtils.buildJSONString(SerDeUtils.java:354)
?? ?at org.apache.hadoop.hive.serde2.SerDeUtils.getJSONString(SerDeUtils.java:198)
?? ?at org.apache.hadoop.hive.serde2.SerDeUtils.getJSONString(SerDeUtils.java:184)
?? ?at org.apache.hadoop.hive.ql.exec.MapOperator.toErrorMessage(MapOperator.java:588)
?? ?at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:557)
?? ?at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:163)
?? ?at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
?? ?at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
?? ?at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
?? ?at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
?? ?at java.security.AccessController.doPrivileged(Native Method)
?? ?at javax.security.auth.Subject.doAs(Subject.java:422)
?? ?at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
?? ?at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
?]
?? ?at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:562)
?? ?at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:163)
?? ?... 8 more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ArrayIndexOutOfBoundsException: -1746617499
?? ?at org.apache.hadoop.hive.ql.exec.ReduceSinkOperator.process(ReduceSinkOperator.java:403)
?? ?at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:838)
?? ?at org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:130)
?? ?at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:167)
?? ?at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:552)
?? ?... 9 more
Caused by: java.lang.ArrayIndexOutOfBoundsException: -1746617499
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils.readVInt(LazyBinaryUtils.java:314)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils.checkObjectByteInfo(LazyBinaryUtils.java:183)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.parse(LazyBinaryStruct.java:142)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.getField(LazyBinaryStruct.java:202)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.objectinspector.LazyBinaryStructObjectInspector.getStructFieldData(LazyBinaryStructObjectInspector.java:64)
?? ?at org.apache.hadoop.hive.ql.exec.ExprNodeColumnEvaluator._evaluate(ExprNodeColumnEvaluator.java:94)
?? ?at org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:77)
?? ?at org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:65)
?? ?at org.apache.hadoop.hive.ql.exec.ReduceSinkOperator.computeHashCode(ReduceSinkOperator.java:480)
?? ?at org.apache.hadoop.hive.ql.exec.ReduceSinkOperator.process(ReduceSinkOperator.java:368)
?? ?... 13 more

2019-02-28 16:33:47,744 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics report from attempt_1551269222015_0034_m_000000_3: Error: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row [Error getting row data with exception java.lang.ArrayIndexOutOfBoundsException: -1746617499
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils.readVInt(LazyBinaryUtils.java:314)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils.checkObjectByteInfo(LazyBinaryUtils.java:183)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.parse(LazyBinaryStruct.java:142)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.getField(LazyBinaryStruct.java:202)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.objectinspector.LazyBinaryStructObjectInspector.getStructFieldData(LazyBinaryStructObjectInspector.java:64)
?? ?at org.apache.hadoop.hive.serde2.SerDeUtils.buildJSONString(SerDeUtils.java:354)
?? ?at org.apache.hadoop.hive.serde2.SerDeUtils.getJSONString(SerDeUtils.java:198)
?? ?at org.apache.hadoop.hive.serde2.SerDeUtils.getJSONString(SerDeUtils.java:184)
?? ?at org.apache.hadoop.hive.ql.exec.MapOperator.toErrorMessage(MapOperator.java:588)
?? ?at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:557)
?? ?at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:163)
?? ?at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
?? ?at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
?? ?at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
?? ?at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
?? ?at java.security.AccessController.doPrivileged(Native Method)
?? ?at javax.security.auth.Subject.doAs(Subject.java:422)
?? ?at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
?? ?at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
?]
?? ?at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:172)
?? ?at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
?? ?at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
?? ?at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
?? ?at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
?? ?at java.security.AccessController.doPrivileged(Native Method)
?? ?at javax.security.auth.Subject.doAs(Subject.java:422)
?? ?at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
?? ?at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row [Error getting row data with exception java.lang.ArrayIndexOutOfBoundsException: -1746617499
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils.readVInt(LazyBinaryUtils.java:314)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils.checkObjectByteInfo(LazyBinaryUtils.java:183)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.parse(LazyBinaryStruct.java:142)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.getField(LazyBinaryStruct.java:202)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.objectinspector.LazyBinaryStructObjectInspector.getStructFieldData(LazyBinaryStructObjectInspector.java:64)
?? ?at org.apache.hadoop.hive.serde2.SerDeUtils.buildJSONString(SerDeUtils.java:354)
?? ?at org.apache.hadoop.hive.serde2.SerDeUtils.getJSONString(SerDeUtils.java:198)
?? ?at org.apache.hadoop.hive.serde2.SerDeUtils.getJSONString(SerDeUtils.java:184)
?? ?at org.apache.hadoop.hive.ql.exec.MapOperator.toErrorMessage(MapOperator.java:588)
?? ?at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:557)
?? ?at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:163)
?? ?at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
?? ?at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
?? ?at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
?? ?at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
?? ?at java.security.AccessController.doPrivileged(Native Method)
?? ?at javax.security.auth.Subject.doAs(Subject.java:422)
?? ?at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
?? ?at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
?]
?? ?at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:562)
?? ?at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:163)
?? ?... 8 more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ArrayIndexOutOfBoundsException: -1746617499
?? ?at org.apache.hadoop.hive.ql.exec.ReduceSinkOperator.process(ReduceSinkOperator.java:403)
?? ?at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:838)
?? ?at org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:130)
?? ?at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:167)
?? ?at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:552)
?? ?... 9 more
Caused by: java.lang.ArrayIndexOutOfBoundsException: -1746617499
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils.readVInt(LazyBinaryUtils.java:314)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryUtils.checkObjectByteInfo(LazyBinaryUtils.java:183)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.parse(LazyBinaryStruct.java:142)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.getField(LazyBinaryStruct.java:202)
?? ?at org.apache.hadoop.hive.serde2.lazybinary.objectinspector.LazyBinaryStructObjectInspector.getStructFieldData(LazyBinaryStructObjectInspector.java:64)
?? ?at org.apache.hadoop.hive.ql.exec.ExprNodeColumnEvaluator._evaluate(ExprNodeColumnEvaluator.java:94)
?? ?at org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:77)
?? ?at org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:65)
?? ?at org.apache.hadoop.hive.ql.exec.ReduceSinkOperator.computeHashCode(ReduceSinkOperator.java:480)
?? ?at org.apache.hadoop.hive.ql.exec.ReduceSinkOperator.process(ReduceSinkOperator.java:368)
?? ?... 13 more

上图显示数据数组下表越界异常!


具体解释:


是因为序列化和反序列化不匹配。
LazyBinarySerDe从之前的MapReduce作业进行的序列化使用了不同的列顺序。当前MapReduce作业对前一个MapReduce作业生成的中间序列文件进行反序列化时,它将使用LazyBinaryStruct使用错误的列顺序从反序列化中获取损坏的数据。序列化和反序列化之间的不匹配列由SelectOperator的Column Pruning?ColumnPrunerSelectProc引起。


查询一条数据进行详细查看


2019-02-28 18:57:46,328 FATAL [IPC Server handler 2 on 43230] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task: attempt_1551269222015_0068_m_000000_0 - exited : java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row
{"_col0":"ab1065d1ca224499a2d78c0cd18eafe3","_col1":"伏自河","_col2":"10100080022","_col3":"10100070009","_col4":"10100030005","_col5":"1975-12-14 15:09:37","_col6":null,"_col7":"0040007u001coVcvgshLl-C35lN4UAuB6OqQKrAU?@E?u0000u0000u0000u0000u0000u000b10","_col8":"0020002u00011u001coftlquLmek909kESJYmVT6BVqAJsu0006东城u0006?","_col9":null,"_col10":null,"_col11":null,"_col12":"","_col13":"u000b10100040007u001coVcvgshLl-C35lN4UAuB6OqQKrAU?@E?u0000u0000u0000u0000u0000u000b10","_col14":null,"_col21":null,"_col22":"0020002u00011u001coftlquLmek909kESJYmVT6BVqAJsu0006东城u0006?","_col24":null,"_col30":null,"_col32":null}
at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:172)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row {"_col0":"ab1065d1ca224499a2d78c0cd18eafe3","_col1":"伏自河","_col2":"10100080022","_col3":"10100070009","_col4":"10100030005","_col5":"1975-12-14 15:09:37","_col6":null,"_col7":"0040007u001coVcvgshLl-C35lN4UAuB6OqQKrAU?@E?u0000u0000u0000u0000u0000u000b10","_col8":"0020002u00011u001coftlquLmek909kESJYmVT6BVqAJsu0006东城u0006?","_col9":null,"_col10":null,"_col11":null,"_col12":"","_col13":"u000b10100040007u001coVcvgshLl-C35lN4UAuB6OqQKrAU?@E?u0000u0000u0000u0000u0000u000b10","_col14":null,"_col21":null,"_col22":"0020002u00011u001coftlquLmek909kESJYmVT6BVqAJsu0006东城u0006?","_col24":null,"_col30":null,"_col32":null}
at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:562)
at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:163)
... 8 more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ArrayIndexOutOfBoundsException
at org.apache.hadoop.hive.ql.exec.ReduceSinkOperator.process(ReduceSinkOperator.java:403)
at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:838)
at org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:130)
at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:167)
at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:552)
... 9 more
Caused by: java.lang.ArrayIndexOutOfBoundsException
at java.lang.System.arraycopy(Native Method)
at org.apache.hadoop.io.Text.set(Text.java:225)
at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryString.init(LazyBinaryString.java:48)
at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.uncheckedGetField(LazyBinaryStruct.java:267)
at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.getField(LazyBinaryStruct.java:204)
at org.apache.hadoop.hive.serde2.lazybinary.objectinspector.LazyBinaryStructObjectInspector.getStructFieldData(LazyBinaryStructObjectInspector.java:64)
at org.apache.hadoop.hive.ql.exec.ExprNodeColumnEvaluator._evaluate(ExprNodeColumnEvaluator.java:94)
at org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:77)
at org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:65)
at org.apache.hadoop.hive.ql.exec.ReduceSinkOperator.makeValueWritable(ReduceSinkOperator.java:558)
at org.apache.hadoop.hive.ql.exec.ReduceSinkOperator.process(ReduceSinkOperator.java:383)
... 13 more

2019-02-28 18:57:46,328 INFO [IPC Server handler 2 on 43230] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Diagnostics report from attempt_1551269222015_0068_m_000000_0: Error: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row {"_col0":"ab1065d1ca224499a2d78c0cd18eafe3","_col1":"伏自河","_col2":"10100080022","_col3":"10100070009","_col4":"10100030005","_col5":"1975-12-14 15:09:37","_col6":null,"_col7":"0040007u001coVcvgshLl-C35lN4UAuB6OqQKrAU?@E?u0000u0000u0000u0000u0000u000b10","_col8":"0020002u00011u001coftlquLmek909kESJYmVT6BVqAJsu0006东城u0006?","_col9":null,"_col10":null,"_col11":null,"_col12":"","_col13":"u000b10100040007u001coVcvgshLl-C35lN4UAuB6OqQKrAU?@E?u0000u0000u0000u0000u0000u000b10","_col14":null,"_col21":null,"_col22":"0020002u00011u001coftlquLmek909kESJYmVT6BVqAJsu0006东城u0006?","_col24":null,"_col30":null,"_col32":null}
at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:172)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row {"_col0":"ab1065d1ca224499a2d78c0cd18eafe3","_col1":"伏自河","_col2":"10100080022","_col3":"10100070009","_col4":"10100030005","_col5":"1975-12-14 15:09:37","_col6":null,"_col7":"0040007u001coVcvgshLl-C35lN4UAuB6OqQKrAU?@E?u0000u0000u0000u0000u0000u000b10","_col8":"0020002u00011u001coftlquLmek909kESJYmVT6BVqAJsu0006东城u0006?","_col9":null,"_col10":null,"_col11":null,"_col12":"","_col13":"u000b10100040007u001coVcvgshLl-C35lN4UAuB6OqQKrAU?@E?u0000u0000u0000u0000u0000u000b10","_col14":null,"_col21":null,"_col22":"0020002u00011u001coftlquLmek909kESJYmVT6BVqAJsu0006东城u0006?","_col24":null,"_col30":null,"_col32":null}
at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:562)
at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:163)
... 8 more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ArrayIndexOutOfBoundsException
at org.apache.hadoop.hive.ql.exec.ReduceSinkOperator.process(ReduceSinkOperator.java:403)
at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:838)
at org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:130)
at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:167)
at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:552)
... 9 more
Caused by: java.lang.ArrayIndexOutOfBoundsException
at java.lang.System.arraycopy(Native Method)
at org.apache.hadoop.io.Text.set(Text.java:225)
at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryString.init(LazyBinaryString.java:48)
at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.uncheckedGetField(LazyBinaryStruct.java:267)
at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.getField(LazyBinaryStruct.java:204)
at org.apache.hadoop.hive.serde2.lazybinary.objectinspector.LazyBinaryStructObjectInspector.getStructFieldData(LazyBinaryStructObjectInspector.java:64)
at org.apache.hadoop.hive.ql.exec.ExprNodeColumnEvaluator._evaluate(ExprNodeColumnEvaluator.java:94)
at org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:77)
at org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:65)
at org.apache.hadoop.hive.ql.exec.ReduceSinkOperator.makeValueWritable(ReduceSinkOperator.java:558)
at org.apache.hadoop.hive.ql.exec.ReduceSinkOperator.process(ReduceSinkOperator.java:383)
... 13 more

2019-02-28 18:57:46,330 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics report from attempt_1551269222015_0068_m_000000_0: Error: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row {"_col0":"ab1065d1ca224499a2d78c0cd18eafe3","_col1":"伏自河","_col2":"10100080022","_col3":"10100070009","_col4":"10100030005","_col5":"1975-12-14 15:09:37","_col6":null,"_col7":"0040007u001coVcvgshLl-C35lN4UAuB6OqQKrAU?@E?u0000u0000u0000u0000u0000u000b10","_col8":"0020002u00011u001coftlquLmek909kESJYmVT6BVqAJsu0006东城u0006?","_col9":null,"_col10":null,"_col11":null,"_col12":"","_col13":"u000b10100040007u001coVcvgshLl-C35lN4UAuB6OqQKrAU?@E?u0000u0000u0000u0000u0000u000b10","_col14":null,"_col21":null,"_col22":"0020002u00011u001coftlquLmek909kESJYmVT6BVqAJsu0006东城u0006?","_col24":null,"_col30":null,"_col32":null}
at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:172)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row {"_col0":"ab1065d1ca224499a2d78c0cd18eafe3","_col1":"伏自河","_col2":"10100080022","_col3":"10100070009","_col4":"10100030005","_col5":"1975-12-14 15:09:37","_col6":null,"_col7":"0040007u001coVcvgshLl-C35lN4UAuB6OqQKrAU?@E?u0000u0000u0000u0000u0000u000b10","_col8":"0020002u00011u001coftlquLmek909kESJYmVT6BVqAJsu0006东城u0006?","_col9":null,"_col10":null,"_col11":null,"_col12":"","_col13":"u000b10100040007u001coVcvgshLl-C35lN4UAuB6OqQKrAU?@E?u0000u0000u0000u0000u0000u000b10","_col14":null,"_col21":null,"_col22":"0020002u00011u001coftlquLmek909kESJYmVT6BVqAJsu0006东城u0006?","_col24":null,"_col30":null,"_col32":null}
at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:562)
at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:163)
... 8 more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ArrayIndexOutOfBoundsException
at org.apache.hadoop.hive.ql.exec.ReduceSinkOperator.process(ReduceSinkOperator.java:403)
at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:838)
at org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:130)
at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:167)
at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:552)
... 9 more
Caused by: java.lang.ArrayIndexOutOfBoundsException
at java.lang.System.arraycopy(Native Method)
at org.apache.hadoop.io.Text.set(Text.java:225)
at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryString.init(LazyBinaryString.java:48)
at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.uncheckedGetField(LazyBinaryStruct.java:267)
at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.getField(LazyBinaryStruct.java:204)
at org.apache.hadoop.hive.serde2.lazybinary.objectinspector.LazyBinaryStructObjectInspector.getStructFieldData(LazyBinaryStructObjectInspector.java:64)
at org.apache.hadoop.hive.ql.exec.ExprNodeColumnEvaluator._evaluate(ExprNodeColumnEvaluator.java:94)
at org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:77)
at org.apache.hadoop.hive.ql.exec.ExprNodeEvaluator.evaluate(ExprNodeEvaluator.java:65)
at org.apache.hadoop.hive.ql.exec.ReduceSinkOperator.makeValueWritable(ReduceSinkOperator.java:558)
at org.apache.hadoop.hive.ql.exec.ReduceSinkOperator.process(ReduceSinkOperator.java:383)
... 13 more

2019-02-28 18:57:46,331 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1551269222015_0068_m_000000_0 TaskAttempt Transitioned from RUNNING to FAIL_CONTAINER_CLEANUP
2019-02-28 18:57:46,332 INFO [ContainerLauncher #1] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: Processing the event EventType: CONTAINER_REMOTE_CLEANUP for container container_e25_1551269222015_0068_01_000002 taskAttempt attempt_1551269222015_0068_m_000000_0
2019-02-28 18:57:46,332 INFO [ContainerLauncher #1] org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl: KILLING attempt_1551269222015_0068_m_000000_0
2019-02-28 18:57:46,332 INFO [ContainerLauncher #1] org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: Opening proxy : bigdata002:45454
2019-02-28 18:57:46,346 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1551269222015_0068_m_000000_0 TaskAttempt Transitioned from FAIL_CONTAINER_CLEANUP to FAIL_TASK_CLEANUP
2019-02-28 18:57:46,347 INFO [CommitterEvent Processor #1] org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler: Processing the event EventType: TASK_ABORT
2019-02-28 18:57:46,348 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1551269222015_0068_m_000000_0 TaskAttempt Transitioned from FAIL_TASK_CLEANUP to FAILED
2019-02-28 18:57:46,353 INFO [AsyncDispatcher event handler] org.apache.hadoop.yarn.util.RackResolver: Resolved bigdata002 to /default-rack
2019-02-28 18:57:46,353 INFO [Thread-53] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: 1 failures on node bigdata002
2019-02-28 18:57:46,354 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: attempt_1551269222015_0068_m_000000_1 TaskAttempt Transitioned from NEW to UNASSIGNED
2019-02-28 18:57:46,355 INFO [Thread-53] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Added attempt_1551269222015_0068_m_000000_1 to list of failed maps
2019-02-28 18:57:46,757 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Before Scheduling: PendingReds:4 ScheduledMaps:1 ScheduledReds:0 AssignedMaps:1 AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:1 ContRel:0 HostLocal:1 RackLocal:0
2019-02-28 18:57:46,762 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: getResources() for application_1551269222015_0068: ask=1 release= 0 newContainers=0 finishedContainers=1 resourcelimit= knownNMs=3
2019-02-28 18:57:46,762 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Received completed container container_e25_1551269222015_0068_01_000002
2019-02-28 18:57:46,763 INFO [RMCommunicator Allocator] org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Recalculating schedule, headroom=
2019-02-28 18:57:46,763 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics report from attempt_1551269222015_0068_m_000000_0: Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143

将第二行ETL后的数据进行查询出来,由原先条件可知,数据字段为 32字段


{
"_col0":"ab1065d1ca224499a2d78c0cd18eafe3",
"_col1":"伏自河",
"_col2":"10100080022",
"_col3":"10100070009",
"_col4":"10100030005",
"_col5":"1975-12-14 15:09:37",
"_col6":null,
"_col7":"0040007u001coVcvgshLl-C35lN4UAuB6OqQKrAU?@E?u0000u0000u0000u0000u0000u000b10",
"_col8":"0020002u00011u001coftlquLmek909kESJYmVT6BVqAJsu0006东城u0006?",
"_col9":null,
"_col10":null,
"_col11":null,
"_col12":"",
"_col13":"u000b10100040007u001coVcvgshLlC35lN4UAuB6OqQKrAU?@E?u0000u0000u0000u0000u0000u000b10",
"_col14":null,
"_col21":null,
"_col22":"0020002u00011u001coftlquLmek909kESJYmVT6BVqAJsu0006东城u0006?",
"_col24":null,
"_col30":null,
"_col32":null}

由上图正向序列化可知: 序列化时,列裁剪字段丢失,从col15 ??> col21之间字段丢失,数据同时也丢失了!!!


?



发生原因:使用Hive做多表连接(本人使用5表连接),同时使用limit 方法,即发生了如下问题,可以有一间接解决方案,将5表连接缩小,使用中间表方式,创建2-3表连接情况,即可间接解决此问题,但是并不能完全解决这个Bug!!!


详情见官方说明:https://issues.apache.org/jira/browse/HIVE-14564


相关推荐

最新更新

猜你喜欢