Details
-
Bug
-
Status: Resolved
-
P2
-
Resolution: Duplicate
-
None
-
None
Description
Failing buildĀ https://builds.apache.org/job/beam_PostCommit_Python_Verify/6538
:beam-sdks-python:hdfsIntegrationTest is failing on jenkins with following error.
[33mdatanode_1_1f917c3c0d2e |[0m 18/11/13 00:57:28 INFO datanode.DataNode: Block pool BP-1790693572-172.18.0.2-1542070629622 (Datanode Uuid 06470cf0-ac11-4c97-80fe-d5463ee38b47) service to namenode/172.18.0.2:8020 beginning handshake with NN [36mnamenode_1_78f1ba71281a |[0m 18/11/13 00:57:38 WARN blockmanagement.DatanodeManager: Unresolved datanode registration: hostname cannot be resolved (ip=172.18.0.3, hostname=172.18.0.3) [36mnamenode_1_78f1ba71281a |[0m 18/11/13 00:57:38 INFO namenode.FSNamesystem: FSNamesystem write lock held for 10010 ms via [36mnamenode_1_78f1ba71281a |[0m java.lang.Thread.getStackTrace(Thread.java:1559) [36mnamenode_1_78f1ba71281a |[0m org.apache.hadoop.util.StringUtils.getStackTrace(StringUtils.java:1032) [36mnamenode_1_78f1ba71281a |[0m org.apache.hadoop.hdfs.server.namenode.FSNamesystemLock.writeUnlock(FSNamesystemLock.java:233) [36mnamenode_1_78f1ba71281a |[0m org.apache.hadoop.hdfs.server.namenode.FSNamesystem.writeUnlock(FSNamesystem.java:1537) [36mnamenode_1_78f1ba71281a |[0m org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3652) [36mnamenode_1_78f1ba71281a |[0m org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:1386) [36mnamenode_1_78f1ba71281a |[0m org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:101) [36mnamenode_1_78f1ba71281a |[0m org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:28419) [36mnamenode_1_78f1ba71281a |[0m org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447) [36mnamenode_1_78f1ba71281a |[0m org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989) [36mnamenode_1_78f1ba71281a |[0m org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845) [36mnamenode_1_78f1ba71281a |[0m org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788) [36mnamenode_1_78f1ba71281a |[0m java.security.AccessController.doPrivileged(Native Method) [36mnamenode_1_78f1ba71281a |[0m javax.security.auth.Subject.doAs(Subject.java:422) [36mnamenode_1_78f1ba71281a |[0m org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807) [36mnamenode_1_78f1ba71281a |[0m org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455) [36mnamenode_1_78f1ba71281a |[0m Number of suppressed write-lock reports: 0 [36mnamenode_1_78f1ba71281a |[0m Longest write-lock held interval: 10010 [36mnamenode_1_78f1ba71281a |[0m 18/11/13 00:57:38 INFO ipc.Server: IPC Server handler 2 on 8020, call Call#3 Retry#0 org.apache.hadoop.hdfs.server.protocol.DatanodeProtocol.registerDatanode from 172.18.0.3:35480 [36mnamenode_1_78f1ba71281a |[0m org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException: Datanode denied communication with namenode because hostname cannot be resolved (ip=172.18.0.3, hostname=172.18.0.3): DatanodeRegistration(0.0.0.0:50010, datanodeUuid=06470cf0-ac11-4c97-80fe-d5463ee38b47, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-4983ef53-0780-42e1-bdd3-d01ccaadf21c;nsid=282386608;c=1542070629622) [36mnamenode_1_78f1ba71281a |[0m at org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:867) [36mnamenode_1_78f1ba71281a |[0m at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3649) [36mnamenode_1_78f1ba71281a |[0m at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:1386) [36mnamenode_1_78f1ba71281a |[0m at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:101) [36mnamenode_1_78f1ba71281a |[0m at org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:28419) [36mnamenode_1_78f1ba71281a |[0m at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447) [36mnamenode_1_78f1ba71281a |[0m at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989) [36mnamenode_1_78f1ba71281a |[0m at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845) [36mnamenode_1_78f1ba71281a |[0m at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788) [36mnamenode_1_78f1ba71281a |[0m at java.security.AccessController.doPrivileged(Native Method) [36mnamenode_1_78f1ba71281a |[0m at javax.security.auth.Subject.doAs(Subject.java:422) [36mnamenode_1_78f1ba71281a |[0m at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807) [36mnamenode_1_78f1ba71281a |[0m at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455) [33mdatanode_1_1f917c3c0d2e |[0m 18/11/13 00:57:38 ERROR datanode.DataNode: Initialization failed for Block pool BP-1790693572-172.18.0.2-1542070629622 (Datanode Uuid 06470cf0-ac11-4c97-80fe-d5463ee38b47) service to namenode/172.18.0.2:8020 Datanode denied communication with namenode because hostname cannot be resolved (ip=172.18.0.3, hostname=172.18.0.3): DatanodeRegistration(0.0.0.0:50010, datanodeUuid=06470cf0-ac11-4c97-80fe-d5463ee38b47, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-4983ef53-0780-42e1-bdd3-d01ccaadf21c;nsid=282386608;c=1542070629622) [33mdatanode_1_1f917c3c0d2e |[0m at org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:867) [33mdatanode_1_1f917c3c0d2e |[0m at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3649) [33mdatanode_1_1f917c3c0d2e |[0m at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:1386) [33mdatanode_1_1f917c3c0d2e |[0m at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:101) [33mdatanode_1_1f917c3c0d2e |[0m at org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:28419) [33mdatanode_1_1f917c3c0d2e |[0m at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447) [33mdatanode_1_1f917c3c0d2e |[0m at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989) [33mdatanode_1_1f917c3c0d2e |[0m at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845) [33mdatanode_1_1f917c3c0d2e |[0m at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788) [33mdatanode_1_1f917c3c0d2e |[0m at java.security.AccessController.doPrivileged(Native Method) [33mdatanode_1_1f917c3c0d2e |[0m at javax.security.auth.Subject.doAs(Subject.java:422) [33mdatanode_1_1f917c3c0d2e |[0m at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807) [33mdatanode_1_1f917c3c0d2e |[0m at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455) [33mdatanode_1_1f917c3c0d2e |[0m [33mdatanode_1_1f917c3c0d2e |[0m 18/11/13 00:57:43 INFO datanode.DataNode: Block pool BP-1790693572-172.18.0.2-1542070629622 (Datanode Uuid 06470cf0-ac11-4c97-80fe-d5463ee38b47) service to namenode/172.18.0.2:8020 beginning handshake with NN [36mnamenode_1_78f1ba71281a |[0m 18/11/13 00:57:53 WARN blockmanagement.DatanodeManager: Unresolved datanode registration: hostname cannot be resolved (ip=172.18.0.3, hostname=172.18.0.3) [36mnamenode_1_78f1ba71281a |[0m 18/11/13 00:57:53 INFO namenode.FSNamesystem: FSNamesystem write lock held for 10012 ms via [36mnamenode_1_78f1ba71281a |[0m java.lang.Thread.getStackTrace(Thread.java:1559) [36mnamenode_1_78f1ba71281a |[0m org.apache.hadoop.util.StringUtils.getStackTrace(StringUtils.java:1032) [36mnamenode_1_78f1ba71281a |[0m org.apache.hadoop.hdfs.server.namenode.FSNamesystemLock.writeUnlock(FSNamesystemLock.java:233) [36mnamenode_1_78f1ba71281a |[0m org.apache.hadoop.hdfs.server.namenode.FSNamesystem.writeUnlock(FSNamesystem.java:1537) [36mnamenode_1_78f1ba71281a |[0m org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3652) [36mnamenode_1_78f1ba71281a |[0m org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:1386) [36mnamenode_1_78f1ba71281a |[0m org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:101) [36mnamenode_1_78f1ba71281a |[0m org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:28419) [36mnamenode_1_78f1ba71281a |[0m org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447) [36mnamenode_1_78f1ba71281a |[0m org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989) [36mnamenode_1_78f1ba71281a |[0m org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845) [36mnamenode_1_78f1ba71281a |[0m org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788) [36mnamenode_1_78f1ba71281a |[0m java.security.AccessController.doPrivileged(Native Method) [36mnamenode_1_78f1ba71281a |[0m javax.security.auth.Subject.doAs(Subject.java:422) [36mnamenode_1_78f1ba71281a |[0m org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807) [36mnamenode_1_78f1ba71281a |[0m org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455) [36mnamenode_1_78f1ba71281a |[0m Number of suppressed write-lock reports: 0 [36mnamenode_1_78f1ba71281a |[0m Longest write-lock held interval: 10012 [36mnamenode_1_78f1ba71281a |[0m 18/11/13 00:57:53 INFO ipc.Server: IPC Server handler 8 on 8020, call Call#5 Retry#0 org.apache.hadoop.hdfs.server.protocol.DatanodeProtocol.registerDatanode from 172.18.0.3:35496 [36mnamenode_1_78f1ba71281a |[0m org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException: Datanode denied communication with namenode because hostname cannot be resolved (ip=172.18.0.3, hostname=172.18.0.3): DatanodeRegistration(0.0.0.0:50010, datanodeUuid=06470cf0-ac11-4c97-80fe-d5463ee38b47, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-4983ef53-0780-42e1-bdd3-d01ccaadf21c;nsid=282386608;c=1542070629622) [36mnamenode_1_78f1ba71281a |[0m at org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:867) [36mnamenode_1_78f1ba71281a |[0m at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3649) [36mnamenode_1_78f1ba71281a |[0m at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:1386) [36mnamenode_1_78f1ba71281a |[0m at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:101) [36mnamenode_1_78f1ba71281a |[0m at org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:28419) [36mnamenode_1_78f1ba71281a |[0m at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447) [36mnamenode_1_78f1ba71281a |[0m at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989) [36mnamenode_1_78f1ba71281a |[0m at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845) [36mnamenode_1_78f1ba71281a |[0m at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788) [36mnamenode_1_78f1ba71281a |[0m at java.security.AccessController.doPrivileged(Native Method) [36mnamenode_1_78f1ba71281a |[0m at javax.security.auth.Subject.doAs(Subject.java:422) [36mnamenode_1_78f1ba71281a |[0m at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807) [36mnamenode_1_78f1ba71281a |[0m at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455) [33mdatanode_1_1f917c3c0d2e |[0m 18/11/13 00:57:53 ERROR datanode.DataNode: Initialization failed for Block pool BP-1790693572-172.18.0.2-1542070629622 (Datanode Uuid 06470cf0-ac11-4c97-80fe-d5463ee38b47) service to namenode/172.18.0.2:8020 Datanode denied communication with namenode because hostname cannot be resolved (ip=172.18.0.3, hostname=172.18.0.3): DatanodeRegistration(0.0.0.0:50010, datanodeUuid=06470cf0-ac11-4c97-80fe-d5463ee38b47, infoPort=50075, infoSecurePort=0, ipcPort=50020, storageInfo=lv=-57;cid=CID-4983ef53-0780-42e1-bdd3-d01ccaadf21c;nsid=282386608;c=1542070629622) [33mdatanode_1_1f917c3c0d2e |[0m at org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:867) [33mdatanode_1_1f917c3c0d2e |[0m at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3649) [33mdatanode_1_1f917c3c0d2e |[0m at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:1386) [33mdatanode_1_1f917c3c0d2e |[0m at org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:101) [33mdatanode_1_1f917c3c0d2e |[0m at org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:28419) [33mdatanode_1_1f917c3c0d2e |[0m at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447) [33mdatanode_1_1f917c3c0d2e |[0m at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989) [33mdatanode_1_1f917c3c0d2e |[0m at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845) [33mdatanode_1_1f917c3c0d2e |[0m at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788) [33mdatanode_1_1f917c3c0d2e |[0m at java.security.AccessController.doPrivileged(Native Method) [33mdatanode_1_1f917c3c0d2e |[0m at javax.security.auth.Subject.doAs(Subject.java:422) [33mdatanode_1_1f917c3c0d2e |[0m at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807) [33mdatanode_1_1f917c3c0d2e |[0m at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455) [33mdatanode_1_1f917c3c0d2e |[0m [32mtest_1_b589c004a4e9 |[0m INFO Instantiated configuration from '/app/sdks/python/apache_beam/io/hdfs_integration_test/hdfscli.cfg'. [32mtest_1_b589c004a4e9 |[0m INFO Instantiated <InsecureClient(url='http://namenode:50070')>. [32mtest_1_b589c004a4e9 |[0m INFO Uploading 'kinglear.txt' to '/'. [32mtest_1_b589c004a4e9 |[0m DEBUG Resolved path '/' to '/'. [32mtest_1_b589c004a4e9 |[0m INFO Listing '/'. [32mtest_1_b589c004a4e9 |[0m DEBUG Resolved path '/' to '/'. [32mtest_1_b589c004a4e9 |[0m DEBUG Resolved path '/' to '/'. [32mtest_1_b589c004a4e9 |[0m DEBUG Starting new HTTP connection (1): namenode:50070 [36mnamenode_1_78f1ba71281a |[0m Nov 13, 2018 12:57:56 AM com.sun.jersey.api.core.PackagesResourceConfig init [36mnamenode_1_78f1ba71281a |[0m INFO: Scanning for root resource and provider classes in the packages: [36mnamenode_1_78f1ba71281a |[0m org.apache.hadoop.hdfs.server.namenode.web.resources [36mnamenode_1_78f1ba71281a |[0m org.apache.hadoop.hdfs.web.resources [36mnamenode_1_78f1ba71281a |[0m Nov 13, 2018 12:57:57 AM com.sun.jersey.api.core.ScanningResourceConfig logClasses [36mnamenode_1_78f1ba71281a |[0m INFO: Root resource classes found: [36mnamenode_1_78f1ba71281a |[0m class org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods [36mnamenode_1_78f1ba71281a |[0m Nov 13, 2018 12:57:57 AM com.sun.jersey.api.core.ScanningResourceConfig logClasses [36mnamenode_1_78f1ba71281a |[0m INFO: Provider classes found: [36mnamenode_1_78f1ba71281a |[0m class org.apache.hadoop.hdfs.web.resources.UserProvider [36mnamenode_1_78f1ba71281a |[0m class org.apache.hadoop.hdfs.web.resources.ExceptionHandler [36mnamenode_1_78f1ba71281a |[0m Nov 13, 2018 12:57:57 AM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate [36mnamenode_1_78f1ba71281a |[0m INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM' [36mnamenode_1_78f1ba71281a |[0m Nov 13, 2018 12:57:58 AM com.sun.jersey.spi.inject.Errors processErrorMessages [36mnamenode_1_78f1ba71281a |[0m WARNING: The following warnings have been detected with resource and/or provider classes: [36mnamenode_1_78f1ba71281a |[0m WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.putRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PutOpParam,org.apache.hadoop.hdfs.web.resources.DestinationParam,org.apache.hadoop.hdfs.web.resources.OwnerParam,org.apache.hadoop.hdfs.web.resources.GroupParam,org.apache.hadoop.hdfs.web.resources.PermissionParam,org.apache.hadoop.hdfs.web.resources.OverwriteParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ReplicationParam,org.apache.hadoop.hdfs.web.resources.BlockSizeParam,org.apache.hadoop.hdfs.web.resources.ModificationTimeParam,org.apache.hadoop.hdfs.web.resources.AccessTimeParam,org.apache.hadoop.hdfs.web.resources.RenameOptionSetParam,org.apache.hadoop.hdfs.web.resources.CreateParentParam,org.apache.hadoop.hdfs.web.resources.TokenArgumentParam,org.apache.hadoop.hdfs.web.resources.AclPermissionParam,org.apache.hadoop.hdfs.web.resources.XAttrNameParam,org.apache.hadoop.hdfs.web.resources.XAttrValueParam,org.apache.hadoop.hdfs.web.resources.XAttrSetFlagParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam,org.apache.hadoop.hdfs.web.resources.OldSnapshotNameParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.CreateFlagParam,org.apache.hadoop.hdfs.web.resources.StoragePolicyParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method [36mnamenode_1_78f1ba71281a |[0m WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.postRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.PostOpParam,org.apache.hadoop.hdfs.web.resources.ConcatSourcesParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.NewLengthParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method [36mnamenode_1_78f1ba71281a |[0m WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.deleteRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.DeleteOpParam,org.apache.hadoop.hdfs.web.resources.RecursiveParam,org.apache.hadoop.hdfs.web.resources.SnapshotNameParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method [36mnamenode_1_78f1ba71281a |[0m WARNING: A sub-resource method, public javax.ws.rs.core.Response org.apache.hadoop.hdfs.server.namenode.web.resources.NamenodeWebHdfsMethods.getRoot(org.apache.hadoop.security.UserGroupInformation,org.apache.hadoop.hdfs.web.resources.DelegationParam,org.apache.hadoop.hdfs.web.resources.UserParam,org.apache.hadoop.hdfs.web.resources.DoAsParam,org.apache.hadoop.hdfs.web.resources.GetOpParam,org.apache.hadoop.hdfs.web.resources.OffsetParam,org.apache.hadoop.hdfs.web.resources.LengthParam,org.apache.hadoop.hdfs.web.resources.RenewerParam,org.apache.hadoop.hdfs.web.resources.BufferSizeParam,java.util.List,org.apache.hadoop.hdfs.web.resources.XAttrEncodingParam,org.apache.hadoop.hdfs.web.resources.ExcludeDatanodesParam,org.apache.hadoop.hdfs.web.resources.FsActionParam,org.apache.hadoop.hdfs.web.resources.TokenKindParam,org.apache.hadoop.hdfs.web.resources.TokenServiceParam) throws java.io.IOException,java.lang.InterruptedException, with URI template, "/", is treated as a resource method [32mtest_1_b589c004a4e9 |[0m DEBUG http://namenode:50070 "GET /webhdfs/v1/?user.name=root&op=LISTSTATUS HTTP/1.1" 200 None [32mtest_1_b589c004a4e9 |[0m DEBUG Uploading 1 files using 1 thread(s). [32mtest_1_b589c004a4e9 |[0m DEBUG Uploading 'kinglear.txt' to '/kinglear.txt'. [32mtest_1_b589c004a4e9 |[0m INFO Writing to '/kinglear.txt'. [32mtest_1_b589c004a4e9 |[0m DEBUG Resolved path '/kinglear.txt' to '/kinglear.txt'. [32mtest_1_b589c004a4e9 |[0m DEBUG http://namenode:50070 "PUT /webhdfs/v1/kinglear.txt?user.name=root&overwrite=False&op=CREATE HTTP/1.1" 403 None [32mtest_1_b589c004a4e9 |[0m ERROR Error while uploading. Attempting cleanup. [32mtest_1_b589c004a4e9 |[0m Traceback (most recent call last): [32mtest_1_b589c004a4e9 |[0m File "/usr/local/lib/python2.7/site-packages/hdfs/client.py", line 594, in upload [32mtest_1_b589c004a4e9 |[0m _upload(path_tuple) [32mtest_1_b589c004a4e9 |[0m File "/usr/local/lib/python2.7/site-packages/hdfs/client.py", line 524, in _upload [32mtest_1_b589c004a4e9 |[0m self.write(_temp_path, wrap(reader, chunk_size, progress), **kwargs) [32mtest_1_b589c004a4e9 |[0m File "/usr/local/lib/python2.7/site-packages/hdfs/client.py", line 456, in write [32mtest_1_b589c004a4e9 |[0m buffersize=buffersize, [32mtest_1_b589c004a4e9 |[0m File "/usr/local/lib/python2.7/site-packages/hdfs/client.py", line 112, in api_handler [32mtest_1_b589c004a4e9 |[0m raise err [32mtest_1_b589c004a4e9 |[0m HdfsError: Failed to find datanode, suggest to check cluster health. excludeDatanodes=null [32mtest_1_b589c004a4e9 |[0m INFO Deleting '/kinglear.txt' recursively. [32mtest_1_b589c004a4e9 |[0m DEBUG Resolved path '/kinglear.txt' to '/kinglear.txt'. [36mnamenode_1_78f1ba71281a |[0m 18/11/13 00:57:58 INFO namenode.EditLogFileOutputStream: Nothing to flush [32mtest_1_b589c004a4e9 |[0m DEBUG http://namenode:50070 "DELETE /webhdfs/v1/kinglear.txt?user.name=root&recursive=True&op=DELETE HTTP/1.1" 200 None [32mtest_1_b589c004a4e9 |[0m ERROR Failed to find datanode, suggest to check cluster health. excludeDatanodes=null [33mdatanode_1_1f917c3c0d2e |[0m 18/11/13 00:57:58 INFO datanode.DataNode: Block pool BP-1790693572-172.18.0.2-1542070629622 (Datanode Uuid 06470cf0-ac11-4c97-80fe-d5463ee38b47) service to namenode/172.18.0.2:8020 beginning handshake with NN [32mhdfs_it-jenkins-beam_postcommit_python_verify-6538_test_1_b589c004a4e9 exited with code 1 [0mStopping hdfs_it-jenkins-beam_postcommit_python_verify-6538_datanode_1_1f917c3c0d2e ... Stopping hdfs_it-jenkins-beam_postcommit_python_verify-6538_namenode_1_78f1ba71281a ... [2A[2K Stopping hdfs_it-jenkins-beam_postcommit_python_verify-6538_datanode_1_1f917c3c0d2e ... [32mdone[0m [2B[1A[2K Stopping hdfs_it-jenkins-beam_postcommit_python_verify-6538_namenode_1_78f1ba71281a ... [32mdone[0m [1BAborting on container exit... > Task :beam-sdks-python:hdfsIntegrationTest FAILED
Attachments
Issue Links
- duplicates
-
BEAM-6047 hdfsIntegrationTest is failing due to DisallowedDatanodeException
- Resolved