Skip to content

Instantly share code, notes, and snippets.

@fivesmallq
Created May 30, 2014 15:29
Show Gist options
  • Save fivesmallq/54f32cc6374f6bb18370 to your computer and use it in GitHub Desktop.
Save fivesmallq/54f32cc6374f6bb18370 to your computer and use it in GitHub Desktop.
Picked up JAVA_TOOL_OPTIONS: -Dfile.encoding=UTF-8
2014-05-30 23:29:10 [main:MutableMetricsFactory:1 ] - [ DEBUG ] field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, about=, value=[Rate of successful kerberos logins and latency (milliseconds)], always=false, type=DEFAULT, sampleName=Ops)
2014-05-30 23:29:10 [main:MutableMetricsFactory:9 ] - [ DEBUG ] field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, about=, value=[Rate of failed kerberos logins and latency (milliseconds)], always=false, type=DEFAULT, sampleName=Ops)
2014-05-30 23:29:10 [main:MutableMetricsFactory:10 ] - [ DEBUG ] field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, about=, value=[GetGroups], always=false, type=DEFAULT, sampleName=Ops)
2014-05-30 23:29:10 [main:MetricsSystemImpl:11 ] - [ DEBUG ] UgiMetrics, User and group related metrics
2014-05-30 23:29:10.637 java[10682:1003] Unable to load realm info from SCDynamicStore
2014-05-30 23:29:10 [main:KerberosName:28 ] - [ DEBUG ] Kerberos krb5 configuration not found, setting default realm to empty
2014-05-30 23:29:10.639 java[10682:1003] Unable to load realm info from SCDynamicStore
2014-05-30 23:29:10 [main:Groups:55 ] - [ DEBUG ] Creating new Groups object
2014-05-30 23:29:10 [main:NativeCodeLoader:57 ] - [ DEBUG ] Trying to load the custom-built native-hadoop library...
2014-05-30 23:29:10 [main:NativeCodeLoader:58 ] - [ DEBUG ] Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
2014-05-30 23:29:10 [main:NativeCodeLoader:58 ] - [ DEBUG ] java.library.path=.:/Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java
2014-05-30 23:29:10 [main:NativeCodeLoader:58 ] - [ WARN ] Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2014-05-30 23:29:10 [main:JniBasedUnixGroupsMappingWithFallback:59 ] - [ DEBUG ] Falling back to shell based
2014-05-30 23:29:10 [main:JniBasedUnixGroupsMappingWithFallback:59 ] - [ DEBUG ] Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
2014-05-30 23:29:10 [main:Groups:126 ] - [ DEBUG ] Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
2014-05-30 23:29:10 [main:UserGroupInformation:130 ] - [ DEBUG ] PrivilegedAction as:cloudera (auth:SIMPLE) from:WordCountRemoteNew.run(WordCountRemoteNew.java:68)
2014-05-30 23:29:10 [main:Configuration:187 ] - [ WARN ] fs.default.name is deprecated. Instead, use fs.defaultFS
2014-05-30 23:29:11 [main:NameNodeProxies:503 ] - [ DEBUG ] multipleLinearRandomRetry = null
2014-05-30 23:29:11 [main:Server:524 ] - [ DEBUG ] rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWritable, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@3677eaf8
2014-05-30 23:29:11 [main:UserGroupInformation:751 ] - [ DEBUG ] PrivilegedAction as:cloudera (auth:SIMPLE) from:org.apache.hadoop.mapreduce.Job.connect(Job.java:593)
2014-05-30 23:29:11 [main:Server:815 ] - [ DEBUG ] rpcKind=RPC_WRITABLE, rpcRequestWrapperClass=class org.apache.hadoop.ipc.WritableRpcEngine$Invocation, rpcInvoker=org.apache.hadoop.ipc.WritableRpcEngine$Server$WritableRpcInvoker@64df83e5
2014-05-30 23:29:11 [main:UserGroupInformation:838 ] - [ DEBUG ] PrivilegedAction as:cloudera (auth:SIMPLE) from:org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:974)
2014-05-30 23:29:11 [main:Client$Connection:845 ] - [ DEBUG ] The ping interval is 60000 ms.
2014-05-30 23:29:11 [main:Client$Connection:855 ] - [ DEBUG ] Use SIMPLE authentication for protocol JobSubmissionProtocol
2014-05-30 23:29:11 [main:Client$Connection:856 ] - [ DEBUG ] Connecting to /192.168.161.139:8021
2014-05-30 23:29:11 [IPC Client (2058236772) connection to /192.168.161.139:8021 from cloudera:Client$Connection:892 ] - [ DEBUG ] IPC Client (2058236772) connection to /192.168.161.139:8021 from cloudera: starting, having connections 1
2014-05-30 23:29:11 [IPC Parameter Sending Thread #0:Client$Connection$3:900 ] - [ DEBUG ] IPC Client (2058236772) connection to /192.168.161.139:8021 from cloudera sending #0
2014-05-30 23:29:11 [IPC Client (2058236772) connection to /192.168.161.139:8021 from cloudera:Client$Connection:905 ] - [ DEBUG ] IPC Client (2058236772) connection to /192.168.161.139:8021 from cloudera got value #0
2014-05-30 23:29:11 [main:WritableRpcEngine$Invoker:906 ] - [ DEBUG ] Call: getStagingAreaDir 67
2014-05-30 23:29:11 [main:NameNodeProxies:910 ] - [ DEBUG ] multipleLinearRandomRetry = null
2014-05-30 23:29:11 [main:UserGroupInformation$HadoopLoginModule:915 ] - [ DEBUG ] hadoop login
2014-05-30 23:29:11 [main:UserGroupInformation$HadoopLoginModule:915 ] - [ DEBUG ] hadoop login commit
2014-05-30 23:29:11 [main:UserGroupInformation$HadoopLoginModule:916 ] - [ DEBUG ] using local user:UnixPrincipal: fivesmallq
2014-05-30 23:29:11 [main:UserGroupInformation:917 ] - [ DEBUG ] UGI loginUser:fivesmallq (auth:SIMPLE)
2014-05-30 23:29:11 [main:Client$Connection:921 ] - [ DEBUG ] The ping interval is 60000 ms.
2014-05-30 23:29:11 [main:Client$Connection:921 ] - [ DEBUG ] Use SIMPLE authentication for protocol ClientNamenodeProtocolPB
2014-05-30 23:29:11 [main:Client$Connection:921 ] - [ DEBUG ] Connecting to localhost.localdomain/192.168.161.139:8020
2014-05-30 23:29:11 [IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera:Client$Connection:922 ] - [ DEBUG ] IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera: starting, having connections 1
2014-05-30 23:29:11 [IPC Parameter Sending Thread #0:Client$Connection$3:923 ] - [ DEBUG ] IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera sending #0
2014-05-30 23:29:11 [IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera:Client$Connection:926 ] - [ DEBUG ] IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera got value #0
2014-05-30 23:29:11 [main:ProtobufRpcEngine$Invoker:926 ] - [ DEBUG ] Call: getFileInfo took 8ms
2014-05-30 23:29:11 [IPC Parameter Sending Thread #0:Client$Connection$3:947 ] - [ DEBUG ] IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera sending #1
2014-05-30 23:29:11 [IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera:Client$Connection:949 ] - [ DEBUG ] IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera got value #1
2014-05-30 23:29:11 [main:ProtobufRpcEngine$Invoker:949 ] - [ DEBUG ] Call: getFileInfo took 2ms
2014-05-30 23:29:11 [IPC Parameter Sending Thread #0:Client$Connection$3:950 ] - [ DEBUG ] IPC Client (2058236772) connection to /192.168.161.139:8021 from cloudera sending #1
2014-05-30 23:29:11 [IPC Client (2058236772) connection to /192.168.161.139:8021 from cloudera:Client$Connection:951 ] - [ DEBUG ] IPC Client (2058236772) connection to /192.168.161.139:8021 from cloudera got value #1
2014-05-30 23:29:11 [main:WritableRpcEngine$Invoker:954 ] - [ DEBUG ] Call: getNewJobId 5
2014-05-30 23:29:11 [main:JobClient:956 ] - [ DEBUG ] adding the following namenodes' delegation tokens:null
2014-05-30 23:29:11 [main:JobClient:956 ] - [ WARN ] Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
2014-05-30 23:29:11 [main:JobClient:957 ] - [ DEBUG ] default FileSystem: hdfs://localhost.localdomain:8020
2014-05-30 23:29:11 [IPC Parameter Sending Thread #0:Client$Connection$3:957 ] - [ DEBUG ] IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera sending #2
2014-05-30 23:29:11 [IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera:Client$Connection:959 ] - [ DEBUG ] IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera got value #2
2014-05-30 23:29:11 [main:ProtobufRpcEngine$Invoker:959 ] - [ DEBUG ] Call: getFileInfo took 2ms
2014-05-30 23:29:11 [main:DFSClient:960 ] - [ DEBUG ] /user/cloudera/.staging/job_201405300255_0006: masked=rwxr-xr-x
2014-05-30 23:29:11 [IPC Parameter Sending Thread #0:Client$Connection$3:961 ] - [ DEBUG ] IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera sending #3
2014-05-30 23:29:11 [IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera:Client$Connection:963 ] - [ DEBUG ] IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera got value #3
2014-05-30 23:29:11 [main:ProtobufRpcEngine$Invoker:963 ] - [ DEBUG ] Call: mkdirs took 3ms
2014-05-30 23:29:11 [IPC Parameter Sending Thread #0:Client$Connection$3:965 ] - [ DEBUG ] IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera sending #4
2014-05-30 23:29:11 [IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera:Client$Connection:968 ] - [ DEBUG ] IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera got value #4
2014-05-30 23:29:11 [main:ProtobufRpcEngine$Invoker:968 ] - [ DEBUG ] Call: setPermission took 3ms
2014-05-30 23:29:11 [main:JobClient:974 ] - [ WARN ] No job jar file set. User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
2014-05-30 23:29:11 [IPC Parameter Sending Thread #0:Client$Connection$3:978 ] - [ DEBUG ] IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera sending #5
2014-05-30 23:29:11 [IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera:Client$Connection:979 ] - [ DEBUG ] IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera got value #5
2014-05-30 23:29:11 [main:ProtobufRpcEngine$Invoker:980 ] - [ DEBUG ] Call: getFileInfo took 2ms
2014-05-30 23:29:11 [main:JobClient$2:980 ] - [ DEBUG ] Creating splits at hdfs://localhost.localdomain:8020/user/cloudera/.staging/job_201405300255_0006
2014-05-30 23:29:11 [IPC Parameter Sending Thread #0:Client$Connection$3:987 ] - [ DEBUG ] IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera sending #6
2014-05-30 23:29:11 [IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera:Client$Connection:989 ] - [ DEBUG ] IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera got value #6
2014-05-30 23:29:11 [main:ProtobufRpcEngine$Invoker:989 ] - [ DEBUG ] Call: getFileInfo took 4ms
2014-05-30 23:29:11 [IPC Parameter Sending Thread #0:Client$Connection$3:1039 ] - [ DEBUG ] IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera sending #7
2014-05-30 23:29:11 [IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera:Client$Connection:1040 ] - [ DEBUG ] IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera got value #7
2014-05-30 23:29:11 [main:ProtobufRpcEngine$Invoker:1041 ] - [ DEBUG ] Call: getListing took 2ms
2014-05-30 23:29:11 [main:FileInputFormat:1047 ] - [ INFO ] Total input paths to process : 0
2014-05-30 23:29:11 [main:FileInputFormat:1047 ] - [ DEBUG ] Total # of splits: 0
2014-05-30 23:29:11 [main:DFSClient:1050 ] - [ DEBUG ] /user/cloudera/.staging/job_201405300255_0006/job.split: masked=rwxr-xr-x
2014-05-30 23:29:11 [main:DFSOutputStream:1055 ] - [ DEBUG ] computePacketChunkSize: src=/user/cloudera/.staging/job_201405300255_0006/job.split, chunkSize=516, chunksPerPacket=127, packetSize=65532
2014-05-30 23:29:11 [IPC Parameter Sending Thread #0:Client$Connection$3:1058 ] - [ DEBUG ] IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera sending #8
2014-05-30 23:29:11 [IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera:Client$Connection:1060 ] - [ DEBUG ] IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera got value #8
2014-05-30 23:29:11 [main:ProtobufRpcEngine$Invoker:1060 ] - [ DEBUG ] Call: create took 2ms
2014-05-30 23:29:11 [LeaseRenewer:[email protected]:8020:LeaseRenewer$1:1070 ] - [ DEBUG ] Lease renewer daemon for [DFSClient_NONMAPREDUCE_-323925144_1] with renew id 1 started
2014-05-30 23:29:11 [IPC Parameter Sending Thread #0:Client$Connection$3:1071 ] - [ DEBUG ] IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera sending #9
2014-05-30 23:29:11 [IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera:Client$Connection:1074 ] - [ DEBUG ] IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera got value #9
2014-05-30 23:29:11 [main:ProtobufRpcEngine$Invoker:1075 ] - [ DEBUG ] Call: setPermission took 4ms
2014-05-30 23:29:11 [IPC Parameter Sending Thread #0:Client$Connection$3:1076 ] - [ DEBUG ] IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera sending #10
2014-05-30 23:29:11 [IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera:Client$Connection:1079 ] - [ DEBUG ] IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera got value #10
2014-05-30 23:29:11 [main:ProtobufRpcEngine$Invoker:1079 ] - [ DEBUG ] Call: setReplication took 4ms
2014-05-30 23:29:11 [main:DFSOutputStream:1087 ] - [ DEBUG ] DFSClient writeChunk allocating new packet seqno=0, src=/user/cloudera/.staging/job_201405300255_0006/job.split, packetSize=65532, chunksPerPacket=127, bytesCurBlock=0
2014-05-30 23:29:11 [main:DFSOutputStream:1087 ] - [ DEBUG ] Queued packet 0
2014-05-30 23:29:11 [main:DFSOutputStream:1087 ] - [ DEBUG ] Queued packet 1
2014-05-30 23:29:11 [main:DFSOutputStream:1088 ] - [ DEBUG ] Waiting for ack for: 1
2014-05-30 23:29:11 [Thread-5:DFSOutputStream$DataStreamer:1088 ] - [ DEBUG ] Allocating new block
2014-05-30 23:29:11 [IPC Parameter Sending Thread #0:Client$Connection$3:1099 ] - [ DEBUG ] IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera sending #11
2014-05-30 23:29:11 [IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera:Client$Connection:1101 ] - [ DEBUG ] IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera got value #11
2014-05-30 23:29:11 [Thread-5:ProtobufRpcEngine$Invoker:1101 ] - [ DEBUG ] Call: addBlock took 2ms
2014-05-30 23:29:11 [Thread-5:DFSOutputStream$DataStreamer:1126 ] - [ DEBUG ] pipeline = 127.0.0.1:50010
2014-05-30 23:29:11 [Thread-5:DFSOutputStream:1126 ] - [ DEBUG ] Connecting to datanode 127.0.0.1:50010
2014-05-30 23:29:11 [Thread-5:DFSOutputStream$DataStreamer:1129 ] - [ INFO ] Exception in createBlockOutputStream
java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:599)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:207)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:528)
at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1254)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1080)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1040)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:488)
2014-05-30 23:29:11 [Thread-5:DFSOutputStream$DataStreamer:1131 ] - [ INFO ] Abandoning BP-1628426775-127.0.0.1-1394440676844:blk_-7943045951540795284_1313
2014-05-30 23:29:11 [IPC Parameter Sending Thread #0:Client$Connection$3:1131 ] - [ DEBUG ] IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera sending #12
2014-05-30 23:29:11 [IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera:Client$Connection:1132 ] - [ DEBUG ] IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera got value #12
2014-05-30 23:29:11 [Thread-5:ProtobufRpcEngine$Invoker:1133 ] - [ DEBUG ] Call: abandonBlock took 2ms
2014-05-30 23:29:11 [Thread-5:DFSOutputStream$DataStreamer:1134 ] - [ INFO ] Excluding datanode 127.0.0.1:50010
2014-05-30 23:29:11 [IPC Parameter Sending Thread #0:Client$Connection$3:1142 ] - [ DEBUG ] IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera sending #13
2014-05-30 23:29:11 [IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera:Client$Connection:1146 ] - [ DEBUG ] IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera got value #13
2014-05-30 23:29:11 [Thread-5:DFSOutputStream$DataStreamer:1156 ] - [ WARN ] DataStreamer Exception
org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /user/cloudera/.staging/job_201405300255_0006/job.split could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget(BlockManager.java:1340)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2296)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:501)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:299)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44954)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1002)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1752)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1748)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1438)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1746)
at org.apache.hadoop.ipc.Client.call(Client.java:1238)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
at com.sun.proxy.$Proxy9.addBlock(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
at com.sun.proxy.$Proxy9.addBlock(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:291)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1177)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1030)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:488)
2014-05-30 23:29:11 [main:JobClient$2:1157 ] - [ INFO ] Cleaning up the staging area hdfs://localhost.localdomain:8020/user/cloudera/.staging/job_201405300255_0006
2014-05-30 23:29:11 [main:UserGroupInformation:1158 ] - [ ERROR ] PriviledgedActionException as:cloudera (auth:SIMPLE) cause:org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /user/cloudera/.staging/job_201405300255_0006/job.split could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget(BlockManager.java:1340)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2296)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:501)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:299)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44954)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1002)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1752)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1748)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1438)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1746)
2014-05-30 23:29:11 [main:UserGroupInformation:1159 ] - [ ERROR ] PriviledgedActionException as:cloudera (auth:SIMPLE) cause:org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /user/cloudera/.staging/job_201405300255_0006/job.split could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget(BlockManager.java:1340)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2296)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:501)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:299)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44954)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1002)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1752)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1748)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1438)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1746)
org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /user/cloudera/.staging/job_201405300255_0006/job.split could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget(BlockManager.java:1340)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2296)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:501)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:299)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44954)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1002)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1752)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1748)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1438)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1746)
at org.apache.hadoop.ipc.Client.call(Client.java:1238)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
at com.sun.proxy.$Proxy9.addBlock(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
at com.sun.proxy.$Proxy9.addBlock(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:291)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1177)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1030)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:488)
2014-05-30 23:29:11 [Thread-2:DFSClient:1161 ] - [ ERROR ] Failed to close file /user/cloudera/.staging/job_201405300255_0006/job.split
org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /user/cloudera/.staging/job_201405300255_0006/job.split could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget(BlockManager.java:1340)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2296)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:501)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:299)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44954)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1002)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1752)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1748)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1438)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1746)
at org.apache.hadoop.ipc.Client.call(Client.java:1238)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
at com.sun.proxy.$Proxy9.addBlock(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
at com.sun.proxy.$Proxy9.addBlock(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:291)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1177)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1030)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:488)
2014-05-30 23:29:11 [Thread-2:Client:1162 ] - [ DEBUG ] Stopping client
2014-05-30 23:29:11 [IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera:Client$Connection:1163 ] - [ DEBUG ] IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera: closed
2014-05-30 23:29:11 [IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera:Client$Connection:1163 ] - [ DEBUG ] IPC Client (2058236772) connection to localhost.localdomain/192.168.161.139:8020 from cloudera: stopped, remaining connections 0
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment