Skip to content

Instantly share code, notes, and snippets.

@rmetzger
Created December 19, 2014 14:20
Show Gist options
  • Save rmetzger/600bea0361736c235eb1 to your computer and use it in GitHub Desktop.
Save rmetzger/600bea0361736c235eb1 to your computer and use it in GitHub Desktop.
If you have this error, read the comment below ....
14:16:45,207 DEBUG org.apache.hadoop.service.AbstractService - Service: org.apache.hadoop.yarn.client.api.impl.AMRMClientImpl entered state INITED
14:16:45,215 WARN org.apache.hadoop.security.token.Token - Cannot find class for token kind YARN_AM_RM_TOKEN
14:16:45,215 DEBUG org.apache.hadoop.security.SecurityUtil - Acquired token Kind: YARN_AM_RM_TOKEN, Service: 127.0.0.1:8030, Ident: 00 00 01 4a 62 b1 45 25 00 00 00 01 00 00 00 01
14:16:45,216 DEBUG org.apache.hadoop.security.UserGroupInformation - PrivilegedAction as:robert (auth:SIMPLE) from:org.apache.hadoop.yarn.client.RMProxy.getProxy(RMProxy.java:63)
14:16:45,217 DEBUG org.apache.hadoop.yarn.ipc.YarnRPC - Creating YarnRPC for org.apache.hadoop.yarn.ipc.HadoopYarnProtoRPC
14:16:45,217 DEBUG org.apache.hadoop.yarn.ipc.HadoopYarnProtoRPC - Creating a HadoopYarnProtoRpc proxy for protocol interface org.apache.hadoop.yarn.api.ApplicationMasterProtocol
14:16:45,225 DEBUG org.apache.hadoop.ipc.Server - rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@66f043ea
14:16:45,238 INFO org.apache.hadoop.yarn.client.RMProxy - Connecting to ResourceManager at robert-da/127.0.0.1:8030
14:16:45,241 DEBUG org.apache.hadoop.service.AbstractService - Service org.apache.hadoop.yarn.client.api.impl.AMRMClientImpl is started
14:16:45,242 DEBUG org.apache.hadoop.service.AbstractService - Service: org.apache.hadoop.yarn.client.api.impl.NMClientImpl entered state INITED
14:16:45,242 INFO org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy - yarn.client.max-nodemanagers-proxies : 500
14:16:45,242 DEBUG org.apache.hadoop.yarn.ipc.YarnRPC - Creating YarnRPC for org.apache.hadoop.yarn.ipc.HadoopYarnProtoRPC
14:16:45,242 DEBUG org.apache.hadoop.service.AbstractService - Service org.apache.hadoop.yarn.client.api.impl.NMClientImpl is started
14:16:45,242 INFO org.apache.flink.yarn.ApplicationMaster$$anonfun$startJobManager$2$$anon$1 - Sec enabled = false cp = /home/robert/incubator-flink/target/org.apache.flink.yarn.YarnClientIT/org.apache.flink.yarn.YarnClientIT-localDir-nm-1_0/usercache/robert/appcache/application_1418994992421_0001/container_1418994992421_0001_01_000001/flink.jar::/share/hadoop/common/*:/share/hadoop/common/lib/*:/share/hadoop/hdfs/*:/share/hadoop/hdfs/lib/*:/share/hadoop/yarn/*:/share/hadoop/yarn/lib/*
14:16:45,243 INFO org.apache.flink.yarn.ApplicationMaster$$anonfun$startJobManager$2$$anon$1 - Registering ApplicationMaster with tracking url http://localhost.localdomain:8081.
14:16:45,272 DEBUG org.apache.hadoop.ipc.Client - The ping interval is 60000 ms.
14:16:45,272 DEBUG org.apache.hadoop.ipc.Client - Connecting to robert-da/127.0.0.1:8030
14:16:45,276 DEBUG org.apache.hadoop.security.UserGroupInformation - PrivilegedAction as:robert (auth:SIMPLE) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:654)
14:16:45,312 DEBUG org.apache.hadoop.security.SaslRpcClient - Sending sasl message state: NEGOTIATE
14:16:45,318 DEBUG org.apache.hadoop.security.SaslRpcClient - Received SASL message state: NEGOTIATE
auths {
method: "TOKEN"
mechanism: "DIGEST-MD5"
protocol: ""
serverId: "default"
challenge: "realm=\"default\",nonce=\"TlxctAx3HDtrYbV48fF1mCdjbz9qQ+/N+UUw2fxa\",qop=\"auth\",charset=utf-8,algorithm=md5-sess"
}
14:16:45,319 DEBUG org.apache.hadoop.security.SaslRpcClient - Get token info proto:interface org.apache.hadoop.yarn.api.ApplicationMasterProtocolPB info:null
14:16:45,319 ERROR org.apache.hadoop.security.UserGroupInformation - PriviledgedActionException as:robert (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN]
14:16:45,320 DEBUG org.apache.hadoop.security.UserGroupInformation - PrivilegedAction as:robert (auth:SIMPLE) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:583)
14:16:45,320 WARN org.apache.hadoop.ipc.Client - Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN]
14:16:45,320 ERROR org.apache.hadoop.security.UserGroupInformation - PriviledgedActionException as:robert (auth:SIMPLE) cause:java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN]
14:16:45,320 DEBUG org.apache.hadoop.ipc.Client - closing ipc connection to robert-da/127.0.0.1:8030: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN]
java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN]
at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:620)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
at org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:583)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:667)
at org.apache.hadoop.ipc.Client$Connection.access$2600(Client.java:314)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1399)
at org.apache.hadoop.ipc.Client.call(Client.java:1318)
at org.apache.hadoop.ipc.Client.call(Client.java:1300)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
at com.sun.proxy.$Proxy5.registerApplicationMaster(Unknown Source)
at org.apache.hadoop.yarn.api.impl.pb.client.ApplicationMasterProtocolPBClientImpl.registerApplicationMaster(ApplicationMasterProtocolPBClientImpl.java:106)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy6.registerApplicationMaster(Unknown Source)
at org.apache.hadoop.yarn.client.api.impl.AMRMClientImpl.registerApplicationMaster(AMRMClientImpl.java:197)
at org.apache.flink.yarn.YarnJobManager$$anonfun$receiveYarnMessages$1.applyOrElse(YarnJobManager.scala:142)
at scala.PartialFunction$OrElse.apply(PartialFunction.scala:162)
at org.apache.flink.runtime.ActorLogMessages$$anon$1.apply(ActorLogMessages.scala:37)
at org.apache.flink.runtime.ActorLogMessages$$anon$1.apply(ActorLogMessages.scala:27)
at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118)
at org.apache.flink.runtime.ActorLogMessages$$anon$1.applyOrElse(ActorLogMessages.scala:27)
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
at org.apache.flink.runtime.jobmanager.JobManager.aroundReceive(JobManager.scala:53)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
at akka.dispatch.Mailbox.run(Mailbox.scala:220)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN]
at org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:170)
at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:387)
at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:494)
at org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:314)
at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:659)
at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:655)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:654)
... 32 more
@rmetzger
Copy link
Author

The issue is that the class for the YARN_AM_RM_TOKEN could not be loaded from the service loader.

  • check your classpath. all the yarn dependencies need to be there.
  • also check that the META-INF/services/ .. TokenI.. file is correct and contaisn the AM RM token entry.

Ask me if you need more information 😄

@jhalaria
Copy link

@rmetzger I get the following exception with flink trying to connect to hdfs.
org.apache.flink.client.program.ProgramInvocationException: The program execution failed: org.apache.flink.runtime.JobException: Creating the input splits caused an error: SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS]

I am using flink0.8.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment