AnsweredAssumed Answered

MapReduce AppMaster does not start with sasl connection failure

Question asked by oae on Mar 5, 2015
Latest reply on Mar 6, 2015 by oae
Hi there,

i have a mapr-4.0.1 cluster installed and a mapr client on a different machine.
Now i was able to use the maprcli to submit an example map-reduce job successfully.
However when i use my custom application to submit map-reduce job the MapReduce Application Master is getting always stuck in this:

    2015-03-05 08:03:17,480 DEBUG [main] org.apache.hadoop.security.SaslRpcClient: Get token info proto:interface org.apache.hadoop.yarn.api.ApplicationMasterProtocolPB info:org.apache.hadoop.yarn.security.SchedulerSecurityInfo$1@7fbf6eeb
    2015-03-05 08:03:17,480 DEBUG [main] org.apache.hadoop.yarn.security.AMRMTokenSelector: Looking for a token with service 10.244.137.136:8032
    2015-03-05 08:03:17,480 DEBUG [main] org.apache.hadoop.yarn.security.AMRMTokenSelector: Token kind is YARN_AM_RM_TOKEN and the token's service name is 10.244.137.136:8032
    2015-03-05 08:03:17,480 DEBUG [main] org.apache.hadoop.security.SaslRpcClient: Creating SASL DIGEST-MD5(TOKEN)  client to authenticate to service at
    2015-03-05 08:03:17,480 DEBUG [main] org.apache.hadoop.security.rpcauth.DigestAuthMethod: Creating SASL DIGEST-MD5 client to authenticate to service at 10.244.137.136:8032
    2015-03-05 08:03:17,480 DEBUG [main] org.apache.hadoop.security.SaslRpcClient: Use TOKEN authentication for protocol ApplicationMasterProtocolPB
    2015-03-05 08:03:17,480 DEBUG [main] org.apache.hadoop.security.rpcauth.DigestAuthMethod: SASL client callback: setting username: AAABS+XvnHIAAAAUAAAAAQ==
    2015-03-05 08:03:17,480 DEBUG [main] org.apache.hadoop.security.rpcauth.DigestAuthMethod: SASL client callback: setting userPassword
    2015-03-05 08:03:17,480 DEBUG [main] org.apache.hadoop.security.rpcauth.DigestAuthMethod: SASL client callback: setting realm: default
    2015-03-05 08:03:17,481 DEBUG [main] org.apache.hadoop.security.SaslRpcClient: Sending sasl message state: INITIATE
    token: "charset=utf-8,username=\"AAABS+XvnHIAAAAUAAAAAQ==\",realm=\"default\",nonce=\"3PHdmq21tpQ1mvDASAS0wilFmjPgeQICLoHk0h+f\",nc=00000001,cnonce=\"hpbSKkSmQrm0P7GtyfM62OOtSa733CPCCZMka2qA\",digest-uri=\"null/default\",maxbuf=65536,respon
    se=f84df1928c83677c3c660d3b54ff52fb,qop=auth"
    auths {
      method: "TOKEN"
      mechanism: "DIGEST-MD5"
      protocol: "default"
      serverId: ""
    }
    
    2015-03-05 08:03:17,483 DEBUG [main] org.apache.hadoop.security.UserGroupInformation: PrivilegedActionException as:bamboo (auth:SIMPLE) cause:org.apache.hadoop.ipc.RemoteException(java.io.IOException): tried to deserialize -27 bytes of data
    !  newLength must be non-negative.
    2015-03-05 08:03:17,483 WARN [main] org.apache.hadoop.security.authentication.util.KerberosUtil: JCE Unlimited Strength Jurisdiction Policy Files are not installed. This could cause authentication failures.
    2015-03-05 08:03:17,483 DEBUG [main] org.apache.hadoop.security.UserGroupInformation: PrivilegedAction as:bamboo (auth:SIMPLE) from:org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:645)
    2015-03-05 08:03:17,483 WARN [main] org.apache.hadoop.ipc.Client: Exception encountered while connecting to the server : org.apache.hadoop.ipc.RemoteException(java.io.IOException): tried to deserialize -27 bytes of data!  newLength must be
    non-negative.
    2015-03-05 08:03:17,483 DEBUG [main] org.apache.hadoop.security.UserGroupInformation: PrivilegedActionException as:bamboo (auth:SIMPLE) cause:org.apache.hadoop.ipc.RemoteException(java.io.IOException): tried to deserialize -27 bytes of data
    !  newLength must be non-negative.
    2015-03-05 08:03:17,483 DEBUG [main] org.apache.hadoop.ipc.Client: closing ipc connection to ec2-54-224-245-45.compute-1.amazonaws.com/10.244.137.136:8032: tried to deserialize -27 bytes of data!  newLength must be non-negative.
    org.apache.hadoop.ipc.RemoteException(java.io.IOException): tried to deserialize -27 bytes of data!  newLength must be non-negative.
            at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:362)
            at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:555)
            at org.apache.hadoop.ipc.Client$Connection.access$1900(Client.java:371)
            at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:717)
            at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:713)
            at java.security.AccessController.doPrivileged(Native Method)
            at javax.security.auth.Subject.doAs(Subject.java:415)
            at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1469)
            at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712)
            at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:371)
            at org.apache.hadoop.ipc.Client.getConnection(Client.java:1464)
            at org.apache.hadoop.ipc.Client.call(Client.java:1383)
            at org.apache.hadoop.ipc.Client.call(Client.java:1365)
            at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
            at com.sun.proxy.$Proxy30.registerApplicationMaster(Unknown Source)
            at org.apache.hadoop.yarn.api.impl.pb.client.ApplicationMasterProtocolPBClientImpl.registerApplicationMaster(ApplicationMasterProtocolPBClientImpl.java:106)

Any ideas ?


Outcomes