AnsweredAssumed Answered

Unable to create the spark session from Jupyterhub

Question asked by jalendhar on Jun 27, 2018
Latest reply on Jun 29, 2018 by MichaelSegel

Hi Team,

 

I am uanble to create the spark session object from the jupyterhub notebook. I am creating the jupyterhub notebook by spawning the docker spawner . When I am trying to connect to spark server I am getting the below exception.

 

Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext. : org.apache.hadoop.security.AccessControlException: SIMPLE authentication is not enabled.  Available:[TOKEN, MAPRSASL]      at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)      at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)      at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)      at java.lang.reflect.Constructor.newInstance(Constructor.java:423)      at org.apache.hadoop.yarn.ipc.RPCUtil.instantiateException(RPCUtil.java:53)      at org.apache.hadoop.yarn.ipc.RPCUtil.unwrapAndThrowException(RPCUtil.java:104)      at org.apache.hadoop.yarn.api.impl.pb.client.ApplicationClientProtocolPBClientImpl.getNewApplication(ApplicationClientProtocolPBClientImpl.java:224)      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)      at java.lang.reflect.Method.invoke(Method.java:498)      at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)      at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)      at com.sun.proxy.$Proxy16.getNewApplication(Unknown Source)      at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getNewApplication(YarnClientImpl.java:219)      at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createApplication(YarnClientImpl.java:227)      at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:159)      at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:56)      at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:156)      at org.apache.spark.SparkContext.<init>(SparkContext.scala:509)      at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)      at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)      at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)      at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)      at java.lang.reflect.Constructor.newInstance(Constructor.java:423)      at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)      at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)      at py4j.Gateway.invoke(Gateway.java:236)      at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)      at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)      at py4j.GatewayConnection.run(GatewayConnection.java:214)      at java.lang.Thread.run(Thread.java:748) Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): SIMPLE authentication is not enabled.  Available:[TOKEN, MAPRSASL]      at org.apache.hadoop.ipc.Client.call(Client.java:1475)      at org.apache.hadoop.ipc.Client.call(Client.java:1412)      at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)      at com.sun.proxy.$Proxy15.getNewApplication(Unknown Source)      at org.apache.hadoop.yarn.api.impl.pb.client.ApplicationClientProtocolPBClientImpl.getNewApplication(ApplicationClientProtocolPBClientImpl.java:221)      ... 25 more

 

 

Any one can help me out !!

 

TIA.

Outcomes