I have 2 nodes and both of them have installed spark. When I run spark job on the shell, It seems that only one node is running. And in the port 18080, there is only one node in the Executors. How can I run spark job by using both nodes?
Spark shell supports only client mode not cluster mode. You can refer this link for spark deployment modes - Deployment Modes
Also, specify more no. of executors using the below property
Please make sure that you are trying to launch Spark Shell in YARN (or Spark Standalone) mode and that YARN Resource Manager (Standalone Master) is able to see Node Manager (Slave) on both nodes.
Both Tarun and Rostyslav provided great information and suggestion. Has your issue been resolved? Please help to mark the answer "Correct" or "Helpful" to share your learning with other community members.
Retrieving data ...