Can Apache Spark cluster manager allocate task to executors of another server?

  • Thread starter Thread starter nidhin13
  • Start date Start date
N

nidhin13

Guest
Hi,

I am newbie into apache spark . My application server is installed with apache spark 2.4.1. It is a windows OS based system. I have 3 linux servers with spark installed in them.

I want my application server to scale in such a way that any request that comes to my application(IIS server) server should redirect the load on to the 3 linux servers.

So, spark driver(master) and cluster manager would start a spark session and initiate the spark executors of another server or virtual machine. Is it possible? If yes, please guide me the steps and process to achieve it. If no, what are the other options to achieve this objective, may be something similar to spark

Continue reading...
 
Back
Top