Multiple Instances of Spark

Hello,

I would like to run multiple instances of Spark.

In the forums I found different opinions as to whether it should work at all.

The following suggestions were made:

  1. In the install directory, edit spark.ini and change

Single Instance=yes

to

Single Instance=no

  1. You can run second instance of Spark only with Run as… option and specifying other user credentials.

  2. Make a copy of spark.exe named spark2.exe

So, can two instances of Spark be run?

Thanks,

Vance

Try it yourself and find out. I can only tell that 2. is working. As about 1. i wasn’t able to reproduce this, i don’t have spark.ini.