Specify JVM

I am creating a spark msi to deploy and since spark can’t run without java I have 2 options, 1)make my msi have a java dependency and install java when needed, or 2) specify the jvm to use at run time.

option 1 won’t work because we have other java apps in our environment and I will break those if I upgrade java.

for option 2 I don’t want the user to have to specify the jre. so if you don’t have java installed and you start spark you can browse to a jre and spark will use that. where does that setting get saved so I can just deploy a jre folder inside the spark dir and have spark already know about it. this will also have zero affect on any other program that use java.

TIA for the help.

Wait. Isnt Spark finding JVM on itself? If i install it with JRE bundled and then delete jre folder in Spark dir, it will still find external JRE and work. Installing Spark with JRE bundled doesnt brake other JRE installations. Spark JRE is some kind of internal java, for Spark only.

if I don’t have java installed on the computer, or if its just sitting in a folder with no environmental variables pointing to it, when I run spark it prompts to browse for the jre. It only does that once and I would think that means it saves it some where. but where.

I want to deploy my msi with its own jre in the spark folder and specify to spark of its existence prior to ever being ran. I also don’t want this jre registered with the rest of the computers environmental variables.

I wonder it Spark bundled with JRE is registering its own jre location anywhere. Maybe it just looks for jre folder in main Spark folder. Anyway, look also in C:\Documents and Settings\user\Spark\spark.properties. Maybe this is the place where it holds jre location if it’s not in env.variables.

I don’t know if this is what you are looking for, but this is what I have done in our environment.

If you look at the Spark.ini file in C:\Program Files\Spark\ you will see two lines:

Preferred versions=1.6;1.5;

Minimum Version=1.6

By default spark want to use JRE 1.6 and this is the JRE version included with the .msi file. If you are using JRE 1.5 in your environment you can edit the spark.ini file to Minimum Version=1.5 to allow spark to use the 1.5 JRE. I don’t know if you can go any lower…I am guessing probably not. I don’t know if this officially supported, but I have been able to successfully use Spark with JRE. 1.5. This is with Spark 2.5.8.

I just use a computer logon script to replace the Spark.ini file at startup with an updated version.

This is also nice because if you are deploying Spark using group policy or some other method it cuts down on the file size because the JRE isn’t in the .msi file.

We have vendor apps that run on JRE 1.5 and I don’t want to move to 1.6 unless I absolutely have to because 1.6 is difficult to deal with in regards to Web Start apps in a multiuser environment.

Hope that helps.

It’s recommended to always have the latest version of JVM, to downgrade to a lower JVM doesn’t sound very logical. Spark is ideally built to use JRE 1.6. Keep that in mind next time you find a bug in Spark and you’re using JRE 1.5 or lower.

thats what I am trying to do is have spark use the latest java, but since spark doesn’t make money (for us) and other programs that make money use a lower version of java. So if I install spark and break those other apps, my name wouldn’t be very good here. Thats why I am looking at trying to just have whatever java I want copied to the spark install folder and have spark use that. as said above spark does seem to find it if its inside its own dir.

I will answer my own question. To deploy spark with a jre that is not installed in windows, under the section [Java Runtime Environment] in the file c:\program files\spark\spark.ini , add the following line:

JRE Path=C:\path\to\your\jre\