powered by Jive Software

Easy way to make Spark work with Java 7

Setting up some new PCs, I didn’t want to put the old JRE on there so I set out to find a way around it. Seems to me the easiest way is just to lie about the Java version to the Spark installer. I dunno if someone else has figured this out in the past, but I never saw any threads about it and kept seeing a bunch of stuff about trying to run the beta (which still wanted 1.5 or 1.6 when I tried build 579) and moving the JRE folder or something. I never got any of that business to work. Anyway, here’s how to do it using my method.

  1. Open regedit (if you are using x64 open it with %systemroot%\syswow64\regedit -m to open the 32 bit registry editor)
  2. Go to **HKEY_CURRENT_USER\Software\ej-technologies\exe4j\jvms**c:/program files (x86)/java/jre7/bin/java.exe
  3. The underlined portion will be different if you are using x86
  4. On the right hand pane, edit the Version key and change 1.7.whatever to 1.6.whatever

You can now use the smaller online installer so that you don’t have to run an old JRE. You may get an error that it can’t find a JRE when you try to run it. If so, click locate and go to C:\Program Files (x86)\Java\jre7\bin then double click java.exe. The installer should run and finish as usual.

EDIT: I confirmed that what wraithe said is true, so I removed the extra keys that didn’t have to be edited. This makes it even easier

1 Like

Excellent tip, thanks! In my test, I only changed the first key listed to 1.6 and had no issues getting spark installed after that. Thanks again for the tip.

I just tried it here: worked for installation but after changeing it back to 1.7… Spark did not work anymore (the EXE4J_JAVA_HOME message) - I had to install a JRE 1.6 and then everything was fine (the 1.7 is still shown when doing a java -version)

W7 64 Bit German

In order to run Spark (after install), you have to leave the Version set to 1.6 something until after Spark is started. You can change it back, but then will need to set it to 1.6 to run Spark again next time.

You can leave the 1.6 setting in the registry and Java still continues to run and update normally. I have just been leaving mine on 1.6.