Java 7 Error in online Installer?

there are scenarios where a client is allowed to update on their own, such as outside a corporate environment with “regular” users. But, I think that plugin may be examining the entire version string, and in that case, then we’d still be OK since 2.7.0 is greater than 2.6.3. At least I think that’s how the plugin behaves.

used the online version, but when Java updated Spark quits working with the Envirnoment varable not set error message.

I have just done clean install of http://bamboo.igniterealtime.org/artifact/SPARK-INSTALL4J/JOB1/build-647/Install 4j/spark_2_7_0_647_online.exe

I have Java 1.7.0.51 32-bit on system (Windows 7 x64). Works.

I will try that, I am doing a clean install as well, and the installed fails with the version mismatch thing again. I am deleteing ALL remance of Spark including registry entries for the install4j, and it is just not working.

This is really messed up, what I finaly did to get it to work was to delete anything SPARK file wise, delete the ej-xxxxx entry from the register, uninstall java 32/64 from the workstation, re-install java, then the spark online would install. Cant be hgaving to do this every time Java updates, its not goign to fly.

i would just use the spark installer that includes a bundled java.

http://www.igniterealtime.org/builds/spark/dailybuilds/spark_2_7_0_648.exe

hmm, what said version mismatch, spark or java?

could you paste the output of the cmd:

path

(if on windows) or…

echo $PATH

(if on *nix)

also, if on windows try running:

set JAVA_HOME

and paste the output here

Idk how spark finds a jre it’s happy with, if it’s either searching the path, looking for a JAVA_HOME variable, a combination of the above, or else…

Perhaps you have JAVA_HOME setup as an environment variable, but it points to your old (and now replaced) java version?

Path:

C:\Program Files (x86)\AMD APP\bin\x86_64;C:\Program Files (x86)\AMD APP\bin\x86

;C:\Perl64\site\bin;C:\Perl64\bin;C:\Program Files (x86)\Seagate Software\NOTES\

;C:\Program Files (x86)\Seagate Software\NOTES\DATA;C:\Windows\system32;C:\Wind

ows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0;C:\Prog

ram Files (x86)\GNU\GnuPG\pub;C:\Program Files (x86)\Common Files\Acronis\SnapAP

I;C:\Program Files\Dell\SysMgt\rac5;C:\Program Files (x86)\Dell\SysMgt\shared\b

in;C:\Program Files (x86)\ATI Technologies\ATI.ACE\Core-Static;c:\Program Files

(x86)\Microsoft SQL Server\100\Tools\Binn\VSShell\Common7\IDE;c:\Program Files

(x86)\Microsoft SQL Server\100\Tools\Binn;c:\Program Files\Microsoft SQL Server

\100\Tools\Binn;c:\Program Files (x86)\Microsoft SQL Server\100\DTS\Binn;C:\Pr

ogram Files (x86)\QuickTime\QTSystem;C:\Program Files\Microsoft SQL Server\100\

DTS\Binn;C:\Program Files (x86)\Windows Kits\8.1\Windows Performance Toolkit;C

:\Program Files (x86)\Acronis\BackupAndRecovery\

Set command:

C:\Users\supervisor.SHELLCU>set JAVA_HOME

Environment variable JAVA_HOME not defined

I see no java in your path, which probably means Spark cannot find your java install, hence the error you see.

If you are really set on using the system java, you will need to either setup your path to include the java install directory and/or setup JAVA_HOME to point to the java install directory. Again, I’m not sure which one spark uses to find java, but my guess is this may fix it for you.

Otherwise, you can just use the Spark installer that includes java bundled in, and this problem will simply just not happen.

what version of Java do we have here :wink: not that it matters since we should be able to copy an existing java into the jre folder. But you might be right, it would be nice to not have multiple version of java floating around but the security risks should be minimal, however I have found that having diff version of java running with app causes issues with browser based apps.

the java bundled in spark will not pose any additional security threat because your OS is not loading it for the java browser plugin. Java Browser Plugin (that runs Applets) is where the security risk is, otherwise the regular jvm is pretty safe, no more or less safe than a python library or c library sitting on your computer.

Basically the java that is bundled with Spark… your computer does not even know it’s there, and only Spark does so only Spark uses it.

The only thing that will chang when using the Spark bundled with java is you will not get the shared footprint of using the system java, so memory usage may be slightly higher… but generally on modern computers, this is really not a problem.

The current nightly builds of Spark are using Java 7 update 51 (the latest). The Spark 2.6.3 official release comes with Java 6…

Thanks Jason!

Something is wrong with the recent builds (after the switch to new install4j and java 7). Plugins like Roar, Transfer Guard, Taskbar flashing do not load at startup.

And we’ve found out this is due that all plugins now require min 2.7.0 Spark version, but Spark is still presenting itself as 2.6.3 in the default.properties file. Should be fixed with SPARK-33

please try build 653.

http://www.igniterealtime.org/builds/spark/dailybuilds/spark_2_7_0_653.exe

653 works fine. Thanks

Read this with interest. Am running Java 8 update 5 (build 1.8.0_05-b13) according to my Java info. Will Spark build 653 work with Java 8? Am getting same install errors as many posters before me. Admin privileges don’t make a difference. Install4j parms do show Java Min and Max as 1.5 and 1.6, respectively, which I would change if I could, but I’ve done no PC programming so have no compilers, etc. Is Spark 653 the answer?

Actually 653 is not available anymore. Only the last 10 builds are available. You can try the latest one (665) http://bamboo.igniterealtime.org/browse/SPARK-INSTALL4J-665/artifact/shared/Inst all4j/

It shouldn’t be much different from 653. It should work fine with Java 8. If you still get errors when using the online installer, try the full installer with Java bundled. It will come with Java 7. But you can try to delete jre folder in C:\Program Files\Spark and it will use the system java then. You can check which Java Spark is using by going into Help > About menu.

Worked perfectly! Checked to see, and Java Min and Max are now 1.7 and 1.8, which covers both Java 7 and 8. Thank you SO much. Will pass this on to my IT people, who had given up. BTW, I love the new colors, etc. available in 665, and the new icons. I can even be invisible, which I liked about Pandion (the only thing I did like about it ).