Feature Request: Java version and module detection

Feature Request

We have noticed some issues when running different versions of Java with different Java modules installed on the system causing some

issues with Spark not working correctly. Upon launching Spark can you perform a check/test to see if the version of Java installed

will correctly work with Spark?

The safest way is probably to use Spark with bundled JRE. Bundled JRE won’t affect other java versions installed on a system.

Hello Wroot,

Could you advise how to do it? how to bundled JRE in Spark Mac app dmg?

I am pretty new here and failed on run ant installer.mac on Mac osx 10.9(mavericks).

Thanks,

DooDoo

My reply was about using the installer with JRE already bundled. I don’t know how to bundle JRE into installer, i’m not a developer.