Spark with OpenJDK

I cant make Spark working with OpenJDK 11. It works well with standard JRE

Message says:

Please define EXE4J_JAVA_HOME to point to an installed 32-bit JDK or JRE or download a JRE from www.java.com

Any way to make it working ?
Might it be this happends because OpenJDK is 64bit/mixed mode

Here are some datails

openjdk version “11.0.1” 2018-10-16
OpenJDK Runtime Environment 18.9 (build 11.0.1+13)
OpenJDK 64-Bit Server VM 18.9 (build 11.0.1+13, mixed mode)

Spark doesn’t support newest versions of Java. It only works with Java 8. https://issues.igniterealtime.org/browse/SPARK-2017

Interestingly Spark runs on Ubuntu with OpenJDK 10 without any adjustment, but it throws lots of errors in console about plugins and not suitable Java. But at least it runs.

I have tested Spark 2.8.3 and not yet released 2.9.0 build with AdoptOpenJDK 8 JRE 8.0.212 (latest) and it seems to work ok. No errors about plugins like with Java 10+.

About the licensing change with Oracle’s Java. I think it only affects distribution and only starting with 8.0.212 (April update). It was still ok to bundle with 2019 January update. And Spark 2.8.3 is currently bundled with even older version. Again, i don’t think this affects actually using Java with a desktop app. But if you want to be strict about it, then you can remove Oracle’s Java and install OpenJDK 8.

There is no change about supporting newer Java as there are no developers working on Spark.

In future we might avoid bundling Spark and Openfire with Java (even with OpenJDK) to avoid any possible legal issues. So it would be up to users to install appropriate Java.