I cant make Spark working with OpenJDK 11. It works well with standard JRE
Please define EXE4J_JAVA_HOME to point to an installed 32-bit JDK or JRE or download a JRE from www.java.com
Any way to make it working ?
Might it be this happends because OpenJDK is 64bit/mixed mode
Here are some datails
openjdk version “11.0.1” 2018-10-16
OpenJDK Runtime Environment 18.9 (build 11.0.1+13)
OpenJDK 64-Bit Server VM 18.9 (build 11.0.1+13, mixed mode)