Hello,
I have been facing an issue with Spark as we would like to begin rolling this client out for employees to use. Here’s the issue I’m facing.
When we install Spark, it runs perfectly initially. However, when the user logs out of the client or their workstation, Spark will not start again. We face the following error:
“The JVM found at C:\Program Files (x86)\Spark\jre is damaged. Please reinstall or define EXE4J_JAVA_HOME to point to an installed JDK or JRE.”
“The JVM could not be started. The main method may have thrown an exception.”
These errors keep popping up on all the versions I have downloaded:
Spark 2.5.8
Spark 2.6.0.12103
Spark 2.6.0.12222
I have Java installed on all machines as we use Java-dependant software within our company. Is there a way to force Spark to use the already installed JDK instead of it’s own that it seems to keep corrupting?
Thanks,
Travis