Spark on Windows 7 Ultimate/Enterprise x64


I have been facing an issue with Spark as we would like to begin rolling this client out for employees to use. Here’s the issue I’m facing.

When we install Spark, it runs perfectly initially. However, when the user logs out of the client or their workstation, Spark will not start again. We face the following error:

“The JVM found at C:\Program Files (x86)\Spark\jre is damaged. Please reinstall or define EXE4J_JAVA_HOME to point to an installed JDK or JRE.”

“The JVM could not be started. The main method may have thrown an exception.”

These errors keep popping up on all the versions I have downloaded:

Spark 2.5.8



I have Java installed on all machines as we use Java-dependant software within our company. Is there a way to force Spark to use the already installed JDK instead of it’s own that it seems to keep corrupting?



You can just delete the jre folder inside the Spark folder. Then it should use system’s Java. But this is not normal for internal Spark’s jre to become corrupted like this. Maybe your antivirus software is messing with it.

There is also a spark install without bundled jre, you should use that

Thanks for the suggestions, however neither worked. While deleting the JRE folder did get me somewhere, it just throws the “The JVM could not be started. The main method may have thrown an exception.” error. Same thing when I use the offline version without the JRE. I even took the liberty of completely disabling my antivirus.

It’s boggling my mind because it works flawlessly - up until the point the client is closed. Once Spark.exe is terminated, I cannot get it to run again without reinstalling the client.

I was able to get some information out of the error log:

May 4, 2011 11:50:30 AM org.jivesoftware.spark.util.log.Log error


java.lang.NoSuchMethodError: org.jivesoftware.sparkimpl.settings.local.LocalPreferences.getStunFallbackHost( )Ljava/lang/String;

at org.jivesoftware.sparkplugin.JinglePlugin.initialize(

at org.jivesoftware.spark.PluginManager$

at java.awt.event.InvocationEvent.dispatch(Unknown Source)

at java.awt.EventQueue.dispatchEvent(Unknown Source)

at java.awt.EventDispatchThread.pumpOneEventForFilters(Unknown Source)

at java.awt.EventDispatchThread.pumpEventsForFilter(Unknown Source)

at java.awt.EventDispatchThread.pumpEventsForHierarchy(Unknown Source)

at java.awt.EventDispatchThread.pumpEvents(Unknown Source)

at java.awt.EventDispatchThread.pumpEvents(Unknown Source)

at Source)


that report is a blast from the past. The STUN service used to have a hard coded IP in Spark. This was removed for very good reasons. I’ll give the error log entry to the developers.Is the error message reproducable?


Hi Walter,

Doesn’t seem to be the case. When I use Spark 2.5.8, I get the following in my error log:


Exception on commit = Can’t find registry file

Exception on commit = Can’t find registry file Cannot assign requested address: Cannot bind

at Method)

at Source)

at Source)

at Source)

at Source)



at org.jivesoftware.smackx.jingle.nat.ICEResolver.initialize(

at org.jivesoftware.smackx.jingle.nat.TransportResolver.initializeAndWait(Transpor

at org.jivesoftware.smackx.jingle.nat.ICETransportManager.(ICETransportManag

at org.jivesoftware.sparkplugin.JinglePlugin$1.construct(

at org.jivesoftware.spark.util.SwingWorker$

at Source)

Exception on commit = Can’t find registry file

Exception on commit = Can’t find registry file

And when I use Spark, nothing shows in my error log but I have the same issue.

Also, maybe this will shine some light. One time after I ran the installer, I unchecked “Start Spark” in the Wizard. But when I ran the exe directly, it threw the error. So however the installer Wizard opens the exe, it works.

try installing with admin-rights and running with admin-rights

the install4j installer sets registry entries.

other than that i have no idea why it shouldn’t work

Windows UAC automatically prompts when the installer is ran, so that is no issue. However, running the program as Administrator completely solved the issue. In case anyone else has the same or similar issue, the problem is resolved by right clicking the Spark.exe > Properties > Compatibility Mode > Change Settings for All Users > tick the box to “Run this program as an administrator”


Thanks for the info.

However having to actually RUN Spark with admin rights does not go down to well with my user base as the majority of them dont have admin rights on their PCs.

try building spark yourself using Launch4j and Innosetup, as they dont set registry entries

Both free software: