powered by Jive Software

Spark run errors in windows

when running spark (any version) in widows the process generates a ton of errors for files not found. It cannot find random tmp files, and other files. See the images below.



I can confirm the error in the tmp directory for Spark 2.5.8. This looks like an error with the installer (install4j). It seems not to be related to the Spark code as my version (not based on install4j but on Installshield) is not showing this behaviour. Any fix would require a fix in the install4j routines. This tool is commercial and not available to the community. Sorry.

A fix, not a good one and also a test about the error, would be like this:Put a file named startup.bat in the folder /bin of your Spark program folder (eg at C:\Tools\Spark2.5.8\bin).Start the bat instead of Spark.exe. My filemon shows nothing when Spark is started by the bat.


The content of the batch file:

if “%1” == “-debug” goto debug
if “%1” == “-noconsole” goto noconsole
…\jre\bin\java -Dappdir=… -cp …/lib/windows/jmf.jar;…/lib/startup.jar;…/lib/windows/jdic.jar;…/resources; …/lib/windows; -Djava.library.path="…/lib/windows" org.jivesoftware.launcher.Startup
goto end

start …\jre\bin\javaw -Dappdir=… -cp …/lib/windows/jmf.jar;…/lib/startup.jar;…/lib/windows/jdic.jar;…/resources; …/lib/windows; -Djava.library.path="…/lib/windows" org.jivesoftware.launcher.Startup
goto end

start “Spark” “…\jre\bin\java” -Ddebugger=true -Ddebug.mode=true -XX:+HeapDumpOnOutOfMemoryError -Xdebug -Xint -server -Xnoagent -Xrunjdwp:transport=dt_socket,server=y,suspend=n,address=8000 -Dappdir= -cp …/lib/windows/jmf.jar;…/lib/startup.jar;…/lib/windows/jdic.jar;…/resources; …/lib/windows; org.jivesoftware.launcher.Startup
goto end

That bat file does not run as it cannot find the paths. My install is in the c:\program files\Spark folder. Do you have a 2.5.8 build in exe or msi that you can share not made with install4j?

Hi Todd,

the batch file should be located in your folder c:\program files\Spark\bin. I can also send you a 2.5.8 folder with spark.exe and the startup.bat. This zip file is 40 MB and it might not be worth the effort.


After modifying the file to point to my JRE I still cannot get spark to run in debug mode. It works correctly if I don’t use the -debug flag, but if I try to use it, a command window just pops up, then closes, and spark never opens.

Any ideas? I can’t seem to find anyone else with the same problem.