I have tried Spark 2.5.8 and Spark 2.6.0 beta and neither run. They install fine, but as a user with a roaming profile running the app - does absolutely nothing. [Just when we’d standardized on Spark! Grrr…] Prior versions [2.5.2] work. These versions will work as Administrator [local account]. Running spark.exe from CMD produced no output, running spark.exe /h or spark.exe -h doesn’t display a usage message. Is there anyway to debug Spark?
Does the spark.properties file get written when you start the application with a using with a roaming profile?
Hi Adam,
you could try to start Spark with -Duser.home=“C:\foobar” to try to use another folder. Read Spark JVM Settings (if you have an MSI install read also Spark.cmd start script) for details how to set it.
LG