Spark Folder/File Creation - .jmf, Spark, userdic.tlx

I spoke with Derek briefly on this, but I’m going to post it here in case anyone else has the same question.

We noticed that Spark does some very strange things according to the HKCU\Software\Microsoft\Windows\CurrentVersion\Explorer\User Shell Folders\Desktop key in the system registery. We are attempting to deply spark into our General Access labs. We have the “Desktop” key set to a readonly share which all users receive upon login into the computers.

Now, for the weird part. For some reason, spark will look at that key, ignore the last folder, and then save the application data there. For example, if the desktop key was set to c:\classrooms\test, spark will create the .jmf, spark and userdic.tlx entries there.

Our problem is that share is only read only, what we need is a way to specify via a config file where spark should create those folders (for example, c:\temp). Has anyone else had this problem and come up with some alternative solution?

Hi Aaron,

for me this key is set to “%userprofile%\Desktop” and I wonder why Spark should ever use it. During installation it may want to create an icon there, but later it should use “%userprofile%\Spark” to store it’s data. I think that previous versions (and maybe also this one) query the user.home which usually is “%userprofile%”. You could create Spark.vmoptions as described in Spark JVM Settings and add there “-Duser.home=c:/temp” - this could work.


I tried creating the Spark.vmoptions file as described, but I couldn’t even get the log information options to work. Is there something i’m missing? I placed the file in c:\program files\spark

I was using the MSI install of spark, which does not allow for these options to be passed in via the Spark.vmoptions. However, even on the exe install, -Duser.home still doesn’t get picked up on. Spark is still creating its files in c:\documents and settings\username\spark

Hi Aaron,

make sure the the file type is VMOPTIONS-File, so if you double-click it Windows should not know that “Notepad.exe” would be the right application to open it. If you did enter there only -Duser.home=c:/temp then the file should have a size of 19 bytes (or probably 21 if you did enter a line feed) but never 22 or more as this would indicate an UTF-8 file with BOM header.

As far as I know it does not matter in which way you did install Spark as you create the file after installation in the directory where Spark.exe is located.


I spoke with Derek who informed me that it was a no go with the MSI install. I was able to get the logging options to work (as the post you directed me to said), but only with the exe install, and not the MSI.

Are there any updates in regards to this problem? Last I chatted with Derek he informed me someone else was working on another solution for this, is there any information? We are in a situation where we could really use this fix, or else sadly we will have to investigate other clients

I really need this fixed soon too, per the issue I describe in the discussion Re: Spark slowing down?? Major issues

Looks like there’s already a bug in the issue tracker for this problem. Hopefully attention can be given to get it resolved, it doesn’t appear to be a very complex fix to me.

See Spark 743

I think this is related to this Java bug

We have these same issues with Maple in our labs. We redirect our desktop to a read only location on the network and have to copy config files for Java programs up to this folder to get them to work

Unfortunetly for Spark this won’t work as we would like it for our users to be able to configure the program to their liking.

Any thoughts on a good way to get around this? Does this work method with the spark.vmoptions work (