Windows 7

I just installed Windows 7 64-bit (RTM) and cant seem to launch Spark. The install goes fine, but when I try to launch the program nothing seems to happen. I’ve checked the Windows events and dont see anything being logged.

I understand that Windows 7 isn’t even publicly available yet, but has anyone else ran into this? I would be will to be a ginea pig if you need me to test things.

Just let me know!

Have just tested almost every Spark install (2.5.8, 2.6.0 Beta 2 and experinemtal SVN version installers) on Windows 7 64-bit RTM. Everything works fine.

I’m running the latest beta on Windows 7 RC, only problem I have is that Spark disappears when minimizing it (it doesn’t go to systray).

Have just tried both 32 and 64 bits versions of Windows 7 (RTM). 2.6.0 Beta and latest SVN versions show systray icon. Maybe it is just hidden in that systray popup window? It is by default. You have to change settings for this icon for it to show all the time.

I forgot about the 2.6.0 Beta2 release when I posted this question - I installed that [2.6.0 Beta2] and it works as I would expect it to. I’m not sure why 2.5.8 didn’t work for me, but I’m happy with this.


I’ve double-checked that the new systray isnt hiding the icon. When I minimize Spark, it just disappears visually (it’s still running in the process list).

Maybe I’ll just need to reset my Spark settings, I’ll try more debugging on monday.

Two notes on Spark 2.5.8 with Windows 7:

  1. Make sure to install java on Windows 7, otherwise, Spark can’t find Java VM and won’t run.

  2. If using Windows 7 64-bit, the program files go to c:\program files\x86\spark folder. Spark’s shortcuts need to be changed to point there as they default to c:\program files\spark.

Otherwise, Spark seems to be working OK on Windows 7.

I have Windows 7 x64 and installed Spark with no problem a couple of days ago. I used the MSI package though, not the exe. Maybe that matters?

we have several users running windows 7 and have for a while, from beta all the way up to rtm and x86 and 64, and the only problem we have had is the 64 path problem mentioned above. and we are using 2.5.8, albeit with the msi custimized quite a bit.

wanted to share in case others are also running similar configuration as i am.

I am running Windows 7 in a VMware instance (virtual machine).

for some reason the Spark 2.5.8 was trying to write to a log under the* \vmware-host\Shared Folders\Spark\logs\error.log*.

of course this folder did not exist. BTW i installed SPARK under c:\Spark.

once i created the Spark\logs folder under* \vmware-host\Shared Folders*

i was able to start Spark, all seem to be working eventhough i see this error in the log file " \vmware-host\Shared Folders\userdic.tlx (The system cannot find the file specified)"

Hi everyone,

I very interesting in Spark and i used it like a cuple of days but now I got the same problem: I try to lunch Spark but nothing happen somebody have any idea why is it? or have any solution for this problem, i think could be something on java update, I will try to uninstall java updates and I will post whats happen with it.

Thank you in advice.

I would recommend looking at the log file under the Spark folder, look for java errors. if you are not familiar with the java errors. remove or rename the log file, then try to start Spark and upload the log file here.

Thanks Jimmy,
I think I got it, I found some files where I installed spark C:\Program Files (x86)\Spark even Spark was uninstalled and some in C:\Users$MyUser\Spark. I deleted all of them, I reinstalled Spark and everything is alright.

Maybe this can help.