I installed version 2.5.7 through group policy (MSI), which succeeded, and then tried to run Spark as a regular user. Spark opened briefly (I saw it in Task Manager) then closed. I figured Spark was trying to access something it didn’t have permissions to. So I ran the utility “Filemon” and found that several files were trying to be created in this directory: \Program Files\Spark\lib
I granted the local users group modify permissions (though write permissions would have probably been enough) and tried to run Spark again. This time it opened. Having to run Spark once with administrative privileges before it can be run as a regular user makes an automated deployment much more difficult to achieve.
Some time ago i was reporting same issues in 2.5.0 version http://www.igniterealtime.org/community/thread/25212?tstart=105 Though it seemed to be fixed. Well, i stopped to use MSI and only upgraded my existing installation with EXE with bundeled JRE version. And with latest versions i havent seen such problem. It seems this problem is reintroduced time to time. Maybe with new MSI builds Daniel will fix that.
Thanks for your response. What happens now? Will this issue get reported to someone who can fix it?
I don’t know if this helps anyone, but I think I see what’s happening. When the install finishes it places “.jar.pack” files in \Program Files\Spark\lib. When someone runs Spark for the first time, those files are unpacked using the permissions of the currently logged on user. Using Filemon I see write failures to that directory when Spark is first run as a limited user; as it should be, since users don’t’ have write access to the “program files” directory by default. Spark is trying to write “.jar” files from the “.jar.pack” files. In mind my this is a simple fix; unpack the files during the installation or don’t pack the files at all.
Maybe those jars were packed to make install smaller. Unpacking those upon setup sounds logical, but we dont know capabilities of Jive’s setup building tool. Btw, Spark now has a new lead developer (well, Spark is not he’s primary task), so maybe he will find a way to solve this.
does someone know if there is a parameter or method to do this initialisation and then quit.
This way the unpacking could be part of the installation process and directly done by the installation script.