Bug: Spark unable to write logs under User account

Spark has a log directory in which it creates and writes logs. Unfortunately its defualt install location is Program Files and it does not change the permissions on its log directory to increase Users’s permissions above the default Read & Execute, therefore every time the program is launched it is unable to write to these logs.

This should be solved via one of two methods:

  1. During installation the Users group should be given Modify permissions to the logs folder.

  2. Logs should be kept under %userprofile%\Local Settings\Application Data\Spark\Logs, thus ensuring users with standard rights have permissions to write these logs.

User’s logs are written in C:\Documents and Settings%username%\Spark\logs

User logs, sure, but any time a user loads Spark it is trying to update the error and output logs whih it keeps within its own directory, and it does so under the user permissions and thus the operation fails.

My install of spark can add to the logs at both locations with no changes at all. I am not sure why you are having this issue, but I would suggest the you try to uninstall Spark, delete all residual directories, and re-install with a different installer. If you used the msi file the first time, try installing with the exe file this time.

How did you find out, that Spark is trying to write at this location at startup? I mean, do you have any problems caused by that. I have never checked that dir permissions, was sure it shouldnt be writable by user, and i never had problems with that.

Run Process Monitor from Microsoft SysInternals and you will see Spark trying to write these logs and failing because the program is being launched under User permissions and they don’t have permissions to write to the Spark\logs directory. I ran across this while trying to find why Spark sometimes appears in the process list but fails to launch its GUI.

There are only two scenarios here:

a) These logs need writing and thus users need permissions to write to these files.

b) These logs don’t need writing and shouldn’t exist.

@mistravel:

Might I suggest the reason your Spark can write the logs under %programfiles%\Spark\logs is because you’re running under a Power User or Administrator account, or you have modified the permissions to the Spark folder. Under a default Spark install this directory has only read & execute permissions for users and therefore it is impossible for Spark, launched under a user account, to create them or write to them should they exist.

I have remotely checked several machines and they all have operation logs in the program files folder that have modification dates as recent as today. Our install of spark is several months old and these people are not admins or users with elevated privledges.

  1. Spark does not install a service and therefore runs under the permissions of the user account running it.

  2. %programfiles% by default gives the Users group Read & Execute permissions to it and all subfolders.

  3. A default install of Spark 2.5.8 does not modify the permissions on any of its folders.

Therefore for a user to be writing to the logs subfolder someone has at some time changed the permissions on that folder, be it on a PC per PC basis, or via a Group Policy.

Attached are 2 photos showing the permissions of the logs folder. They have not been modified and the user does not have elevated permissions. The user is an AD standard user. I am the Domain administrator and hve not published any changes to the permissions for the folder to any machine. The software was installed via group policy.


Then the users are unable to write to those files because Spark is working under the permissions of the locally logged on account, and if that’s a user in the Users group, and Users only have R&E permissions to those files, and Spark doesn’t have a service running under a higher permissions level, then it is 100% impossible for those files to be written to by Spark running under that account.

If Spark is writing to those files, then it means the account you’re using doesn’t have the permissions level you think it does.

Incidentally, if the user account was ever at a higher level, say elevated to admin to run the install, then it would be able to write to the log files because it would own them, and the CREATOR OWNER permission would then give them Full Control of the files. Check the permissions on the log files themselves. However, from Administrator being in the second shot, I would guess that is who the program was installed under.

I have checked the actual log files as you suggested. None of the users have direct write access or ownership of the file. None of the users have been elevated since or to install spark as it was pusshed to each machine via a computer level group policy. The logs have modified yet again today. So somehow my computers have the ability to edit the log file and yours does not. It is obvioustly something different in the way my systems are configured, or in the way spark was installed. Either way my standard users with no rights to those files do not have the issue you have. Spark can and does write to those files every day.

So, i have checked this with Process Monitor and Spark is indeed trying to write to those logs. Not right for the limited rights environment. But as it’s not causing any noticable problems for the most of the users (e.g. for me) and happens only at startup, i have created jira ticket as a Task to investigate and maybe fix that. SPARK-949

Bounce this topic up …

Same problem still happened to my distribution of Spark via GPO.

As limited users, no one can run sparks for the first time. I need to run each copy of sparks on each computers once AS ADMINISTRATOR because only after that my limited users able to run sparks.

Any solution on this ?

Thanks

Solved … found the answer here thanks to @methodadmin

Message was edited by: voidzzz