I’m running the latest Spark version (2.5.8 - linux edition), which is installed under /opt/Spark. When I start Spark using a ‘normal’ local system (e.g. ‘tim’ -> /home/tim) account, everything is fine.
If I change my login to be a domain account (e.g. DOMAIN\tim2 -> /home/DOMAIN/tim2), Spark (well, tries to) creates a directory /opt/Spark/? rather than /home/DOMAIN/tim2/.Spark.
I inserted a “echo $HOME/test” into Spark starting file, which printed /home/DOMAIN/tim2/test on my screen.
Can anybody help to fix this?