I dowloaded Spark yesterday, went to install it today and am a little lost by the lack of any install documentation in the tarball. So far I have extracted the Spark directory to /opt and tried running both the “starter” and “Spark” scripts. On running “Spark” I got the above error. Upon investigation I discovered that to my horror, there was no file / directory “windows” as reported, but there was a directory “linux”! could it be that although I downloaded the “RHEL” version, the startup script is for MS Windows? Or am I doing something wrong? Any help here would be great, documentation would be fantastic!