Spark 2.5.8 on Ubuntu 7.04 (Feisty Fawn) AMD64

Hello,

I appear to have a problem installing/runing Spark 2.5.8 on Ubuntu 7.04 AMD64 architecture.

I have installed Sun Java Runtime Environment 6 for x64 architecture.

~$ java -version
java version “1.6.0”
Java™ SE Runtime Environment (build 1.6.0-b105)
Java HotSpot™ 64-Bit Server VM (build 1.6.0-b105, mixed mode)

The location of JVM is: /usr/lib/jvm/java-6-sun

I have downloaded spark_2_5_8.tar.gz from your website and unpacked it into my home dir, then i went into the Spark directory and tried to run starter:

~/Spark$ ./starter
Preparing JRE …
testing JVM in /home/llhull/Spark/jre …
~/Spark$

As you notice it just returns to console, nothing else happens, no window opens. Then i tried to run Spark.

~/Spark$ ./Spark
ls: /home/llhull/Spark/lib/windows: No such file or directory
~/Spark$

I don’t know why it tries to find a /lib/windows directory since this is the Linux version of the client and only has /lib/linux . Either way, i tried to enforce the detection of the JVM directory and modified the following line:

  1. INSTALL4J_JAVA_HOME_OVERRIDE=

with

INSTALL4J_JAVA_HOME_OVERRIDE=/usr/lib/jvm/java-6-sun

in both starter and Spark files. I then tried to run starter and Spark again.

~/Spark$ ./starter

testing JVM in /usr/lib/jvm/java-6-sun …

~/Spark$

~/Spark$ ./Spark

ls: /home/llhull/Spark/lib/windows: No such file or directory (note this does not return to console)

Nothing happens on either commands. I have checked error.log in the logs directory, but it’s empty.

Can someone give me some pointers on what to do/try next? Maybe /usr/lib/jvm/java-6-sun is not the required path this script needs, maybe /usr/lib/jvm/java-6-sun/jre? I would appreciate any suggestions. Thank you.

Best regards,

Lucian Constantin

Does anything get written to the log files in the logs directory?

Also, try starting it like this:

sh -x ./Spark

And post the output here. It will give more clues.

Hi Lucian,

In my case Spark will start when I issue ./Spark. I am running Suse 11.2 and Gnome.

I still got the error message you stumbled upon: “ls: /homes/cristi/Spark/lib/windows: No such file or directory”.

However this is not a big issue, just a mistake in the Spark shell script (see line 249: “for i in `ls “$app_home/lib/windows” | egrep “.(jar$|zip$)””). Since this is a linux distribution it will not have a windows dir in lib. So what you need to do is go into the lib dir (of the Spark app dir) and create a dir named windows.

This will solve the problem.

Best regards,

Cristian Scrieciu