Spark on Ubuntu Linux

I have been using Spark on Windows for awhile and would like to use it on Ubuntu Linux. I downloaded the Linux version and unpacked the archive. I proceeded to try to start Spark. Here is the output:

root@chrisc-ubuntu:/home/cchristensen/Desktop/Downloads/Spark# ./Spark

Preparing JRE …

testing JVM in /home/cchristensen/Desktop/Downloads/Spark/jre …

ls: cannot access /home/cchristensen/Desktop/Downloads/Spark/lib/windows: No such file or directory

Locking assertion failure. Backtrace:

#0 /usr/lib/libxcb-xlib.so.0

#1 /usr/lib/libxcb-xlib.so.0(xcb_xlib_unlock+0x31)

#2 /usr/lib/libX11.so.6(_XReply+0xfd)

#3 /home/cchristensen/Desktop/Downloads/Spark/jre/lib/i386/xawt/libmawt.so

#4 /home/cchristensen/Desktop/Downloads/Spark/jre/lib/i386/xawt/libmawt.so

#5 /home/cchristensen/Desktop/Downloads/Spark/jre/lib/i386/xawt/libmawt.so

#6 /home/cchristensen/Desktop/Downloads/Spark/jre/lib/i386/xawt/libmawt.so(Java_su n_awt_X11GraphicsEnvironment_initDisplay+0x2f)

#7

#8

#9

#10

#11 /home/cchristensen/Desktop/Downloads/Spark/jre/lib/i386/client/libjvm.so

#12 /home/cchristensen/Desktop/Downloads/Spark/jre/lib/i386/client/libjvm.so

#13 /home/cchristensen/Desktop/Downloads/Spark/jre/lib/i386/client/libjvm.so

#14 /home/cchristensen/Desktop/Downloads/Spark/jre/lib/i386/client/libjvm.so(JVM_Do Privileged+0x34b)

#15 /home/cchristensen/Desktop/Downloads/Spark/jre/lib/i386/libjava.so(Java_java_se curity_AccessController_doPrivileged__Ljava_security_PrivilegedAction_2+0x3d)

#16

#17

#18

#19 /home/cchristensen/Desktop/Downloads/Spark/jre/lib/i386/client/libjvm.so

java: xcb_xlib.c:82: xcb_xlib_unlock: Assertion `c->xlib.lock’ failed.

Aborted (core dumped)

Notice that the Windows folder is referenced in the 4th line.

I have searched this forum for answers but have not seen any. If I am missing something please point it out to me.

Thanks in advance for the help.

I have this problem too, did you find a solution? My install tries to find /lib/windows also, the GUI loads up but it is a blank grey screen.

I did find out that this has to do with a bug in relation to java. Here is the related post and solution on my blog:

http://geekdept.com/blog/?p=35

Thanks for the reply. I think I found a fix for my problem. I have to disable beryl before I load up Spark, after Spark is loaded I can start beryl again and it all works fine. It is a little annoying but I’m sure a fix will come soon.

Actually, the fix is easier than all of that.

You just need to change windows to linux in the Spark startup file.

There was this:

for i in ls "$app_home/lib/windows" | egrep "\.(jar$|zip$)"

to

for i in ls "$app_home/lib/linux" | egrep "\.(jar$|zip$)"

and it is running for me. Although, now that I look at the file, right under it, I see this:

add_class_path “$app_home/lib/windows/$i”

And I haven’t changed that… It’s running now, so I don’t know that it’s needed, but I think I’ll change that one too, and see what happens.

And, now that I look, I also see this:

“$app_java_home/bin/java” -client -Dinstall4j.jvmDir="$app_java_home" -Dexe4j.moduleName="$prg_dir/$progname" “-Dappdir=$prg_dir/” -Dsun.java2d.noddraw=true “-Djava.library.path=$prg_dir/\lib\windows” …

OK, it looks like those are it.

Anyway, it’s running for me.

desiv

OK, more testing.

What I did was comment out these lines:

#for i in ls "$app_home/lib/windows" | egrep "\.(jar$|zip$)"

#do

#add_class_path “$app_home/lib/windows/$i”

#done

Because, the same lines are in there lower, with linux in them. I think there’s a missing “if” statement there.

But, then I did change the line above that had “-Djava.library.path=$prg_dir/\lib\windows” to “-Djava.library.path=$prg_dir/\lib\linux”

It just feels like a cleaner fix. YMMV