Client Support for Fedora (rpm)

Hello Everyone,

some collegues and I want to use the Spark client on Linux (especially Fedora 20/RPM, 64-Bit) but we get in trouble with the distributed RPMs and the source variants. We tried Nightly and Stable Builds with no success.

The problem of installing the RPMs is, that the required libodbcinst.so and libodbc.so are not found. They are installed and already present under /usr/lib64 in the system. I think there is a problem within the package building process.

The sources doesn’t work either. When ./Spark was startet it always says that {SPARK_ROOT}/lib/windows is “No such file or directory”.

So, we search for a solution to use Spark and would help to solve the problems as far we can do this. So far, we will thank the developers for the nice software solution they produced for free.

Sincerly,

Oliver

you need the 32bit version of those libraries. (ie. .i686 packages). RPM’s don’t do automatic dependency resolution like YUM would do for you.

to install, usually it’s something like:

sudo yum install libodbc.i686 libodbcinst.i686

I don’t know if that’s the actual package names (my fedora laptop is off right now). But it not, you can always do:

sudo yum provides libodbc.i686

and see what packages it comes in. Then install that package.

Hi,

thank you for the reply.

the 32 andb 64 Bit libs are installed now. libodbc.so and libodbcinst.so are provided through the package unixODBC and unixODBC-devel. Packages named libodbc* are not in the yum DB.

The yum provides command gives no result for libodbc on Fedora 20, so I think they have renamed it to unixODBC or I am completly wrong?

So far, the problem with the dependencies still exist. Tried with Spark-2.6.3 and Spark-2.7.0.651 RPMs.

Sincerly,

Oliver

just to make sure we’re together here… you installed the 32bit versions of unixODBC and unixODBC-devel packages?

ie: sudo yum install unixODBC.i686 unixODBC-devel.i686

it may also be worth installing the developer tools on you system. a lot of packages need some stuff out of there, so it may be helpful in more than one case.

would be something like:

sudo yum groupinstall “Developer Tools”

Yes, i have both installed unixODBC.i686, unixODBC.x86_84 and unixODBC-devel.i686, unixODBC-devel.x86_64 to get save.

I also installed as you mentioned:

yum groupinstall “RPM Development Tools”

Here my grouplist output:

yum grouplist

Loaded plugins: langpacks, refresh-packagekit

Available environment groups:

GNOME Desktop

KDE Plasma Workspaces

Xfce Desktop

LXDE Desktop

Cinnamon Desktop

MATE Desktop

Sugar Desktop Environment

Development and Creative Workstation

Web Server

Infrastructure Server

Basic Desktop

Minimal Install

Installed groups:

Development Tools

LibreOffice

RPM Development Tools

Available Groups:

3D Printing

Administration Tools

Authoring and Publishing

Books and Guides

C Development Tools and Libraries

Cloud Infrastructure

Design Suite

Editors

Educational Software

Electronic Lab

Engineering and Scientific

Fedora Eclipse

FreeIPA Server

Games and Entertainment

Medical Applications

Milkymist

Network Servers

Office/Productivity

Robotics

Security Lab

Sound and Video

System Tools

Text-based Internet

Window Managers

Doesn’t help yet.

hmm… i’ll need to spin up a fedora 20 vm, or better yet, upgrade my laptop from 19 -> 20.

you could also try downloading the tar.gz version and extract it someplace then run the startup.sh file found inside the bin/ directory. a lot of people on the forums appear to be doing this, and just writing a shell script to start it at login or whatever. it may be acceptable for some people to just do it this way.

http://bamboo.igniterealtime.org/artifact/SPARK-INSTALL4J/shared/build-653/Insta ll4j/spark_2_7_0_653.tar.gz

i’ll see if i can replicate your issue if I have time this evening.

Hi,

actually the spark_2_7_0_653.tar.gz works after I installed the missing i686 packages. So for the Moment it is a solution we can live with.

If you can provide a working RPM for the future would be great, because of updating. I think on the Fedora side it would be easier to prepare an repository for yum packages if it is not to much work.

For further testing contact me if you want.

Thanks for your time.

I took at look at the 2.6.3 release rpm as well as the rpm build process last night… unfortunately neither are in great shape.

The RPM build process appears to have been setup a while ago and there are large chunks that need updating such as the bundled java jre (it’s bundling with 1.6.0_01, very old),

The problem with the libs you experienced originally is due to the RPM looking for the old version of libodbc, etc. Even though you installed them via the package unixODBC, the RPM database is not aware they are the same, and therefore it fails it out.

Obviously the RPM needs some love

Hi Jason,

I’m in the same boat as Oliver here. Trying to install spark-2.7.0.rpm on Fedora 20. I have installed the i686 versions of unixODBC, and the Development Tools group, then tried to use yum localinstall spark-2.7.0.rpm so that it would do the dependency resolution. It told me that it would also be installing compat-libstdc+±33.i686, but fails to do the actual install for these two errors:

Error: Package: Spark-2.7.0.668-1.x86_64 (/spark-2.7.0)

Requires: libodbcinst.so

Error: Package: Spark-2.7.0.668-1.x86_64 (/spark-2.7.0)

Requires: libodbc.so

These files are installed on my filesystem, however:

[jt@jtsdesktop ~]$ locate libodbc.so

/usr/lib/libodbc.so

/usr/lib/libodbc.so.2

/usr/lib/libodbc.so.2.0.0

/usr/lib64/libodbc.so

/usr/lib64/libodbc.so.2

/usr/lib64/libodbc.so.2.0.0

[jt@jtsdesktop ~]$ locate libodbcinst.so

/usr/lib/libodbcinst.so

/usr/lib/libodbcinst.so.2

/usr/lib/libodbcinst.so.2.0.0

/usr/lib64/libodbcinst.so

/usr/lib64/libodbcinst.so.2

/usr/lib64/libodbcinst.so.2.0.0

Since the yum localinstall command failed, I went ahead and installed compat-libstdc+±33.i686 myself, and having satisfied myself that the dependencies were met, I ran rpm -i --nodeps spark-2.7.0.rpm which works to install, but then when I try to run the spark command, I get the following:

[jt@jtsdesktop ~]$ spark

Exception in thread “main” java.lang.UnsupportedClassVersionError: org/jivesoftware/launcher/Startup : Unsupported major.minor version 51.0

at java.lang.ClassLoader.defineClass1(Native Method)

at java.lang.ClassLoader.defineClass(Unknown Source)

at java.security.SecureClassLoader.defineClass(Unknown Source)

at java.net.URLClassLoader.defineClass(Unknown Source)

at java.net.URLClassLoader.access$000(Unknown Source)

at java.net.URLClassLoader$1.run(Unknown Source)

at java.security.AccessController.doPrivileged(Native Method)

at java.net.URLClassLoader.findClass(Unknown Source)

at java.lang.ClassLoader.loadClass(Unknown Source)

at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)

at java.lang.ClassLoader.loadClass(Unknown Source)

at java.lang.ClassLoader.loadClassInternal(Unknown Source)

(This would be because of the old bundled jre?)

I’m not sure how much this helps you, as it sounds like you may already have a plan of attack, but I figured it couldn’t hurt to detail exactly where the issues lie when we are trying to run the program. For now I will try the tar.gz, but I really would love to see an RPM that I can manage properly.

Thanks for your hard work!

JT

I’m afraid Jason is no longer participating here (last logged in a year ago).

“Unsupported major.minor version 51.0” means that you probably has Java 6 and Spark 2.7.0 needs Java 7 at least

Definitely a JRE mismatch, but the cause is surprising. Spark as compiled for the rpm actually uses its own internal jre (/usr/share/spark/jre), which is 32-bit 1.6.0_03-b05. Awhile back we had an internal project to make some enhancements to Spark for Windows and used the 2.7.0 source as its base. We also stayed with a bundled jre, but bumped it up to 1.7.0_60-b19 when 1.6.0 failed.

When I installed the new 2.7.0 rpm on my Fedora 21 workstation I got the same results as you did.

Just as an experiment I downloaded the last 32-bit 1.7.0 update from Oracle, jre-7u79-linux-i586 and swapped it in for the shipping jre (just renamed /usr/share/spark/jre to /usr/share/spark/jre-1.6.0 and then copied the uncompressed 1.7.0 jre to /usr/share/spark/jre). Although it complained mightily in the background, Spark finally came up and let me log into my dev Openfire server. Can’t say how stable this might be, or usable (windows take an inordinate time to paint and sometimes need a nudge by clicking into them) over time. Probably would be better to recompile Spark from source using Java 1.8 at this point, but that’s an extra mile this lowly sysadmin won’t be traveling.

A couple of notes: (1) The odbc packages seem to be a hang up for a lot of systems, I also wound up using “rpm -ivh --nodeps” to get past them (why does an XMPP client need database drivers?); (2) Even though there’s nothing checked off to cause it to be invoked, the client seems to go out and try kerberos authentication anyway, which drags out the start up process considerably.