Fresh Install of 3.8 on CentOS 6.3 Not Working

Well, you know, I’m a Linux noob, and it turned out that I changed the wrong line in that file, and put the permissive where it should have said ‘targeted’. My bad. I fixed it, though. Best way to learn something is to break it and have to fix it, right?!

I’ll look into that. My guess is that it’s not installed, although we do run Nagios on a CentOS box, but I doubt it’s doing anything with Active Directory.

I’ve looked in the windows logs, and don’t see anywhere where the LDAP queries are hitting the server, and I don’t see any failures in the security log, so I’m leaning toward LDAP not working properly on the CentOS box. I haven’t figured out how to turn on the LDAP debug, though.

welcome to linux lol!

there’s a “fun” disto of linux called Suicide Linux… if you mistype or misspell a command, it :

rm -rf /

deletes everything on the hard drive started from the root directory! It’s like a game to see how long you can gow ithout a typeo command or something lol…

anyways, your /etc/selinux/config file should look something like this now:

This file controls the state of SELinux on the system.

SELINUX= can take one of these three values:

enforcing - SELinux security policy is enforced.

permissive - SELinux prints warnings instead of enforcing.

disabled - No SELinux policy is loaded.

SELINUX=permissive

SELINUXTYPE= can take one of these two values:

targeted - Targeted processes are protected,

mls - Multi Level Security protection.

SELINUXTYPE=targeted

make sure to reboot to have it take effect…

Set the system property ldap.debugEnabled to true (or add it). Maybe have to cycle Openfire. You also need to enable debug logging under the "Logs’ menu option - It’ll log out to debug.log.

After trying a few things, including installing the identity management for Unix, it worked! I had to add some static routes to the network config, and I can’t get Spark to install on it due to some RPM dependency errors that are well-documented, but I can deal with not having Spark installed on the server.

glad you got it working!

to get spark running on your server… download the tar.gz file and unpack… there is a Spark shell script file inside, run this and it will launch spark.

http://www.igniterealtime.org/downloadServlet?filename=spark/spark_2_6_3.tar.gz

from the terminal do this:

sudo yum install wget -y

cd

wget http://www.igniterealtime.org/downloadServlet?filename=spark/spark_2_6_3.tar.gz

tar -zvxf spark_2_6_3.tar.gz

cd Spark

./Spark

if you have no GUI environment (X session) on your CentOS install, then i think it will fail and give some error… otherwise should start right up.


EDIT:

Oh, and as i'm sure you already know, MS Domain services (Active Directory) is very heavily dependant on DNS and network time being synced.

do this to add your domain controller or internal DNS server as your centOS dns server:

vi /etc/resolv.conf

where it says nameserver xxx.xxx.xxx.xxx

change to:

nameserver your.internal.dns.server

or add it keeping the existing nameserver entry -- i'd add your internal DNS server above whatever is in there if anything... this will make centos check that local DNS server first before trying something else...

then do:

sudo service networking restart

(will dissconnect any SSH sessions open... just a heads up).

then for time sync – do this from terminal:

sudo yum install ntp -y

sudo vi /etc/ntp.conf

scroll down until you see the default lines of:

server 0.centos.pool.ntp.org

server 1.centos.pool.ntp.org

etc etc…

change to this if you are in the US – or lookup your closest pool here: http://www.pool.ntp.org/en/ (look on right hand side of page and click your continent and then scroll down to find your country, etc… — or set this to your internal NTP server if you have one (and that NTP server should get its time from the pool):

server 0.us.pool.ntp.org

server 1.us.pool.ntp.org

server 2.us.pool.ntp.org

then save the file…

on terminal again do:

sudo ntpdate 0.us.pool.ntp.org (or one of the servers you set in the config file)

then do:

sudo ntpd start

sudo chkconfig ntpd on

all set!

When I unpack it, and then run the shell script, it gives me this error:

Preparing JRE…

testing JVM in /home/OpenFireAdmin/Downloads/Spark/jre…

ls: cannot access /Home/OpenFireAdmin/Downloads/Spark/lib/windows: No such file or directory

There is a windows64 directory, but not a windows directory. So, I copied the contents (a .dll file) to a windows folder, and when I run the ./Spark script, nothing happens.

What did you download? There is a Linux tar.gz or a rpm for Spark. Probably want to use those.

Initially, I tried the RPM, but get a message about some missing dependencies (a couple of files that are super old, and I couldn’t find them), so I tried the tar.gz file.

hmm… i think i vaugely remember having this issue initially… try just creaking a blank directory called “windows”, the startup scripts are simply looking for it’s existance, not that there’s anything inside…

so at the terminal do:

cd ~/your_spark_directory

mkdir lib/windows

chmod -R 755 lib/windows


also, check that your file permissions are correct on the entire Spark/ directory allowing execution. ... but if you unpacked the .tar.gz correctly then you should already be set here...

... also, as a sanity check, you are running in a GUI environment right? Spark will fail if there is no gui to work with...

on my laptop, Spark complains about missing some GTK stuff, but still fires up anyways (its stuff for a spark theme/skin that i don't really care about anyways...)

EDIT:

It just occurred to me, the user you unpacked Spark with … is this the same user you’re trying to run it under? if you unpacked under the say, root user, then the permissions might be off on the entire Spark/ directory, and could cause your issue…

Yeah, I’m running the GUI.

I did the CHMOD thing, but to no avail. So, I logged in as root, moved the folder from the home/downloads to the /opt folder and tried to run the script while logged in as root. It said:

testing JVM in /opt/Spark/jre…

Then nothing. Is there a log somewhere I can look at to see that’s going on under the covers?

yes,

uder whatever user you ran it as… do this:

cd ~/.Spark/logs

ls -la

cat errors.log

… anything?

also… when you say it does “nothing”, is it still running and nothing noticable happens (as in your coursor on the terminal is still not on a blank line), or does it crash out and return you to the console with a fresh blank line?

for testing purposes. try verifying the jre directory inside Spark is populated with stuff, and permissions are ok…

do:

cd Spark/jre

ls -la

------------- if there’s stuff, then see what permissions they are… i think most if not all files in there should be set as being executable… likely in a 755 configuration… if you get real desparate, you can always chmod -R 777 the entire Spark directory and just see if it will run then, do not run this with these permissions at all, it is strickley for testking and is unsafe. but if Spark runs with these super loose permissions, then you know it’s simply dialed in wrong. I’d delete the Spark directory by doing a rm -rf Spark and then grab a fresh copy with wget and try again…

Here’s the log:

[root@AccOpenFire logs]# cat error.log

java.lang.reflect.InvocationTargetException

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)

at java.lang.reflect.Method.invoke(Unknown Source)

at org.jivesoftware.launcher.Startup.start(Startup.java:94)

at org.jivesoftware.launcher.Startup.main(Startup.java:44)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)

at java.lang.reflect.Method.invoke(Unknown Source)

at com.exe4j.runtime.LauncherEngine.launch(Unknown Source)

at com.install4j.runtime.Launcher.main(Unknown Source)

Caused by: java.lang.UnsatisfiedLinkError: /opt/Spark/jre/lib/i386/xawt/libmawt.so: libXext.so.6: cannot open shared object file: No such file or directory

at java.lang.ClassLoader$NativeLibrary.load(Native Method)

at java.lang.ClassLoader.loadLibrary0(Unknown Source)

at java.lang.ClassLoader.loadLibrary(Unknown Source)

at java.lang.Runtime.load0(Unknown Source)

at java.lang.System.load(Unknown Source)

at java.lang.ClassLoader$NativeLibrary.load(Native Method)

at java.lang.ClassLoader.loadLibrary0(Unknown Source)

at java.lang.ClassLoader.loadLibrary(Unknown Source)

at java.lang.Runtime.loadLibrary0(Unknown Source)

at java.lang.System.loadLibrary(Unknown Source)

at sun.security.action.LoadLibraryAction.run(Unknown Source)

at java.security.AccessController.doPrivileged(Native Method)

at java.awt.Toolkit.loadLibraries(Unknown Source)

at java.awt.Toolkit.(Unknown Source)

at com.jtattoo.plaf.JTattooUtilities.(Unknown Source)

at com.jtattoo.plaf.AbstractLookAndFeel.(Unknown Source)

at java.lang.Class.forName0(Native Method)

at java.lang.Class.forName(Unknown Source)

at javax.swing.SwingUtilities.loadSystemClass(Unknown Source)

at javax.swing.UIManager.setLookAndFeel(Unknown Source)

at org.jivesoftware.Spark.loadLookAndFeel(Spark.java:271)

at org.jivesoftware.Spark.startup(Spark.java:186)

… 12 more

I did the ls -la and got this:

[root@AccOpenFire jre]# ls -la

total 236

drwxr-xr-x. 4 root root 4096 Mar 1 10:48 .

drwxr-xr-x. 11 root root 4096 Mar 1 10:48 …

drwxr-xr-x. 2 root root 4096 Mar 1 10:48 bin

-r–r--r–. 1 root root 3767 Jul 1 2011 COPYRIGHT

drwxr-xr-x. 17 root root 4096 Mar 1 10:48 lib

-r–r--r–. 1 root root 12720 Jul 1 2011 LICENSE

-r–r--r–. 1 root root 15906 Jul 1 2011 README

-r–r--r–. 1 root root 183173 Jul 1 2011 THIRDPARTYLICENSEREADME.txt

-r–r--r–. 1 root root 968 Jul 1 2011 Welcome.html

[root@AccOpenFire jre]#

ok do this:

uname -a

post output here…

also, see if you can navigate down /opt/Spark/jre/lib/i386/xawt/ and see if libmawt.so is in that directory.

yum install libXext.i686

There may be other libraries you are missing. You’ll have to install them as they are identified. You’ll want to install the 32bit libraries since it looks like you’re using the 32bit JRE.

ah… i just checked that on my laptop, and it was already installed!

I think it came with the “Development Tools” groupinstall i did when i initially set my laptop up… either that or Fedora now packages that in by default, which is doubtful…

you could try doing

sudo yum groupinstall “Development Tools”

but this will install a bunch of other things you may not want or need on your server… it installs a bunch of compilers, libraries, and other devel things…

uname - a gives me:

Linux AccOpenFire 2.6.32-279.22.1.el6.x86_64 #1 SMP Wed Feb 6 03:10:46 UTC 2013 x86 64 x86_64 x86_64 GNU/Linux

I can go to that directory, and that file does exist.

I went ahead and installed the ‘Development Tools’. I’m not worried about having too many extra libraries installed, as long as I’m able to get it working. I also installed the libXext.i686. Those didn’t help, by the way.

Did installing libXext.i686 get you a libXext.so.6 file? You might want to run ldconfig.

Maybe just use the JRE which ships with the OS rather than the one from Spark?

Running a search for libXext.so.6 found two files: libXext.so.6 and libXext.so.6.4.0

How would I switch to the JRE with the OS? Is that a config file change somewhere?