Spark crashes after login

I am experiencing a problem with the Spark client installed on a Toshiba laptop. The finger scanner has been disabled according to a previous post that I found, but we are having the same issue. The user is able to login, but as soon as the program begins to conenct it immediately crashes. I have uninstalled and reinstalled using both the online and offline versions of 2.5.8. I have also updated Java and tried installing the 2.6 beta, but we are getting the same error. This is what I have from the error log:

Exception on commit = java.io.IOException: Can’t find registry file
Exception in thread “Thread-43” java.lang.NoClassDefFoundError: mil/jfcom/cie/media/session/MediaSessionListener
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(Unknown Source)
at java.security.SecureClassLoader.defineClass(Unknown Source)
at java.net.URLClassLoader.defineClass(Unknown Source)
at java.net.URLClassLoader.access$000(Unknown Source)
at java.net.URLClassLoader$1.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClassInternal(Unknown Source)
at org.jivesoftware.sparkplugin.JinglePlugin$1.construct(JinglePlugin.java:108)
at org.jivesoftware.spark.util.SwingWorker$2.run(SwingWorker.java:129)
at java.lang.Thread.run(Unknown Source)

Any help would be greatly appreciated, and if any additional information is needed please let me know and I’ll add it as soon as I can.

gairys wrote:

Exception on commit = java.io.IOException: Can’t find registry file
Exception in thread “Thread-43” java.lang.NoClassDefFoundError: mil/jfcom/cie/media/session/MediaSessionListener

Can’t say much. Are you running Spark with limited user rights? What happens if you try to run it as an admin and login?

You can try another experimental installer from here http://www.igniterealtime.org/community/thread/37565 (pick the latest spark-installer.jar). Maybe this will be different as it is using the newest code with some new libraries.

I apologize for not adding that information earlier. The user is a member of the Administrative group on our OpenFire server. I’m not sure if this will help, but the server uses only local accounts, no AD integration, and is using a MySQL database instead of the imbedded database.

Thank you for the link. I will forward this to the user since he is out of town until Friday. Hopefully I can have you an answer back by the end of today.

gairys wrote:

The user is a member of the Administrative group on our OpenFire server.

I didnt quite understand. I was asking about user’s rights on his computer (Windows i presume). Does he has administrator rights in Windows? If he doesnt, maybe you have to run Spark with administrator once.

I apologize for the misunderstanding. The computer is in a domain and the user’s account is in the Administrator and Domain Users groups. He is not a member of the Domain or Forest admins, or any of the other admins within the domain.

wroot,

I wanted to thank you for your help. It’s been hard to get the laptop away from the user so that I could have at least a few minutes to find the problem. The problem did end up being the admin. The user is wireless connecting to the network, and was constantly receiving a userenv 1054. When I connected to the network wired, I had no problems authenticating with the domain controller. I still have to find a work around for the wireless adapter not making a connection before login, but I was able to add the user to the local administrator account as a work around for spark.

Thank you for your help and I apologize for not getting back to you sooner.

That’s ok. I’m posting to much messages here to keep watching them all Usually i get an email notification and then i have to remember what was all about.

It’s good that you have found a work around, but not very security wise one. So you should seek for a better solution i think.