Spark issues: some users can't send/recieve messages

I’m having an occasional issue (but fairly frequently) where some users cannot send or recieve messages using spark. The messages don’t display in their client, and any outgoing messages appear lost, however all the missing messages appear in the archives.

We’re using openfire 3.6.4 and spark 2.5.8 in a business environment, and getting users and groups from active directory. I have to wonder if it’s an issue with spark knowing who is and isn’t online, but i’m not sure where to go from here.

Thanks.

Neglected to mention this in my question, but the users in question can usually regain full functionality after exiting the client, waiting a minute or two, and starting it up again.

Probably they dont have to wait and just launch it. Usually Spark’s profile wiping and recreating fixes this for a while. Also you can try installing Beta or latest SVN version on some of the clients with such problems and check if that helps. If you are going to install some of those versions you better wipe user’s Spark profile after previous version’s uninstallation.

user’s Spark profile is in *C:\Documents and Settings\username\Spark* for 2.5.8, 2.6.0 Beta

and in *C:\Documents and Settings\username\Application Data\Spark* for the latest SVN version

Thanks. I’ve cleared out the profiles for now. Any idea if there is a particular file in their folders that can be removed independantly for now?

At present I have 2 impacted users. If this issue is something I need to watch out for, I’d like to know the most streamlined way of fixing it when it occurs. I’m setting up the latest SVN on my machine as well, so I’ll see how that works out.

Thanks greatly for your help.

One of the things that seems to help with this is install the version of spark without the jre included. This seems to fix the issue on every machine I tried thus far.

Is this still a suitable option after the JRE-including version ahs been installed? I’ll give it a try, thanks.

Clearing the profile seemed to help for a very short time (about 4 hours).

I’ve just installed the beta (online version) on those with this issue. if it works, I’ll roll it out.

Speaking about JRE, you can always delete jre folder in Spark program folder. Then it should use system’s JRE.

So heres where we are…

I’ve installed the (most recent) JRE from sun and installed the most recent beta of spark. Afterwards, I removed the JRE folder from the impacted users. They still seem to have this issue although it’s much less frequent. Some uses with the current release of spark have no such issues.

I’m not sure what to try next here. Theres only a few users still having this issue, otherwise we’re quite happy with the solution.

Ideally yes… but as with other builds jive put out, there seems to be be some other subtle difference in the builds because once I deployed that version the errors went away. But to deploy it I had to fully remove the old version including a manual deletion of the directory.

There does not seem to be a consistancy in code between the online, exe, and msi version put out by Jive. they all have different quirky errors.

So, a bit of time has passed and we are still having this issue. It appears to be a little more widespread than I had thought, so if anyone has anything else I can try at this point I would appreciate it.

The server is a win2k3 box running little else other than openfire, and the desktops are (mostly) windows XP machines with a handful of windows 7 machines now. As far as I know, none of the win 7 systems have the issue. I’ve tried everything mentioned in this thread (including manual install of java, etc) so perhaps there is a specific version of java i should be targeting. If so, that is fine. Just looking for hints to move on.