Spark "Check for Updates" button

Hi All,

New guy here, trialling Openfire/Spark for our Faculty at University, hopefully supporting up to 50 users intitially, perhaps a couple hundred with a commerical license if all goes well Looks pretty super so far.

Just a question for which I cannot find an FAQ or forum answer to about the “Check for Updates” command in Spark. I have Spark 2.5.7 on WinXP SP2, installed with the Spark-bundled JRE. I click on the “Check for Updates” command, it tells me that a newer version is available (2.5.8), asks me if I want to install, I click OK - and nothing. The client remains open, no further disk or network activity…looks like the command sort of does nothing.

Now on Ubuntu 7.10 AMD64, Spark 2.5.7 - I hit the “Check for Updates” command - and I consistently get the prompt that no updates are available. Uses Java 1.5.0 as bundled with Ubuntu.

Maybe I am missing something simple here, but what could be the problem?

I am running Openfire 3.4.1 on Ubuntu x86 32bit, if that is of any relevance.

Perhaps someone could also kindly let me know how to otherwise do an in-place upgrade on both systems such that I don’t have to re-add and authorise my buddies?

Any help given would be much appreciated!

Cheers,

Dave

Oh, also, both clients are sitting behind a corporate firewall, but have direct access to the internet. Cheers.

So. This is hard to test, cause you have to have older version and newer version available to update. And i only have 2.5.8. Though i never use Update option in Spark. Just overwriting older version with newer will keep all settings and buddies in place. Btw, maybe you want to use Openfire’s shared groups, so you could have some default roster and wount need to add/authorise those hundreds of users for every new user.

Hey wr00t, thanks heaps for your reply

That’s really helpful to know about the install in place - presumably works the same for Linux too? I’ll give the shared groups feature a look as well.

If anyone else out there uses the Check for Updates feature to update their clients (and has had any luck with it), hopefully you can let me know if it’s worth persisting with.

Cheers,

Dave

davefromnz wrote:

That’s really helpful to know about the install in place - presumably works the same for Linux too?

I think linux version is saving settings in /home/user dir. Windows version does this in C:\Documents and Settings%Username%\Spark

These settings are not deleted after uninstall or installation of new version on top of old one. As linux version doesnt have install, so you would have to manually overwrite its dir content with new, or just remove old catalog and launch Spark from new one. It should use old settings.

hi wr00t - thanks again. Did the in place installs and worked fine as you said.

It would be nice if the Check for Updates feature worked, but seeing as an in place upgrade is easy enough to do this will be fine for further evaluation of the product - so question is answered!

Cheers,

Dave