powered by Jive Software

Session ID not updated for Spark 2.7.0

I’ve had a number of my users upgrade to the latest (as of this posting) nightly build of Spark 2.7.0

I’ve noticed that their session ID still reports as Spark 2.6.3, which will complicate tracking the upgrade process throughout our company.

If this is an easy fix, I’d like to request that this be updated in the next release.

Thank you for your time and attention.

probably an oversight on my part. which nightly are you using? build 653?

Yes, build 653 downloaded from here: http://bamboo.igniterealtime.org/browse/SPARK-INSTALL4J-653/artifact/shared/Inst all4j/

ok, it is working fine. it looks like the Resource name is set in your local user profile (in %APPDATA%/Spark/spark.properties file). This does not get over-riden if you are keeping your old profile while upgrading. To get a new resource, either change that, or kill the %APPDATA%/Spark directory to have it setup a new, fresh one.

the image shows some of my users and their resource name. everyone is on a custom version and all should read as Spark 2.7.0-20131016-jsc, but as you can see, we’ve got resources all over the place lol (we had a cavalier dev over here for a bit who liked pushing new versions for image changes lol). i have a script someplace that will auto-kill the %APPDATA%/Spark directory… i’ll look for it… may help…

Well, here it is. It’s an AutoIt script… it’s been a long while since I used it… so normal disclaimer applies… if it nukes your computer, it’s on you

it’s purpose back when it was (hastily) written was to uninstall the current spark, kill the %APPDATA/Spark directory, then download and install a fresh copy from our local server (using the Client Management plugin to host a local Spark download for our custom version).

you will surely need to customize parts… such as your server address, etc. Or, you can just modify it to only kill the %APPDATA%/Spark directory…

or, of course, either write your own script to do this… or just manually kill them.
Fresh_Spark.au3.zip (2088 Bytes)

Well, killing the profile just to get resource changed is not an option for everyone One can also create a script which will change

resource=xxx

line in spark.properties file to anything desirable.

The thing is that resource string never was intended to track client version. I think that original Spark developer decided to put Spark version in there by default if the user doesn’t specify anything while logging in. It can (and should be) changed by users, so it is not a reliable tool to track versions.

very true, and good point

i forget people become attached to their settings lol. i’ve trained my users that they may be reset at any time with no notice