There have been several posts, and an announcement by WROOT regarding the failure to login once updated to Spark 2.8.2 - my question is how can I make this change remotely for my 50+ workstations? I would rather not have to walk around to my local workstations, or have to VNC into my remote locations…however, if there is a setting I can edit (say in the spark.PROPERTIES file) to “check” that checkbox, that would be a better alternative. Anyone have any thoughts? I am slowly starting to get a trickle of calls from users that have accidentally updated. Any help would be appreciated, which reminds me: What do I need to change (remotely) to disable any further notices for new versions of Spark?
If you can browse each machine and know the username on each, you can go somewhere like:
Then you can edit the “spark.properties” file and make sure there is a “AcceptAllCertificates=true” setting in it.
But not sure if this will work if Spark is currently running, or if so, if it will keep the change when quitting and restarting Spark.
If your network is running in a Windows domain environment, you can simply push a custom spark.properties file with the proper settings to each workstation using group policy. Or have each workstation pull down the customized file during login to the network using a login script. The spark.properties files is stored in %AppData%\Spark
Another option is to install a trusted ssl from a vendor with a root ca that is bundled into java. This behavior was changed so that we could provide awareness to the user that they are connecting to a server that is using a self signed or invalid certificate. We are hoping to add a dialog box as some point, but we are low of devs!
As it was already suggested, you can add that line to user’s spark.properties. But it is better to do this when Spark is not running. It still might work in some cases if Spark is running. But most of the time it won’t save, as Spark is usually saving settings on exit and it will overwrite that change. There is no way of doing that centralized from Admin Console or such.
To disable updates you can disable it in File > Preferences > Notifications > Check for updates. Or you can add or change a line in spark.properties, which is called oddly enough
I think at some point it was checking for beta versions and the setting name hasn’t been changed.
Your users will still be able to turn this setting on or use the Check updates menu in Help section. If you are running Openfire 4.0.0+ and want to disable that, you may install Client Control plugin and disable Updates in Client Management menu.
thank you very much for this recommendation. This indeed is the fix to configure each client remotely. I have already tested it on 4 machines and it has worked brilliantly.
Just for everyone’s reference, I have been adding " AcceptAllCertificates=true " below the entry " timeout=10 " as it appears on the workstations I have manually upgraded.
Edit the “Spark.PROPERTIES” file with notepad, scroll to the bottom of the file
Look for the entry “ timeout=10 “
Below that entry, add the following command:
Save the file, and exit.
Actually, there is no difference where to put it in the file
Looking at my own spark.properties file, I notice it has a username and apparently my encrypted password stored in the file. Wouldn’t pushing this file out through group policy cause a problem? I suppose you could omit the username and password entries for the file being pushed out, though; the users would then have to set up their login.
Yes, it could be problematic pushing it to already existing users. Unless you can use some script to just add one line to the existing file. I was using vbs script to update an xml config file of a client we were using before switching to Spark awhile ago. But it was a well structured xml. And spark.properties is more a free forming file.
But for the new users. Yeah, i do a check if spark.properties file doesn’t already exist and then copy prepared file with settings, without login info, to appdata for that user when he/she logins. That usually happens on first login.
That’s correct and sorry I didn’t mention that part . In my environment, it was more important to make sure that every user had the exact same standardized Spark settings on their computer. Using group policy made this easy. Then each user had to sign in with correct credentials the first time to write those specific settings back to their spark.properties file. Maybe a better way to do this in your case would be to utilize a login script which creates the customized spark.properties file during login and adds the correct username into the file as well. Of course, the user would still need to fill in the password field when first signing into Spark.
At first we were leaving “hostname” field preconfigured in the default spark.properties file when pushing it to new users, so they wouldn’t need to fill in the server part. But for some reason Spark was taking long to come up in such case. So we ended up not filling that field. Of course, that was two years ago with a pre-2.7.0 version.
Slightly off topic, but I notice recently it’s been called “Domain”. I believe it was called “Hostname” or “Server” in previous versions? “Server” would have been most accurate, no?
It was “Server”. We have decided to change it to “Domain” in 2.8.1 after we’ve got a flood of reports of Spark not being able to connect after the update to 2.8.0. We saw that many users has a misconception (or rather lack of understanding) how xmpp works. Similar to email in xmpp your login is in the form of user@domain (that’s called a bare JID, full JID would be user@domain/resource). Many users had their servers named as the name of the machine they have installed it on (that’s what Openfire suggests by default, which adds to confusion - i have filed a ticket to add some warning), but then they were logging into some DNS domain name. Which created a mismatch between server’s (domain’s) name and one they put into Server field of Spark and one that has certificates generated for. That’s why we had to add “Disable certificate hostname verification” option to workaround this problem. Ideally Spark should just have one field “Username (JID)” as many xmpp clients have. One can have a machine “server” with Openfire, but specify im.domain.com as server’s name during setup, and then create DNS entries to point im.domain.com to the “server” machine. Users should be putting im.domain.com as Domain in Spark. In 2.7.7 it was possible to login with IP or hostname and it mostly worked, but some specific xmpp stuff, like adding users, presence, server to server and other stuff rely on domain names, so it may create various problems.
Thanks for the clarification. It makes sense now and reminds me to finally correct the server names in Openfire, Domain fields in Spark and set up DNS correctly so that those checkboxes don’t need to be used as overrides.
Curt, this is exactly what we did when we pushed Spark out to our users (I think I was ver. 2.7.1) - I set everything up exactly how I wanted to (skin, configuration, etc.), copied my PROPERTIES files to a folder, and removed my user/login. Once I dropped the folder in the (new)users AppData/Roaming, a fresh install of Spark would open, fail to login, but it would have all of my other existing settings. All I had to do was add in that user’s login/pass and they were set to go
I meant to reply to this yesterday, things got crazy with an emergency…anywho, for anyone else out there the Client Control plugin is beyond awesome, it is definitely a plug-in to have installed on your server. Thanks for the suggestion Wroot
Michael is the one to thank here as he has added 23 new settings (it was just 3 before) to that plugin for 2.1.0 version recently
I’m glad to hear you think it’s “beyond awesome”. As a sysadmin, I figured it would be put to good use. Also wanted to give a shoutout to Wroot for all his help and support!
Please note that settings from that plugin are network-wide, so if you only want to control a small group of users, it’s best to modify their default.properties file with the specific setting such as “DISABLE_UPDATES=true”.
Hey there, is there a parameter to set this in the default.properties file?
I tried AcceptAllCertificates=true as Parameter but this doesn’t work.
There is no such setting in default.properties file.