Client Management Version Update not working

Hello,

I’ve been in the process of using my Openfire server to remotely update everyone’s Spark to the newest version using the Client Control plugin and am having issues with it not prompting anyone to update. I’ve ensured that the plugin and server are fully up to date and I’ve used the right exe to upgrade to. I’ll go into the Client Management tab, then Spark Version, uploaded the exe for 2.9.4 (just the exe for that, not the one with JRE) and then clicked on Update Spark Version, but when I check my Spark client for me and any of the other users, clicking on “Check for Updates” it will say “There are no updates”. Logging in and out didn’t fix it either, nor did a computer reboot. Any advice on how to fix this would be much appreciated.

This part of Client Control is broken for a while https://igniterealtime.atlassian.net/browse/SPARK-1851
Although i’m not sure whether this is Spark’s or plugin’s problem. Because Spark doesn’t have a constant developer, i think it won’t get fixed soon (if ever). So you may want to look for other options updating Spark on endpoints (AD, SCCM, scripts, other tools). Btw, if this option would have worked, your users would need admin rights to install it, which in normal circumstances users shouldn’t have.

Personally i think this part along with client program control should be removed from this plugin as obsolete and broken.

Our users do have admin rights on their own computers (part of our network setup) but I can’t say I’ve been getting myself familiarized with various other ways to automate client updates as I was under the assumption this feature would work without issue. Can you recommend which of the other options you listed (e.g. AD, SCCM, scripts) is the most efficient and possibly provide links to guides I can use to get started on this project?

Thank you for your response.

If you search in these forums for “silent install” or “using gpo to install” you can find a few guides on how people do this. In the past i have used GPO with simple cmd/bat scripts. You can find a link to my script in official documentation (Settings and Client Deployment section). I have seen more sophisticated scripts posted by various users.

But my script will only work in internal network when PC can reach AD domain before user login or can still reach it when user is logging out. It might not work for remote users on VPN or so.

Alright, so I have learned a bit of how to do the GPO but I’m having some issues with getting the script to run. So I have remoted into my AD’s computer and found the location where the script needs to go (Computer Configuration > Policies > Windows Settings > Scripts (Startup/Shutdown) > Shutdown) and made sure it has the updated directory paths to my shared server. I linked the policy to the ComputerUsers group I have that should be going out to all of the computers in the AD. I made sure Fast Boot was turned off on my computer while testing to ensure the script gets ran, but so far nothing. It won’t even copy the txt file I made. Any advice or pointers on where I could go from here?

Troubleshooting GPO is tough, especially when i don’t know much about your network and setup. I suggest to try running the script locally on the computer to see if bat file itself runs ok and does what it has to do (copies the file, kills Spark, etc.).

Also, GPO is assigned to OUs (folders). Group can be used for filtering who gets GPO applied in the OU. But GPO is assigned to OU in the first place. And in this case it must OU containing computers that you want to push Spark to (suggest to first create a separate OU and put one computer in there to test and assign new GPO to that new OU only first). First script in my example, which is updating Spark, is running in machine’s/system context, so it has to be assigned to computers.

Second one, which copies new Spark settings file, is running per user, so it is assigned to OUs holding users.

So I made a test OU and made sure it worked by testing a drive mapping batch file and that works fine, yet that batch file I made to initially check for and update spark is mysteriously not working. I have run the batch file out of my documents folder and it works exactly as intended, so I know it’s not in my code. I tried to run it both as shutdown and startup but no luck. I thought about possibly trying it with log in and log out, but I wanted to ask if there was possibly a step I was missing along the way. Also, does it need to be run on shutdown, or can it be run on the logon/logoff stage?

Well, i used it like 4 years ago last time, so maybe something has changed in Windows 10 again preventing this from working. Last time it was Fast boot messing with shutdown scripts. As your users have admin rights you can try login or logout script, maybe it will work. I was using it with shutdown script because my users didn’t have admin rights to install software so i couldn’t use login or logout. And i also used shutdown to be sure that Spark is no needed and its process can be killed to not intervene with the update. You can also try a simpler script, which is only running Spark installer without even silent switch and any check. Install older version first, then try such GPO to see if Spark gets updated. Other than that i don’t have much to suggest. I have not used Spark in production for years.

I do apologize for necro’ing an old post, but I’ve made some progress and had some fallbacks. I did manage to get some of my computers on the AD to install on login and some on shutdown. What confuses me is that many other computers aren’t getting it to work. I made sure it wasn’t an administration error, all users had the admin privilege set up, and I also checked to see if they would need the 2.9.4 with JRE in case JRE somehow wasn’t installed and that didn’t solve the issue either. I’m not quite sure where to go from here as I’ve followed your scripts as thoroughly as I could. I even tried putting a Runas parameter in when setting the batch file to use so I make sure it runs as admin.

As i’ve said i don’t have much to add here. This would need to be troubleshooted directly in your environment. You can check if Spark is installing manually on such machines or using just a silent switch running installer directly on such machine from cmd or point to a share. There can be many possible issues. Could be network. Maybe these machines for some reason can’t access that share, maybe only during login or shutdown. You can try having another script which is just copying the installer from the share to say C:\Temp on startup, login, whatever and then point installation script to C:\Temp instead of a share. Maybe this will work more reliably. That is, if they will be able to copy the installer correctly. My script is not 100% guarantee. I also had some fails. Our of 200 clients i usually had maybe 5 fails when it would not install correctly and i would have to go and reinstall Spark manually.