Spark Manager Issue

I am finding that my updates are not working because spark.client.downloadURL appears to be limited to 50 characters. Is this correct, and how do I fix it? I’‘m using the embedded-db, but plan on moving to mysql after initial testing is complete. I am using the latest version of Spark Manager also, just downloaded it today. The manual update from the client works OK. If it’'s just a timing issue, would like to request that clients check in at each log on.

Ryan

Message was edited by: revermeys

Hi Ryan,

There should not be a character limit on the spark.client.downloadURL. It should be as long as you wish. As for the Spark client, it checks for updates once a day and not at every startup. However, this can be voted on

Cheers,

Derek

I would be inclined to vote for both functionalities. Check for updates on program start/login, then a periodic check. This way if someone were to leave themselves logged in for a week they would still get an update notice (current functionality), but for some reason if they happened to login right after an update became available, but before their day was up they could also receive the notice.

Cheers,

RioGD

Agree check for updates on login is first priority

Less important is occasional poll by client or force push from manager

I agree with the login check. I would like everyone to get the update ASAP after I put it on Spark Manager.

I like the Force push Idea, and also I would love a configuration or answer file for the install (update,) to be able to eliminate as much user interaction during the install as possible.

It would also be great to be able to control some of the user settings like…

run at startup,

run in tray

auto login

Those of us using this in a corporate environment like as much administration control as possible. We’re Nazi’s that way…lol.

Add your suggestions and Vote Here: SPARK-252

UPDATE: The update worked (of course), so my issue (bug?) has changed to this: The Property Value field on the System Properties page only displays X number of characters, which limits the ability to read long field values i.e. spark.client.downloadURL. Does that make more sense?

Also, I think it would be helpful to have a “deploy to group” function, where you could limit your new client deployment to a group. I could see where this would be beneficial for testing new client versions before realeasing it to the masses.

Ryan