powered by Jive Software

Default.properties <->spark.properties & host_name/server?

Hi all,

I have created a msi package and changed some settings in the default.properties file.

Because of having 2 offices (and openfire servers), I would like to create 2 packages with the server address preconfigured.

Installing it works perfect, except for the server setting.

In my test, the server was entered correctly and impossible to change.

I wanted to have some more settings preconfigured though and used a spark.properties file that I copied to the users profile and the server field became editable again.

How can I get as much as possible preconfigured and impossible to modify so the client becomes minimalistic and foolproof?

I’m using the latest stable versions of openfire & spark.

Spark is installed on Win7 64-bit.


If you had a previous installation of Spark, upgrading will preserve those files among other things, as-to not break your current existing expected settings. If that is not desirable, you should remove those files and let fresh ones be created upon installation of a fresh spark.

Hi Jason, thank you for stepping in!

No, there is no previous installation.

Since we just upgraded from Xp to Win7, I went for a clean install and also started from scratch with the server.

Do you know how these 2 .properties files interact with eachother and how I can get a client with as much as possible preconfigured and locked down without recompiling?


EDIT: the most important reason for copying the spark.properties after the installation is that would like to have another language but I don’t know if this possible in another way.

You probably are mostly interested in default.properties, which will allow you to change how spark behaves… spark.properteis handles other things such as the image names, etc.

Note that there is another “spark.properties” file that gets created in your %APPDATA%/Spark directory when you first run Spark after it’s isntalled… that one holds various settings such as servername, etc, and get’s over-written during spark’s runtime (so your changes will get replaced!). That file is not the same one that’s in the spark source code.

the default.properties file gets built into the spark.jar file when it’s compiled. i do not know of a way to modify that file without needing to re-compile spark from source. you basically are making your own custom spark at this point.

it’s not hard, there are many guides on the forums how to compile spark, and even guides for how to create an exe or msi installer file.

Start by grabbing Spark’s source code from github: https://github.com/igniterealtime/Spark

There’s a “Download Zip” button in the bottom right corner if you don’t know how to use git to clone the repo.

From here, you will need to make your changes to the default.properties file to suit your needs. It’s located at: src/java/org/jivesoftware/resource/default.properties

Then, follow one of the guides to compile spark. There’s the one you posted earlier:

http://community.igniterealtime.org/docs/DOC-1521 (it’s a little old now, but should give you the general idea)

This guide is more about creating a custom Spark: http://community.igniterealtime.org/message/225739#225739

It’s also a little old now, but the largest difference is Spark is now hosted using Git on Github, not on SVN anymore. If you downloaded the zip of the source above, just skip the downloading steps. Also skip the Openfire steps since you are only after customizing Spark.

This opens the door to customize a bit more. You can add your own images to brand it for your company, etc. It may take a little trial-n-error, but the end result is definitely worth it.

Thank you for that extra info, I will look into it afterwards.

I did create a modified msi file and made changes to the default properties file which part of them are implemented, so until there everything is fine.

After the installation, I wanted to change the language and some other settings to the client so I thought the easiest way was to copy a ‘neutral’ spark.properties file to the %APPDATA%/Spark directory.

We have no AD/GPO so I created a script to install the modified msi and copy that spark.properties file.

The problem is that the order of these files seems to be randomly so I don’t know of a way to modify these with a script to achieve my goal…

right, the contents of the %APPDATA/Spark/spark.properties file is basically spark saving it’s settings from memory… there is no guaruntee of order in that file (due to how the Java Properties API works). That file may be over-written at any time during spark’s runtime (every time a setting changes, or spark logs out, shuts down, etc).

The language attribute does not appear to be set by default… which implies english for spark. if you set another language, then the language= property appears in the %APPDATA%/Spark/spark.properties file.

So, you could take your “neutral” spark.properties file, and add:


anywhere in that file, even at the bottom is ok. Then, next time spark launches, it will read from that file and set the language to the one specified. that is… until a user changes it of course

To figure out what the correct language property is, it may take some trial-n-error. Just launch Spark, set your language, then logout/shutdown spark. Check the spark.properties file and see what is set.








Thank you for that info, I tried it and it works - I have set it to: language=en

Now I’m just trying to figure out if/how I can use that neutral file and I’m testing now what I have to set and what I can leave out.

That way I hope to have a client in which only the username & password has to be entered and the server address remains greyed out with the right info of course.

EDIT (added info): after installing and opening spark without logging in, the spark.properties is created.

I closed spark and copied the file from a working system with the correct settings but changed the username and removed the line ‘password’" completely.

When I start spark again I get 2 problems:

  1. there is an error stating the username or password is invalid

  2. the server address becomes editable but the address is correct

Any idea if it’s possible to avoid that username/password issue?

How come the server address becomes editable and can I change that afterwards back to greyed out?

EDIT 2: by deleting the %appdata%\spark\spark.properties file, I’m back to the original situation.

By leaving ‘password=’ the result is same as deleting the line completely as stated in my first edit.

I believe the server/host box being greyed out is set in the default.properties file since that’s usually an “enterprise” thing where sys admins want to hardcode the server field for users. I’m not sure if you can set that anywhere else, which means you will have to compile it that way.

There are some situations where that box will become “un-greyed”, I believe usually this happens if the server field is not reachable (to prevent a broken login loop). There may be other ways it comes un-greyed.

regarding the password field, spark does some “magic” with the saved passwords to make them hard to figure out if you don’t know them already.

You’ll notice how it’s set to password=

Just delete the entire line. Next time you start spark, the password field will be empty and it wont let you click the Login button until the password field is populated.

Also, make sure you do not accidentally have “Save password” and/or “Auto login” selected when you make your “neutral” spark.properties file. Both of those settings could give you some trouble if set.

in the spark.properties file, these properties are:




both should be set to false.

You are right about the greyed out server/host field: it’s because the user/password was not correct and it makes sense now that I read your explanation.

I assumed that the settings were not applied and did not look further.

It worked well to create that general default.properties file but I would like to go a little further, therefor I created a new thread so others can step in and/or learn from it.

I would like to summarize the things I have and it seems I can also create a document.

Will be something for when all is running.

Thanks again for your input!