Spark having trouble with new SSL cert

We just switched from using a self-signed to a publicly signed SSL cert on our Openfire server. Every Spark client I tested now throws an error when I try to connect, while I was able to get around the problem in Pidgin by deleting the relevant account in the client and then re-creating. My thought it that Spark (and Pidgin) is looking for the self-signed cert that was previously accepted. Although I can delete the account info for Pidgin easily, this is not the case for Spark. I did delete the account folder I found under C:\users[username]\AppData\Roaming\Spark\user[user_jid], but this didn’t seem to make a difference. Question: where does Spark store the info that a particular self-signed cert is accepted, and how do I flush that record?

Need to note that our change took place in development. We’re still running self-signed in prod because it’s always best to first try these kinds of things out where it won’t impact anyone important (like my boss).

What happens if you delete whole C:\users[username]\AppData\Roaming\Spark\ folder?

Sorry for the delay in responding.

Same result. “Invalid username or password”. The new cert is OK, at least as far as Pidgin and Trillian are concerned. I’m thinking that Spark is using a Java truststore/keystore located somewhere other than its program directory or the user profile directory and won’t accept the “real cert” because it already has one for that host. But I’ve been unable to find it even after running through the whole process with diskmon (Sysinternal’s Disk Monitor, which tracks disk access by various processes in real time).

Sorry, the tool I used to log spark.exe’s disk accesses was procmon, not diskmon.

We are using self-signed certs. I have recently deleted old ones and generated new ones and nothing had to be done on the Spark’s end. It works fine. Not sure where the keystore is.

The official instructions on using publicly signed certs are dated and incomplete. For example, no one ever adequately explains why both RSA and DSA certs continue to be generated on a default install. There’s no answer to the basic question: do you need to have both for things to work? I thought I had the public cert installation process right and then discovered an article that gave a completely different take on things (adding Alias entries for and As a sanity check I restored the old self-signed certs and everything worked fine. So did Spark on a virgin machine (one that had never had Spark installed before). This is clearly one of those Java Desktop client + Windows = Chaos things. I’d try using a wildcard cert next but am hesitant due to the practicalities involved (justifying the additional expense to management). Can someone please help me get out of this minefield?