I am trying to get a Certificate from our corporate CA to work for SSL connections with spark. The certificate is installed on the Openfire server version 4.1.6 with no problem. The certificate is working for the connection to the https admin site. I have read that the certificate with DSA is not required.
The spark clients are 2.8.3 and they keep getting the unable to verify certificate login error. I understand that you can accept all certificates on the advanced menu of spark but I would prefer to have spark work without this change since we are going to push this company wide. Where is the break happening for the Spark client to accept the certificate?
Spark is Java application and it uses Java’s truststore for certificates. Your local cert and CA are not known to Java, so not known to Spark. You will have to import this cert into every Spark client Java truststore. Or just into one and then distribute it somehow to clients.
Can’t give any detailed instructions as i have never done this and i only think it is possible (in theory). I have done this for some other Java based app though. It involved using keytool to import my CA generated cert to Java keystore. I think you can find guides on the Internet.
And you will also have to use the not released yet 2.9.0 version as prior to that Spark didn’t have any certificates management code, that’s why accept all certificates option had to be added, so it would be possible to login at all. Since 2.9.0 truststore is created in C:\Users\user\AppData\Roaming\Spark\security
Also, current 2.9.0 build has an option to accept a single certificate on a first login, so maybe you won’t have to import anything. But i can’t say when 2.9.0 could be released, still has many rough edges.
The server certificate chain was shown and I accepted the Root Cert.
Tracked down which file spark uses to store the certificates
C:\Program Files (x86)\Spark\jre\lib\security\cacerts
When deploying spark 2.9.0 to other computers, I overwrote the cacert file with the one
created from the above install
Spark now does not give the cert error but since it is still in development the process may change going forward. Thanks for the help in figuring this out.
Did you run it with an admin? I thought it should save into cacerts file in the user’s profile (AppData), not in Program files. But of course either way works.
I don’t think this process will change before 2.9.0 release, so this solution should work. Here’s the list of issues blocking 2.9.0 for now Issues blocking 2.9.0 release
Although, Smack developer now works on updating Spark to the latest Smack library version and it might have something changed about the certificates, which might require changes to Spark. A minor possibility.
I understant that this is an old topic, but there is an actual question: in Spark client some root certificates are integrated. Nowadays some of them are not actual and some actual are missing (for example GlobalSign Root CA - R6 and it’s inter GlobalSign GCC R6 AlphaSSL CA 2023) I think it’s very important to import new ca cert’s to installer or it must be a feature to add them manually…
Hi Aleksej, You’re absolutely right. With new releases of Spark, we occasionally refresh the standard set of root certificates that are part of the distribution. We will need to do that again, soon. In the mean time, it should be possible for you to manually modify the certificates.