powered by Jive Software

Cannot load EC or DSA based certificates in Spark


I have been using certificates generated with a RSA private key fine in Spark’s identity store, but when I try to load similar DSA or EC based certs I get an error when trying to upload in the “Mutual auth” tab : “Cannot upload certificate file”.

In the logs I have some more info : org.bouncastle.openssl.PEMException: Pem file doesn't include: KEY_BEGIN kind of delimiter.

Some more context: my certs are exported with the private key embedded, for instance (DSA) :


Are these types of private keys supported in Spark ?

My suspicion is that this is a naive implementation in Spark, that checks for a specific header. If so, that should be fixed. Can you try to import the certificate into the keystore directly (using Java’s keytool command)?

Unfortunately it seems that Java’s keytool does not support importing private keys, so I can’t test that since the my file contains both the cert and the private key (which to me seems to be the only format to work from Spark’s UI)

keytool is a bit clunky. There are other tools that can operate on a Java keystore that do work. Alternatively, you can add your private key / cert to a PKCS12 file, and then convert that into a java keystore.

Here’s a bit of a follow up: I got to open Spark’s identity keystore with some software (https://keystore-explorer.org/). From here I could import PKCS12 certs in this keystore.

Then when I check in Spark’s GUI, it does show the certs with all the information (DSA or EC public key etc). I can also use the cert in my Spark plugin to do some message signature.

I think it would be worth to open an issue on Spark’s Jira for this (now that I mention it, Spark’s UI when importing certs only allows to choose .pem files, eventhough it says ".cer, *.crt and *.der" extensions in the explorer’s window).