Upgraded to Spark 2.7.7 - does not work

I was using Spark 2.7.5 without any issues. Uninstalled 2.7.5 and then installed 2.7.7.

Install seems to work without any issues, but when I try to open Spark it just sits there. Looking in the error.log I see this:

javax.net.ssl.SSLHandshakeException: java.security.cert.CertificateException: Certificates does not conform to algorithm constraints

at sun.security.ssl.Alerts.getSSLException(Unknown Source)

at sun.security.ssl.SSLSocketImpl.fatal(Unknown Source)

at sun.security.ssl.Handshaker.fatalSE(Unknown Source)

at sun.security.ssl.Handshaker.fatalSE(Unknown Source)

at sun.security.ssl.ClientHandshaker.serverCertificate(Unknown Source)

at sun.security.ssl.ClientHandshaker.processMessage(Unknown Source)

at sun.security.ssl.Handshaker.processLoop(Unknown Source)

at sun.security.ssl.Handshaker.process_record(Unknown Source)

at sun.security.ssl.SSLSocketImpl.readRecord(Unknown Source)

at sun.security.ssl.SSLSocketImpl.performInitialHandshake(Unknown Source)

at sun.security.ssl.SSLSocketImpl.startHandshake(Unknown Source)

at sun.security.ssl.SSLSocketImpl.startHandshake(Unknown Source)

at org.jivesoftware.smack.XMPPConnection.proceedTLSReceived(XMPPConnection.java:87 1)

at org.jivesoftware.smack.PacketReader.parsePackets(PacketReader.java:258)

at org.jivesoftware.smack.PacketReader.access$000(PacketReader.java:46)

at org.jivesoftware.smack.PacketReader$1.run(PacketReader.java:72)

Caused by: java.security.cert.CertificateException: Certificates does not conform to algorithm constraints

at sun.security.ssl.AbstractTrustManagerWrapper.checkAlgorithmConstraints(Unknown Source)

at sun.security.ssl.AbstractTrustManagerWrapper.checkAdditionalTrust(Unknown Source)

at sun.security.ssl.AbstractTrustManagerWrapper.checkServerTrusted(Unknown Source)

… 12 more

Can anyone help?

What certificate are you using in Openfire?

Care to try the latest build (not an official release)? It has some bugs, but i want to check if it works better in this situation. 844 at the moment Ignite Realtime: Spark Nightly Builds

I downloaded and installed the latest build. When I try to start it, I just get an error message indicating an invalid username / password

How do I determine what certificate I am using?

1 Like

I had that same issue, but I had to change the server I was logging into because it was incorrect. I’m not sure what it connected to prior to updating, but changing to the correct server for me worked.

Go to https://replace_with_your_servername:9091/security-keystore.jsp?connectionType=S OCKET_S2S

Are they self-signed or some other certs?

This may also be a problem that started for some with 2.7.6 version after the Java update Connection issues after Spark 2.7.5 Build 770

You can try installing older Java (and removing jre folder from Spark installation) or replacing jre folder from Spark 2.7.5 version. We don’t know why newer Java causes problems for some reason. Most probably SSL/certificates related. Works fine with the self-signed ones. Maybe some certificates or SSL ciphers are too weak for new Java.

2.7.5 is the last version that works for us - anything after that fails. I tried the link that was provided in the previous post but it does not resolve.

How do I determine what certs are being used? Since we use Spark at our company, I don’t know that back versioning our java is something that we could consider. I’m hopeful that there is a better solution than that.

In that link you have to replace server name with your server’s name. This is a link to exact place in Admin Console to reach the certificates screen.

I do not consider downgrading java a solution either. This is just a temporary workaround until it is known what causes the issue. But we don’t know so far.

Oops - didn’t see that. I tried putting server name there but that still doesn’t work.

Hm, i should probably ask which version of Openfire you are using first

How do I determine what version of Openfire I am using?

Login to Admin Console and it shows the version on the first page.

Sorry - I don’t have access to the admin console. We may just have to leave this for now until our internal group who supports our use of Spark has time to investigate and pursue a fix.