Spark without Openfire

I like spark as a client. I prefer it over other XMPP clients out there. I am trying to use it with IceWarp, which uses XMPP for their chat feature. When I try to log into the server I get a “Certificate path validation failed” error. “this cert must be the last cert in the certification path”

I can login with Gajim and Swift, so the server seems to be working. Anyone have any ideas on how to correct this?

org.jivesoftware.smack.SmackException: javax.net.ssl.SSLHandshakeException: java.security.cert.CertificateException: java.security.cert.CertPathValidatorException: Certificate path validation failed
at org.jivesoftware.smack.tcp.XMPPTCPConnection$PacketReader.parsePackets(XMPPTCPConnection.java:1176)
at org.jivesoftware.smack.tcp.XMPPTCPConnection$PacketReader.access$1000(XMPPTCPConnection.java:1092)
at org.jivesoftware.smack.tcp.XMPPTCPConnection$PacketReader$1.run(XMPPTCPConnection.java:1112)
at java.lang.Thread.run(Unknown Source)
Caused by: javax.net.ssl.SSLHandshakeException: java.security.cert.CertificateException: java.security.cert.CertPathValidatorException: Certificate path validation failed
at sun.security.ssl.Alerts.getSSLException(Unknown Source)
at sun.security.ssl.SSLSocketImpl.fatal(Unknown Source)
at sun.security.ssl.Handshaker.fatalSE(Unknown Source)
at sun.security.ssl.Handshaker.fatalSE(Unknown Source)
at sun.security.ssl.ClientHandshaker.serverCertificate(Unknown Source)
at sun.security.ssl.ClientHandshaker.processMessage(Unknown Source)
at sun.security.ssl.Handshaker.processLoop(Unknown Source)
at sun.security.ssl.Handshaker.process_record(Unknown Source)
at sun.security.ssl.SSLSocketImpl.readRecord(Unknown Source)
at sun.security.ssl.SSLSocketImpl.performInitialHandshake(Unknown Source)
at sun.security.ssl.SSLSocketImpl.startHandshake(Unknown Source)
at sun.security.ssl.SSLSocketImpl.startHandshake(Unknown Source)
at org.jivesoftware.smack.tcp.XMPPTCPConnection.proceedTLSReceived(XMPPTCPConnection.java:856)
at org.jivesoftware.smack.tcp.XMPPTCPConnection.access$2000(XMPPTCPConnection.java:155)
at org.jivesoftware.smack.tcp.XMPPTCPConnection$PacketReader.parsePackets(XMPPTCPConnection.java:1171)
… 3 more
Caused by: java.security.cert.CertificateException: java.security.cert.CertPathValidatorException: Certificate path validation failed
at org.jivesoftware.sparkimpl.certificates.SparkTrustManager.checkServerTrusted(SparkTrustManager.java:97)
at sun.security.ssl.AbstractTrustManagerWrapper.checkServerTrusted(Unknown Source)
… 14 more
Caused by: java.security.cert.CertPathValidatorException: Certificate path validation failed
at org.jivesoftware.sparkimpl.certificates.SparkTrustManager.doTheChecks(SparkTrustManager.java:127)
at org.jivesoftware.sparkimpl.certificates.SparkTrustManager.checkServerTrusted(SparkTrustManager.java:93)
… 15 more
Caused by: java.security.cert.CertPathValidatorException: basic constraints check failed: pathLenConstraint violated - this cert must be the last cert in the certification path
at sun.security.provider.certpath.PKIXMasterCertPathValidator.validate(Unknown Source)
at sun.security.provider.certpath.PKIXCertPathValidator.validate(Unknown Source)
at sun.security.provider.certpath.PKIXCertPathValidator.validate(Unknown Source)
at sun.security.provider.certpath.PKIXCertPathValidator.engineValidate(Unknown Source)
at java.security.cert.CertPathValidator.validate(Unknown Source)
at org.jivesoftware.sparkimpl.certificates.SparkTrustManager.validatePath(SparkTrustManager.java:270)
at org.jivesoftware.sparkimpl.certificates.SparkTrustManager.doTheChecks(SparkTrustManager.java:123)
… 16 more
Caused by: java.security.cert.CertPathValidatorException: basic constraints check failed: pathLenConstraint violated - this cert must be the last cert in the certification path
at sun.security.provider.certpath.ConstraintsChecker.checkBasicConstraints(Unknown Source)
at sun.security.provider.certpath.ConstraintsChecker.check(Unknown Source)
… 23 more

Hi
Try turning off encryption or turning off certificate verification.

I recently experienced the same issue using Spark v2.9.4 and Openfire v4.6.3

Using the following steps, I was able to resolve the issue:

  1. Install Spark v2.9.3 over Spark v2.9.4

  2. Log in using Spark v2.9.3

  3. When prompted, accept SSL certificates. Spark v2.9.3 logs in successfully.

  4. Install Spark v2.9.4 over Spark v2.9.3

To view the settings that Spark v2.9.3 created:

  1. Log out of Spark

  2. On the Login screen, click Advanced, and then click the Certificates tab.

The certificates required to validate the chain that you accepted using Spark v2.9.3 appear in the list of certificates in the trust store with the Exempted field selected. You can manually add the same certificates to another instance of Spark v2.9.4, and then select Exempted, so that you do not have to re-install Spark each time your SSL certificates renew.

The available documentation on using the Certificates tab seems very poor. For me, I believe the takeaway lesson is that the Spark client seems to validate SSL certificates required for a secure connection entirely on the client side, independent of the chat server. My incorrect assumption regarding SSL certificates and Spark was that Spark would request the chat server to validate SSL certificates in the chain as required to authenticate a secure connection. If there is a Spark dev out there, maybe you can explain how funtionality related to establishing secure connections really works, or is designed to work.

When encryption is enabled (triggering the server certificates to be validated), Spark tends to be rather strict in what it allows. You can work around this by disabling encryption, as Ilya illustrated.

The message “basic constraints check failed: pathLenConstraint violated - this cert must be the last cert in the certification path” suggests that the certificate chain as provided by the server is not quite correct (or that Spark has a bug in working with this particular chain).

Although not the only possible cause for this problem, a common one is where the certificate chain that’s presented by the server is not presented in the correct order. I don’t think that Spark attempts to detect and fix an unordered chain. Certificates should be ordered in a fashion where the next certificate is the issuer of its predecessor. Is the server that you’re trying to connect to publicly accessible? If so, then it’d be easy for me to verify if this is the case on that server.