Certificate have no basic constraints error using Spark 2.9.3

I am unable to use 2.9.3

On every PC I’ve tried (Windows 10 or Windows 8.1) after successful installation I am unable to connect to my Openfire server 4.6.0…

Spark gives an error:
An error has been detected.
javax.net.ssl.SSLHandshakeException: java.security.cert.CerficateException: Certificate has no basic constraints.

[Details] [Close].

2.8.3 still works without error messages.

regards,

This indicates that your Openfire server uses an invalid certificate. Spark 2.9.3 out of the box will be more secure, in establishing secure connections. You can bypass this, by configuring Spark to either not use TLS at all, or to ignore security exceptions - both of which obviously take away from the security offered. From a security perspective, the best way to solve this is to configure Openfire with a valid certifiate (and remove all invalid ones).

Hello Guus, Thank you for your reply. I’m going to paste an image hopefully, it should be below.

The error that you get is not related to the date-based validity/expiry of the certificate. If that were the case, then, with the settings that you highlighted, things should work.

Instead, the certificate that you are using (or possibly a certificate in the chain) is missing something. If l recall correctly, Basic Constraints relate to a flag that should be present in a CA certificate, marking it as something special.

Moved to a separate thread.

There are four solutions to the problem:

  1. Turn off encryption.

2)If you still want to use encryption then:
a)edit your file spark spark-core-2.9.3.jar


b)delete spark.properties from %appdata%/spark
c)connect to server

3)If you still want to use encryption then(silent):
a)edit your file spark spark-core-2.9.3.jar


b)delete spark.properties from %appdata%/spark
c)connect to server

d)Copy all user to% appdata% / Spark / security
your folder where you accepted the certificate and your users will be able to enter Spark without accepting this certificate because it will already be in their certificate store (Lifehack)

4)If you use external certificate
By the way, if you are using an external certificate that has a certification authority (root), then you can add the root certificate to C:\Program Files (x86)\Spark\jre\lib\security\cacerts
Using the KeyStore Explorer program and it should work.
CA usually have a long lifespan, about 10 years.
But I cannot check this option because I have a self-signed certificate that does not have a certification authority.

I myself will soon transfer all users to SSL, option 3 is the easiest for me, 300 computers, I will do it with a script: D

p.s I would advise you to issue a self-signed certificate through Openfire, we do it once for 5 years.

I didn’t read the message yesterday.
Besides enabling these 2 options, can you enable these and try again?

In the case when certificate hostname is not matching server’s domain Spark would say that in the error. But StanIt is getting an error about constraints, which means that something is wrong with the certificate, but not the hostname. I wish this error could be more verbose, but it is how it is.

@StanIT give a try this build, maybe it will work better with your cert

1 Like

I concur. This issue that @StanIT reported is caused by SPARK-2184 which should be fixed by the build that @wroot just linked to.

Yes 2.9.4-Snapshot works as I would expect it to, with our legitimate CA issued SSL Cert. With no knowledge needed by the end user to ‘configure it’. Just install and input username/password and you’re up and running without needing to check or uncheck any other options, as I had originally set it up to be.

2 Likes

8 posts were split to a new topic: Spellchecker missing in 2.9 version