Spark is not shutting down at Windows log off? Here is why!

Some users have reported issues with Spark 2.6.3 while logging off Windows.

Symptom:

If the user of Windows shut down the PC, the Spark client does not log off and shut down normally. It has to be forced to shut down via a system dialog

Root Cause:

Spark runs on Windows using the Java version installed on Windows. Spark has been installed with Spark_2.6.3_online.exe or by a custom installed. This installer of Spark is not using the bundled JRE of the Spark installer.

The JVM 1.6.0 updates 25 to 27 have introduced a bug that prevents Java apps from reacting to the shut down signal issues to the application by the Windows OS. (https://forums.oracle.com/forums/thread.jspa?threadID=2260059)

Resolution:

Downgrade to Java 1.6.0 build 24 for Spark. Java 1.7.0 JRE also solve the issue but may introduce other side effect.

Is there anything in the works to get this working besides the downgrade of Java? Solely because of this issue I will not deploy this software. I can’t see introducing a new piece of software to all my users when that software won’t allow people to log off cleanly. No only is it a pain in the butt it is also a security risk. I see this problem has been since at least Sept. 13. Are there any plans to fix it?

It should be fixed when Spark is released with Java 7 bundled, probably 2.7.0. Can’t say when exactly. Walter is trying to push it out in the next weeks, though i think it can take longer, as few weeks has already passed.

Btw,

I see this problem has been since at least Sept. 13. Are there any plans to fix it?

You probably mean that it was like this for a long time. Well, not for the Spark. It is solely volunteers driven project and the lack developers usually, so sometimes you can wait for a fix or a new version for months or even years. It is more active recently, but you have to be prepared for this if you think to use Spark/Openfire.

Hi Rob

the issue is not caused by the Spark code, but by a change in Java. Complaints about this might be better adressed to Oracle. You can use Spark with the bundled Java (u18). This is from the security standpoint not optimal, but since all Java apps and your browser will use the OS installed Java (that should be the latest J6 version) it might be acceptable.

The other option is the installation of J7 as default JVM for your OS. The shut down will work with J7.

From security point of vie: What JVM would be considered to be secure (that’s a serious question, since we are working on a new bundle for Spark).

Walter