powered by Jive Software

Connection issues after Spark 2.7.5 Build 770


I looked around on here and couldn’t find any else who asked this question.

Has anyone else had issues with Spark not connecting after build 770? I am sure it has something to do server side within the company I work for, but I was wondering if there was a way to fix this issue. My company primarily supports using Pidgin as its means for messaging, but Spark is a far more popular option. A few co-workers tried updating to the newest download only for the connection to timeout when installing. I was able to learn that the newest version that functions is 2.7.5 Build 770 I don’t know what would have changed at that point that would have created this issue, but we would like to continue to receive the benefits of the Spark updates. Any information about a potential fix or any other help would be great. Otherwise, we will just stick with the build that works.



Edit: Updated to correct version.

As you don’t have access to the server it will be hard to find the root cause. I don’t see anything significant in 2.7.5, that would cause such problems and as far as i remember 751 build was the same as 2.7.4 release. So there shouldn’t be any other build before 2.7.4 release and after 2.7.5 development started. There were reports about connection problems with 2.7.6 version, but we weren’t able to find out what was causing that for a few users in the forums.

I was wrong. 751 build is not the final build of 2.7.4 release. But it is more puzzling now. As 752 was the final build of 2.7.5 release and final build is just a version renaming and changelog update, no changes to the code. So it should be the same as 751. So is the same as 2.7.5 (or and should work the same.

My apologies, I was incorrect in regards to the last working Build. I just downloaded about 10 different versions and I found that it is actually 2.7.5 Build 770 that is the last one that works properly. Anything after that I run into connection issues. Sorry about the confusion.

Mike, are you using the included JRE? What version of JRE is spark reporting?

Here is the screen shot.

can you try this.

remove all version of spark

manually check and delete c:\program files\spark

manually check and delete %appdata%\spark

reinstall latest version 2.7.7

So it is the same as Spark v 2.7.6 won’t accept correct credentials (means that the issue was introduced in the 2.7.6 development span - 2.7.6 release was 790 build).

Unfortunately there is no such a long history on Bamboo to see what changes went into 771 build. So one has to take into account whole changelog for 2.7.6, which is quite lengthy.

Have filed [SPARK-1734] Connection issues after build - IgniteRealtime JIRA to track this.

I gave that a shot and ended up with the invalid username or password error.

By plain counting commits it looks like this one was 771 (Java Update) SPARK-1643: Update bundled JRE to 1.8.0u74 · igniterealtime/Spark@aabd227 · GitHub

One should try installing older Java (66), remove bundled JRE from Spark installation folder and try running it.

UPD: Actually IT IS the change as this is the only commit on February 11 and 771 build is of February 11.

Awesome. Sorry about the mix up right away. Thanks for the help thus far.

I wish i could reproduce this…ive had zero problems…although there was an update to openfire that broke gssapi…but that issue is in 4.0.2 (i think) and only shows up if the sasl.mech is set to gssapi

It must be some change in Java Update 74. I’m trying to google it, and there are various reports with SSL problems, but nothing that i can relate. I’m using Openfire 4.0.2 with built-in Java (1.8.0_74 at the moment) with TLS (SSL disabled) and self-signed certificates produced by Openfire.

We will need info from your server to further debug this.

With that said, I have noticed various Java issues in the past. As I mentioned, I can’t do anything server side really. For the time being, I replaced the Spark\jre folder for 2.7.7 Build 880 with the fold jre folder from 2.5.7 Build 770 and everything is working great. I appreciate the help.



Have you tried 2.8.x versions of Spark, maybe it works with the Java bundled with it?