Spark Reconnect

Looking from here it appears that a bug that was previously investigated and fixed may have been reintroduced in the latest versions of Spark that have the new style reconnecting look in the main spark window. This screen shot is from 2.7.6 build 790, running on Win7 64bit.

When the connection to the Openfire server is lost for whatever reason, in this example because a VPN is temporarily down, Spark will fail to successfully reconnect more often than it will successfully reconnect. When a reconnect fails, and when the VPN is back up, logout and login connects immediately.

Have others seen the same reconnection problem and is there any patch to fix this?

Which fix you are referring to (previously investigated and fixed)? There were no changes to the network/login/reconnect code for many years. So i don’t think something recently got broken. At least i don’t see it on my production site and on numerous testing machines.

I have seen a thread or two related to newest Openfire version and an issue with Spark constantly reconnecting. So, maybe this is related to Openfire. Again, using the latest 4.0.2 everywhere and not seeing it yet. Using build now, but there were only proxy config change since 790 build, which i can somehow relate to networking. So you can try the latest build and see if it somehow works better.

There is no patch in the works as there is no reproducible scenario and frankly not enough active developers working on Spark…

Ignite Realtime: Spark Nightly Builds

Have the same problem on Linux Mint.

Spark Im works good about 10-15 minutes after connection, then show the same message.

Log is clear, no error messages.

Снимок экрана от 2016-07-19 13:23:45.png

Not seeing such issue in my Ubuntu\Xubuntu 16.04 test machines (which is similar to Mint). What version? What installer you use (or tar.gz)? What does you Help > About window show?

Linux Mint 17.3 Rosa

Using tar.gz

Снимок экрана от 2016-07-20 10:52:09.png

I see that you probably have Java 7 installed on the system. Tar.gz archive comes with a newer java version inside (jre folder). Have you removed it maybe? If not, then Spark is using your system’s java automatically. Try removing java 7 (if you can) and let it use its internal java 8.

Switched to Java inside Spark folder, but same problem.

Снимок экрана от 2016-07-20 12:24:48.png

Also I couldn’t connect to server while not commented to lines in file:

#jdk.tls.disabledAlgorithms=SSLv3, RC4, MD5withRSA, DH keySize < 768

#jdk.certpath.disabledAlgorithms=MD2, MD5, RSA keySize < 1024

May be problem is over here? I suppose my server use old and weak certificate.

Don’t know. What are you using as you server? Openfire? Which version? And what Java version it shows on the first page of Admin Console? Maybe it is also using older Java. You can put same jre folder from Spark into Openfire’s installation folder and restart it. It should then use that internal java.

I don’t know server version, it is corporative. Have no access to it.

Pidgin client works ok, but I don’t like it.

Well, that’s all i can think of now. I’m testing against Openfire 4.0.2 with Java 8 and it works fine with the latest Spark (and the latest builds, which are using newer Smack library).