Openfire 4.8.1 statuses not updating

Upgraded from 4.7.0 to 4.8.1 over the weekend (all clients are Spark 3.0.2). We are seeing issues with peoples presence not updating. It will show away even though they are online. In most cases the person who sees someone away restarts their client then it updates. We never had this issue with 4.7.0

I have the same problem with the Pidgin client and InVerse plugin.

With version 4.7.5 everything works perfectly.

We have more then 500 accounts with many groups, if i save a grouo config even if i thont make any change the status os these acconts refresh but after that freze and wont update any more.

If i put the client offline a bak to online again, in the begin all clients are update but freze after that, seams like server stop broadcast update.

Hello! Thanks for reporting this. That surely is an annoying issue to have.

A couple of us have been trying to reproduce the problem, but we are unable to. That makes it difficult for us to find the cause of the issue. Can you give some pointers on how to trigger the problem?

In our case we simply upgraded to 4.8.1 from 4.7.0. We’ve been seeing the problems since. People show as away even if they are online. I’ll see a co-worker is away, but if I restart my Spark then they are online (and they actually were online), but others may have seen that person online the entire time.

I’m stumped. Are there specific steps to make this problem occur? For example, does this happen every time the co-worker toggles their status from ‘away’ to ‘online’ again? Does it happen only on ‘auto-away’? Does it happen with other statuses? Does it happen with everyone? Anything specific that you can think of?

I haven’t noticed a pattern. most of us just use auto away. In our case we also have 2 people that can’t see each other in chat. They are not on each others list.

It’s happening with most of us at some point (but not all the time).

Just now I had a user that was showing online and when I logged out and back in she’s now offline (which she actually is offline).

Have you had a look in the log files of Openfire? Is there anything that’s obviously related to this? I don’t really expect it to, but I’m grasping at straws at the moment.

I did just notice that in Spark I see one of my co-workers as away (she showed online before I restarted my chat), but in the Client Sessions on the server it she shows as online.She is not online.

This have anything to do with it?

2024.04.03 16:38:57.942 e[1;31mERRORe[m [socket_c2s-thread-12]: org.jivesoftware.openfire.nio.NettyConnection - Problem during connection close or cleanup
java.io.IOException: Connection reset by peer
	at sun.nio.ch.SocketDispatcher.write0(Native Method) ~[?:?]
	at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:54) ~[?:?]
	at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:137) ~[?:?]
	at sun.nio.ch.IOUtil.write(IOUtil.java:81) ~[?:?]
	at sun.nio.ch.IOUtil.write(IOUtil.java:58) ~[?:?]
	at sun.nio.ch.SocketChannelImpl.implWrite(SocketChannelImpl.java:566) ~[?:?]
	at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:618) ~[?:?]
	at io.netty.channel.socket.nio.NioSocketChannel.doWrite(NioSocketChannel.java:415) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.channel.AbstractChannel$AbstractUnsafe.flush0(AbstractChannel.java:931) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.flush0(AbstractNioChannel.java:354) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.channel.AbstractChannel$AbstractUnsafe.flush(AbstractChannel.java:895) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.channel.DefaultChannelPipeline$HeadContext.flush(DefaultChannelPipeline.java:1372) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeFlush0(AbstractChannelHandlerContext.java:921) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeFlush(AbstractChannelHandlerContext.java:907) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.channel.AbstractChannelHandlerContext.flush(AbstractChannelHandlerContext.java:893) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.handler.ssl.SslHandler.forceFlush(SslHandler.java:2236) ~[netty-handler-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.handler.ssl.SslHandler.wrapAndFlush(SslHandler.java:825) ~[netty-handler-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.handler.ssl.SslHandler.flush(SslHandler.java:802) ~[netty-handler-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeFlush0(AbstractChannelHandlerContext.java:925) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeFlush(AbstractChannelHandlerContext.java:907) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.channel.AbstractChannelHandlerContext.flush(AbstractChannelHandlerContext.java:893) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.channel.ChannelOutboundHandlerAdapter.flush(ChannelOutboundHandlerAdapter.java:125) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeFlush0(AbstractChannelHandlerContext.java:925) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeWriteAndFlush(AbstractChannelHandlerContext.java:941) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:966) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.channel.AbstractChannelHandlerContext.writeAndFlush(AbstractChannelHandlerContext.java:934) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.channel.AbstractChannelHandlerContext.writeAndFlush(AbstractChannelHandlerContext.java:984) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
	at org.jivesoftware.openfire.nio.NettyConnection.close(NettyConnection.java:215) ~[xmppserver-4.8.1.jar:4.8.1]
	at org.jivesoftware.openfire.nio.NettyXMPPDecoder.exceptionCaught(NettyXMPPDecoder.java:72) ~[xmppserver-4.8.1.jar:4.8.1]
	at io.netty.channel.AbstractChannelHandlerContext.invokeExceptionCaught(AbstractChannelHandlerContext.java:346) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeExceptionCaught(AbstractChannelHandlerContext.java:325) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.channel.AbstractChannelHandlerContext.fireExceptionCaught(AbstractChannelHandlerContext.java:317) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.handler.ssl.SslHandler.exceptionCaught(SslHandler.java:1203) ~[netty-handler-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeExceptionCaught(AbstractChannelHandlerContext.java:346) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeExceptionCaught(AbstractChannelHandlerContext.java:325) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.channel.AbstractChannelHandlerContext.fireExceptionCaught(AbstractChannelHandlerContext.java:317) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.channel.DefaultChannelPipeline$HeadContext.exceptionCaught(DefaultChannelPipeline.java:1377) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeExceptionCaught(AbstractChannelHandlerContext.java:346) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeExceptionCaught(AbstractChannelHandlerContext.java:325) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.channel.DefaultChannelPipeline.fireExceptionCaught(DefaultChannelPipeline.java:907) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.handleReadException(AbstractNioByteChannel.java:125) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:177) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:788) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562) ~[netty-transport-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) ~[netty-common-4.1.100.Final.jar:4.1.100.Final]
	at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) ~[netty-common-4.1.100.Final.jar:4.1.100.Final]
	at java.lang.Thread.run(Thread.java:1570) [?:?]

Also I see this repeating (probably not related):

Unable to create a socket connection to XMPP domain 'pubsub.chatsecure.org': Unable to connect to any of its remote hosts.

Not sure what that is. The Pubsub stuff is unchecked on the server.

Thanks for checking, but I don’t think either are related.

The first error seems to be a client abruptly disconnecting (without properly logging off first). That’s not a big deal (the logs are a bit to verbose here).

The second line is a failed attempt of your XMPP server (Openfire) to reach another XMPP server’s “pubsub” service. You’ve likely disabled server-to-server functionality, or the network has not been properly configured to allow these connections to have been set up. At a guess, one of your users is using the ChatSecure client, that tries to use something on its own hardcoded domain. You can probably ignore this (although that particular user might be missing some of ChatSecure’s functionality).

Nobody uses the CS client. Pretty much internal only using Spark.

Let me know if you need anything else from our setup to help troubleshoot this. At some point I’ll have to rollback to 4.7.0 (have a snapshot of the server).

What would help most is a way for us to reliably reproduce the issue, but you’ve already mentioned that you do not know how either. For now, I’m at a loss.

I’ve created a ticket for this issue: [OF-2820] - Ignite Realtime Jira

Pidgin normally receive status update like below:

(18:30:39) jabber: Recv (ssl)(252): dnd0
(18:30:39) blist: Updating buddy status for ruisimoes@10.3.0.2 (XMPP)
(18:30:53) jabber: Recv (ssl)(236): 0
(18:30:53) blist: Updating buddy status for ruisimoes@10.3.0.2 (XMPP)
(18:31:12) jabber: Recv (ssl)(98):
(18:31:12) blist: Updating buddy status for ruisimoes@10.3.0.2 (XMPP)
(18:31:12) jabber: Recv (ssl)(98):
(18:31:12) blist: Updating buddy status for ruisimoes@10.3.0.2 (XMPP)
(18:31:12) jabber: Recv (ssl)(27):
(18:31:12) jabber: Sending (ssl) (teste@10.3.0.2/centrauto):
(18:31:17) util: Writing file blist.xml to directory D:\Pidgin.purple
(18:31:17) util: Writing file D:\Pidgin.purple\blist.xml
(18:31:23) jabber: Recv (ssl)(236): 0
(18:31:23) blist: Updating buddy status for
ruisimoes@10.3.0.2 (XMPP)
(18:32:18) jabber: Sending (ssl) (teste@10.3.0.2/centrauto):
(18:32:18) jabber: Sending (ssl) (teste@10.3.0.2/centrauto):
(18:32:18) jabber: Recv (ssl)(91):
(18:32:18) jabber: Recv (ssl)(34):

After update for 4.8.0 for some reason this updates stop coming.

FWIW we had to rollback to 4.7.5 (which we were on before upgrading to 4.8.1, so the original post is slightly off) and now everything is working normally. Definitely an issue with 4.8.1

The situation has not been corrected in version 4.8.3 ?

Don’t know. I haven’t upgraded from 4.7.5 since we had the issue with 4.8.1