Client issue - In and Out

I have many users keeps in and out continuously.
I checked Warn log and I found the following :
2017.12.14 14:26:10 org.jivesoftware.openfire.nio.ConnectionHandler - Closing connection due to exception in session: (0x00076432: nio socket, server, /12.198.86.155:57354 => /10.254.250.11:5222)
java.lang.OutOfMemoryError: Java heap space
2017.12.14 14:26:11 org.jivesoftware.openfire.nio.ConnectionHandler - Closing connection due to exception in session: (0x00076435: nio socket, server, /12.198.86.155:1481 => /10.254.250.11:5222)
java.lang.OutOfMemoryError: Java heap space
2017.12.14 14:26:12 org.jivesoftware.openfire.nio.ConnectionHandler - Closing connection due to exception in session: (0x00076434: nio socket, server, /12.198.86.155:7164 => /10.254.250.11:5222)
java.lang.OutOfMemoryError: Java heap space

Any idea?

How much memory have you allocated to Openfire? You may just need to allocate some more or do some debugging to see what the memory leak may be? Are you running Openfire 4.2.1 ?

1 Like

The memory size is (8192 M)
Version: Openfire 4.1.6
I going to upgrade it to newer version (4.2.1), I hope that will fix it!

Thank you for your help

Just to verify, you see that 8192 MB memory reported on the Openfire Admin Console homepage?

1 Like

No, only the following:
Java memory 81.06 MB of 247.50 MB (32.8%) used

There is something weird. When I trying bad accounts on another PCs, they working fine!
Do you think the problem in Spark’s client? (I’m using V 2.8.3)

It might be a problem with that PC (software, hardware). But this doesn’t remove the fact that you are getting OutOfMemory errors in Openfire. Openfire only uses 250 MB of your 8 GB of RAM. You should adjsut Openfire’s memory settings as described here (Custom Parameters) http://download.igniterealtime.org/openfire/docs/latest/documentation/install-guide.html
If you are running 32-bit version, then you can only go max to 2000 MB or so. If you need more, switch to 64-bit version, if you can.

1 Like

Thank you for your help.
Back to work fine again after allocated 1GB for Spark server.