Running out of Java memory on 3.6.4

Hello,

We are running OpenFire 3.6.4 and we keep running out of Java memory. We have around a 1000 users and following other OpenFire support threads we have gradually increased our Java memory to “3835,00 MB”. To get the memory to this level we’ve used a 64bit Java VM (1.6.0_17 - 64-Bit).

The Java memory increase has meant that more users can now connect but once we get to around 600 users OpenFire crashes. For example we currently have 539 sessions opened and this is the status of the Java memory is: 3570,47 MB of 3835,00 MB (93,1%) used.

Most of our clients use Spark 2.5.8 but some users have installed these clients:

Pandion

pidgin

Kopete

Miranda

Psi

Gajim

We don’t have any Empathy clients (this appeared in another OpenFire memory leak thread).

Does anyone have any ideas as to what could be using all the Java memory?

Thanks in advance

Not sure if this will help but I have 112 users connected to our 3.6.4 box and we’re sitting at 76.27MB used.

This is a CentOS 5.3 box, running JRE 1.6.0 u11. Clients are all Miranda 0.7.19 over TLS. A few server-to-server connections, one conference (unused), and syching with our A/D domain for user lists. About 200 total users registered.

Sounds like you have something funky going on with your Openfire build…have you tried reinstalling and/or migrating to a new/different server?

Hello,

Thanks for responding.

We have fixed our problem by disabling the PEP protocol, by setting the property xmpp.pep.enabled to: false

We found some similar posts related to the “Empathy” IM client and although we were not using Empathy we had some clients using a “Telepathy” IM client. We believe these make use of the "XEP-0163 “Personal Eventing Protocol”.

Since we have disabled PEP, everything is running well.

Hi Paulagf,

Thanks for reporting this bug, and the suspected cause. Something appears to be awfully wrong with the PEP module. We’re working on it as part of the other issue that you mentioned, OF-82.