powered by Jive Software

Java running out of memory, crashing server

I’ve got a fresh install of Openfire that’s erratically running out of memory and crashing. Four scenarios:

  1. Default install with 64 megs of Java Heap

  2. 512 Megs of Java Heap

  3. Minimum 256/Max 512 Megs Java Heap

  4. Min 512/Max 1024

In all cases, the system will begin using memory until Java fails. At this time, I believe there is a client connection causing this error, as this is the exact reason I just upgraded from jabberd2. However, I’m trying to find a way to track this. I would liketo correlate a particular user authentication (LDAP in this case) against the time when the memory began to be eaten, but as far as I can tell, logs for that do not exist.

Can anyone recommend anything that might be helpful?

Thanks

Tom Rodriguez

What client? Are you restricting simultaneous logons? Do you have openfire debug logs enabled? How many users?

Clients: Pidgin, Adium, Pandion, Trillian

Logins are not restricted

Debugs were enabled, but I was unable to find specific mention of usernames/source IPs

Approximately 300 users.

Thanks

Tom Rodriguez

It may not be an error with any one user but instead a process. Is there a repeating error on the server? How often does the server error out?

When it was failing, it was within maybe 5 minutes of the server coming up. It’s fairly stable now, but I don’t know for how long, or if it’s going to remain that way.

This is the only message I can see in the error log that appears to repeat at the time of the last outtage:

2008.03.03 12:17:01 org.jivesoftware.openfire.handler.IQvCardHandler.handleIQ(IQvCardHandler.java:91 )

java.lang.UnsupportedOperationException: VCard provider is read-only.at org.jivesoftware.openfire.vcard.VCardManager.setVCard(VCardManager.java:124)at org.jivesoftware.openfire.handler.IQvCardHandler.handleIQ(IQvCardHandler.java:8 2)at org.jivesoftware.openfire.handler.IQHandler.process(IQHandler.java:48)at org.jivesoftware.openfire.IQRouter.handle(IQRouter.java:348)at org.jivesoftware.openfire.IQRouter.route(IQRouter.java:100)at org.jivesoftware.openfire.spi.PacketRouterImpl.route(PacketRouterImpl.java:67)< span class=“hilite”>at org.jivesoftware.openfire.net.StanzaHandler.processIQ(StanzaHandler.java:303)at org.jivesoftware.openfire.net.ClientStanzaHandler.processIQ(ClientStanzaHandler .java:78)at org.jivesoftware.openfire.net.StanzaHandler.process(StanzaHandler.java:268)at org.jivesoftware.openfire.net.StanzaHandler.process(StanzaHandler.java:167)at org.jivesoftware.openfire.nio.ConnectionHandler.messageReceived(ConnectionHandl er.java:132)

at org.apache.mina.common.support.AbstractIoFilterChain$TailFilter.messageReceived (AbstractIoFilterChain.java:570)

at org.apache.mina.common.support.AbstractIoFilterChain.callNextMessageReceived(Ab stractIoFilterChain.java:299)

at org.apache.mina.common.support.AbstractIoFilterChain.access$1100(AbstractIoFilt erChain.java:53)

at org.apache.mina.common.support.AbstractIoFilterChain$EntryImpl$1.messageReceive d(AbstractIoFilterChain.java:648)

at org.apache.mina.filter.codec.support.SimpleProtocolDecoderOutput.flush(SimplePr otocolDecoderOutput.java:58)

at org.apache.mina.filter.codec.ProtocolCodecFilter.messageReceived(ProtocolCodecF ilter.java:173)

at org.apache.mina.common.support.AbstractIoFilterChain.callNextMessageReceived(Ab stractIoFilterChain.java:299)

at org.apache.mina.common.support.AbstractIoFilterChain.access$1100(AbstractIoFilt erChain.java:53)

at org.apache.mina.common.support.AbstractIoFilterChain$EntryImpl$1.messageReceive d(AbstractIoFilterChain.java:648)

at org.apache.mina.filter.executor.ExecutorFilter.processEvent(ExecutorFilter.java :239)

at org.apache.mina.filter.executor.ExecutorFilter$ProcessEventsRunnable.run(Execut orFilter.java:283)

at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(Unknown Source)

at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)

at org.apache.mina.util.NamePreservingRunnable.run(NamePreservingRunnable.java:51)

at java.lang.Thread.run(Unknown Source)

This was being logged once every few seconds.

None of the other logs seem like they have any issues with massively repeated messages at that time.

Hey Tomas,

You should add -XX:+HeapDumpOnOutOfMemoryError as a java parameter when launching the server. With that extra parameter the Java Virtual Machine will create a heap dump when it runs out of memory. You can send me that heap dump file (once compressed) so I can analyze it. You can also analyze it using jhat. BTW, which version of the server are you using?

Regards,

– Gato

Version is Openfire

3.4.5

I’ll add that parameter when I restart the system, and I’ll provide a log to you, should this happen again.

Thanks

Tom Rodriguez

It crashed again, I added the parameter, and now it’s in the process of crashing a second time. Where will it put the dump file, and will it actually create a dump if the java process hangs indefinately?

Thanks

Tom Rodriguez

Hey Tom,

It’s going to dump the heap dump in the folder [openfire home]\bin. I think that a single heap dump will be generated when the JVM gets a OOM. The file name will start with java something.

Thanks,

– Gato

Looks like it actually put it in /opt/openfire/logs

However, the dump is huge. I’m restarting the server with the default 64 megs of heap memory to try and get a smaller dump.

Hey Tom,

64mb is going to be too small and the server may run OOM due to some other reason. Try compressing the file. It should be reduced a lot.

Thanks,

– Gato

I’ll re-run is at 256.

The 500meg dump is well over 30 megs, and it’s still not done BZipping. I can send you this file, but do you have a place I can put it, or do you want me to split it and send it over EMail?

Hey Tom,

Send me an email to gaston at jivesoftware dot com and I will give you an FTP server where you can upload the file. 30mb is not much.

Thanks,

– Gato