BOSH client out of memory issue causes Openfire to consume all server resources

I’m currently using Openfire 4.1.6 and Java 8 update 162 on Windows. I’ve created a JMeter load test, which I can share, that simply connects using BOSH, logs in, sets its presence, waits and then disconnects. I gave the test only a small amount of memory by setting JVM_ARGS="-Xms128m -Xmx128m". After a few clients connect JMeter runs out of memory. This then causes the Openfire server to start consuming CPU and an increasing amount of memory beyond the initial JVM allocation. This happens reproducably.

I see the following error repeated multiple times in the logs:

[Jetty-QTP-BOSH-45209]: org.jivesoftware.openfire.http.HttpBindServlet - Error when setting read listener
java.lang.IllegalStateException: state=EOF
	at org.eclipse.jetty.server.HttpInput.setReadListener(HttpInput.java:376)
	at org.jivesoftware.openfire.http.HttpBindServlet.doPost(HttpBindServlet.java:153)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:707)
	at org.jivesoftware.openfire.http.HttpBindServlet.service(HttpBindServlet.java:116)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
	at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:812)
	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:587)
	at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:221)
	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1127)
	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:515)
	at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)
	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1061)
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
	at org.eclipse.jetty.server.handler.HandlerList.handle(HandlerList.java:52)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)
	at org.eclipse.jetty.server.Server.handleAsync(Server.java:549)
	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:348)
	at org.eclipse.jetty.server.HttpChannel.run(HttpChannel.java:262)
	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:635)
	at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:555)
	at java.lang.Thread.run(Unknown Source)

I can see similar issues in the past (here and here). I’ll try updating the Openfire to 4.2.2 and report back. Due you have any further suggestions on how to resolve this?

1 Like

I’ve updated to Openfire 4.2.2 and can still cause this issue to occur within a few seconds:

[Jetty-QTP-BOSH-76]: org.jivesoftware.openfire.http.HttpBindServlet - Error when setting read listener
java.lang.IllegalStateException: state=EOF
	at org.eclipse.jetty.server.HttpInput.setReadListener(HttpInput.java:376)
	at org.jivesoftware.openfire.http.HttpBindServlet.doPost(HttpBindServlet.java:150)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:707)
	at org.jivesoftware.openfire.http.HttpBindServlet.service(HttpBindServlet.java:113)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
	at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:812)
	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:587)
	at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:221)
	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1127)
	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:515)
	at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)
	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1061)
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
	at org.eclipse.jetty.server.handler.HandlerList.handle(HandlerList.java:52)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)
	at org.eclipse.jetty.server.Server.handleAsync(Server.java:549)
	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:348)
	at org.eclipse.jetty.server.HttpChannel.run(HttpChannel.java:262)
	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:635)
	at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:555)
	at java.lang.Thread.run(Unknown Source)

I’ve attached the JMeter test that I mentioned. It required defining the serveraddress, username and password. It requires the BlazerMeter XMPP plugin for JMeter and I’m using JMeter 4.0.
Openfire XMPP over BOSH Load Test.jmx (15.6 KB)

I performed the same test using XMPP and the problem does not occur.

I’ve seen a lot of these errors, too. But they never caused any issues, besides polluting the log files.

It may be that the error logged isn’t related, but it is all that I can see being added to the logs.

I’ve set Openfire to use -Xms768m -Xmx768m -Xmn512m and when I run the test with XMPP over BOSH the Openfire service consumes almost all the CPU and the memory usage keeps growing even to over 8GB (beyond the memory allocation that I’ve set the JVM to use).

Have you tried hooking up a profiler (like JProfiler)? That might give you an indication of where that memory and/or CPU is being consumed.

I’m having trouble figuring out how the memory allocation can go (much) beyone Xmx. I’d think that the JVM would prevent that from happening. I do know that Xmx does not cover the non-heap part of memory, but that’s typically pretty small.

@guus No I haven’t tried a profiler. I’ll give that a try on Monday.

Yes I was very surprised when I saw that it was able to consume so much memory. Also that when the client ran out of memory that it triggered this issue on the server.

Strangely enough when I tried to reproduce it yesterday I was unable to, even though I reproduce it many times (>10) last week directly after starting the service. What I have noticed is that once instance of Openfire that has been only allowing XMPP has around 1095 handles with a peak of 1168, where the instance that was using BOSH has 239266 handles with a peak of 334814, where they both should be experiencing a similar amount of usage. Perhaps there is a resource leak when using BOSH which might explain why the memory is expanding beyond the JVM.

Unfortunately I need to move onto other tasks, but I’ll post updates here when I’m able to look into this again.