we are evaluating the openfire pubsub dom as the core of our solution. i need to find out pubsub’s upper capacity. does the dom persists on a database or is it loaded in memory?
Information is stored in the DB. Creating nodes or subscribing to nodes is immediately stored in the DB. However, for performance reasons new published items or items being deleted are queued in memory for 2 minutes and later flushed into the db.
what’s the maximum number of nodes it could handle? does anyone have any performance bench marks we can look at?
We are currently not running any heavy load tests on the pubsub module. However, when we implemented it we took in consideration performance, memory size and performed some profiling. If you are seeing some performance issues I would be happy to optimize the server. All I would need is for you to provide me the load tests that you are running.
Nodes are not memory-eaters so number of nodes that you can keep in memory should be pretty high. Besides nodes you will have to count subscriptions to nodes and published items. Subscribing to a collection node high in the hierarchy and listening to published events to any child (no matter the depth) is the most expensive operation that you will find in pubsub.