I have a wildfire server deployment currently running on version 2.6.2. There are currently 350 registered users and with 19 users currently connected I am seeing that the wildfire server is using 337.31 MB of 506.31 MB (66.6%) used
The server is running on the embedded database at the moment which is probably not good - is this the cause of the hight memory usage?.
The server runs on a Linux Fedora Core 2 system and was installed using the RPM from this site which uses the Java HotSpot™ Server VM version 1.5.0_06 from Sun Microsystems Inc.
The server is started with the following options -
Hi, I know everyone is busy with the new release etc . but is my situation typical of a wildfire install?.
This is a public wildfire installation, there are now 400 registered users and 30 concurrent connections, I’'m seeing upwards of 350Mb of memory being utilised, is this to be expected??,
There is nothing to worry about. That is how Java applications usually behave. Java is responsible for collecting garbage (i.e. things no longer used) and clean up memory. The garbage collector process performs a balance between CPU consumption and amount of free memory. While there is still space in memory the garbage collector will perform incremental garbage collects but will leave heavy work until there is no more room in memory. Once memory is getting exhausted the GC will perform a full GC and will remove all old objects.
This means that if you decrease the amount of memory you give to Wildfire then the garbage collector will probably run more full GCs to release more memory and make room for new objects to create.
However, if you see the memory consumption is always going up and nothing is released after a full GC then there is a memory leak. As of Wildfire 2.6.2 there is not know memory leaking so you should be fine.