Bug: Spark 2.0.8 - memory leak

We have been having a problem getting this error message “Insufficient system resources exist to complete the requested service” when trying to access shared network drives. Also all Internet sites become unaccessable. This occurs after the computer has been left running for longer than 24hrs. 4 computers in our company have been affected by this.

After diagnosing the problem with Microsoft PSS and submitting a full memory dump they came back to me with the following:


We observe from the dump that the Spark software creates the objects and fails and hence there is a leak in objects as it is not releasing them. This is probably the reason behind the insufficient resources error message we are seeing.

Below are the findings from our dump:

Lookaside “nt!ObpCreateInfoLookasideList” @ 80553100 “ObCi”

Type = 0000 NonPagedPool

Current Depth = 0 Max Depth = 4

Size = 48 Max Alloc = 192

AllocateMisses = 1964522 FreeMisses = 0

TotalAllocates = 1964620 TotalFrees = 98

Hit Rate = 0% Hit Rate = 100%

Below are the process which occupy excessive private bytes:

Total Private: 80646 ( 322584 Kb)

07f0 Spark.exe 11253 ( 45012 Kb)

043c svchost.exe 11150 ( 44600 Kb)

04ec avgrssvc.exe 10723 ( 42892 Kb)

0f14 firefox.exe 8126 ( 32504 Kb)


We have uninstalled Spark from the computer getting the error message most frequently. We are still waiting to see if the error appears again now Sparks been removed.

Message was edited by: hailstorm

i cant belive that this is the fault of spark because it’‘s a java application and java has a built-in garbage collector… that means that every object on that no reference points is deleted … so i can’'t imagine how to much memory is beeing allocated

Hi,

Spark uses 50-60 MB, so I wonder how it can cause the trouble. It allocates about 50 MB right after startup, so there is no leak as you can see in “07f0 Spark.exe 11253 ( 45012 Kb)”.

If you want to track the memory usage you may want to create in the Spark folder a “Spark.vmoptions” file with this content:

-Xloggc:logs/gc.log

-XX:+PrintGCDetails

If you want to make sure that Spark always allocates 64 MB add also

-Xms64m

-Xmx64m

The JVM itself will also use some memory but this shouldn’'t be too much.

LG

I don’'t think the problem was that the computer was running out of memory. Whenever I checked memory usage when the error occured it had plenty of free memory to spare.

What it was affecting was kernel memory from what I was told.

From what I see Spark stays between 30-55MBs when the Roster list is open. Once I minimize the window, spark.exe drops down to 1.8-5MBs. I then initiated a conversation with user and spark.exe goes up to 40 megs. After sending a few files and doing a conference, spark.exe is up to 88MBs. If I minimize the main window(roster list) it seems like spark does some garbage cleanup because spark.exe then drops down in mem usage dramatically between 4-25MBs.

My guess the mem usage is GUI related.

Hi,

the Windows Taskmanager displays the wrong memory values as soon as you minimize a Java application, so also Spark seems to use only 5 MB while it is still using 50 MB.

LG