Spark Memory Usage - minimize to reduce Windows footprint

I am evaluating Spark and OpenFire as a solution for IM for our company. I prefer it to IBM’s Sametime and it sure beats the stuff coming out of Redmond!

I need to decide on a IM client and Spark is a natural choice. My problem is that the default Windows footprint can be as much as 60MB!! I say “default” because, with a little mouse magic, it can be reduced to about 1.5MB.

Here’s what I mean:

  • Open the main Spark Dialog

  • Open the Task Manager and see the memory usage of Spark

  • Do some stuff in Spark; watch memory usage climb

  • Click the ‘X’ to “hide”; mMemory usage stays huge

  • Open again and now click “Minimize”; memory usage drops to a respectable 4-5MB

  • Right click on the TaskBar icon (NOT tray icon) icon and select “Close”

  • Check Task Manager and you’ll see your spark is now behaving itself

This assumes the program is set to hide if the “close” button is pressed which I think is the default

Of course, when I showed this to the IT manager at my company he was not happy about the 50MB sucked up by the program when idling and correctly pointed out that his 1000 users are not about to go through this process every time they want to hide their chat program. “Find a smaller chat program” I was his comment.

This behavior is a result of a combination of how Windows handles windows and maybe a little bit of lazy programming. When an application is “sent to the tool tray” (notification area) that’s not actually what happens. What really happens is that the main application window is simply hidden. Its still there, doing whatever it was doing; just hidden from view. The user gets the illusion that the application is popping in and out of the tray. Since a “hide” operation does not really do anything to the window, the memory consumption stays the same. If, however, you first minimize a window and then “hide” you get the benefit of Windows memory management releasing resources first.

So, my question is: Is it possible in a future release to “minimize” first and then “hide” when the “close” button is pressed?

I know this is possible since, as programmer, I have been doing this for years with “tray-based” apps so I know its a trivial change and even Java-based apps can be written that way. This would reduce the footprint by a factor or 10.

Sorry for the long-winded post but I wanted to provide the technical information as well.

Okay, so, i decided to do some comparisons of programs that were running on my PC:

Internet explorer = 76 Megs

WMP Player = 35 Megs

Dreamweaver = 31 megs

Netscape = 444 megs (although that is with 80 tabs open :stuck_out_tongue: )

MSN Messenger = 35 megs

Spark = 45 megs

In comparison, Spark doesn’t really use the amount of memory that people seem to stating as “too much”. If spark used 80+ megs, i could understand. Most systems these days have at least 512 MBs, so, a system should be able to run spark fine.

I agree that Spark is not bad by today’s standards, although somehow bloatware has become the standard, but that was not really my question.

Its obvious that Spark CAN be convinced to take up a very small amount of memory; its only a matter of allowing it to do so. Read the part of my post about minimizing and then closing. All I am asking is if there are any development plans to do this. We’re talking about 2 lines of Java code or 3 lines of c++ code at most.

Unfortunately, it’s not that simple, it’s obvious upon minimizing it that there is a memory leak right there and the fact that these memory usages are flying about as much as they are shows there is a problem somewhere. This will need to be investigated further.

I am afraid I am not sure what you mean by “memory usages are flying about” and how that somehow indicates a memory leak.

Computer programs allocate and deallocate memory as needed. Java apps rely on the virtual machine to perform garbage collection which is more or less the same thing.

When a Win32 window is minimized a WM_SIZE message is send to the window and the operating system (Windows) does a bunch of “stuff” including releasing unused resources for that window before clearing it from the screen. With a “hide” operation a SW_HIDE mesage is sent to the window. When the window receives that message the operating system does very little other than remove the window and turn off painting for it.

To all you programmers out there I realize I am grossly oversimplifying and a Java app (as I believe Spark is) is a little more complicated and may even use polling instead of messaging but the end result is more or less same at the OS level.

My point was that the expected result of a minimize operation is release resources and any program you minimize will consume less memory in its minimized state.

Anyway, we’re getting a little off topic. I hope someone on the Spark development team sees this post sometime and adds the few lines of code needed to fix this. I downloaded the source to Pidgin (aka GAIM) and did the same thing to it and its “iconized” memory usage dropped from 21M to 2M. Is there a community version of Spark that people can modify and compile?

To people that manage 1000’s of computers, this stuff really is important so I understand the reservations my company has to using Spark until this is fixed.

What i meant by memory leak is that, upon minimizing you can clearly see the memory usage drop and then start climbing, that’s a leak. It’s a problem because it could attempt to grow until there is no more memory to allocate.

Well. Sometimes memory usage does decrease only by hitting X button (have just tried that and it decreased by 50 MB). What i’m concerned about is that Spark becomes slow after long idle when untraying it (even while not releasing resources). It can become even more PITA if it would has to take some system resources first. In overall it’s not an ideal client for 1000 PCs environment (with different resources). So i do suggest trying more clients.