Possible resource leak?

just confirming that I did indeed kill the local profiles before the 604 beta build install… so it’s not a profile/cached resources thing.

Thank you Serge for the information. A time-window as you described will certaintly help in narrowing down the search for the offending commit! (too bad Jira doesn’t have the Github “Blame” feature lol!!!)

I noticed the announcement by rcollier regarding the memory leak in smack 3.3.0 - and 3.3.0 was integrated into Spark at somepoint this year. perhaps this has something to do with our problem?

It could be. It was released in May. And that KeepAliveManager looks very related. The only way to test this is probably to build previous Smack version and buid Spark with that.

UPD: or build Spark with that http://bamboo.igniterealtime.org/artifact/SMACK-NIGHTLYBINDIST/shared/build-1105 /Project-binary-files/smack_3_3_0.zip

Ok, I checked-out the smack source and reverted to revision 12965 (Smack version 3.2.1) and built the jars. I simply over-writed the smack jars that were in the Spark/lib/ directory on my machine as well as a couple others over here as a test. I have a version of SPARK-33 in my build so the about box confirmed the 3.2.1 Smack version is indeed being used now. I guess we’ll know in a few days if this did the trick…

For anyone else wanting to test – I attached the 3 Jars you will need. Just replace (after backing up) the ones in your Spark installation directory’s lib folder.

The 3 Jars you’ll need:

smack.jar

smackx.jar

smackx-debug.jar
smack.jar (307100 Bytes)
smackx.jar (666602 Bytes)
smackx-debug.jar (56349 Bytes)

Looking good here.

The client on my station has been running for a little less than a day with the replacement jars and it currently uses ~40Mb where it would have been around 200Mb before.

At the risk of being premature, congrats on finding and fixing the issue.

Well, not actually a fix, but downgrade There can be other issues with the older Smack library. The real fix would be to wait for fixed Smack library and update it in Spark.

it appears they are working on a 3.3.1 release – which hopefully will resolve the resource leak.

I’d say it’s a little too early to call this one definitive, however I should know by tuesday morning (monday is a holiday for us) if this issue is still present with these jars in place.

actually it does look like they have fixed this in the smack nightlies. if my test with these 3.2.1 jars work over the weekend, then i’ll try out their nightlies a few days and confirm there as well.

so any verdict yet? old or new? thanks!

Well, I’m going to say this is confirmed… bad Smack resource leak was the issue. My systems that I setup the 3.2.1 Smack jars on are still running normally…

Ok, i’ll setup a ticket in Jira to integrate the latest Smack library after they release their next version (thinking it’s 3.3.1) – wroot (or someong from the Smack Team), do you when the Smack Team may be gearing up for the next release?

Ok, it’s logged in Jira: SPARK-1550

I’ll try to keep an eye out for when the Smack Team releases the next version… until then, users can use the jar’s I posted above (Smack version 3.2.1) in their custom builds. I believe 3.2.x is what was in Spark 2.6.3 so it’s basically no different than the off-the-shelf release of Spark.

Thanks for all the help in tracking this down guys! (was a pretty frustrating issue in my office at least)

I don’t know when next Smack release could be. We should ask Robin Collier, but he is a busy man, so it can take very long. Though we are not experiencing the same issue, but some users has issues with Spark freezing, so i would like to have memory leak fixed. What about using “Beta” of Smack for current “beta” builds of Spark and when Smack final is released, update it.

beta would need to be tested before pulling it into Spark TRUNK imho.

I just pulled the latest smack_3_3_1 branch for SMACK and compiled… version still says 3.3.0 but you can see in the tree it’s well beyond that in commits now… I’m going to test this out on my machine for the next few days and see if the issue is truely resolved in the beta’s.

For anyone else who cares to test the beta SMACK lib - I’m attaching the compiled jars (From branches/smack_3_3_1 on Sept 6 2013 – Revision # : 13713)

Same deal as above:

Just replace (after backing up) the ones in your Spark installation directory’s lib folder.

The 3 Jars you’ll need:

smack.jar

smackx.jar

smackx-debug.jar
smack.jar (349160 Bytes)
smackx.jar (706300 Bytes)
smackx-debug.jar (56756 Bytes)

Someone should have PM’ed me, I don’t tend to read the Spark related discussions. This has been fixed for awhile and has been in the nightlies for a couple of months. I will try to review the contributions in the current 3.3.1 branch this week and see if I can push out a release in the next week or so.

That being said, there is a trivial workaround for this bug as well which is mentioned in the announcement.

thanks for the input rcollier. it took a while before it was discovered that smack was the culprit :stuck_out_tongue_winking_eye:

I can confirm that the beta smack (revision 13713) is indeed fixed as I’ve been running it a few days in-house without incident. I think it would be best to have spark use a released version of smack instead of a nightly or modifying it per the announcement. But if you determine if will be a bit before you can release, I think we’ll just pull the latest nightly of smack into spark TRUNK for now.

Agreed on the release for Spark. I will try to get a release out in the next week. If that doesn’t happen, then my suggestion it to put the workaround in Spark as opposed to using the nightly build.

Just to confirm, from the announcement it appears the trivial fix is to add a missing method to the KeepAliveManager smack class. So for Spark, we’d have to pull in a customized 3.3.0 release jar to fix this until next smack release… right?

i’m wondering how many people pull from the spark Trunk per week? if it’s worth adding the fix to Spark now and then updating it once the next smack is released… or if we should just wait. i’d hate for new contributors or users to pull the trunk and wind up burned… this was quite the headache at my company… but if it’s not many people (assuming most users use the released spark version), then maybe we can just wait it out.

thoughts anyone?

I am pretty confused now. I’m not the expert developer so I was wondering if someone could spell it out to me. Is a release a new version of what is currently posted? Sorry I’m not much help here but I’ve been trying to resolve this problem for quite some time now. Thanks!

@Vinh – we’re talking about waiting for the next Smack library release, since we don’t necessarily want to pull a beta/nightly build of that lib into Spark’s TRUNK - since the beta/nightly is not vetted/tested as much and may change between now and when the next Smack lib version is released. The next Smack version will be 3.3.1 if I’m not mistaken, the current release of Smack is 3.3.0.

The jar’s I posted most recently are to be considered beta/nightly builds since I took it from the development branch of Smack and compiled it myself. They work well from my testing and do resolve the issue, but it may or may not end up being the same code as what will finally ship in 3.3.1 Smack lib. For that reason, you are welcome to use those jars until we integrate the next Smack lib release (after rcollier is able to finalize stuff and push out release 3.3.1 Smack).

To use them in a fresh build, you can simply drop those jars into your /build/lib/dist directory prior to compiling your custom Spark. To use them in an existing install of Spark – you can replace the ones in your Spark installation’s lib/ directory and restart Spark.

Hope that doesn’t confuse more! lol