Spark Memory Settings

To limit the memory usage one can add parameters to the JVM. These may lead to OutOfMemory errors if the values are too low. With these parameters one can only limit the Java heap and the Permsize but not the native heap of javaw.exe which manages the threads, the garbage collector etc. within the JVM.

See Spark JVM Settings for a general description how to add Java System Properties to Spark, one could add:





but what should be a good solution? 64MB is much and I think Spark don’t really need it. I don’t want do compile it share to our network and see then that in some cases the memory is not enough.

I have put the Spark.vmoptions under ubuntu Jaunty in /usr/bin and also in usr/share/spark/bin but spark keeps getting to 180MB.

What it is wrong?

I have java and spark beta 2.6 beta2.

Thanks a lot,


I have edited the file: /usr/bin/spark to:


/usr/lib/jvm/java-6-sun/jre/bin/java -Dappdir=$SPARKDIR -cp $SPARKDIR/lib/linux/jmf.jar:$SPARKDIR/lib/startup.jar:$SPARKDIR/lib/linux/jdic. jar:$SPARKDIR/resources org.jivesoftware.launcher.Startup -Xms56m -Xmx56m -XX:PermSize=12m -XX:MaxPermSize=12m

But i still seeing in top:

Mem: 3999640k total, 1156016k used, 2843624k free, 51600k buffers
Swap: 5863684k total, 0k used, 5863684k free, 379708k cached

8406 root 20 0 1401m 235m 10m S 0 6.0 0:12.35 java
8807 aanibald 20 0 1421m 169m 11m S 0 4.3 0:08.10 java

This server has two users running spark, one is consuming 235M RES Memory and the other is about 169m of Resident Memory.

¿How can i reduce this?

I am trying to deploy an LTSP Server, so 200MB of memory is too much for only an application.

Thanks a lot,