Spark Limit Memory under Linux

I have put the Spark.vmoptions under ubuntu Jaunty in /usr/bin and also in usr/share/spark/bin but spark keeps getting too much memory.

I have java 1.6.0.13 and spark beta 2.6 beta2.

Alse I have edited the file: /usr/bin/spark to:

#!/bin/bash
SPARKDIR=/usr/share/spark/

/usr/lib/jvm/java-6-sun/jre/bin/java -Dappdir=$SPARKDIR -cp $SPARKDIR/lib/linux/jmf.jar:$SPARKDIR/lib/startup.jar:$SPARKDIR/lib/linux/jdic. jar:$SPARKDIR/resources org.jivesoftware.launcher.Startup -Xms56m -Xmx56m -XX:PermSize=12m -XX:MaxPermSize=12m

But i still seeing in top command:

Mem: 3999640k total, 1156016k used, 2843624k free, 51600k buffers
Swap: 5863684k total, 0k used, 5863684k free, 379708k cached

PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
8406 root 20 0 1401m 235m 10m S 0 6.0 0:12.35 java
8807 aanibald 20 0 1421m 169m 11m S 0 4.3 0:08.10 java

This server has two users running spark, one is consuming 235M RES Memory and the other is about 169m of Resident Memory.

¿How can i reduce this?

I am trying to deploy an LTSP Server, so 200MB of memory is too much for only an application.

Thanks a lot,

Andrew.

Hi,

you may want to use a C based client. Java usually needs a lot of memory, you can specify the Xmx and MaxPermSize value to limit the memory usage but the java executable itself (the jit compiler, thread controller, …) will use addtional memory.

Or you could try to use IBMs JRE which allows to share the classes between applications within the same group, see

http://www.ibm.com/developerworks/library/j-sharedclasses/?ca=dgr-lnxw16ClassSha reJRE&S_Tact=105AGX59&S_cmp=GRsitelnxw16 and

http://www.ibm.com/developerworks/java/library/j-ibmjava4/ - Security

Anyhow I don’t know how much this will help.

LG

Ok, thanks for the suggestion.

I will try another Client althought i think Spark is the best client.