Methods for reducing Spark memory footprint?

During a Spark rollout, I was surprised to hear that users were concerned with the 50MB footprint of spark, relative to other clients.

To appease them, is there any way to reduce windows spark utilization, even at the cost of performance? I tried to delete plugins, but that didn’'t help.

I tried to launch spark from a java command to adjust the JVM memory, but was unsuccessful in even getting spark to run with the defaults on windows this way. Can it be done?

Thanks!

Hi,

if you want to modify the standard JVM parameters create a “Spark.vmoptions” file (file type: vmoptions (file type: text is wrong) and make sure to use an upper case S). Spark will look for this file and use the parameters in it (one JVM parameter/line) but as far as I can tell you can only speed-up the start process of Spark. It may look like:

-Xms32m

-Xmx64m

Setting the Xmx value to 64 MB or even a lower value may cause a lot of GC cycles and make Spark much slower or cause OOM errors.

LG