During a Spark rollout, I was surprised to hear that users were concerned with the 50MB footprint of spark, relative to other clients.
To appease them, is there any way to reduce windows spark utilization, even at the cost of performance? I tried to delete plugins, but that didn’'t help.
I tried to launch spark from a java command to adjust the JVM memory, but was unsuccessful in even getting spark to run with the defaults on windows this way. Can it be done?