Problems with Spark 2.5.0 and JRE 1.6.0_01-b06

Hi,

the JRE did crash a few times during startup of Spark. This could also be a problem of my sound driver.

# Problematic frame:
# C  [jmdaudc.dll+0x1332] (one time the DLL, the other times just an address)
...
Stack: [0x04700000,0x04750000),  sp=0x0474f334,  free space=316k
Native frames: (J=compiled Java code, j=interpreted, Vv=VM code, C=native code)
C  [jmdaudc.dll+0x1332]
j  com.sun.media.protocol.dsound.DSound.close()V+14
j  DirectSoundAuto.supports(Ljavax/media/format/AudioFormat;)Z+17
j  DirectSoundAuto.<init>()V+219
v  ~StubRoutines::call_stub Java frames: (J=compiled Java code, j=interpreted, Vv=VM code)
j  com.sun.media.protocol.dsound.DSound.nClose(J)V+0
j  com.sun.media.protocol.dsound.DSound.close()V+14
j  DirectSoundAuto.supports(Ljavax/media/format/AudioFormat;)Z+17
j  DirectSoundAuto.<init>()V+219
v  ~StubRoutines::call_stub
j  sun.reflect.NativeConstructorAccessorImpl.newInstance0(Ljava/lang/reflect/Constructor;[Ljava/lang/Object;)Ljava/lang/Object;+0
j  sun.reflect.NativeConstructorAccessorImpl.newInstance([Ljava/lang/Object;)Ljava/lang/Object;+72
J  sun.reflect.DelegatingConstructorAccessorImpl.newInstance([Ljava/lang/Object;)Ljava/lang/Object;
J  java.lang.reflect.Constructor.newInstance([Ljava/lang/Object;)Ljava/lang/Object;
j  java.lang.Class.newInstance0()Ljava/lang/Object;+118
j  java.lang.Class.newInstance()Ljava/lang/Object;+15
j  org.jivesoftware.smackx.jingle.mediaimpl.JMFInit.detectCaptureDevices()V+13
j  org.jivesoftware.smackx.jingle.mediaimpl.JMFInit.run()V+9
j  java.lang.Thread.run()V+11
j  org.jivesoftware.smackx.jingle.mediaimpl.JMFInit.<init>([Ljava/lang/String;Z)V+76
j  org.jivesoftware.smackx.jingle.mediaimpl.jspeex.SpeexMediaManager.setupJMF()V+131
j  org.jivesoftware.smackx.jingle.mediaimpl.jspeex.SpeexMediaManager.<init>()V+19
j  org.jivesoftware.sparkplugin.JinglePlugin$1.construct()Ljava/lang/Object;+133
j  org.jivesoftware.spark.util.SwingWorker$2.run()V+8
j  java.lang.Thread.run()V+11
v  ~StubRoutines::call_stub

This happens always when Spark is running two times.