How to use dlls in spark-plug?

hi

I’m using dlls in spark-plugin, and don’t want to put the dlls in the system directory, I want them be with the spark, but if I do so, I cann’t find the dependent dll, I dont know where to put the dependent dll;

how should I do with the dependent dll? just like the Spark\lib\windows\jmutil.dll (other jmf dlls depend on jmutil.dll)

thanks very much.

Hi Solar,

You can use the code below to add dlls that are contained within the lib dir of your sparkplug:

private static void addClasspath() {
   try {
      File libDir = (new File(PluginManager.PLUGINS_DIRECTORY, "mysparkplug/lib")).getAbsoluteFile();
      String path = libDir.getAbsolutePath();
               Field field = ClassLoader.class.getDeclaredField("usr_paths");
      field.setAccessible(true);
      String[] paths = (String[]) field.get(null);
      for (int i = 0; i < paths.length; i++) {
         if (path.equals(paths[i])) {
            return;
         }
      }
      String[] tmp = new String[http://paths.length + 1|http://paths.length + 1];
      System.arraycopy(paths, 0, tmp, 0, paths.length);
      tmp[http://paths.length|http://paths.length] = path;
      field.set(null, tmp);
   } catch (Exception e) {
      Log.error(e);
   }
}

Hope this helps,

Ryan

thank you so much, It does help me!:slight_smile:

but,

1 :what’s the meaning of “usr_paths” and http://paths.length?

2 : when should I call this function? how about the initialization of myplugin?

thanks

My experiences with the same issue …

  1. Attempting to load “dll’s” by changing the value of “java.library.path” from within code as shown above did not work. The value of “java.library.path” did change though.

  2. Adding “dll’s” to one of the directories within the existing value of “java.library.path” does work.

  3. I downloaded/built Spark from code. In the target/build/bin/startup.bat script I added the required directories to “java.library.path” and viola that works. This is the best solution for me todate.

/rk