I’m trying to use plugins in Spark that are in the project, not the ones in APPDATA. I looked at the source code to see how the plugins are loaded by Spark and did not find any way to get into the classpath plugins.
I need this to debug the plugin in order to figure it out better. Is there a way to launch Spark and plug-ins for debug from the intellij idea IDE?
Right now you can’t run a plugin right from IDE. I doing this by manually build the the plugin jarand then the resulted jar to the~/.Spark/plugins` like:
cd plugins/translator/; mvn package
mv plugins/translator/target/translator-3.0-spark-plugin.jar ~/.Spark/plugins/translator.jar
rm -fr ~/.Spark/plugins/translator/
Then when starting the Spark from IDE in debug mode all the breakpoints inside of the plugin works as expected. If I made a change I make a manual recompile of the class and hot reload it. In the IntelliJ “Build” / “Recomplile TranslatorPlugin.java”.
There was some mechanism to load a plugin directly from the sources folder by using -Dplugin option to specify path to the plugin.xml. E.g. -Dplugin=~/src/Spark/plugins/translator/target/classes/plugin.xml. But currently it doesn’t work because the Spark doesn’t load plugin classes dynamically. If it can be fixed I’ll create a task for this.
What a plugin are you trying to debug?
Sorry for the late response.
Hello. I’m very nice that my quanstion was take in attension.
Some time later, while studying the code in search of a solution, I found it in the PluginManager.java line 286 is a hint of what I want, but I was wrong - it didn’t do what I want to see – doesn’t work.
But I didn’t give up.
As result I understood that here me necessarily need using jar file of the for plugin which i want debug, but i don’t want constantly relocate in ~/.Spark/plugins. So I still studing code and solve make next :
APPROACH 1 (but my solve – is only temprery solution in achive my goal as soon as possible)
- Build jar file something plugin (for exmaple meet plugin) in directory
meet\target\plugins
Before building in meet\pom.xml I specify
<properties>
<jar.file.path>${project.build.directory}/plugins</jar.file.path>
</properties>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<outputDirectory>${jar.file.path}</outputDirectory>
</configuration>
</plugin>
${project.build.directory}/plugins == is C:\Users\User\IdeaProjects\Spark\plugins\meet\target\plugins
- Specify in VM options:
-Dappdir=C:\Users\User\IdeaProjects\Spark\plugins\meet\target - Run Spark in debug mode. There plugin-jar file will be relocated from
${project.build.directory}/pluginsto${env.APPDATA}\Spark\plugins. Also there check oldfile and newfile on checksum. PluginManager.java line 196 – it nice!
APPROACH 2 (does the same thing you suggest)
- Before building in
meet\pom.xmlI specify
<properties>
<jar.file.path>${project.build.directory}</jar.file.path>
<spark.plugins.dir>${env.APPDATA}\Spark\plugins</spark.plugins.dir>
</properties>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<outputDirectory>${jar.file.path}</outputDirectory>
</configuration>
</plugin>
<!-- Delete the plugins directory before copying -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<version>3.1.0</version>
<executions>
<execution>
<id>clean-and-copy-jar</id>
<phase>package</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<target>
<!-- Delete the plugins directory -->
<delete dir="${spark.plugins.dir}" />
<!-- Create the directory -->
<mkdir dir="${spark.plugins.dir}" />
<!-- Copy the JAR -->
<copy file="${project.build.directory}\${project.build.finalName}-spark-plugin.jar"
tofile="${spark.plugins.dir}\${project.build.finalName}-spark-plugin.jar"/>
</target>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
Run mvn that buld jar file. Now jar-plugin located in ${env.APPDATA}\Spark\plugins. But before relocate jar file I need delete content from ${env.APPDATA}\Spark\plugins that remove old version plugin but is not very good because delete all content from folder Spark\plugins.
- Specify in VM options:
I didn’t specify anything here. - Run Spark in debug mode.
As result I achive my goals, but I know that presented two approuch don’t finally currect solution maybe.
I think creating a task for this would be a good idea.
Also I want to ask: Is it possible to add slf4j to the project because now logger it don’t convenient?
to enable the logger, I specify in VM options:-Ddebug.mode -Ddebugger=true