How to update a Spark plugin?

Hi, I’m elton.

As a beginner, I’m puzzled with Spark plugin development.

I have developed a rather simple plugin for Spark and it works well.

But after I added more codes, built it, deleted the old jar file in “plugins” directory of Spark and then copied the new jar file to “plugins”, I found Spark still loaded the old version jar but not the new one when I restart Spark. Even after I deleted my own plugin in “plugins” directory, Spark can also load it. Why? How can the new version jar work?

thanks a lot!

Spark:2.6.3

JRE:1.6.0_29
sqlerror.txt.zip (681 Bytes)

In what directory exactly have you deleted your old plugin? You have to delete it in you program files\or build catalog.

I have deleted my old plugin in folder **“spark_install_path/plug****ins” **and even deleted my whole build catalog, “spark_install_path” is the installation path of Spark.

But when i start Spark, Spark still loads my plugin!

I must delete my plugin from “installed plugin” list in order to unload my plugin.

But the following problem is: when I put new version plugin into **“spark_install_path/plugins”, **that the new version plugin has the same name with the old plugin, I found that Spark still load the old plugin accoring to my program output. Why does this happen? Do Spark has plugin cache or … …?

I’m not a developer. As much as i know that Spark checks plugins in the installation folder and then extracts them into user’s application data folder. Maybe it can’t delete the old one from the user’s folder (/user/AppData/Roaming/Spark/plugins for Win7).

thanks a lot.