Hi, I’m elton.
As a beginner, I’m puzzled with Spark plugin development.
I have developed a rather simple plugin for Spark and it works well.
But after I added more codes, built it, deleted the old jar file in “plugins” directory of Spark and then copied the new jar file to “plugins”, I found Spark still loaded the old version jar but not the new one when I restart Spark. Even after I deleted my own plugin in “plugins” directory, Spark can also load it. Why? How can the new version jar work?
thanks a lot!
sqlerror.txt.zip (681 Bytes)