Installing Plugins - Can''t for the life of me get them working

I’‘m currently trying to learn how to write/use Sparkplugs, but I can’‘t for the life of me get them to work! I’‘ve got them compiling ok with the ant / build stuff, but they aren’‘t working in Spark after I drop them in the plugin directory. For now I’‘ve just been messing with the ExamplePlugin stuff. I’'m using Spark V.

First line of

package com.jivesoftware.plugin;

First line of

package com.jivesoftware.plugin;

DIR listing:

pr0zac@Phoenix:~/spark/sparkplugs/builder/src$ ls


Can anyone help me out? Thanks a bunch.

hey pr0zac,

Check in the home_dir\Spark directory for a logs directory.

This is where all your exceptions will be dumped.

Check the VERY bottom one. They will have time stamps on them.

This is just a guess looking at your plugin.xml file.

Possible Solution

At the end of your plugin.xml file

put this.

Here is an example of a whole plugin.xml



I hope this works for ya.

Otherwise, copy and paste the last exception in the logs folder.

Happy Coding,


Message was edited by: cyclone

Read this a bit more careful :stuck_out_tongue_winking_eye:

You need to create the correct directories to represent the package.



cd src

mkdir com

cd com

mkdir jivesoftware

cd jivesoftware

mkdir plugin

cd plugin

mkdir myplugin


a pain… I know, but works out in the end.