Installing Plugins - Can''t for the life of me get them working

I’‘m currently trying to learn how to write/use Sparkplugs, but I can’‘t for the life of me get them to work! I’‘ve got them compiling ok with the ant / build stuff, but they aren’‘t working in Spark after I drop them in the plugin directory. For now I’‘ve just been messing with the ExamplePlugin stuff. I’'m using Spark V. 2.0.0.3.

First line of ExamplePlugin.java:

package com.jivesoftware.plugin;

First line of SearchMe.java:

package com.jivesoftware.plugin;

DIR listing:

pr0zac@Phoenix:~/spark/sparkplugs/builder/src$ ls

ExamplePlugin.java SearchMe.java

plugin.xml:

Can anyone help me out? Thanks a bunch.

hey pr0zac,

Check in the home_dir\Spark directory for a logs directory.

This is where all your exceptions will be dumped.

Check the VERY bottom one. They will have time stamps on them.

This is just a guess looking at your plugin.xml file.

Possible Solution

At the end of your plugin.xml file

put this.

2.0.0.1

Here is an example of a whole plugin.xml

=-----

=----

I hope this works for ya.

Otherwise, copy and paste the last exception in the logs folder.

Happy Coding,

-Aaron

Message was edited by: cyclone

Read this a bit more careful :stuck_out_tongue_winking_eye:

You need to create the correct directories to represent the package.

com.jivesoftware.plugin.myplugin.

=----

cd src

mkdir com

cd com

mkdir jivesoftware

cd jivesoftware

mkdir plugin

cd plugin

mkdir myplugin

=----

a pain… I know, but works out in the end.

_Aaron