Need help on how to set up spark to be able to run in IDE (NetBeans/Eclipse)!

Hi all,

I’m very new to spark and also quite new to java.

I would like to know how to setup the environment for to be able to run Spark in IDE

I have read the instructions in the development guide but it’s not clear to me.

Developer documentation isn’t helpful enough for a newbie to start up

To setup a project to run Spark within your IDE, you will need to the following:

  • It is required that you use the 1.4 JRE to build Spark.

  • Add all *.jar files in the Sparkplugs/spark/lib and Sparkplugs/spark/lib/windows directories to your classpath.

  • Add the resource directory (Sparkplugins/spark/resource) to your classpath for the native libraries.

  • Main Class - org.jivesoftware.Spark

  • VM Parameters - -Dplugin=path_to_your_plugin.xml file. This allows you to run your plugins within your IDE without deploying.

  • That’s it.

Do I need to create my own project in NetBeans/Eclipse?

Then how?

I’ve downloaded sparkplug_kit_2_0_7.zip.

Do I also need to install spark_2_5_8.exe in my machine?

Tried to search in the forum also but didn’t see any helpful topic.

May be I missed out

Can anybody help me to provide the steps that needs to do to run spark in IDE?

Kindly provide me the detail steps if possible.

Thanks a lot.

-Julia

I have the same question.

I cannot manage my net beans to be able to build example plugins.

Anyone have a suggestion ?

Hi,

Are these the docs you’re talking about in setting up your development environment?

http://www.igniterealtime.org/community/docs/DOC-1521

http://www.igniterealtime.org/community/docs/DOC-1020

They’re very good. Instead of using the trunk branch of the repository, use something in the tag branch. These are guaranteed to compile and work so if you get say svn.igniterealtime.org/svn/repos/spark/tags/spark_2_5_8 and try compiling and it works, then that is a good baseline to start looking at the trunk branch. This is what I did to make sure my environment was set up correctly.

For some more help: in NetBeans, make sure that Spark can Clean and Compile successfully first (set it up as per the NetBeans doc above). Once that is done, you can go down to the, for example, src/plugins/fastpath/build folder and then right click on the build.xml and click Run on the jar option (it should already be bolded by default). This should build the plugin’s jar file (usually located in the src/plugins/fastpath folder).