Spark SVN + Eclipse 3.3 + Subversive Installation Guide

This guide assumes that you are installing everything from scratch. If you’ve done some parts of them, this guide may still be useful. I compile this guide to the best of my knowledge. I apologize if it doesn’t work for you.


  • This guide assumes that you want the latest updates of the source i.e. from the project’s trunk directory. If you only want the released/stable version, checkout the desired release from under the tags directory.

Install JDK

  • Download JDK and install them. The least version should be 1.5. I use 1.6. Sorry, no instruction for this.

Install Eclipse 3.3

  • Download Eclipse 3.3 from I use Eclipse IDE for Java EE Developers. You should at least use Eclipse IDE for Java Developers.

  • Extract the downloaded zip file into C:/Program Files/Eclipse.

  • Open C:/Program Files/Eclipse folder.

  • Right click and drag eclipse.exe on to your desktop (or Windows taskbar) to create a shortcut icon.

  • Right click the shortcut icon and choose Properties. The Eclipse Properties window will show.

  • The Target textbox should read something like this “C:\Program Files\Eclipse\eclipse.exe” -vm “C:\Program Files\Java\jdk1.6.0\bin\javaw” depending on the JDK that you use and where you installed it.

  • Close the Eclipse Properties window.

Install Subversive Plugin

  • Double-click the shortcut icon to start Eclipse.

  • Select/enter your preferred workspace and click OK to open Eclipse main IDE window.

  • Click on the Workbench icon to close the welcome screen.

  • Click Help::Software Updates::Find and Install… menu.

  • Click on Search for new features to install and click Next.

  • Click on New Remote Site… button.

  • Enter Subversive in the Name box and in the URL box (Check the latest URL from website), then click OK.

  • Click Finish to install Subversive. Eclipse will search for the update site and show the result in a next window where you will select the features to install. I choose everything under Subversive SVN Team Provider Plugin and Subversive Client Libraries.

  • Click Next to continue and so on until the installation ends. You normally want to restart Eclipse when the installation ends.

Check Out Spark SVN

  • Click Windows::Open Perspective::Other… menu.

  • Click on SVN Repository Exploring on the Open Perspective window and click OK.

  • Right-click on SVN Repositories screen and choose New::Repository Location…

  • On New Repository Location enter in the URL box and click Finish. You’ll see the URL location in the SVN Repositories screen.

  • Expand the URL location.

  • Expand the spark tree.

  • Right-click on trunk and choose Check Out. Make yourself some Mocha while waiting for the checkout to complete.

Create Spark Project

  • Click Window::Open Perspective::Java menu.

  • In the Project Explorer screen, if there is a spark project, delete it. This project was created during the Spark check out process. Yes you read it correctly, DELETE the project!!! Otherwise you’ll have to setup your Spark development environment manually. On the Confirm Project Delete choose Do not delete contents, then click Yes.

  • Click File::new::Project… Notice the ellipses!!!

  • Select Java::Java Project and click Next.

  • On the New Java Project window choose Create project from existing source and browse to where spark folder is located under your workspace.

  • In the Project name box enter exactly as spark. Otherwise, the Next and Finish button remain disabled. Click on Next. Eclipse will read the directory structure to setup the environment automatically (almost) for you and you can see what it does on the next screen. Then click on Finish.

  • If the Open Associated Perspective windows opens, click Yes.

Build Spark

  • Click Window::Show View::Ant menu.

  • Right-click the Ant screen and choose Add Buildfiles…

  • Expand the spark::build folder and select build.xml, then click OK.

  • On the Ant screen, expand the Spark and double-click on release ant task. The build may fail because you’re checking out the daily updates of Spark sources, which may contain bugs. If so, wait for another day and hope that the developers discover and fix the bug; or you might dare to fix it yourself. During this first time setup, a successful build is necessary before you can proceed with the remaining tasks below.

Create Project Builder

  • Click Run::Open Run Dialog… or Run::Open Debug Dialog… menu. A Run window shows.

  • Select Java Application and click on the New button.

  • On the Main tab of the Run window, change the New_configuration name to Spark or anything you like.

  • Click on Project::Browse button and select spark and click OK.

  • Click on Main class::Search button and select Startup - org.jivesoftware.launcher and click OK.

  • I’d suggest that you select Stop in main check box so that you could later verify that debugging works.

  • Click on Classpath tab.

  • Select User Entries so that the Advanced… button will be enabled.

  • Click on the Advanced… button.

  • On the Advanced Options window select Add Folders and click OK.

  • On the Folder Selection window select spark::src::resources folder and click OK.

  • Click on Common tab.

  • Select the Debug and Run check box.

  • Click on Apply button.

  • Click on Close button.


  • The setting is now complete for Spark.

  • You may test running and debugging by clicking on Run::Run History::Spark and Run::Debug History::Spark respectively. If you choose the later and if you follow this instruction closely, execution will stop in the main method of

Related Documents


Thanks to @Michaela V who impired me to write this document.

thanks for manual but I have one problem…

after doing all this

I have some errors in project

reason are plugins…

I tried to add plugins build xml files but it did not helped or i did it in wrong way.

The svn repository address needs to be updated from to

A lot of the files for the Mac version are missing, how would I go about obtaining them?

Can anyone please post a step by step instruction to package for windows via eclipse? The above steps, I believe, did not mention anything about packaging and deployment for windows or any other platform.

Thanks a lot for the above valuable step by step tutorial for the build though.

This document does work for Windows. I followed it step by step and my Windows build works flawlessly.

Its a good how to for eclipse and spark. but how to create a msi or .exe with this solution ?


Excuse me for my answer i search and i find the solution

Why the source code i donwload got so many errors? all the package is named with* but inside the code it is without java. it is start with org.

i have used both the links,but its not downloading the repository,


** and


is there any new link updated for this?


Please help me…

Aznidin - excellent job - thank-you for sharing this with the community

Is there a free alternative to using “Install4j”? Thank you.

Nice document. Just provide some additional information on plugin (such as fastpath) running and debugging in eclipse IDE.

  1. In the “Create Project Builder” (may “Create Debug and Running Configuration” is more properly) step, add following VM arguments on “Arguments” tab,

-Dappdir="${workspace_loc:spark/target/build}" -Djava.library.path="${workspace_loc:spark/target/build/lib/windows}"

  1. On “classpath” tab, in addition to adding the “spark::src::resources” folder to classpath, the plugin’s src::resource folder should also be added.

That’s all.

any suggestions for following error :

java.lang.SecurityException: Prohibited package name:

at java.lang.ClassLoader.preDefineClass(

at java.lang.ClassLoader.defineClassCond(

at java.lang.ClassLoader.defineClass(





at Method)


at java.lang.ClassLoader.loadClass(

at sun.misc.Launcher$AppClassLoader.loadClass(

at java.lang.ClassLoader.loadClass(

java.lang.SecurityException: Prohibited package name:

at java.lang.ClassLoader.preDefineClass(

at java.lang.ClassLoader.defineClassCond(

at java.lang.ClassLoader.defineClass(





at Method)


at java.lang.ClassLoader.loadClass(

at sun.misc.Launcher$AppClassLoader.loadClass(

at java.lang.ClassLoader.loadClass(

I did try by removing src/java folder from build path but then it would complain about classes not found from this package!

This one is closed by itself ! I just did a clean ,restart of eclipse and build.

Did any body get the plugin in working , after the installation , I ddin’t get the translator plugin working

I even don’t see the drop down menu for the translator plugin …but under installed plugin …I see the translator plugin name

any way to get it to work

Thank you


Hello. Thank you for your instructions. I have been able to follow them so far without any problems until I get to Build Spark section of the instructions.

I have gotten as far as the Build Spark section of the instructions which yeilds the following results:

***** Results from Build Spark *****

Buildfile: C:\Documents and Settings\eadeleye.DELANEYRAD\workspace\spark\build\build.xml
[checkstyle] Running Checkstyle 5.3 on 338 files
[echo] Apache Ant version 1.7.1 compiled on June 27 2008
[echo] Java Version: 1.6
Total time: 7 seconds

****** End of Results from Build Spark ******

I am unsure as to how to proceed further to edit the Do I edit Line 409 in using “Eclipse for IDE for Java Developers”? Once I successfully edit the file, how do I create the .exe file for installing the custom build of Spark on workstations and computers on my network? My goal is to create a custom excecutable of Spark Messenger that doesn’t have the preference options installed in the client to prevent end-users from modifying any of the settings.

Hi all,

I am done with environmental setup in eclipse but when I try to run spark, I am getting following error :-

SEVERE: Profile contains wrong format: “login name” located at: %APPDATA%\Spark\user\username

Kindly help me solve this issue.