Hi, I’m new to Spark; I tried to build it from the source code in svn, and I’m having trouble with importing all the non-jivesoftware packages. I (painfully) resolved the XMLPull API and XPP3 ones, but there are still a fair number of them left, some of which you have to BUY, e.g. install4j.com. Any tips on how to properly build from source? Or whether some of these external packages (like synthetica) can be bypassed (not interested in look-and-feel, for example). Thanks!!
Hi Michaela, welcome to Ignite Realtime community B-).
What tool do you use to compile Spark? If you use Eclipse you might want to have a look at Openfire SVN + Eclipse 3.3 + Subversive Installation Guide. It is for Openfire, but perhaps it could also work with Spark. Your interest would probably begin at the “Create Openfire Project” topic.
Hope that helps.
I tried building Spark using the guides from the doc just now, and it was successful on the first run! Admittedly, you need install4j to execute some of the ant tasks, but the good news is: you just need to worry about install4j, everything else is well taken care of.
FYI, I downloaded install4j from http://www.ej-technologies.com/download/install4j/files.php. If you just care about being able to build Spark, an evaluation license should suffice.
Thanks so much aznidin, it all worked much better following your instructions. Below I attach the changes you would make to your openfire doc so that it will be applicable to spark (besides the obvious openfire <- spark). However I still have build errors in the plugins directory, most of the files there complain that the import com.apple or com.google, etc. cannot be resolved. I think we are very close now
– changes to DOC-1020:
In the “Create Project Builder” section, instead of selecting ServerStarter - org.jivesoftware.openfire.starter, select Startup - org.jivesoftware.launcher
Instead of DopenfireHome="$/target/openfire" -> replace openfire with spark
Also instead of openfire::src::i18n folder use spark::src::resources::i18n, and instead of
openfire::src::resources::jar, do not select anything (there is no “jar” directory).
At this point there are still compile errors under src. Particularly, spark\src\java\org\jivesoftware\GSSAPIConfiguration.java has an error that is corrected by moving the GSSAPIConfiguration to org.jivesoftware. But most of the errors remain in the src/plugin directory.
I suppose you have your auto-build feature on, and you’re looking at Eclipse’s “Problems” view and see all those plugin related errors. You need not worry about them. Those are part of Spark SVN directory structure and files that Eclipse include in the project when you created it. You can safely ignore them unless you plan to customize and build the plugins.
I think most of those libraries are not available for download. In fact, if you look at Spark’s main build.xml there are no build task for them except for apple and growl in mac.build.plugin ant task. I don’t think you want to run the task. Those plugins don’t even exist as a part of official Spark distributions and “Available Plugins”. Also you don’t need to define -DsparkHome="$/target/spark".
Anyway, you inspired me to write a document for Spark! Thank you.
I went through and try your suggestion and it worked. I managed to debug and stop at the main() method in Startup.java. Well, there’s actually one tiny problem. You actually need to drop i18n from spark::src::resources::i18n.
Worked wonderfully. Couple of mods to the new doc: Main class should be org.jivesoftware.launcher.Startup (doesn’t work w/o Startup), and after the Add classes you need to insert the following step:
-On the Folder Selection window select spark::src::resources folder and click OK.