Spark installer .jar for OpenSolaris/OpenIndiana


Around the 2.6.0 beta of Spark, there was a .jar installer available which I used to test on OpenSolaris. I note this is no longer available, and the existing installers appear to be Linux-only, if the error message generated when attempting to run the .tar file installer is anything to go by:

Preparing JRE …

unpack200: Cannot find /lib/

./Spark: line 150: 2477: Killed

Is there any interest or resource from the current Spark maintainers to make Spark available for OpenIndiana? Alternately, is there easy documentation I could point a developer to to create a working installer of Spark for OpenIndiana?



You can search the forums for ‘izpack’. Those jar installers were made with that software. Maybe you’ll be able to create your own installer. I have also filed the request SPARK-1430.

Well, I am afraid, there will be no Solaris version for Spark in the near future. Unless you do it or find a java developer for Solaris and encourage him/her to digg into the Spark code. The old Solaris code may be available in the older code versions of Spark (available in a SVN).

Lot’s of code segement in Spark are open source by no (compared to the last commercial Spark client). I would dare to guess that this will cause issues on Solaris.

The Mac edition of Spark was and is pretty close to disappear since no Mac people are around and Apple is working to get rid of Java on Mac OS.

Thanks wroot,

Cheers Dave

Thanks Walter. To be clear I am not talking about Solaris: OpenIndiana is presenty completely open source. I will be looking therefore to engage a developer to build an OpenIndiana version of Spark. I mean, Spark is available for WIndows and that’s not exactly open source.

Your comments about Mac OS are interesting - I can understand there being a lack of developers with Mac OS experience working on Spark, but can you me point to any concrete evidence that Apple is “working to get rid of Java on Mac OS” and why that would make also engaging a developer to work on improving Spark for Mac a waste of time? Mac OS is not my platform of choice these days, but as far as I understand they have simply removed the JRE from being bundled with the OS (i.e. the same as what Windows has always had), but installation if needed is very straightforward - also, they are contributing proprietary bits to the OpenJDK project.



As fas as i know, Apple has removed (or blocked) some of the Java functionality on Mac OS X, so it was very hard to create a stable version of Spark 2.6.x for that platform. A few developers working on Spark this spring/summer had to do workarounds because of these changes and this can cause problems in future.

There was no intend to suggest thet OpenIndiana is not a good plattform. My statement was ment to show the issues this project has. We barely manage to get the primary plattform (Windows) supported. Linux is already causing regular issues during the build process (maybe only due to imperfect configuration of Bamboo).

Apple has reduced the integration of Java in Mac OSX. Apparently, there are several usual Mac functions (like pop up) not reachable via Java anymore. Java is running, but it is not tightly integrated anymore.

Hi Walter,

Thanks for the info, and sorry if I came off as sounding a bit defensive.



Hi Walter/All

I potentially have a developer available who is looking into the feasibility of building Spark for OpenIndiana and creating an attendant installer. If this works out it would be beneficial to both Spark and OpenIndiana development and project visibility.

The developer has asked for source code access - I am not a developer myself so would you be able to let me know how this can be arranged? And in general, what would be the most appropriate/effective way for him to communicate with the rest of the Spark developers (such as yourself) and the wider Spark development community - basically how best to get started?

I would like to contribute something to this project, rather than make vague claims of being able to donate “time” etc. I feel this would be something concrete.



You can access the source in read-only mode via SVN Unless you will need to edit the source, then you should talk to Walter. If it is one time edit, then maybe just a patch will do.

There is no other special place to talk with developers (we don’t have many of them) than this forum. Also you can join the open_chat on server. Usually some of us gather there on Wednesdays (at at 10:00am PST (17:00 UTC/GMT) i believe). You can also agree to meet someone there and discuss issues.

Walter, i wonder is it possible to tune the Bamboo to provide jar installer? Would be nice to have all installers built automatically, so we won’t have to wait until solaris installer is provided. If it is not possible, well, then Dave or his developer would have to send/upload the installer for us every time new version of Spark is released. Or maybe OpenIndiana also has a package manager and Spark can be provided as a package to solaris users?

Hi Dave,

everything to get started about development is documented here:

If you want to have an automated build for Spark, you would need aa bamboo agent to be set up an tested. Bamboo ( is our nightly build generator ( Creating a plan requires some testing & experience. The best way would be to get a Bamboo up & running for you that fits the need of the community. There is a 10$ bamboo license available were 10$ go to charity. Might be worth while to spend.

With respect to OpenIndiana enhancements, I would suggest to discuss a Spark/OpenIndiana road map in the Spark Dev Forum and submit patches to the code via Spark Dev. I would prefer to have issues in Jira ( for the OpenIndian changes, just to ensure that everything is documented.

I habe no idea about OpenIndiana installers/packager etc. I am sure that you will have something available on that plattform.