Spark 2.6.0 does not work in Mac OS Leopard

I noticed I couldn’t get Spark 2.6.0 to work even in Snow Leopard until I did a Mac software update and installed all of the latest updates. Once I did that, I was very pleased with the progress that was made on the Mac side from RC1 to the final version.

I’m actually OK with Spark not working in Leopard, as I’d rather you focus on current gen software. But at the moment, I do have a couple of Mac clients that won’t be able to update to Snow Leopard for a bit and I thought I’d see if anyone could offer me some guidance:

  1. Any suggestions to try and get Spark 2.6.0 working in Leopard? (I already tried doing all available updates for Leopard)

  2. Failing that, I need to downgrade to Spark 2.6.0 RC1 (which worked, but with several bugs, in Leopard), however, I don’t see anywhere I can download old versions of Spark…where can I get the Spark 2.6.0 RC1 dmg file?

you can always build it yourself, osx comes shipping with everything needed (except for svn)

  1. checkout RC1 sources (in terminal):

svn checkout -r12000 [http://svn.igniterealtime.org/svn/repos/spark/trunk](http://svn.igniterealtime.org/svn/repos/spark/trunk) ~/spark

  1. compile with ant (still in terminal)

ant -f ~/spark/build/build.xml mac.app

or

ant -f ~/spark/build/build.xml installer.mac

all assuming that ~/spark points to a folder called spark in your home.dir


by does not work in leopard, what exactly do you mean?

Wolf_P. wrote:

by does not work in leopard, what exactly do you mean?

Same behavior I had in Snow Leopard until installing all OS updates: if I click to start, the Spark icon bounces once on the dock to indicate it is trying to start, then nothing. Does not show up in running processes list, so I assume it is crashing at startup.

could you post your errorlog?

located @ ~/Library/Application Support/Spark/logs/error.log

i would advise deleting it once, starting spark once, and then posting it, to make sure the errorlog is really about the current issue and not older ones

also there is a Console.app in your Application folder, which will also have some additional output regarding specific java issues, if that might be the case

also you might try to remove the apple and growl plugins located at

Applications/Spark.app/Contents/Resource/plugins

and

~/Library/Application Support/Spark/plugins

as these will only work with the newest apple-java

There is no Spark folder under Application Support, nor is there a Spark.app folder under Applications

I should note that the two systems running Leopard are also running in Spanish, so Library is Libreria and Applications is Applicaciones, but I’m sure these are just virtual names and it shouldn’t affect anything?

if theres no Spark.app in you applications-folder, where is it then?

also note that Spark.app refers to a bundle and might be displayed as “Spark” instead of “Spark.app”

try locating the spark folder

open finder, press SHIFT+CMD+G and type:

~/Library/Application Support/Spark/

it should bring you to the right folder

for spark.app

open finder, press SHIFT+CMD+G and type:

/Applications/

then navigate to Spark, right click it and select “show package contents”

then navigate to …/Content/Resources/

for the rest procede as described in above post

or in terminal

rm /Applications/Spark.app/Contents/Resources/plugins/apple.jar

rm /Applications/Spark.app/Contents/Resources/plugins/growl.jar

rm -r ~/Library/Application\ Support/Spark/plugins/growl*

rm -r ~/Library/Application\ Support/Spark/plugins/apple*

I followed your instructions exactly, but the ~/Library/Application Support/Spark folder does not exist

I was able to delete apple.jar and growl.jar by viewing package contents, but the application still does not start, and no logs are created …

Not very helpful I know.

I’ll just try to compile Spark RC1 for now.