Compilation Errors in Spark

Hi Guys, I’'m pretty new to Spark development, I downloaded the source from SVN and tried compiling with ant. However, I recevied the following errors. Do I need another set of plugin libraries? Or some an apple dev kit?

build:

Compiling 295 source files to C:\Spark\target\classes

C:\Spark\src\plugins\apple\src\com\jivesoftware\spark\plugin\apple\ApplePlugin. java:12: package com.apple.eawt does not exist

import com.apple.eawt.Application;

^

C:\Spark\src\plugins\apple\src\com\jivesoftware\spark\plugin\apple\ApplePlugin. java:13: package com.apple.eawt does not exist

import com.apple.eawt.ApplicationAdapter;

^

C:\Spark\src\plugins\apple\src\com\jivesoftware\spark\plugin\apple\ApplePlugin. java:14: package com.apple.eawt does not exist

import com.apple.eawt.ApplicationEvent;

Have you found a solution to this problem? I am having the same thing happen to me.

Hi,

it may help to delete the apple folder if you are not compiling on a Mac. I assume that these classes are included in the OS-X JDK.

LG

How can I get com.apple.eawt .* ? I can not find it in source or svn. If you know ,please tell me ! Thank you!

probably a plugin. check the spark source --> plugins directory. compile the apple plugin and move it to your classpath (lib directory).