Spark source problem

i debug spark source by eclipse, and it reporting a error, like following:

Exception in thread “AWT-EventQueue-0” java.lang.ExceptionInInitializerError

at org.jivesoftware.LoginDialog$LoginPanel.(Res.java:33)

… 12 more

after, i check the codes and found the part of that source codes is requisite in that package.

And i don’'t know what can i do to resolve this problem.

PLEASE HELP ME!

THANKS!

It’'s this portion of your stack dump that is important:

Caused by: java.util.MissingResourceException: Can’'t find bundle for base name i18n/spark_i18n, locale es

at java.util.ResourceBundle.throwMissingResourceException(ResourceBundle.java:836)

You just need to include the spark/src/resource directory with your source.

Not sure how Eclipse does this, but Netbeans just has a section to load in all the source code. Which you would include this directory as well.

It contains the i18n/spark_i18n file and many more.

_Aaron

Thanks for your help!

In fact, i had copy the spark/src/ directory to my workspace of Spark. Beacuse i must do it when i ant the Build.xml file. But I have not found where is the i18n/spark_i18n.

And when i run the java app whatever the build file is Build Successful, it all ways has the problem with that.

and now i doubt there has some deletion in my PC or Eclipse.

I reinstall the Eclipse now!

Hello Aaron

i am having the same problem but i debug the spark source by IntelliJ IDEA 5.0. Any suggestion?

another thing is that i saw same question on Spark Support community and the suggestion that have been given is:

Simple copy spark_i18n_en.properties file in spark.jar\i18n\ with name of your locale. Ex: spark_i18n_zh_CN.properties or something like this.

I can not understand this suggestion either. any idea ?

thanks

Regards

Rita

Hey Rita,

It seems that was for someone who read Chinese maybe?

I don’‘t know how IDEA does it either, it has been a long LONG time since I used IDEA (since it wasn’'t free anymore)

I’'d would think it would be the same way you imported the source files for your Spark project.

Just import the resource directory as well.

You should have gotten this directory with your subversion checkout.

directory is: spark\src\resources\i18n\spark_i18n.properties

In how to import source to an existing project for your IDE, should be looked into by you though.

If I had more time I would go through their website and try to find it for you.

Hope this somehow helps,

_Aaron

Message was edited by: cyclone

Hey Aaron…thanks for ur reply.

i can try netbeans. can u elaborate it more how to solve this issue on netbeans

Regards

Rita

It is pretty easy with Netbeans and I don’'t want people thinking I am trying to convert everyone TO Netbeans.

Especially some of the Eclipse users out there.

Here are the steps with Netbeans though:

Just use the Project Creation Wizard.

=-----

Click File>New Project

Under Categories Select >General

Under Projects> Project With an Existing Ant Script

Click Next

Under this next step you’'ll want to select the directory where you put Spark.

Then in that directory go to the build directoy for the build.xml Ant Script.

Click Next

Here is where the trick is.

You will want to add in the directory path of spark\src\java for the source

AND you need to add spark\src\resources <— This is where you was having your problems with IDEA

Then if you want, you can create a directory called tests and add a tests directory. It is usually a good idea.

Click Next

Here you will want to put ALL the .jar’'s in spark\build\lib

This is including the jars in the sub directories of lib as well.

You should be finished after that.

=----

Their website has a better tutorial for this type of example.

It is just a typical Project setup.

Make sure you read the steps VERY carefully in the Netbeans Project Wizard.

Good luck and Happy Coding,

_Aaron

thanks for ur detailed reply Aaron.

i followed the above mentioned steps using Netbeans.

After this i build the Main Project which was successful. Then i run the Main Project and the Output window shows the following contents

init:

resources:

base:

Building jar: E:\Spark\target\build\lib\base.jar

build:

build_tests:

run_tests:

BUILD SUCCESSFUL (total time: 4 seconds)

When i run the project i should have spark messenger client window (where i can enter my user id and password) instead of this out put window.

how can i have the required output ? I just follow ur given steps and did nothing else. Is there any other steps that i need to follow

Regards

Rita

You may want to use the startup.bat in spark\target\build\bin AFTER you have compiled Spark though.

This way it loads all the necassary libs and gives you a console screen with debug information as well.

I also recommending reading some documentation on Netbeans at netbeans.org.

They should have any answers you need for netbeans questions.

If I had more time I would answer all your netbeans questions. but it would be easier for you to check out their site.

u mean after compilation i should go to command prompt and then use startup.bat file to execute the project right ?

can i somehow use startup.bat file in netbeans to execute the project?

thanks

regards

Rita

yes…

And I don’'t think you can with Netbeans.

Netbeans is just using the ant script to run the project.

So whatever rules you can set with ant.

That u say the Netbeans, it is just for BUILD.XML .

Yes, the codes can BUILDSUCCESSFUL with build file.

But it is always occured that error when i RUN JAVA application in ECLIPSE.

why?

And netbean, i think it is like the eclipse.

Thanks for ur help.

alonlier:

I am taking a guess to what your question might have been.

You will not want to do a BUILD RUN from Netbeans.

Instead use the supplied startup.bat and startup.sh scripts.

These will also output debugging information that could be useful.

thanks

Aaron

Hi cyclone:

Thanks for you help, but i have not a good idea yet.

Please tell me more detail about you had said.:slight_smile:

3Q

Can you refine your question a bit?

Are you just trying to figure out how to run spark once you have compiled it?

thats actually what im wondering…

i followed what you said and my output console says this…

init:

resources:

base:

Building jar: /Users/stevenguitar/Development/Spark/target/build/lib/base.jar

build:

build_tests:

run_tests:

BUILD SUCCESSFUL (total time: 1 second)

i guess i was sort of expecting to see the spark application fire up. i am running on mac os x btw, so i guess i should try to run that .sh file from the command prompt? i noticed that when i built it in netbeans it created a “target” directory. is this where i need to get the startup.sh file from?

Hi,

Stguitar,i think the “target” directory already exists within spark directory; spark/target, NetBeans creates a folder called: /nbproject

AND

i’'m having the same problem too, no exceptions appears while running,

but Spark itself is not executed!

i also prompted spark using startup.bat (for windows) in the spark/target/build/bin folder, and got this exception:

exception in thread “main”: java.lang.NoClassDefFoundError: org/jivesoftware/Spark

hope, someone could help us…

Thanks,

HI cyclone,

Thanks for your suurport always.

Those times i drop the Spark work away had long time. And now I again to work it, so this problem will be reapearing in front of my face yet. but this time i must modifing or write little parts codes as Spark plugin add in Spark.

So, i also have a lot of problems whatever i haven’'t found, and i would post them on here, i hope you can help me. but i think the problem will be codes! yeath!

Another, about this problem is i run Spark in Eclipes,not in NetBean. i complied it —Successful. but when i run java application in Eclipes, it report the error . when i packaged the source codes with install4j tool, it will work OK.

so i think the Spark source codes be can’‘t run in Eclipse. To NetBean, i don’'t know what would be occured, because i never used it. Sorry!

Thanks again!

Message was edited by: alonlier

I use Netbeans as just my editor and compiler.

I then use startup.bat/startup.sh for the actual running of Spark.

Another thing you can do is copy over the spark.jar from the target\lib directory into your Spark\lib directory.

Then just run spark normally like you usually do. Double Clicking it from desktop or the such.

I’'ve not really ever concentrated on running spark from Netbeans or Eclipse, mainly cause the required libs for it and they have been changed a little bit from release to release.

Just make sure the paths are correct in your startup.bat or startup.sh file.

_Aaron