Spark language to the whole Chinese

Hi , everyone

I would like to ask a question - How to make Spark all features and content are displayed as Traditional Chinese?

If you want to change the code, if someone can tell which part of the code change is and the language-related?

Thanks for help.

Language files are located in src\resources\i18n

If you’ll make an updated translation file and wish for it to be included in the next Spark version, then attach it here. It will be enough to attach the i18n file itself. Though it is better to make a diff with only changes, so i can apply it as a patch to the source.

Thank you for your reply

You have to reply to the location-finding, but did not find i18n5 this file,
Therefore asking you once again.

Could you help, thank you.

I have pointed where to find it in the source code of Spark as you mentioned “code”. E.g. igniterealtime/Spark · GitHub

If you want to change it for your Spark, you can also open Spark\lib\spark.jar (in the folder you have Spark installed in) with an archive program, go to i18n inside it, extract the language file you need, edit it and then add it back to spark.jar and let it repack the archive.

Thank you for your reply attentive.

Will the change, you need to restart Spark Change settings?

Thank you.

You won’t be able to edit spark.jar while Spark is running, so yes, you would have to stop it, edit the file, run it again.

Sorry to ask again …

May I ask what the program can be used to write it?

Thank you

i18n files use unicode codes for non-latin symbols. Like: \u63a5 (some Chinese symbol). You can edit this file with a notepad, but you will have to use those codes instead of normal symbols.

I usually edit it in Eclipse which use to edit the source code. It converts non-latin symbols to unicode code automatically (i think it does, haven’t done this for a long, at least Netbeans did). But it is a huge IDE program and installing and configuring it just to edit language files looks like too much. Maybe there are some other programs (like Popeye) to do this, but i haven’t used any of them.

I’ve been using Notepad to edit the source code, also converted to Unicode format store, but after covering the original spark.jar this file, find Spark unable to open, how to do it?