Hi, I want to do some modification on Spark project and generate the build. Anyone can help me how I do it? Some documentation? I’m facing little troubles generating the build using Eclipse IDE.
There is a guide how to setup it in IDE in core/src/documentation in the source of Spark.
I found it, After setup the project it’s just follow the build maven process right? Or theres any other configuration?
Don’t know, not a real developer myself. I use IntelliJ a bit and i just run Startup class if i want to test something in Spark. But i don’t build an installer or package.
So, how or where I can found something to help me to build this project, I’am using eclipse and facing some problems when runing maven on it. There’s no guide how to build?
Depends of what you mean by “build”. In developing language build is almost the same as compile. So, your IDE builds the project when you try to run the Startup class. This process builds all the libraries and jar files.
If you mean creating an exe or installer, then this is not covered and is left for developers to figure out on their own. At this site commercial install4j is used to create exe and installer, but it costs money, so probably not an option for most developers and it is not used directly. Our Bamboo system uses install4j to produce installers automatically. I personally don’t know how it works exactly. You can search the forums and internet about installers. I know some were using IzPack in the past, maybe Advanced Installer (although it is paid also) r some other options to produce exe or installers for Java projects.
Ok, I get it, thanks for the attention, I’m thinking in generate the role project compiled and use Inno Setup, this can be possible? What do you think?
Not familiar with Inno either. Isn’t it only a packaging tool to add all file into an installer (same as IzPack)? Unless you don’t really want to generate exe and will launch Spark.jar directly.
Yeah, Inno Setup pack all things into a installer, and I’m using him on another project. But thanks, my problems now are just generate the .jar of spark project with all dependencies, but probably is because I’m not familiar with maven and java stuffs. If you can guide me on this. But if not, thanks for yout time and patient.
I’m not a developer myself. Just build Spark to test changes. I was using Eclipse in the past, but i didn’t like how it works with maven, so i have switched to IntelliJ. I think @Alameyo is using Eclipse, maybe he can explain how to get everything built.
Sorry to say that but I never worked with Inno or install4j. I don’t know for now how to create such installation. Normally you could do File>export>Java>Runnable Jar and you could just deploy your app as Jar but it seems such jar doesn’t work for me now and I never tried it earlier with Spark.
If you want to run it from source code you need just to go to Startup class and click run as Java Application, eclipse build project automatically before running it.