Switch to JDK7 for install4j in Bamboo?

Daryl says Bamboo is still using Java 6 for building Spark with Install4j. We can try switching to Java 7, as Spark is already bundled with Java 7. Maybe it will even fix the issue that Spark still hooks to Java 6 if it finds it on the system, though it probably needs some fixing in the code. Any objections or insights what this could break? Ideally i would want to have a separate new build (say 624) after switching to Java 7, so we can compare and rollback.

I build with Java 7 at my company with no discovered issues. I had to update the build script to get ant/compiler to quit complaining. Also install4j script file needs to be updated. The compilation does throw a lot more Deprecated Method/Class warnings, such as using URL instead of URI, etc. But it works fine.

– The attached spark-java7build.patch should update these.

I also updated all the other build/packaging script files to reflect the move to j7. Some/most of those look to have been added over time by people with special needs, and aren’t maintained/supported anymore, ie. the AdvancedInstaller script, etc.

I also changed the minimum of java 5 to 6 when I ran accross them, per a previous discussion somewhere on the forums. (I did leave the build/installer/spark_install4j_3_2_1.install4j file’s java minimum at j5 since this install4j script looks to be for legacy use anyways)

I should note, the current Trunk HEAD is failing to build for me… fails due to CheckStyle check… over 16000 “errors”, got 16038 “errors”. Likely introduced by some recent commits, but upon review of the log, almost all of the “errors” are simply stylistic differences, such as { } on different lines, extra spaces, etc. We can either enforce the CheckStyle check, or raise the “error” limit.

– The attached spark-checkstyle.patch should fix this by raising the “error” limit to 16500 (this patch may conflict with the spark-java7build.patch, depending on the order you apply them)

I know nothing about the auto-search for j6, as all my systems run j7, so it’s not an issue for me. I haven’t stumbled upon anything in the codebase that would do it… so my guess is something in install4j is doing it.

I do not know the version of install4j we have on the build server… but it appears to be old (4.x?). The latest is 5.1.7. I have attached a new install4j script that is basically just the build/installer/spark.install4j file opened in install4j 5.1.7 then saved so it auto-updated the script. If no one else has a rapport with the ej-technologies guys, I can email them for updated license keys if necessary.

Install4j may refuse to build the installer for a Spark that’s built with java 7. I get the following error:

Build failed.

Cause: com.a.a.d

Cannot instantiate action org.jivesoftware.launcher.Installer

Stack trace:

com.a.a.d: Cannot instantiate action org.jivesoftware.launcher.Installer

at com.install4j.b.c.a.a(ejt:871)

at com.install4j.b.c.a.a(ejt:765)

at com.install4j.b.c.a.a(ejt:752)

at com.install4j.b.c.a.a(ejt:734)

at com.install4j.b.c.a.a(ejt:647)

at com.install4j.b.c.a.a(ejt:629)

at com.install4j.b.c.a.j(ejt:597)

at com.install4j.b.c.a.b(ejt:186)

at com.install4j.b.c.a.a(ejt:125)

at com.install4j.b.b.aa.a(ejt:64)

at com.install4j.b.b.c(ejt:363)

at com.install4j.b.b.a(ejt:135)

at com.install4j.b.h.a(ejt:435)

at com.install4j.b.h.a(ejt:214)

at com.install4j.b.h.d(ejt:106)

at com.install4j.gui.a.run(ejt:86)

After a lot of head scratching and dead ends, all you have to do is copy a current java 7 jre directory and replace the one in the install4j install directory. Sort of like how people have been replacing the java 6 jre directory inside the spark install directory with that of a java 7 jre install. Or, depending on how the build server is setup, change the environment variable to point to a java 7 jre instead of j6.
spark-java7build.patch.zip (1546 Bytes)
spark-checkstyle.path.zip (468 Bytes)
spark_install4j_5_1_7.install4j.zip (5054 Bytes)

So, as i thought… a “trivial change”

On my dev machine I compile SPARK using JDK 7.

The build.xml is ready to compile SPARK with JDK 7 and also, the resulted .jar packets are compatible with JDK 6, so there are no issues here. (we use -target=1.6 and -source=1.6)

I am mostly using development builds and I am not familiar with what install.4j does to create installers in BAMBOO, but I do think that we should switch to JDK 7 also in BAMBOO, to make sure SPARK is compiled and packaged with JDK 7. I am recommending this because there is at least one memory leak fix in JDK 7 that affects spark (http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=6542440)

I suppose there’s no harm done in leaving source=1.6 but building with java7… ant will just complain a little. Although target should at least probably be set to target=1.7 so that ant makes sure it outputs j7 code (sets the compatibility mode for the javac compiler).

that bug appears to have been fixed in jdk6 actually – however there are a lot of other bugs in j6 that won’t ever be fixed due to EOL, etc. Number of Swing bugs as well as underlying AWT bugs that effect the fluidness of spark.

updating java 7 patch
spark_java7.patch.zip (617 Bytes)