Yes a version 2.7.0 of Spark is still in the works, although admitedly progress has been slow (lack of active developers at the moment, everyone’s been pretty busy!).
Spark 2.6.3 will indeed work with Java 7, and it does fix some of the Java 6 related issues some users experience.
I must note though, that the regular spark install uses it’s own embedded java version, which can’t be exploited in the same way as your IT company is thinking. They are probably concerned about the recent java exploits, which have all been against java applets. The embedded java version can’t be used by your browser by default, and there’s no browser plugin, so basically it’s not vulnerable to these attacks.
As I mentioned earlier, java 7 will work fine with spark version 2.6.3. To setup java 7 on spark, simply install spark, then copy the java 7 jre directory from your Program Files and replace the jre directory that’s found inside the Spark installation directory.
copy: C:\Program Files\Java\jre7 to C:\Program Files\Spark\
then rename jre7 to just jre – replacing the original jre directory that was inside the Spark installation directory. Restart Spark, now it’s using Java 7.
If your IT company is still worried, they always have the option to custom compile their own version of spark, or use one of the beta/nightly 2.7.0 builds. The beta/nightly builds may not be as “stable” as a regular release, but know that many of us are using it in production environments just fine… and they come bundled with java 7 already