Interestingly Spark runs on Ubuntu with OpenJDK 10 without any adjustment, but it throws lots of errors in console about plugins and not suitable Java. But at least it runs.
I have tested Spark 2.8.3 and not yet released 2.9.0 build with AdoptOpenJDK 8 JRE 8.0.212 (latest) and it seems to work ok. No errors about plugins like with Java 10+.
About the licensing change with Oracle’s Java. I think it only affects distribution and only starting with 8.0.212 (April update). It was still ok to bundle with 2019 January update. And Spark 2.8.3 is currently bundled with even older version. Again, i don’t think this affects actually using Java with a desktop app. But if you want to be strict about it, then you can remove Oracle’s Java and install OpenJDK 8.
There is no change about supporting newer Java as there are no developers working on Spark.
In future we might avoid bundling Spark and Openfire with Java (even with OpenJDK) to avoid any possible legal issues. So it would be up to users to install appropriate Java.