Error when opening spark on linux

Installed spark rpm package on centos 6.9. Error thrown while opening the spark app,

Please help me to resolve this issue.

[root@localhost bin]# ./startup.sh

using classpath: /usr/share/spark/lib/jdom.jar:/usr/share/spark/lib/log4j.jar:/usr/share/spark/l ib/lti-civil.jar:/usr/share/spark/lib/fmj.jar:/usr/share/spark/lib/jspeex.jar:/u sr/share/spark/lib/libjitsi.jar:/usr/share/spark/lib/zrtp4j-light.jar:/usr/share /spark/lib/jna.jar:/usr/share/spark/lib/bcpkix.jar:/usr/share/spark/lib/bcprov.j ar:/usr/share/spark/lib/bccontrib.jar:/usr/share/spark/lib/ice4j.jar:/usr/share/ spark/lib/osgi.core.jar:/usr/share/spark/lib/startup.jar:/usr/share/spark/lib/li nux/jdic.jar:/usr/share/spark/resources

java.lang.reflect.InvocationTargetException

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.ja va:43)

at java.lang.reflect.Method.invoke(Method.java:498)

at org.jivesoftware.launcher.Startup.start(Startup.java:88)

at org.jivesoftware.launcher.Startup.main(Startup.java:38)

Caused by: java.lang.ExceptionInInitializerError

at com.jtattoo.plaf.AbstractLookAndFeel.(AbstractLookAndFeel.java:42)

at java.lang.Class.forName0(Native Method)

at java.lang.Class.forName(Class.java:348)

at javax.swing.SwingUtilities.loadSystemClass(SwingUtilities.java:1874)

at javax.swing.UIManager.setLookAndFeel(UIManager.java:582)

at org.jivesoftware.Spark.loadLookAndFeel(Spark.java:258)

at org.jivesoftware.Spark.startup(Spark.java:173)

… 6 more

Caused by: java.awt.HeadlessException

at sun.awt.HeadlessToolkit.getScreenSize(HeadlessToolkit.java:284)

at com.jtattoo.plaf.JTattooUtilities.(JTattooUtilities.java:44)

What version of Spark?

Spark version 2.8.3

I have tried this in CentOS 7 virtual machine. Downloaded spark-2.8.3.rpm. It comes without Java. So first:

yum install java

then:

yum install spark-2.8.3.rpm

then:

/usr/share/spark/bin/./startup.sh

Starting with 2.9.0 it will be installed into /opt/Spark instead. And to run it you first will need to give it execute permission:

chmod +x /opt/Spark/bin/startup.sh

/opt/Spark/bin/./startup.sh