powered by Jive Software

High CPU usage when chat windows is minimized

One of our users is seeing very high (50%) CPU usage when using Spark 2.8.3 running on Windows 10 x64. Task manager shows that it’s Spark using CPU. This happens almost always when the chat window is minimized (it works fine as long as chat window is not minimized). After it has been minimized and CPU usuage is high, it doesn’t help to restore the chat window. The user needs to close the chat window and sometimes even restart Spark. We’ve reinstalled Spark, deleted the Spark folder under Appdata and updated Java to latest version. Log doesn’t show any errors.
Is anyone familiar with this issue or have suggestions on how to troubleshoot it?
We have lots of PCs with identical setup but only have this reported from one user.

Spark produces a number of log files, errors might be in one of them. I haven’t see such behavior, but there was a post a few days ago about Spark freezing after it is minimized. Could be related. Another user asked if that user is using Nvidia card. Might be related to drivers.

I did find two logs under Appdata but the only error is about a missing dictionary and i see that also on my system. Not sure if that could be related?
Do you know if there are any other logs we could check?
I’ll see if i can reinstall the graphic drivers on that PC. It’s using Intel IGP and it’s an older CPU so the only driver available is a WDDM driver from Windows Update.

Spark might not log such issues as it probably operates normally and Cpu usage is some external factor. Maybe attaching some java debugger could lead into some info, but I’m not good at that. I have tried using VisualVM with Spark in the past, but it can’t connect to Spark process often.

For anyone else seeing high CPU usage…
It may be related to the version of Java installed. Since my first post, we started seeing this happening on more and more computers and thought it might be related to Java updates as we’re using the online version without Java included. We uninstalled Spark and installed the version with Java included and that seems to help.

Can you tell which version of Java were you using when you observed the issue?

I have tried to run Spark 2.8.3 with 8u162 version and it was fine (Windows 7 x64). Oracle usually releases odd and even version at the same day and auto-update usually installs odd version (161). It seems it has less feature updates and changes and mostly is about security fixes. Even versions (162) contain more updates. We usually use even versions to bundle with Openfire/Spark. Although it might be a slight chance, but one can give it a try with 162. http://www.oracle.com/technetwork/java/javase/downloads/jre8-downloads-2133155.html

I saw it in my PC with 8u161 running in Windows 10 x64 (1709).
I’ll try to do some tests with 162.