Spark crashing when run over Remote Desktop

Since most of the everyone is now working from home, everyone uses Remote Desktop Gateway to RDP into their individual desktops to work. However, we have been getting a lot of reports, and I have seen Spark IM crash much more frequently. Usually with the following error message:

“LoadLibrary failed with error 126: The specified module count not be found.”

This only seems to be an issue when run though a Remote Desktop session. Has anyone else seen this or know what might be the cause?

Thank you.

I myself have been using spark for many years through rdp. What version of spark are you using? spark 2.8.3?

2.8.3. Correct.

Hello,

We are also seeing similar crash over RDP, but only on a few users, most are working normally.

For all but one user, a reboot of the host workstation has resolved the crashing. However for one stubborn hold out, it continues. Have tried all the standard troubleshooting steps, including but not limited to: reboots, clear temp files, uninstall/re-install, and removing the spark profile folder.

It throws up an error “LoadLibrary failed with error 87: The parameter is incorrect”, then freezes up requiring us to end task.

Spark 2.8.3, build 960, host is Windows 10 Pro.

Thank you.

Another sufferer here. We’ve now got all staff working remotely via VPN and RDC and we’ve seen an upturn in Spark freezing and throwing up a “LoadLibrary” error that then closes all Spark windows.
Anecdotally, users haven’t been receiving messages while frozen, although I’ve not been able to corroborate this fully while being remote myself.

We hadn’t noticed any crashes before, even with the rare use of people occasionally using remote desktop for a day or so, but with 8-10 users remote it seems to have increased.

All machines have had a restart or two over the last fortnight due to antivirus/windows updates.

We’re on 2.8.3 build 960, windows 10.

Any guidance/suggestions appreciated

Hey Matt,

We have been unable to find a solution. There is no rhyme or reason to when and why it crashes. I see it personally frequently. I can go days with no problems, and then other days it will crash every other message all day long.

We have about ~75 people working remote and its a mix bag of who experiences these crashes and who doesn’t. One thing to note is, it does appear to be more frequent when in a conference room. Although, I can’t confirm this to be the case 100%. I have not seen it crash with the LoadLibrary message and still be able to send messages. When that crash does happen, the entire application appears locked and unresponsive.

One thing we do is Active Directory integration for groups and login. Not sure if maybe you are doing that to? Maybe it’s related to that, and why others might not experience the issue?

A few of us have switched to a different client. Although this is not a desired solution, since Spark seems to be the best at doing everything we need it too, except for the crashing.

1 Like

Thanks for the above. We agree, for most things it seems to be ideal, and it’s only the current circumstances that have shown the bug. We are Active Directory linked, you might be right, it could be a thread to pull.

Find two users.
1 - It is working with no issues over RDP
2 - It is not working over RDP

  • In Active Directory Users and Computers - enable your “Advanced Features”
    Double check everything for each user is the same, or find what is different.

An example - user 1 is able to connect to work computer or terminal server with no issues and can use Spark.
User 2 - can RDP to work station or terminal server, but has a roaming profile as viewed on the “profile” tab of their account in AD.

Just an avenue to double check.
If this is on Terminal servers - check the “Remote Desktop Services Profile” in AD tab as well.

Thanks for the avenues. Unfortunately, all remote users seem to be affected. All are remote desktop connection over IP to machines rather than Terminal Server and I can’t test machines physically … yet. I’ll see what I can do to feedback.

Just to add the full error message we receive:

 LoadLibrary failed with error 87: The parameter is incorrect

do you have a workstationuser that this happens regularly on that is easily reproduced.
Are you using the embedded jre or a system jre. what version java does spark report?

No issues over RDP with our Spark to Openfire connections here.
We changed the install location to be C:\Spark

Would enjoy some after hour tinkering to create that issue.

In our case, we are using the integrated JRE. Spark reports this version to be “1.8.0_121”.

if changing the install locations resolve the issue, than its likely UAC is causing some sort of issue.

You might be hitting a UAC issue or AV. It might be worth trying to disable UAC and/or AV.

I tried running Spark with elevated (as an Administrator) permissions, with no luck. It continued to crash even then. I also checked my AV logs both on the local system and server side, but see nothing that would lead me to believe it’s AV getting in the way.

One thing that I do want to point out. These desktops that the users are all RDPing into all have more than one monitor. Obviously, that means little when RDP is use, unless setup that way, but multi-monitor configurations have caused issues with other apps before so I wouldn’t be surprised if that was involved in one way or another.

Following the hunch that this was happening with odd regularity I’ve had a team member up their auto status/idle time from about 15 minutes to 800 today. No drops yet. Am going check with a couple of others. Probably not a full solution but I thought I might as well look at moving parts while I’m unable to access certain areas noted above.

1 Like

Hello, I had a similar problem, but can you try on a current computer where there is no antivirus? I didn’t have any logs in the antivirus either, but after I added the entire c: / programfiles / spark folder to the antivirus exception, the problem was solved. 250 people now work through rdp on their computers (1 or 2 monitors, no difference) problems no. spark 2.8.3 openfire 4.5.1
and also we have turned off UAC on all computers.

1 Like

Just to be sure, I completely stopped AV on my computer, but was still able to get Spark to crash easily.

Wish it was AV, as it would be an easy fix with an exclusion or two. But, it appears not to be.

1 Like

By the way, if the problem is still relevant, you can run spark in compatibility mode with windows xp sp3.
As far as I can see, this solves the problem with spark hanging up during a network break.