Crashing - Spark

Hi
I am having some issues with Spark on one particular computer only. I will explain what I did so far but when spark is crashing, is crashing with Error failing to load library 82.
OpenFire/Spark is local, no proxy, no ssl …
Spark is running on a Windows 10 Pro.
The user is working remotely from home, directly on that machine via a vpn and RDP.

What I have done so far:

  • Uninstalled few times Spark
  • Cleared both the profile and reinstalled again
  • Installed Spark Directly under C:| its own folder
  • Installed latest overnight built
  • Give full permission to that folder/files.

Not sure what I can do next. Anyone else with the same issues, behavior. ?

First off, what version of Windows 10 Pro? 32 or 64bit.
Then, what version of Spark? Packaged with Java or not, 32 or 64bit

I had a similar issue when I installed the version that was packaged together with Java. I uninstalled that. Then I downloaded and installed the 64bit version of Java making sure to uninstall any 32bit versions that may already be installed (the can cause conflicts). Then installed the 64bit version of Spark. Everything worked fine after that. I’m betting you are experiencing conflicts in 64 vs 32 bit. If you version of Windows is 64bit, you should install the 64bit version of Java.

Hope this helps.

Thank you

Windows pro 10, 64 bit
Originally I have used the 2.8.3 (official) Offline installation, includes Java JRE (recommended)
Will try your suggestion also and see if it helps

I have removed java and left just x64 java
I have downloaded nightly release 2.9.0 with jre 1.8.0 202

will see what I hear