I’m having the same issue as this thread. There didn’t seem to be a final answer re: a true fix.
- We are running a master image that is on hundreds of machines. None of them are having issues except one person.
- The machine has been completely re-imaged, and the Spark install completely removed and redone
- We have deleted and re-created the user’s Spark account. That account works fine on any machine except the user’s primary workstation.
- We are running 2.6.3 and cannot downgrade/upgrade (the previous thread recommended 2.0.x)
- We have tested with Java 6u45, 7u25, and 7u45, no changes in behavior
What else can we test? What else does Spark rely on? Anyone experienced this and actually fixed it without downgrading to a previous version?
Thanks in advance.