I am not sure if this is a Wildfire problem or a Spark problem, but I believe it to be some problem with Spark.
I have a Windows 2003 Server running AD and Wildfire with LDAP authentication. I can log into the admin console and used to be able to logon as users in Spark but now I cannot. My problem is that most users have auto-login enabled (They logon to the machine and Spark starts from their profile and logs them in). But the network policy is to change AD passwords every 42 days. Now, after many users have changed their passwords they cannot connect to the server. Spark will load and try to logon (auto-login) and hang up at “Authenticating…” and offers no options to cancel and enter a different password. For as long as I can see it will not time-out.
Any ideas on how to get around this problem?
Looking into this a bit more I found that when I attempt to login with a user who has changed their password I get:
javax.naming.AuthenticationException: [LDAP: error code 49 - 80090308: LdapErr: DSID-0C090334, comment: AcceptSecurityContext error, data 52e, vece
in my debug log. I believe that 49 is an invalid credentials error. So why is Spark not telling my that my credentials are bad and letting me use my new password? I have deleted my settings.xml file so I could put in my new password but I found that if I type my username and then just jibberish for a password it will still get stuck at the “Authorizing…” screen. So no matter what password that I use that is not the right password it will not tell me that it is incorrect.
Could this be a bug in beta or some sort of misconfiguration on my Wildfire side? Any ideas on a word around or a solution?
My LDAP settings are as follows:
Also, if I log in as a local administrator and try to use spark with incorrect credentials it will tell me that I cannot login. However, as a regular AD user it will not tell me and just lock up at Authenticating. However, I can use Gaim to connect to the server and it will tell me my credentials are incorrect as both a admin and a user. This seems to be a problem with Spark.
Message was edited by: paco36