powered by Jive Software

Duplicate entries in message log

I have auditing turned on for only message packets. In the jive.audit-*.log file the messages are being duplicated. When I parse the log file and pull the data I want its duplicated which is bad for running reports against. I have the Open Archive plugin as well which I thought was causing it but I’ve turned it off, restarted the server, and get the same thing. (Open Archive logs to the mysql table)

Any help would be appreciated. Thanks

Openfire 3.4.3


I should mention that the clients are using Spark. Also, when I open a window like I’m going to talk to someone and close it without saying anything, that shows up in the log too. Thats not as much of a deal.

I’m facing exactly the same problem. Openfire 3.3.2. Is this a bug ?

sbrissen: Have you found a solution ?

I have not found a solution yet, sorry. For now my work around is to grab the log info I need, write it to a file (of course its all duplicated), then pipe it through the uniq command.

I spent too much time trying to figure it out so I’ll stick with my work around until a permanant solution is found.

I have read elsewhere in the community the the duplication in the logs has something to do with the message is written once for the sender and once for the receiver so that in enterprise when you look at the logs for a person you see a complete discussion.

I thought it was something along those lines. I don’t have the enterprise plugin but I did have the Open Archive plugin (I have since stopped logging with it) Its possible that its still causing the problem. We will be purchasing the enterprise plugin so I guess I’ll live with how its working now. My script is doing a good job of grabbing only what I need so I won’t worry about it.

Once the plugin is disabled I doubt it will cause any issues. Depending on how complex your setup, you could reload openfire into a clean database. I have it down to a science and can do it in about 30 minutes. But I use LDAP.

We are using LDAP as well. I think I’ll just live with it like this for now, we may be using the enterprise logging at some point so the problem would come back even if we loaded it into a new database.

I’m going to call this resolved as I’m pretty certain you are correct on the issue.

Thanks for the help.