Error.log filled with "Data too long for column 'body'"

i’m using Openfire 3.8.2, MySQL 5.5 on a CentOS 6.4 32bit

if someone posts very long content in a MUC, Openfire’s error.log will be filled with exception like following:

2013.07.05 13:56:54 org.jivesoftware.openfire.muc.spi.MUCPersistenceManager - Error saving conversation log entry
com.mysql.jdbc.MysqlDataTruncation: Data truncation: Data too long for column 'body' at row 1
        at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2983)
        at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1631)
        at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:1723)
        at com.mysql.jdbc.Connection.execSQL(Connection.java:3283)
        at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:1332)
        at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:1604)
        at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:1519)
        at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:1504)
        at sun.reflect.GeneratedMethodAccessor58.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
        at java.lang.reflect.Method.invoke(Unknown Source)
        at org.logicalcobwebs.proxool.ProxyStatement.invoke(ProxyStatement.java:100)
        at org.logicalcobwebs.proxool.ProxyStatement.intercept(ProxyStatement.java:57)
        at $java.sql.PreparedStatement$$EnhancerByProxool$$ac2d43e7.executeUpdate(<generated>)
        at org.jivesoftware.openfire.muc.spi.MUCPersistenceManager.saveConversationLogEntry(MUCPersistenceManager.java:1023)
        at org.jivesoftware.openfire.muc.spi.MultiUserChatServiceImpl.logConversation(MultiUserChatServiceImpl.java:451)
        at org.jivesoftware.openfire.muc.spi.MultiUserChatServiceImpl.access$100(MultiUserChatServiceImpl.java:96)
        at org.jivesoftware.openfire.muc.spi.MultiUserChatServiceImpl$LogConversationTask.run(MultiUserChatServiceImpl.java:437)
        at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
        at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source)
        at java.util.concurrent.FutureTask.run(Unknown Source)
        at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(Unknown Source)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
        at java.lang.Thread.run(Unknown Source)

above log will be produced 50 times every 3 minutes (group chat conversation logging settings: Flush interval = 300, Batch size = 50)

my question is, is there any configuration can be set to just ignore very long content, rather than trying to insert it into db again and again?

if not, any suggestion to avoid that?

thanks a lot.