We have Openfire installed on a server using the embedded database. Since Spark/Openfire is working out so well, we are starting to run out of hard drive space on the machine. What all is involved in moving all the data from the embedded database into a MySQL database and telling Openfire to use the external database? Can it even be done or do we have to rebuild? We are using 3.6.2. Thanks
I would be interested to see what others have done in this space as well. I’m also currently working a project that will include migration of the embedded DB to MySQL. What I’ve come up with so far is basically read each line of the openfire.script file (except the initial table creation section) and execute each line as a SQL statement. I’m splitting it all up into smaller chunks first based on table name, then doing each table as a separate transaction… seems to work okay, although it takes about 45 minutes to do a full import.
I’d love to hear if anyone’s come up with a better migration path?
I wonder why it takes long for you to import the embedded database. A mysql dump is in a very similar format. Anyhow I wonder whether non-ASCII characters are imported properly.
The “MySQL Migration Toolkit 1.1” could help, available here: http://dev.mysql.com/downloads/gui-tools/5.0.html + http://dev.mysql.com/doc/migration-toolkit/en/index.html
http://wiki.alfresco.com/wiki/Migrating_from_HSQL may also be interesting.
Thanks for the info. Couple of questions. What do I use for the source connection information? Also, will this work if I install the migration Toolkit on my workstation? I am not able to easily install the toolkit on either the source or destination servers.
as Openfire uses the internal database you may want to copy the whole embedded-db folder to your local workstation.
You must have JDBC access to the target database.
The connection settings will look like this, anyhow you may need to tweak the URL, maybe to “jdbc:hsqldb:/temp/embedded-db/openfire”.