Patch [#SPARK-1184]

Hey,

youll find the the patchfile here http://pastebin.de/15496 or attached

currently what needs to be properly discussed is how the plugin folder should be handled.

Delete it fully, only copy plugins that would still be working.

Some testing should also be made, but it seems to run fine on my System

another version displays an OptionPane where the user can select if he wants to migrate his Profile

http://nopaste.info/39cb314e25.html
migrateOptionPane.java.patch.zip (1774 Bytes)
Spark.java.patch.zip (1593 Bytes)

IMHO we shouldnot copy the plugins from the old profile. Instead the plugins from the install folder of Spark ā€¦\Spark\plugins should be copied to the new profile location. There is no gurantee that old plugins will work with versions > 2.5.8.

With respect to the option pane: A profile migration is a purely technical issue that should not be an option to a user. He/She expects that his/her data are also available after installing a new version. To be ask is somewhat unexpected. I would go for a ā€œsilentā€ migration.

It is good to have more contributors and patches. But, Walter, another person is already working on this issue and changing the assignee witout asking him isnā€™t nice. I suggest Wolf to contact Mike and cooperate. Recently we were discussing how it should work in this thread http://community.igniterealtime.org/message/209307#209307

Iā€™m voting for the silent mode too.

I like the patch and i much prefer a silent migration as well. My one suggestion is to move the copying/moving logic into a supporting class and not in the main Spark.java (to keep that class as clean as possible).

Iā€™m going to work with him and compare / contrast what we have.

Great stuff.

SPARK-1184 has been updated with a slight modification to this patch cheers and happy coding!

The Trasnfer-method should be generalized, right now it only skips plugins directory. I dont expect it, but it could happen in future versions that more directories should be skipped.

It seems unlikely that windows will switch userdirectories in the near future.

I would like it to be something like this:

http://pastebin.de/15519


public void transferConfig(String userSparkHome, Collection<String> skipfiles)

...

...

private void copyDirectory(File src , File dest, Collection<String> skipfiles)

...

...

if (!skipfiles.contains(children[i])) {

...

...

But anyhow, Patch provided by Mike works fine and i vote for commit

another thing

why are plugins loaded from the user directory and not from the installation directory?

1 Like

Definitely, i like that idea as well and I think it should be implemented.

One thing to consider, is that with the current patch if a username is for some reason plugins or contains that for any reason (strange name i know, but stranger things have happened) it will completely skip over copying the aformentioned directory.

It is probably important that we ensure we are skipping the USER_SPARK_HOME/plugins directory and not just anything containing the word ā€˜pluginsā€™.

One more thing to note - is that this will happen any time a users spark home dir does not exist and their old dir does, not just on first launch - which may or may not be desired - that really depends on how we want this import to work. If we want this import to happen any time the new directory layout does not exist (and the old one does), then we are set; however, if we only want this to happen after a new installation then this only answers half the question.

I tried mocking up some new preferences is in the LocalPreferences area to determine if the launch was a first time launch; however, the very first call to LocalPreferences will create the spark.properties file and subsequent USER_SPARK_HOME directory (but none of the other sub dirā€™s / files of course).

It all depends on how we want this import to work!

I would vote to have the patch as simple as possible. It should run every time after start up. There are a lot of open issues (e.g. NPE) that we have to have fixes for. Letā€™s get the patch in the trunk and let it bake for some time. We may consider the ā€œrun onceā€ option later on.

SPARK-1184

Updated again per your advisements, if you like what you see iā€™ll push that into the trunk and let the nightly build catch it

A bit late, but i agree that it should be simple and run on every startup. Will test it shortly.

Canā€™t build that patch. So maybe you should push it and then i will try with the bamboo automated build.

wont apply or wont build when you apply the patch? I would hate to put something into trunk thatā€™s bad

The code has been pushed in.

It applies, but wonā€™t build. Bamboo has problems with it too. http://bamboo.igniterealtime.org/build/viewBuildLog.action?buildNumber=56&buildK ey=SPARK-INSTALL4J

Should be corrected now, eclipse decided to auto-import the wrong log class.

  1. When you create new files into Spark project please insert Apache licence before package definition.

  2. I removed Log.debug() because you couldnā€™t use Log class before LOG_DIRECTORY initialized.

  3. Never commit Files to svn untill testing it.

Konstantin, it was me who asked to push it to svn. It is easier to test changes with the installer than having to start heavy IDE, apply patches, sync svn and all this heavy stuff. As i know it is easy to revert any svn commit, so i see no problem here. Bamboo probably gave more detailed log than my Netbeans to find the problem.

Anyway. I have tested this now and it works (clean instalaltion, importing of older settings, starting again after the import). It takes a few moments longer to startup if a profile is a bit large (mine is 20 MB), but nothing horrible in this. We will warn users about that in the release blog post. Closing the ticket for now. Thanks to Mike and Wolf (and Konstantin ).

Understood Konstantine; however, as wroot mentioned there were some extenuating circumstances which pushed me to get it into svn sooner rather than later. Which i know comes with its risks (thankfully bamboo is a smart system and catches it).

Another unfortunate circumstance is not knowing all of the ropes as there arenā€™t as many people around to guide others as there used to be. Iā€™ll be sure to add the apache license to the top of each class and add it to my templates so any future classes gets the lic. deff added in there.

Thanks for the heads up.

I think we could create branch for testing. where we could commit untested patches, betta libs and so on. And bind it with bamboo.

1 Like

that would be pretty awesome.