I have just noticed that Spark’s nightly builds list is empty, though there were a number of builds a few weeks ago. I suppose builds are removed on a schedule (@Daryl Herzmann)? Maybe the script can be changed to remove builds based on the number of them, not the age, so there always will be latest builds available and not the empty list, if there are no recent activity. Of course, one can find recent builds on Bamboo, but it is faster and more approachable for users on the Downloads page.
P.S. or maybe this is some error, as it seems only Spark is affected by this and other projects still have their older builds showing.
wroot, indeed there was this cron job running on the website to cull out files older than 5 days.
1 1 * * * tmpwatch -m 5d /staticfiles/builds/spark/dailybuilds
I’d prefer to change the bamboo logic and have it make daily builds each day, to match the name of the directory. I have it scheduled now to build each day.
Actually, i object. I see no value in having meaningless builds of the same code over and over again wiping out the builds that matter (Bamboo only keeps a few builds in history also). Spark had a first code commit yesterday since 34 days. The pace of development here is not fitting the scheduled nightly builds scheme. As i oversee this project, i want it to be more convenient for me to quickly download and distribute between virtual machines an installer of a build with a particular change to test a regression or a new feature. Or point forum users to a particular build to test the change they wanted.
So, can we have it the old way, so Bamboo only builds when a change happens and the site holds 10-20 latest builds? If the name is not right (though it was like that for years), we can change it to Test builds, etc.
OK, I have changed the cron script to this
1 1 * * * ls -t /staticfiles/builds/spark/dailybuilds/* | sed -n ‘20,100p’ | xargs rm -f
and adjusted bamboo to not auto-run each day.