Six-month cycles are great. Now let’s talk about meta-cycles: broader release cycles for major work. I’m very interested in a cross-community conversation about this, so will sketch out some ideas and then encourage people from as many different free software communities as possible to comment here. I’ll summarise those comments in a follow-up post, which will no doubt be a lot wiser and more insightful than this one 🙂

Background: building on the best practice of cadence

The practice of regular releases, and now time-based releases, is becoming widespread within the free software community. From the kernel, to GNOME and KDE, to X, and distributions like Ubuntu, Fedora, the idea of a regular, predictable cycle is now better understood and widely embraced. Many smarter folks than me have articulated the benefits of such a cadence: energising the whole community, REALLY releasing early and often, shaking out good and bad code, rapid course correction.

There has been some experimentation with different cycles. I’m involved in projects that have 1 month, 3 month and 6 month cycles, for different reasons. They all work well.

..but addressing the needs of the longer term

But there are also weaknesses to the six-month cycle:

It’s hard to communicate to your users that you have made some definitive, significant change,

It’s hard to know what to support for how long, you obviously can’t support every release indefinitely.

I think there is growing insight into this, on both sides of the original “cadence” debate.

A tale of two philosophies, perhaps with a unifying theory

A few years back, at AKademy in Glasgow, I was in the middle of a great discussion about six month cycles. I was a passionate advocate of the six month cycle, and interested in the arguments against it. The strongest one was the challenge of making “big bold moves”.

“You just can’t do some things in six months” was the common refrain. “You need to be able to take a longer view, and you need a plan for the big change.” There was a lot of criticism of GNOME for having “stagnated” due to the inability to make tough choices inside a six month cycle (and with perpetual backward compatibility guarantees). Such discussions often become ideological, with folks on one side saying “you can evolve anything incrementally” and others saying “you need to make a clean break”.

At the time of course, KDE was gearing up for KDE 4.0, a significant and bold move indeed. And GNOME was quite happily making its regular releases. When the KDE release arrived, it was beautiful, but it had real issues. Somewhat predictably, the regular-release crowd said “see, we told you, BIG releases don’t work”. But since then KDE has knuckled down with regular, well managed, incremental improvements, and KDE is looking fantastic. Suddenly, the big bold move comes into focus, and the benefits become clear. Well done KDE 🙂

On the other side of the fence, GNOME is now more aware of the limitations of indefinite regular releases. I’m very excited by the zest and spirit with which the “user experience MATTERS” campaign is being taken up in Gnome, there’s a real desire to deliver breakthrough changes. This kicked off at the excellent Gnome usability summit last year, which I enjoyed and which quite a few of the Canonical usability and design folks participated in, and the fruits of that are shaping up in things like the new Activities shell.

But it’s become clear that a change like this represents a definitive break with the past, and might take more than a single six month release to achieve. And most important of all, that this is an opportunity to make other, significant, distinctive changes. A break with the past. A big bold move. And so there’s been a series of conversations about how to “do a 3.0”, in effect, how to break with the tradition of incremental change, in order to make this vision possible.

It strikes me that both projects are converging on a common set of ideas:

Rapid, predictable releases are super for keeping energy high and code evolving cleanly and efficiently, they keep people out of a deathmarch scenario, they tighten things up and they allow for a shakeout of good and bad ideas in a coordinated, managed fashion.

Big releases are energising too. They are motivational, they make people feel like it’s possible to change anything, they release a lot of creative energy and generate a lot of healthy discussion. But they can be a bit messy, things can break on the way, and that’s a healthy thing.

Anecdotally, there are other interesting stories that feed into this.

Recently, the Python community decided that Python 3.0 will be a shorter cycle than the usual Python release. The 3.0 release is serving to shake out the ideas and code for 3.x, but it won’t be heavily adopted itself so it doesn’t really make sense to put a lot of effort into maintaining it – get it out there, have a short cycle, and then invest in quality for the next cycle because 3.x will be much more heavily used than 3.0. This reminds me a lot of KDE 4.0.

So, I’m interesting in gathering opinions, challenges, ideas, commitments, hypotheses etc about the idea of meta-cycles and how we could organise ourselves to make the most of this. I suspect that we can define a best practice, which includes regular releases for continuous improvement on a predictable schedule, and ALSO defines a good practice for how MAJOR releases fit into that cadence, in a well structured and manageable fashion. I think we can draw on the experiences in both GNOME and KDE, and other projects, to shape that thinking.

This is important for distributions, too

The major distributions tend to have big releases, as well as more frequent releases. RHEL has Fedora, Ubuntu makes LTS releases, Debian takes cadence to its logical continuous integration extreme with Sid and Testing :-).

When we did Ubuntu 6.06 LTS we said we’d do another LTS in “2 to 3 years”. When we did 8.04 LTS we said that the benefits of predictability for LTS’s are such that it would be good to say in advance when the next LTS would be. I said I would like that to be 10.04 LTS, a major cycle of 2 years, unless the opportunity came up to coordinate major releases with one or two other major distributions – Debian, Suse or Red Hat.

I’ve spoken with folks at Novell, and it doesn’t look like there’s an opportunity to coordinate for the moment. In conversations with Steve McIntyre, the current Debian Project Leader, we’ve identified an interesting opportunity to collaborate. Debian is aiming for an 18 month cycle, which would put their next release around October 2010, which would be the same time as the Ubuntu 10.10 release. Potentially, then, we could defer the Ubuntu LTS till 10.10, coordinating and collaborating with the Debian project for a release with very similar choices of core infrastructure. That would make sharing patches a lot easier, a benefit both ways. Since there will be a lot of folks from Ubuntu at Debconf, and hopefully a number of Debian developers at UDS in Barcelona in May, we will have good opportunities to examine this opportunity in detail. If there is goodwill, excitement and broad commitment to such an idea from Debian, I would be willing to promote the idea of deferring the LTS from 10.04 to 10.10 LTS.

Questions and options

So, what would the “best practices” of a meta-cycle be? What sorts of things should be considered in planning for these meta-cycles? What problems do they cause, and how are those best addressed? How do short term (3 month, 6 month) cycles fit into a broader meta-cycle? Asking these questions across multiple communities will help test the ideas and generate better ones.

What’s a good name for such a meta-cycle? Meta-cycle seems…. very meta.

Is it true that the “first release of the major cycle” (KDE 4.0, Python 3.0) is best done as a short cycle that does not get long term attention? Are there counter-examples, or better examples, of this?

Which release in the major cycle is best for long term support? Is it the last of the releases before major new changes begin (Python 2.6? GNOME 2.28?) or is it the result of a couple of quick iterations on the X.0 release (KDE 4.2? GNOME 3.2?) Does it matter? I do believe that it’s worthwhile for upstreams to support an occasional release for a longer time than usual, because that’s what large organisations want.

Is a whole-year cycle beneficial? For example, is 2.5 years a good idea? Personally, I think not. I think conferences and holidays tend to happen at the same time of the year every year and it’s much, much easier to think in terms of whole number of year cycles. But in informal conversations about this, some people have said 18 months, others have said 30 months (2.5 years) might suit them. I think they’re craaaazy, what do you think?

If it’s 2 years or 3 years, which is better for you? Hardware guys tend to say “2 years!” to get the benefit of new hardware, sooner. Software guys say “3 years!” so that they have less change to deal with. Personally, I am in the 2 years camp, but I think it’s more important to be aligned with the pulse of the community, and if GNOME / KDE / Kernel wanted 3 years, I’d be happy to go with it.

How do the meta-cycles of different projects come together? Does it make sense to have low-level, hardware-related things on a different cycle to high-level, user visible things? Or does it make more sense to have a rhythm of life that’s shared from the top to the bottom of the stack?

Would it make more sense to stagger long term releases based on how they depend on one another, like GCC then X then OpenOffice? Or would it make more sense to have them all follow the same meta-cycle, so that we get big breakage across the stack at times, and big stability across the stack at others?

Are any projects out there already doing this?

Is there any established theory or practice for this?

A cross-community conversation

If you’ve read this far, thank you! Please do comment, and if you are interested then please do take up these questions in the communities that you care about, and bring the results of those discussions back here as comments. I’m pretty sure that we can take the art of software to a whole new level if we take advantage of the fact that we are NOT proprietary, and this is one of the key ways we can do it.

This entry was posted
on Friday, April 17th, 2009 at 2:49 pm and is filed under free software, thoughts.
You can follow any responses to this entry through the
RSS 2.0 feed.
Both comments and pings are currently closed.

288 Responses to “Meta-cycles: 2-3 year major cycles for free software?”

6-month cycles only have support until the next 6-month release and are very close to bleading edge as they are now. Regular users are recommended to stay with LTS versions that have proper length support. Then make normal LTS releases support for 1.5 or 2 years and keep server lts support longer.

This way you take advantage of both worlds fast development but have time to fine tune the new features for 1.5 year release to the consumer at which point they are prompted to upgrade to the newest LTS version.

To me (average user), there are 2 issues that keep long term releases from being accepted :

1) Drivers ! Recent hardware takes some time to be supported properly. What if I have to wait 2 more years for a graphics board to work properly ?

2) Contrary to closed-source OSes, it’s difficult to tell the OS apart from the software. The ideal would be to have “core components” that remain the same (with bugfixes) a long time, while apps are updated more regularly (because new features may be needed, because of performance improvements…). However, every piece of linux software is more or less dependant on the other layers.

However, short term released are not ideal either, because : drivers sometimes get broken, some broken feature are added etc.

So, would it be possible to have a stable core system, up to date drivers, and not-too-old stable apps ?

Can I pitch in and provoke a little discussion from a completely different perspective? The main point I want to make is to question whether synchronisation of release cycles between major distros and projects can be assumed to be a good idea. The only advantage as far as I can see is that it would make the development calendar easier to understand an predict. However, I don’t think it follows that a simple calendar would be much help to deliver a better OS.

I have a concept in my mind that at it’s most simple, could be seen as a “seasonal development calendar” for Linux as a whole. Say for example we do tie Ubuntu releases to Debian releases. Firstly, would the Ubuntu Devs be concentrating on their release targets so hard, they lose focus on Upstream events at Debian? Then Debian and Ubuntu have to compete for the small limited media space available, and folk wishing to try out the new releases are hit with the choice at the same time. It just seems like everybody on the same side is set up to trip over each other to get to the same objective. Why not deliberately off-set or stagger releases, so that they are created and distrobuted evenly throughout the year? So for example, Debian put out their major 18monthly release in January. Ubuntu then have three months to anyalyse and incorporate those advances and still release to and the big Linux news in March isn’t trumped by a big Debian release? In turn the Mint and Crunchbang folk would then have a couple of months to get their major version updates highly polished and so on downstream.

Comments are tl;dr so forgive me if someone has already offered this suggestion. What about a mixing in a year long cycle between every few 6 month cycles? For example:

6 month
6 month
12 month
6 month
6 month
12 month
etc…

This way you are never more than a year away from the start of a long cycle and short term goals can be accomplished before the start of the next long cycle. We really don’t need symmetry in our release patterns – what we need is to eliminate half baked ideas making it into short term releases.

I think both cycles are good. I use both. I use the short cycle on my computer, but the long cycle on my Wife’s and kid’s computer. It is easier for me to not have to mess around with installing/upgrading every 6mo for them. They are running and using Hardy. I have noticed that they are (for me) very outdated, and I really like the new improvements and miss them when I sit down at their computer. That being said, I would say a 2 year major cycle as a desktop user would be better. On the other hand I could see where a major company, or any business for that matter would only want to update every 3 years, it is work and time, especially when dealing with more than one machine.

Ubuntu 9.04 is looking to be a pretty boring release. Please forgive me for saying this – but this is the problem when you force yourself to release every 6 months. I like the idea of regular releases, but maybe we should remove the symmetry from the release schedule and add a longer development cycle after every couple of short ones. For example:

6 month
6 month
12 month
6 month
6 month
12 month
etc

This way you can upgrade upstream software and fix bugs in the shorter cycles and have time to fully bake in new features in the longer cycles. You would never be more than 1 year away from the start of the longer cycle and would never get out of date thanks to the short cycles.

I always look forward to a new release of Ubuntu but just recently I’ve been inclined to look more on updates in software like Thunderbird, Openoffice & updates in kernels. Obviously there is no need to switch to the latest offering, but by having the six month release cycle it feels, in my opinion that there is a need to update the current version.

For me personally I would feel the benefit more if a longer cycle of releases were done, thus making it feel as if the user is actually experiencing the latest and greatest offering from the developers.

I am a total newb on Ubuntu. Not because I didn’t try to get it on my laptop before. I did. I also like to think I am not too stupid or computer illiterate when compared to the great masses of computer users, neverthless installing Ubuntu on my own in a way that “just worked” was essentially too time-consuming and “difficult” for that reason.

Fast forward to a couple of months ago. Despite Dell apparently reducing what is available with Ubuntu pre-installed to just one (tiny) notebook I bought it. Essentially because I am so sick of bug number one (Windows).
Well…how can I describe it…

It was like a revelation of spiritual significance. Everything just WORKS.
My God…I thought…this is how computers are supposed to be. We have been lied to and enslaved for too long.
It was like eating the right coloured pill in the Matrix but waking up to a much better world you only suspected existed in dreams.

EVEN SO… I still don’t have the time to make Ubuntu work on my currently Windows laptops…soon I will be unemployed (by choice) and I will then make the time, however what has been driven home to me is that there must be millions of people like me out there that would switch to ubuntu in an instant if it were just idiot-proof-easy.

On this note I have given it some serious thought and given my soon to be found new indipendence have been constantly thinking…man…why doesn’t someone organise a factory to just produce ubuntu-pre-installed laptops for the masses.
I still don’t get it. And I still think it would be relatively easy to set up and very profitable.
No one seems interested to do it or maybe they don’t have the money.
So…

I am willing to step up. If you can help fund the startup (and retain major interest/control/shares obviously) I am willing to give of my time to make it work. I don’t think my relative ignorance of the technical aspect is necessarily very relevant at this stage for this kind of project, but I do have a clear idea of how to go about it and I am willing to devote my time free of charge until it works (at which point I’d be wanting some money/shares/gold ingots/whatever) if you are willing to hear me out on this please let me know. I will also try to contact you again more formally in a few months with a proper proposal with some clearly defined points.

If however you are for some reason against even looking at this please just let me know now so I don’t invest too much time in it over the next couple of months.

I’m a new computer/Gnu/linux/Ubuntu user with no any other OS experience; (Newbie across the board). I’ve chosen to use Ubuntu 8.04 for it’s stability and longevity. This is what a Newbie needs, to begin to learn about OS’s. Not something that is changing so fast that it overwhelms with a too high learning curve. Since Freeware; Open Source; and GNu/Linux are aiming their sites at the masses; In making your decisions on how to evolve releases, please keep in mind the needs of the clients you wish to serve.

From a Newbie’s perspective; having a lengthy cycle of LTS releases is imperative, to facilitate and allow the accumulation of the knowledge base needed, to be able to eventually, adapt to and contribute in, the faster changing technology of short term releases with their inevitable challenges.

I sure hope that in forthcoming 2-3 years Ubuntu would try to catch Windows in ability to use voice too as input / output method. User with just mouce and keybad is much slower comparet to just say “open firefox” … “browse markshuttleworth.com” … “start to read” like you can do with Windows (speech regognition and Narrator).

A three years cycle to do what? Who can imagine the iphone 2012 today? Will computers still have keyboards and mices? Or is everybody sending eVideo, eAudio to friends using 3D gesture input and stays in contact by video twitter.

Facebook is then an always on realtime app running on my mobile displayed by a pico beamer – no need for cables, heavy hardware and clumsy human interfaces.

To plan Ubuntu Mobile I would start now, collaborate with a descent open hardware developer and get it on the market ASAP.

I havent read all the comments, so apologies if this has been said before; a common thread, from the OP to the responses, appears that the question “how long before releases” is being asked from the point of the developer community. How about from the pov of the user community… e.g. non-technical user sat in office, or better still, the business decision makers who decide what OS their organisation uses, or the hardware manufacturer who decides what OS ships on their products.

6 month cycles for these peoples = ‘death’ for your release… it means change occurs every 6 months, which is unsupportable in a business environment, asone would need to retrain the support teams every 6months, ideally before the changes go mainstream.

Why doesn’t Ubuntu follow a philosophy of providing a rolling release which does away with a 6 months release cycle and brings continuous upgrades to the end user? I am talking about an Arch Linux style upgradation process.

Well, unfortunately, Mark seems to have pretty much given up on posting anything here. Maybe the discussion was not particularly helpful in his desire to improve the coordination of releases among open source developers… Envy among the different distributions – debian, suse, ubuntu – appears to be a major obstacle. Now, I sure wish the relationship with ubuntu’s mother distribution debian could be improved. This seems to be one of the major things one could wish for.

Since Ubuntu has become such a big player, I think upstream developers shall eventually adapt to ubuntu’s schedule for new releases. Mark should just keep on doing his thing. The residual players will – if they are not willing to compromise – eventually adapt to his release cycle in their own interest. He should not lightly deviate from his release cycle, because everybody else is counting on it. Only major major contributors like debian or gnome should tempt him to deviate from this schedule. Everybody else, upstream developers as well users, could be “pissed” if he doesn’t deliver on schedule. So that would be the price he would have to pay for deviation and it really should be worth the price.

Final thoughts on karmic: I think you did a wonderful job. Boot-up times as well as responsiveness have been improved considerably. The artwork is improving too, although not as apparently. Nautilus and the displayed folder icons come to mind, nice job, what about the theme colour?. The audio interface. The combined empathy/evolution ikon in the panel. The new software center is really fool proof. Ubunto-one is nicely integrated into the desktop. And everything seems to be running smoothly without any major problems – o.k. minor glitches…. If you keep on improving at this pace, well then OS-X could be in your reach pretty soon;-) I adore Ubuntu! My partner – a staunch apple supporter – is becoming frustrated by your improvements. Upstream has delivered too. Both open office word and evince have resolved bugs I reported – these are two applications I use a lot….

Well, I would like to make a few recommendations for the next release.

1. Gnome 3.0

Don’t integrate the next major gnome update in the next long-term release. I fear that the same problems will arise as in KDE. I have looked at the new interface. It is nice, but doesn’ knock me over. I hear that the new gnome interface (zeitgeist?) may not support compiz. Please do not sacrifice all the features compiz users have grown used to for – well this bites – moving around a few buttons on the interface.

2. Package converter:

Integrate the package converter into your next release, at least integrate it into the repositories. The package converter lets you convert rpm into deb packages. It’s a very simple graphical user interface and you no longer need to use alien.

3. Gnome-do:

I love gnome do, It’s magic for anybody who uses his computer a lot. Please take a close look at it and let it be part of your ubuntu repository. It is part of my desktop and I am extremely fond of it. It can’t be described in few words other than magic.

If you have any problems or concerns with regard to intellectual property matters in Germany, please contact me. It’s my profession and I would be very glad to help as a way of contributing to Ubuntu.

I liked the idea to cooperate with debian release shedule and postpone the lts for 10.10 or X.X 😉 Also i would like to have a different versioning system Lets say
Lts will be 10.10 next version should be 10.50 10.60 10.70 next lts would be 11.xx
So short term release would be an update to lts until a new lts version comes out no need to go from x.4 to x.10 and back to xi.4.
Also with every new version of ubuntu i find it difficult to discover the differences because they are so close each other. We need every major version update to show that it is a version update and feel and act like one. Major steps need to be taken from lts to next lts to show the direction of the ubuntu distro, things that show progress and a meaningful new release.
Also would be great to have a theming update as the ubuntu desktop looks like is stuck to brownage!
A more pleasant bright and colorful theme will be a more than welcome.

“In the US, one form of advertising we often see is “New and completely redesigned for 2009!” OK, GM or whoever you are, if your last car was as perfect and wonderful as you said it was, why did you have to scrap the design and start all over? I think this is relevant to software — and software advertising — as well.”

Like all organizations, it’s important to properly manage expectations.

Personally, one key issue here is managing expectations. Ubuntu’s past releases have created for me the following expectations:

* Even numbered first iteration releases are Long Term Support releases. (6.06, 8.04)
* Even numbered second iterations are cleaned and more stable Even numbered first iterations (but are not LTS). (6.10, 8.10)
* Odd numbered releases are useable playgrounds that are used to collect information for the next LTS. (7.04, 9.04)
* Odd numbered second iteration releases are cleaned and more stable Odd numbered first iterations. (7.10, 9.10)

Wether or not the above is true, for some reason it’s the message I heard since starting with Ubuntu 5.04. It’s now what I expect from Ubuntu releases.

If Ubuntu decides to implement a new release cycle it should have a clear understanding of current customer expectations. Otherwise it risks putting serious strain on customer relationships with Ubuntu. I know I’ve stopped being loyal to company brands because of failed expectations. It’s totally fine to change things up, but it needs to be well communicated to users.

I do believe that the only way to gain a noticeable market share in major corporations for Ubuntu distros is to support LTS for way longer than 3 or even 5 years, something like 8 to 10 years seems to me a minimum.
Most IT Directors want to know that they won’t be “forced” by some Linux geeks 😉 to migrate their tenth, or hundredth, or thousands computer systems thx to a lack of support. They want to keep the decision for themselves or their Executive Boards. They’re ready to pay big bucks for it. They don’t really give a damn’ about free software or proprietary software. They want software which suits their needs on the long run, which they trust would still be there roaming in a long time to protect their investments and their careers. And with a clear way to get support from a well established company they entirely trust.
Red Hat rules at this stage amongst . Canonical and its associated foundation is clearly mandatory to allow Ubuntu distros to enter big corporation’s servers rooms and big data centers. Elsewhere they’ll choose another OS for their servers as they won’t be confident at all as they can’t contract anything with anyone. For desktops the path seems to me much longer for Ubuntu vs Windows than for servers rooms where “only” a few people, often technically aware, to be convinced to switch from Windows or big Un*x brands. And “only” servers stuff to be migrated. Firms have countless MsOffice or windows-only docs with macros they will never ever want to have trouble with as it runs their businesses and are used by everyone around them. Firms don’t want to alienate their staff which what will mostly happen for such moves from Ms to Linux desktops as usually people resist changes, even for the better :-).
Mammoth corporations are probably the most difficult market to enter for Ubuntu, but I believe that it is necessary to succeed in entering them as they then will ask their suppliers to deliver Ubuntu compatible software. Imagine … 🙂
Another important thing is for Ubuntu is to deliver a clear, easy and safe way to migrate servers from LTS version xy to next one on an industrial scale. 10 servers can me migrated one per one. Not 5000, too expensive.
I know my post is not very free-software-minded but it is real-world-minded. I read a post a few days ago telling that open-source software represents 2.5 billions lines of code. I thought : “whao !”. But it also told that nowadays cobol software running in firms, written since the 70’s, represents 200 billion lines of code, increased by 5 another billions every year … That’s real world of IT. It needs to be confident about how long an OS they plan to use will be supported and who will sign the contract. And actual 3 to 5 years of Ubuntu LTS is not enough for most of them as they often run systems 8, 10 years or more. I see that all year long in my job, brand new software, sometimes entirely built upon free-software, working with some other old un*x box running since 1990 for some stuff corporate bosses don’t want to reinvest a single dollar but don’t want to retire either.

B. Create an online payment service called “Ubuntu Shopper Partner” (USP), which the online shopper can use to purchase products from OPI. This service will integrate with the package management software on the user’s PC so that stuff like product keys will be handled in a manner transparent to the user. USP will make money for Canonical through fees.

C. There will be a program offering to include OPI in default Ubuntu installations in return for monetary compensation. All products in such OPI must be fully compatible with the Ubuntu release in which they are included.

D. This obviously doesn’t come to exclude Free Software, only to add commercial options as well.

I recently made the switch from the world of windows to Linux. Actually, I installed Ubuntu 9.10. Nice release, thank you! I just now stumbled across your blog, and this thread. Sorry for the belated post.

I would suggest an 18 month to Milestone / 18 month to LTS release cycle. These would be broken up into 6 month releases.
1) The Milestone: is the stable desktop/workstation type release. These are supported for the duration of the 18 months between Milestone, and LTS. This is also the staging area for the ‘upcoming’ major API type upgrades/updates.
1.a) Milestone +1 would be the time for a weighted % of major updates.
1.b) Milestone +2 would be the time for a weighted % of stabilization, but still include some new but somewhat less risky updates.

The third post Milestone release of course would be the LTS. The LTS of course would be heavily weighted % wise to Polish Polish Polish and more Polish.
1.c) LTS: This release is about stable API’s, stable package updates, and a development environment that fosters intermediate to long term ‘growth’ for its intended audiences. It’s security, and major packages should be maintained throughout its life of 3 years.

From the final user perspective I and my friends dislike the 6 month cycle because we need to upgrade the system every 6 month and final users don’t like to change things every 6 month. I believe that a year cycle is a nice period to change a system – like Mandriva we can have Ubuntu 2010, 2011, 2012… is sufficient for final users. For a ‘long term service’, I believe that 3 years is nice, for sample today I’m using Ubuntu 8.04 LTS because I don’t have necessity to change things in my production computer every 6 months… Take these consideration on mind!

“Long or Short” — That seems to me to not be the central question, but artifacts of something else.

Of course, the benefits of short cycles is that you see things work (or not work, so you can correct them). I have always been in favor of 15 minute (max) codking cycles, but I would not recommend a 15 minute release cycle.

You will recall any college course you had as a prerequisite – there had to be some overlap for you to bootstrap to the next level. Can you do short cycles to accomplish big changes? (Yes, or course – but what will that natural process look like?).

Let me get back to the 15 minute coding cycle for a minute (or even, if you will, test-driven coding):

If I know what I want to do, I (lets say) write a failing test. The process forces me to work out little details that simple specification might miss. That is, it’s a discovery process.

How do I build a group of 15 minute cycles into a short term (usable, complete feature) release?
I discover the shape of the test; I make parts of the test (incrementally) succeed; I may discover things along the way that the feature hadn’t considered about interaction; I maybe go update a collaborating part of the system with the new knowledge – that is to say, I work on the component, and on the interface.

Now, let me talk about the big change: I layout what new I want. I work out the big pieces of the problem. Things I don’t know how to do, I prototype (quickly) – that is, I instigate a discovery phase to feed the abstraction level I am shaping (this way I avoid “analysis paralysis”). Once the new function is worked out, I work out how it integrates, fits into something existing (system interface). Maybe I identify new technology that would be desireable (think of the first CD recorder, how that process came about). I use a “second choice” technology until then. I make tradeoffs. Once I have the big-system plumbing (interfaces and technologies) chosen, I might stub out a prototype for those interfaces (instead of making some connection or calculation for the user, I will print “this will make a connection”, etc.).

I structure my part of the system (that which I have control over), and I repeat the processes of structure & discovery, and stubing and functionality until there is something useful.

You see – with goals of discovery, the issue is not one of long term vs. short term – it’s one of integrating the two, and when (and to whom) you expose the short term at the state you’ve achieved.

If you have no overriding plan, no big bang happens. But, if you have an overriding plan, discover (or change of available technology) will modify your big plan – maybe a little, maybe a lot.

But the big unifying factor of BOTH events (to use KDE and Gnome examples) is that they fit into some plan; some part of a big new plan will not be ready for prime time; when they are, they will go thru the phase of discovery inevitable when exposed to an ever larger user base. This is where incremental an big-plan converge and overlap. I would argue, this is the new (higher abstraction) ever-ongoing incremental: it’s nothing new, it’s just a higher level view of the same things scaled differently and integrated differently.

There is a process for identifying the important things at the higher level of abstraction (call it Architecture, but it is more than that, and architectural rules alone don’t cover it). In fact, I think of 3 different layers, or kinds of architecture in designing systems.

The process is a complete parallel to team/social processes; human systems (how they operate) and big system design match well, and in fact must intersect / interact.