Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

IraLaefsky writes "The appropriately titled Becoming Agile: In An Imperfect World by Greg Smith and Ahmed Sidky offers a realistic path to the family of Agile practices which have become prevalent in software development in the last few years. This family of approaches to software development has been widely adopted in the past decade to replace the traditional Waterfall Model of software development, described in a 1970 article by Winston W. Royce 'Managing the Development of Large Software Systems.' The Waterfall Model stressed rigid functional and design specification of the program(s) to be constructed in advance of any code development. While the this methodology and other early formal tools for Software Engineering were infinitely preferable to the chaos and ad-hoc programming-without-design practices of early systems, these first tools ignored the fallibility of initial interviews used to construct initial design and often resulted in massive time and cost overruns." Read below for the rest of IraLaefsky's review.

Becoming Agile: In An Imperfect World

author

Greg Smith and Ahmed Sidky

pages

408 pages

publisher

Manning

rating

9/10

reviewer

IraLaefsky

ISBN

1933988258

summary

provides the tools to introduce and adapt agile practices in a variety of corporate cultures

The Agile methodologies which are described in this text stress an iterative approach to software development, with the continuous involvement of users (or user surrogates). These iterations consist of several week periods (to at most two month intervals) where a concise partial design requirement, story, is translated to a complete executable version of the program which can be demonstrated to users, for their immediate and anticipated criticism and controlled feature addition. These practices have undergone various codifications since the Agile Manifesto of 2001. Among the more popular Agile Menthodologies are Extreme Programming (XP), Crystal Clear and Scrum.

In describing these development methodologies this practical handbook takes an approach sorely needed in descriptions of Information Technology (IT), it assumes that the purchaser is considering employing the technologies described within the context of a real corporate environment with existing strengths and limitations, an existing approach to the problems addressed, and cultural biases concerning the adoption of new technologies. This approach enables the book to be used as a virtual consultant, taking the experiences described in a case study based upon the authors' advisory experience, and the test of organizational readiness for adoption and needs for customization of the technology as true guideline for introducing these practices in culturally and technology appropriate fashion. During the mid 1980s I served as an internal consultant at a large insurance firm, at the time we were considering the introduction of Expert Systems methodologies into the IT organization. I purchased several handbooks which were intended to introduce this new from academia technology to companies in the financial industries. Most of these books did an adequate job of describing the nature and basis of this technology to IT and Business Analysts trained in existing technology. But, all of the available books failed to chart a path for an IT organization with traditional development practices to successfully migrate to the new technology and appropriately translate this technology for business management. Becoming Agile, introduces a new effective method for describing the risks, benefits and appropriate adaptation of a radically new technology to organizations with existing successful and unsuccessful software development practices and a particular business culture.

Important features of this guide include the Sidky Agile Measurement Index (SAMI) which provides guidelines in moving your particular organization to Agile practices, the non-religious presentation of multiple Agile methodologies and approaches (specifically XP and SCRUM), appendices on organizational readiness assessment, phased development within the Agile context, an overview of the Agile process (suitable for business presentation), and the author forum. The importance of recognizing that new technology methodologies such as Agile Practices must be introduced and carried out in the context of a specific organization, with its own strengths and foibles, cannot be overemphasized. Step-by-step directions and illustrations are given for choosing an appropriate target application for the initial introduction of these methodologies, and each stage of implementation and their possible stumbling blocks are carefully outlined.

That it provides the tools to introduce and adapt these practices in a variety of corporate cultures, with varying degrees of technical sophistication is an invaluable advantage over other Agile texts and will save the organization many thousands of dollars in consulting fees.
My only minor nit with this exceptionally fine introduction to Agile Methodologies is that some of the illustration appear to have been formatted in PC-based tools such as VISIO and PowerPoint and require a bit of squinting to study in the smaller book format. With this trivial exception I would award this excellent guide and virtual consultant, an almost perfect nine out of ten review, and recommend it to any organization seeking to intelligently adopt Agile Practices.

The print edition is available at all retailers, while the ebook can be purchased exclusively through the Manning E-Book Storefront.

It does work at the right granularity. Implementing a straight forward feature that is well understood works best without having to waste time being interrupted by process or creating more documents than actual code or carving deadlines in stone.

At some point, the actual work needs to be done, real code gets typed, and the results tested. That's the point when the process needs to back off and let stuff happen.

That's the point when the process needs to back off and let stuff happen.

If you think you can just ignore all the design work when you start coding I'm afraid you are not going to be very successful. You seem to have a 'go away and I'll get you the answer' approach which is the same as throw it over the wall.

Good developers need to learn how to work cooperatively in teams. Too many times I see people that *could* be good developers but have this isolationist attitude and dont want to be involved in the big picture.

But what do you do if the spec. sheet is wrong? What do you do if the spec. sheet is unclear? How about if you can't meet the requirements given by the time specified? Methodology answers all those questions.

They even seem to come with the weird religious rhetoric, too, promising that if only you embrace $Methodology, your life will change. Well, except the ultra-oppressive ones like CMMI: it demands that you accept CMMI or be destroyed.

It seems to be a fact of life that these things are going to float around, so I guess what makes sense is figuring out which ones are relatively better or worse, and what ideas from them are relatively useful or not. As (excellent tech blogger) YosefK put it [yosefk.com] in a review of an Ex

I'm almost certain that the language fad will be functional programming, unless of course something even bigger and sloppier comes along to counteract Moore's Law. The management and process fad? Who cares? Except for a lucky few, we'll all have to nod and smile and suck it up until the next silver bullet comes along.

The only Agile Programming method is to trust your workers, and give them what they need to get it done.

My company said it was agile, but every process was waterfall with the names changed. More gates, more reviews, more layoffs and offshoring.

I could support my application 100% and have time for other development, but instead I have to train other teams (10 people at least at this point) and produce enough documentation that they can fire me when I get too expensive and hire a homeless guy to replace me.

Trust your coders, give them access they need. Then, when someone inevitably breaks your rules, hold them accountable. That's the key. Don't change the rules for everyone so you can answer "What have you done to prevent this from happening again?" The answer should be "We enforced our current policies, the person has had all access stripped, then terminated, and a reminder sent out."

I can't do agile development if.NET has built-in limitations which require a change request to an offshore server admin support group that doesn't even work my hours. So I sit idly until the workday ends, wake up the next morning and realize the sa team wants more information. I just bypass the whole thing and develop locally, but I can't always demo locally. If I could configure the box myself, I'd be able to document what I did so I can make an implementation guide, but I can't even do that - the dev servers are locked down. I am expected to document steps that I can't perform myself, nor test. That adds time to development.

Trust your coders, work closely with them, and get something working. Then plan for changes, because there will be design updates as well as requirement updates.

It all boils down to hiring people who can code either quickly or generically, so that when something changes they can respond - and then allowing them to respond.

Fully agree with you on this. What the Agile process has shown me so far is that its a very fragile process, where the management want to break down the developers to the point of making replaceable parts of even the very best programmers. Add the war room to the mix, keep poking developers to "what I did yesterday, what I'm going to do today, whats holding me back" daily even when there is no work planned, so that you keep every one in check, all the time. Every job in the world has some slack built in,

As far as answering the phone goes, it helps to give out your cell phone number to prospects, so that, when they call, you can step out of the room as if you were discussing a private issue with a friend or a family member.

We're a small company, so we can probably get away with it, but I require my coders to show up twice a week to the office for regular meetings and then be available if something comes up. All our code is either in SVN or Git repositories depending on the project. (Our Java team likes SVN, the web development team likes Git. Usually one isn't dealing with the other). We just come up with a list of milestones and due dates and then I get out of their way. Once a new feature is added, we test for bugs, an

and produce enough documentation that they can fire me when I get too expensive and hire a homeless guy to replace me.

Or find someone who can quickly pick up the work, when you inevitably die horribly in a plane crash on New Years, when the commercial airliner experiences a catastrophic failure and both sets of stabilizers snap off, causing it to crash right into your living room.

It's terribly precise, I know, but hey - you were being overly paranoid.

A problem with many of these methodologies is that they claim to be the silver bullet, or at least be a one-size-fits-all approach. The waterfall model may be inefficient for short quick feature changes to existing products, but on the other hand the Agile approach is not very good for long term project planning and infrastructures. Often these methodologies become ideologies, with proponents pushing the ideas religiously. Ie, the project manager who has the attitude that everything will be great if ever

I hate fads but this new agile nonsense isn't your regular fad. It's much much worse. I've seen "agile" used as an excuse to deliver a substantially changed final spec well into the final stages of development, then wonder why the product, which started off as one thing has been so completely MANGLED to become that other thing, performs poorly and is buggy. It's like starting off with a spec for a bridge then in the months where the bridge is almost complete despite incomplete documentation asking for that

I must agree. As a Project Manager my boss has sent me off to several of the "Agile" development workshops and I came to the same conclusion I do about most buzzwords - it's a solution looking for a problem. Too many PHB's walking around looking for a way to keep themselves busy.

Oh, right, yet another valiant effort at demolishing a strawman of the Waterfal Model... which never really meant the carricature opposed by the "agile" crowd, and wasn't applied that way. Ever heard of iterations? Right. Apparently the agile crowd still never heard that anyone else uses those.

I'm uncertain as to what waterfall means where you work...but despite loud protestations from several developers...where I work, it means precisely the caricature that we Agile folks oppose. And it's basically mandated that a formal, pre-planned, no-iterations "waterfall" approach is used, as per the guidelines pushed by PMI and CMMI. I wish you were right...but it ain't a strawman.

Actually, I hope you realize that -- since it's referenced even in the review here -- the 1970 article by Royce which described it (albeit not actually using the term "waterfall") was a description of a dysfunctional process which doesn't work right for software. That article wasn't a formalization of the waterfall model, but a critique of it. (And a modification to something that works better, at that.)

It's not even the only one. Several authors and books exist about making software development work, _long_ before there even was such a thing as the XP manifesto or the word agile.

So at the very least we have the first strawman: that we needed the Agile crowd to come along and wake us up to the fact that the waterfall doesn't work. It was widely known that it doesn't work, at _least_ since 1970. You know, the fun times before microprocessors and personal computers and when even Unix was still just a cutesy personal experiment.

The model actually doesn't come from computing but from RL construction and partially manufacturing. That's the place where the costs get prohibitively higher if you discover a mistake after you built the whole building. And while _some_ misguided MBAs have always tried to treat programming like a generic assembly-line operation -- including using methodologies like this one, which are utterly inapropriate -- it's hardly been the standard, de facto and de jure methodology in computing. The pretense that somehow if you're not doing a crap^H^H^H^H agile job, you're one of those using the waterfall model (never mind that it was discredited for 3 decades already) is the strawman that irks me the most. Yes, some islands of insanity exist. You have my sympathy if you work for one. No, it's not the standard in programming. No, we don't need a wakeup call or extreme methodologies to know it doesn't work. We knew that already. At least since 1970.

And in reality even the places which go somewhat waterfall-y... well, let's just say that any place that's ever had a change request or made a version 2.0 of a product, essentially has iterations. Not the best form of them, to be sure, but it is iterations. Requirements change after code has been written too.

If your CMMi consultants are pushing waterfall as official CMMi canon, you might want to find yourself a new batch of CMMi consultants. CMMi is about being able to measure and manage your work, not about using a specific methodology to do so.

But a lot of it depends on the size of the company and the average project size. If I have an estimated 2-month effort and 4 programmers (all internal), it's unlikely in the extreme I'm going to deal with the complexity of an iterative model. Plain old waterfall fits

I'm uncertain as to what waterfall means where you work...but despite loud protestations from several developers...where I work, it means precisely the caricature that we Agile folks oppose. And it's basically mandated that a formal, pre-planned, no-iterations "waterfall" approach is used, as per the guidelines pushed by PMI and CMMI. I wish you were right...but it ain't a strawman.

This is ridiculous. And shows a lot of developers simply dont understand processes.

1) nothing in waterfall mandates that it cannot be iterative. In fact it *must* be iterative, allowing you to jump up levels as well as flow down. Thew waterfall levels simply are to be used as milestone gates so progress can be measured. Companies simply cannot work with the 'just give me unlimited funds and resources, leave me alone, and I'll get you the product sometime" approach that some so called agile developers th

Waterfall-ish approaches ought to be iterative. I agree. Enough phase-gating, Change-boards, and weight of documentation, however, and it becomes infeasible to go backwards up the chain

Waterfall is iterative. If your company implements it stupidly then thats not the fault of the process model, just your implementation of it. Each gate does not require 100% completion of the current phase before starting the next phase, you just need to complete enough to reduce the risk of proceeding. Good guidelies are PDR you need 60% of high risk design completed, CDR 80%.

Waterfall asks for LOTS of decisions to be made LONG before the last responsible moment.

No, noly the high risk. And if you can't decide on things like basic architecture early in the program, then you shouldn't be dec

The software implementation of waterlfall process was never intended to be purely iterative. This clearly does not work. The intent of the software waterfall process is exactly as I described without having to invent stupid cutesy monikers and pretend it is something new.

What you described is not waterfall, not by any measure. It is a contrasting methodology.

It has nothing to do with cutesy monikers (which, btw, is a fair dig at Sashimi). The reason it's not waterfall, however, has nothing to do with the overlap of phases.

What it does have to do with is the fact that it's iterative. Iterative design was developed (or evolved, or whatever) in response to the weaknesses of waterfall methods. The iteration is what makes it !waterfall.

Shows what you know about PMI then. In fact, right up front in PMI's definition is that a project is "progressively elaborated."

Both are perfectly at ease with an iterative model, simply noting that

1) each iteration is itself a project (or at least: a project phase which requires most of the same initiation, planning, execution, monitoring & control and closing disciplines as a whole project)

2) if you're changing your mind as you go along, there's a real Risk that you'll have to rework earlier outputs to suit later decisions, with impacts to some or all project constraints.

But of course, handling Risk within a mature framework - like PMI's - you'll be able to assess this against the Risks of not doing this, and understand which - in any given situation - is the better option.

In fact, I'm still trying to see one Agile practise that is unique and couldn't be applied to any other project:* Daily standup meetings? Check* Ongoing stakeholder involvement? Check* Defining tests as a way of validating Quality (ie that the product meets requirements)? Check* Delivering end to end working software each release/iteration? Check* Late changes in requirements? Check, but each one has a real decision about whether the impact of it is worth the benefit* Trust of the individuals & Self-Organising teams? Check - good old McGregor's Theory of Y (as described in PMBoK)* Co-location? Check

And far too often, I've seen Agile used as an excuse for cowboy coding and sheer laziness in producing any documentation (try getting your AMS team to support a complex system and diagnose a production outage based on Googlechat records); I've seen developers rush for it as a simple power grab (but hey, everyone thinks they should be in charge), which is slightly better than rushing for it because it's trendy; I've seen Agile projects swallow 10s of millions of dollars with no business benefit (and yes, I've seen waterfall projects do the same).

While Agile is a particularly useful approach when dealing with customers who genuinely can't know their destination until they're along the track of discovery, and so evolving requirements within the project lifecycle are really called for (User Interfaces being a major example), generally it's not development's responsibility to cover up for customers' failure to plan and decide.

And to anyone who's done a few Agile projects with teams of 5, or 10, or 50... come back and boast when you've done something *hard*. As the classic book on the PMP exam says: it tests from the perspective of a large project, that involves 200 people from many countries, takes at least one year, has never been done before in the organisation and has a budget of US$100m or more.

When you've done that kind of project, come back and complain about PMI's methodology.

Hard methodologies are definitely the way to go in some environments, to solve some kinds of problems. I've been in those situations, and been glad that the structure was there. However, there's situations where agile-like project organizations will do the job better. The trick is realizing that no size fits all, and that each development model fits different organizations, and sometimes even projects within organizations, much differently.

Actually, neither PMI or CMMI push for the use of waterfall methodologies. I've worked in a CMMI 3 environment and we did XP. And I've also worked on projects "guided" by PMBOK managers and using Scrum.

But I know what you're talking about, I guess I've worked in similar situations, where people think this push exists. It usually happens because during their training, whoever talked about CMM or PMBOK used waterfall like phases or tools like a WBS as examples somewhere and people just assume it needs to be t

Nevertheless, I fail to see why XP and generally the "Agile" crowd has to compare itself only to the Waterfall Model, every single time. It's getting like listening to the Creationist crowd: they can't even make their point without devolving into just bad-mouthing Darwinism. WTF of a claim is it, anyway? "Yay, we're better than a methodology known to be the absolute worst at least since 1970"? There's a difference between it being alive in some places, and acting as if that was the only alternative to the "

The first link starts with a mistake and is from 2003. A lot has changed in "agile development" since then.

And the second actually praises the way Google does things (quite "agile" if I may say so) while criticizing fads and the methodology-of-the-month.

I couldn't agree more with Steve about how bad it is with the agile fad and all the courses and whatever people keep selling and hyping about. And I do hate using to worldagile to mean so much different stuff.

Steve Yegge's post is (line for line) more of a masturbation session on how cool Google is.

Although he also makes the point quite well that if you're all rock star coders in a companydriven by engineering instead of marketing, you don't need no "steenkin'" methodology. Butthen again, that's more of my first complaint now isn't it;)

"Agile has turned out to be the worst of Micromanagement. Make replaceable parts out of every developer."

That's as much a strawman as the (implicit) that everything that is not "agile" must be pureshit waterfall. Agile *empowers" developers: it allows developer-to-customer feedback instead of a top-bottom "that's what you are gonna do". It rewards the problem realm knowledge the developer gains through succesive iterations and feedback with the customers. It gives the chance to the developer to have some

Absolutely not. Writing something in an "agile" way is not the same thing as producing something that is "agile". As an example, a project here at work was a 2 year development effort, for a (in the grand scheme of things) pretty boring CRUD/workflow type app. What would be the reasonable approach to take to build 20 modules, all with pretty similar features? Write (or buy) an engine, pushing as much details as possible as data (configuration files). Adding a new module should this take days; changing modul

The best programmers utilize domain specific knowledge gained through years of experience to perform the project design and development tasks that make sense. Trying to generalize one model to fit all domains is doomed to failure. Mainframe COBOL screens work differently than web screens which work differently than low level screen drivers and so on.

If you're starting work in a new domain, no methodology is magically going to make things work. New domains of development require plenty of experimentation and failure. How to best build the project is going to depend on what comes out of that experimentation.

And above all, the most important factor is people. You need smart people. No amount of clever methodology is going to make mediocre programmers create a great project. And for smart people, SDLC usually stands in the way of what they already know works best.

And above all, the most important factor is people. You need smart people. No amount of clever methodology is going to make mediocre programmers create a great project. And for smart people, SDLC usually stands in the way of what they already know works best.

Well said! In fact, in my 25+ years of programming I have found that methodologies are put in place for PHBs and the talentless. I recall when I took my first programming class the teacher insisted we must write the flowchart first. I found it easier to write the code and reverse engineer the flowchart. To this day I find the formal process to be an anchor weighing down creativity and speed of development.

Smart people can still make a project fail miserably. 50/50 they also get in the way of a project (by miscalculating priorities for instance).

You don't just need smart people: you need the right people. That's called a team. To the MBA's, that management 101, and is no different for the s/w field. Bring the wrong team, and I guarantee it will fail.

The problem with right-designing (and I do like the designation, btw) is that having multiple methodologies can create confusion. Consider the corporate IT shop. Hundreds of IT folk of various skill sets working in an organization with strict auditing and quality controls. Repeatable, predicatable processes work best in this environment... pick one and stick with it as best as you can so everyone's on the same page. It's a trade-off to be sure, but in some environments you have to learn to work within s

Agile Development is a monthly waterfall. You have short cycles, but if anything, there is even MORE up front planning than you would get for each sprint than a waterfall model and precious little time to change if things need to be retasked.

The only real advance in methodologies as of late has been to include automated testing.

Our company is trying to switch to Agile methods and have bought some software. Hoping to get training scheduled soon. But from what I see in the intro so far, all the examples are from GUI development or web support or IT where a large number of coders with very similar skill set is used to implement from the scratch a new application for deployment.

But our company software has a large installed base and we need to fix bugs in existing code and somehow graft new functionalities into existing architecture with full backward compatibility for old saved data. And the skill set of coders varies widely. There are just a couple who can even touch isoparametric element stiffness matrix code, to name just one example. I still dont know how agile is going to change the way those two guys work.

I see the advantages of early feedback, and early testing, testing partial implementations etc. But at some point for some kind of code development, Agile may not be the best way to do the code. And I am hoping the training will shed light on where I can use Agile and where I should stay clear of it. I don't want to jump on a band wagon because it is the latest and then have a minor revolt among my padavans.

I ask you that you go into it with an open mind. Agile stuff has been treated as an overhyped bogeyman (look at all the posts here) and understandably many of us geeks are very cautious to approach it. However, if you pull the layers of hype and bullshit away, you recognize that it's just a different mindset that guides the work. This mindset happens to work a lot better with people and software (I am now convinced of this).

I'm hoping that you have a good instructor for training. But also, I am hoping that

My preferred approach these days is what I think is technically called Evolutionary Prototyping. Basically you start with some rough requirements, make something to show people, get more refined requirements, and repeat. At some point when the product is useful, you go live, but in reality you're never done and just the time between deployments gets longer.

This approach is horrible for things that have to work perfectly the first time (e.g. rockets to Mars), but for web development seems to be a decent approach. It's also hard to estimate how long something will take, as the requirements aren't known up front. Still, it's what I've been using for years. I don't think I knew what it was called until our organization brought in a consultant to talk about this stuff.

Yeah, this is the only type of approach that really works at my job. Most of the time the system owners and end users don't *know* what they really want at the start. They have a vague idea of a few things, and thats it. Trying to develop detailed specs has never worked out based on that, they agree to anything you put under their nose then in 6 months realize its all wrong.

Instead, start building based on what you do know. Refine that. Get a lot of feedback. People may not know what they want, but they kno

Evolutionary Prototyping probably works well with small systems. With large systems, the sheer weight and cost of producing many prototypes is going to get in the way. Just testing a large system is a significant expense.

Another area it probably won't fly is security systems for organizations like NSA. You won't be going back and forth with those guys over some system, they'll simply conclude you have no idea what you are doing and your contract won't be renewed. All of your security mechanisms and overall

This approach is horrible for things that have to work perfectly the first time (e.g. rockets to Mars), but for web development seems to be a decent approach.

Doesn't NASA use a fair amount of prototyping when they're designing new rocket systems? I mean, there's nothing like building a small scale version to see a preview of what you've got to deal with when building the full scale system.

I swear, between O'Reilly's [oreilly.com] animals and Manning's [manning.com] depictions of people from different eras and cultures, I'd say a picture of a guy with a dead bird tucked in his belt is the most random choice for the cover of a programming book that I've ever seen.

and everybody is theoretically supposed to be able to do system analysis, coding, and QA. That way nobody is unique and it's much easier for them to lay you off and swap some new cog into the machine. (Ok, so I just got laid off and I was on one of the agile teams. Actually we had a bunch of agile teams and they pretty much dumped most of the people on them and kept people who weren't on them because they had unique skills. I know, I should have seen that one coming.)

The attempt to write a Python implementation in Python, PyPy [codespeak.net], turned into a death march. The project has been underway since at least 2003 (when they had their first "sprint"), never produced a usable system, and the European Union pulled the plug on funding. But the project limps on. There's a released version. It's slower than CPython. There's supposed to be a "just in time" compiler Real Soon Now. (This is try #2 at a JIT, not counting the schemes for outputting Java bytecode and Javascript.) Six years in on a compiler project, and no product.

The PyPy project is very "agile". They have "sprints". They have "flexibility". They have nightly builds. They have mailing lists and trackers. They support multiple output back-ends. They have about 50 contributors. What they don't have is a usable product.

Meanwhile, one programmer produced Shed Skin [blogspot.com], which compiles Python to C++, with a speed gain of 5x to 50x over CPython.

When the problem is dominated by design and architecture, "agile" doesn't help.

"Agile" designs for the business logic part of web sites makes sense. The problems aren't technological. Writing an efficient compiler for a language that's never had one is a design and theory problem. A big team hacking away somewhat randomly won't cut it.

Designing is much harder than writing, but figuring out what to design is harder still. The difference between what a user wants, what they need, and what they need that can be built under budget can be very large. Few things in development are so disheartening than developing a system as demanded by the customer, and then see it go unused because the requirements that were written down had little to do with real life needs.

That one about working software vs comprehensive documentation. I mean what company was it where they actually had a problem with too much documentation. All my experience has been in IT people don't want to document anything and that dogma would probably make them think not only is it ok to not document but it's actually a good idea.

I think the appeal with agile development is that it removes any barriers that programmers might have, such as rigid milestones, etc, and basically allows management to do what they want in terms of setting goals. It also is appealing to management because the knowledge sharing implies that they can get rid of their most expensive employees after a period of time (once the knowledge has dispersed). Specialized knowledge is an anathema to management, as it means that you have to pay that person more, and it's critical to the business, it's harder to fire them.

We have to evaluate agile based on it's real world results, not what the books describe. In the real-world, agile creates a very high-pressure work environment, where personal space is non-existent, everyone is watching you, and your work is constantly on display. This pressure can produce productivity gains but I would say that in the long run these gains aren't sustainable. I think agile is a very poor fit for your average introvert, which, imagine that, describes most programmers very well. What I believe will happen is that over time the better developers will move to a work place where things aren't quite so agile.

In the mean time, throwing out such ideas as design first, is going to cost us, big time. I think that software quality will drop, but it won't be obvious, as "quality" and "productivity" aren't things that are easily measurable. Often times, managers walk through a room, and if they see a bunch of people typing away or debating some design issue, then they see that busyness as productivity. No, I think the drop in productivity will become apparent when non-agile competitors clean their clocks, but then it will be too late.

Right, because goofing off is only allowable for managers, who have their own offices, not programmers. After all, we all know that having a desk in a quiet, distraction free location means that one is up to no good. What other reason would someone want this kind of environment? The same goes for libraries and other quiet settings, it clearly means that people are up to no good.

You're going to overlook quite a few very talented programmers with that kind of attitude.

. Often times, managers walk through a room, and if they see a bunch of people typing away or debating some design issue, then they see that busyness as productivity.

Very true.

The biggest surprise I had in my professional life was when I was stuck on a particular problem. Completely stuck. So I tried doing other stuff in the mean time. Then I rand out of stuff to do. Posted on newsgroups and forums, but still no answers.

So I asked my boss if he had anything else I could do, while waiting. "Go play solitaire, read a book, go for a run. But not too far, and keep your cell on you if we need you."

He knew that some things cannot be forced. So I got paid to sit on my ass and read books for two days until the answer popped into my head. Sometimes the best way to be productive is to just sit back, relax and do nothing.

My hope is that it will fail. However, I've seen over and over that just because something should fail, doesn't mean it will. It's all a matter of what people will put up with. Unfortunately, markets don't always teach us the correct lesson, or measure the correct variables.

If anything, Agile requires high quality developers to be very successful: Since the design process is diluted, you can't get away with having one or two people that run the show technically and a bunch of drones: Bad developers that can't shape up throw a wrench into the system, and send quality down to the toilet.

An agile system where the developers are actually talented raises their skill levels as they go along, and it can be very rewarding to everyone involved. Eventually management can just set them loose on a project, knowing not just that the end result will be good, but that most issues with requirements will be dealt with before they become a serious problem.

If anything, the problems I've had getting Agile implemented is getting management to accept the budget issues that come from leaving behind a model of two very good developers/architects and a bunch of bums paid a pittance.

My biggest problem with quick prototypes are the users that expect too much.

Me: Here's the prototype. It's black and white, but the finished product will be in full color.
Them: This menu item is supposed to be green.
Me: That's because the prototype is in black and white. We're just trying to get the text and spacing correct
Them: That item is supposed to be red...

Managing user's expectations during the prototype phase can be a full-time job. (:-(

There's plenty of different methodologies to writing software. IBM's Rational system combines project management, a feature request system and a code management system into an iterative development structure that I found to be very robust and fun to work with.The idea is to start writing code now. Write the code, make sure your concepts work - and before it's been written into a budget and had time allocated to it. With each iteration of the development process, you can bring together more code and mo

No, you missed the point. What I said was that it got you to start your coding to make sure your concepts will actually work - BEFORE you blow your budget out because it turns out you can't do something as originally intended. Traditionally you'd write up the whole project plan, budget time, resources, money, etc - before writing a line of code. In the Rational way, you do have to do some of that, but you can avoid the major pitfalls of budget (be it time, money, resources, whatever) under and overru

Is that, everyone gets so caught up in "hours burned" for each scrum, that what is actually getting burned takes a back seat to the deliverable. I was on a sprint where I had no hours burned for three days, which really, really sucked. But at the end, I was the only one that actually had a product that you could do sprint demo with.

While the this methodology and... The Agile methodologies which are described...In describing these development methodologies...non-religious presentation of multiple Agile methodologies...

The word method may be defined as "a systematic way of accomplishing a task"; near-synonyms include "procedure" or "technique". The word methodology means "a study of methods", or perhaps "a comparison of methods", or "a science of methods".

This substitution of the word "methodology" for "method" happens so frequentl

Scrum has one good point - show your code to your customer often and apply his comments. But that works only for GUI or web. I cannot imagine that in, say, medical software. For instance, one of my colleagues had to optimize crucial piece of code (some heavy math for CAT scanner). It took him several months. Only thing he could say to the customer was like "it is now two times faster" or "it is now 3.45 times faster". Not something you can really get some useful feedback on.

In another company, we were supposed to do a project for GE, and there was some serious GUI there, but their office was some several thousands miles from us and I really didn't know how we would make regular meetings with them as Scrum required. Also, they had some very elaborate specs and they did not see too much point of Scrum methodology. In some other situation, that project would be a perfect one for Scrum.

PS One thing I don't like with Agile evangelists is that Agile (Scrum in this case) is always right. It's either "what are you doing is not Scrum" or "you did not adapt Scrum to your needs", but its never the problem in Scrum.

Waterfall? Agile? All are passing fads. There is only one correct methodology, and you are probably using it already - Avalanche Software Development Process:http://www.youtube.com/watch?v=yR_XIPOkSmQ [youtube.com]

I would hope so. Accounting is so formalized that the prototype essentially has already been written, it's in the way accounts would do it without a computer. The same rules have to be followed whether you are doing it by hand or machine.

You bring up great points. If a job is currently being done, and we want it to be done the exact same way, but cheap and faster, a methodical method would work well.
If, on the other hand, you want to create something that's never been done before, an agile methodolo