Posted
by
timothy
on Wednesday December 23, 2009 @02:27PM
from the productivity-is-a-blunt-edged-word dept.

theodp writes "John D. Cook takes a stab at explaining why programmers are not paid in proportion to their productivity. The basic problem, Cook explains, is that extreme programmer productivity may not be obvious. A salesman who sells 10x as much as his peers will be noticed, and compensated accordingly. And if a bricklayer were 10x more productive than his peers, this would be obvious too (it doesn't happen). But the best programmers do not write 10x as many lines of code; nor do they work 10x as many hours. Programmers are most effective when they avoid writing code. An über-programmer, Cook explains, is likely to be someone who stares quietly into space and then says 'Hmm. I think I've seen something like this before.'"

Programming is usually team work and as such kind of hard to measure compared to salesman who just pulls for himself. Another thing is that coders aren't usually that good at expressing themself, so it may not be obvious who is being more productive than others.

And how do you measure that productivity? Is it the amount of code you write? What if its bad code.. Is it the quality of code? What if that shows up as less productive.. No one notices unless you make it visible and show your boss or developer that you're the man.

But being awesome coder and making upper level see it won't get you 10x salary. It might get you a better salary, but at that point you should probably aim for developer position or boss level, because that will happen eventually.

I know a person who used to run a application company. There was a coder who worked as such for some years, but he also took more important stuff to handle in the company. His boss always told how good coder he is and definitely noticed him over the others working there. Later he became the boss running that company, when the old one stepped down and only owned the company anymore.

Writing a new routine for an accounts payable system is one thing but.. there are just so many Gary Kildalls, Bill Gates and Paul Allen, Woz and Jobs, or John Carmacks in the world and these are paid by the universe accordingly.
Of course there are also many Phil Katz out there too..

I'd argue that there are more of them than you think.. It's just that all the hard (and cool) stuff has already been done. So the guy who 30 years ago might have developed the first viable JIT compiler is now working on some esoteric feature of some esoteric codebase that you've probably never heard of. There's a lot more programmers now than there were when those guys got their start,

And for the record, I'm probably a better coder than Bill Gates ever was (as for a business-man, not so much).

I actually like a lot of the.Net framework and related architectures myself. It is a bit bloated, but not too much more so than other frameworks, and does offer a lot to productivity over lower-level constructs.

I'm not aware of any evidence that makes me believe BG is a particularly impressive programmer and coding for 64k limits is hardly a metric for skill. You're obviously too young but a lot of us were coding to 4k limits or even much less. 64k is downright roomy especially with assembler or procedural languages. When I finally got a Commodore 64, I didn't know what to do with all that memory. It was hard to imagine how to use it all. Shit, I used to write custom databases for the military in Turbo Pascal that compiled to under 8k.

Heck forget memory limits, those were easy (he said, using his tongue to push his dentures back onto the roof of his mouth while tugging his pants up over his belly button), one time my lab partner and I re-coded our elevator simulator (written in machine code, not assembler, you wimps!) so that we could enter it with a hexadecimal keypad that was broken so the "E" key debounce didn't work. For you whippersnappers who never entered machine code with a keypad (not keyboard!) or switches, that means we rewrote it so that no machine instruction (one byte instructions) or data byte had "1110" as the lowest four bits. No that was programming.

Cyril, my lab partner, wound up being Bob Moog's protege and has become the key designer at Moog. Wonder if he remembers that afternoon in EE lab.

Then there was Bob in high school, who reconfigured RSTS control blocks through the front panel switches on the PDP-11/40 to enable root-like privileges. Now, that was art: several levels of indirection, the machine needed to be halted to use the panel, but it was a timeshare system and you had to get it running before any of the users noticed. Pure art, until that one time he made a mistake and caused a crash that rewrote the master file directory with all zeroes. That was a long night writing, testing, and running a program in BASIC that used heuristics to read the disks and a three month old backup (for getting user IDs and old passwords) to recover the directories on the disk. When the security guard came in at 3 AM, Chris (the friend helping me fix Bob's mistake) had to talk the security guard into not waking up the Dean of Students to report us. Bob got kicked off the admin staff for a while. Things got boring after that.

I think the point is, that those more recent theories are a lot more complex than the previous ones.

Same goes for programming; in order to stand out amongst the crowd, you'll have to create something much more complex than you would have some 20-30 years ago.

Most of us could have invented the sorting algorithms we use. Most of us could have invented the data structures we use. Most of us could have invented a lot of stuff that made other people famous some 20-30 years ago. I know I "invented" some algorithm

Oh, I don't know about most programmers being good enough to invent data structures. In the 1980s, my friend Karen was credited by her co-workers at Sperry as the inventor of doubly linked lists. She encountered a situation that obviously needed them and got rid of a bunch of old, inefficient code. Exactly none of her co-workers had seen them before, nor had they thought of using something like that to fix the performance problems they were encountering due to poor data structure choice.

Somehow I think that Bill may have had a slight "edge" over other applicants for the chief software architect position.

This is made clearer if you investigate his full name, William Henry Gates III. His father, William Henry Gates II, was a fairly successful businessman who had the money to send him to Harvard, where he made the contacts needed to start a business with inside access into IMB's management level. Yes, he was a geeky kid who got into the new small-computer stuff, and he even learned a bit of

But, code is a product, and expected to be created. The value is obvious when it's completed, but still worthless to the bean counters until someone in sales sells it to a customer. The more customers they sell the code to, the more profitable it's become.

The thanks never comes down to the programmers. When the product is completed, it's likely they'll be let go, since no more work needs to be done. The sales staff could continue selling it for years, and making a profit.

I was told, I have to be able to sell the product. That's not where I want to be. I like creating things. I prefer to leave it up to sales to make it profitable. Unfortunately, the way most bosses run the show, development will always be a negative cashflow area, and sales will always be positive. In that, they consider development bad for the company, and forget that without our work, they'd never turn a profit.

That's probably where the term "code monkey" also comes from. It's sad but true. The only solution probably is to start your own company, or do things independently. However that usually requires you to create the idea too, not just following specs. But theres on good thing - on internet age it's easy to get marketers for you who promote the product for a share of income (CPA marketing).

The thanks never comes down to the programmers. When the product is completed, it's likely they'll be let go, since no more work needs to be done. The sales staff could continue selling it for years, and making a profit.

This is why I always leave lots of bugs in the code, and name the variables: a, aa, aAa, Aa, etc. They can never fire me.

... they designed the system, got the experience and then left to either set up their own company, to go abroad or become a contractor, and leave the bug fixing to someone else.

There used to be a solution for that back in the 90's. It was called equity ownership. If you tell a programmer, "build this app" he'll build the app. If you tell a programmer, "help build this company" then there's a good chance he'll help build the company.

This is why I always leave lots of bugs in the code, and name the variables: a, aa, aAa, Aa, etc. They can never fire me.

Hey, Intron, good to hear from you again. Seriously, we are really sorry we never sent you your last check after we fired you (your code had a bug in it which corrupted our terminated employee database beyond repair so we didn't have your address anymore).

In the modern world, those who create wealth are *never* paid anything close to the value of what they created. The lion's share goes to the rich fat-cats that sit at the top of the corporate ladder, contributing nothing that wouldn't have been present otherwise.

When the product is completed, it's likely they'll be let go, since no more work needs to be done. The sales staff could continue selling it for years, and making a profit.

Software that's finished in finite time? (Forever-finished, not just this-release-finished.)
What a concept! Exactly what segment of the industry are you working in over there? If my organization stopped development for a year or two just to sell the existing stuff, our competitors would soon crush us handily.

This release finish is usually enough for most companies. They can thin the herd of most of their developers, keep just a very few on, and when it's time to start on the next release, hire on fresh meat for a fraction of the cost of the last crew.

There used to be company loyalty. That's long since gone. Back in the day, if you had a job and were good at it, you would continue the job for the rest of your life, get yearly raises and promotions. Now, once they can terminate y

The thanks never comes down to the programmers. When the product is completed, it's likely they'll be let go, since no more work needs to be done. The sales staff could continue selling it for years, and making a profit.

Actually, this is the way that "creative" professions have generally worked. Consider the typical sculptor or painter. Even those that reached a level of fame have usually been paid only once for each creation. It is then owned by the client, who can resell it and not give the creator any part of the sale. There are a few countries that have dabbled with royalties for resale, but this is rare, and the royalties are typically small. The real profit from art goes to the sponsors and investors.

Authors and musicians have had some small success in getting royalties for their work. But this is most often "honored in the breach". It's well known that recording artists don't get any royalties at all, and may lose money, unless the recording sells around 1.5 to 2 million copies. Before that, all the income goes to the owner of the recording, which is the corporation that produced and marketed it. Even after a recording reaches the profitable stage, the artist typically gets only a few percent of each sale. The situation is similar with authors, who may be paid a small "advance" before production, but rarely makes a profit until several million copies have been sold. Most writers have worked for corporations such as newspapers or other periodicals, who pay a salary and claim all income from sales.

The movie industry has a few showcase stars who have made a small fortune in royalties. But most actors are "starving artists" who have to work at part-time jobs to get rent and food money. Movies are owned by the producers, not the actors. The few stars are held out as bait to attract the many workers who will never be stars and will never make a decent living from their creativity.

Software programmers like to think that they're something new that the world has never seen. But in reality they are merely creators in a new medium, and they are treated as the commercial world has always treated creative types. They're workers who can be paid a small salary to produce, and when they produce something that sells, the corporation can claim the profits. A few stars can be paid some royalties (still only a few percent of sales) and held up as public examples to attract the many workers that the industry needs.

The thanks never comes down to the programmers. When the product is completed, it's likely they'll be let go, since no more work needs to be done. The sales staff could continue selling it for years, and making a profit.

Cry me a river. If a programmer wants lots of money, let him/her make and sell a product or service. It's a free country, and anyone who aspires can make the effort. Even people who aren't "smart" or "productive" can sell vast quantities of crapola for a fortune.

Perhaps instead of trying to measure productivity directly, it could be done in other ways. For instance, there needs to be a project done - get two small teams together to tackle it: the first that gets a functional prototype done has their Tech. Dir. given the job, as well as being able to assemble his team from both. Notice which people always gets to be chosen on a team in both phases, promote those people, and let go of the people never chosen.

No worker in America has pay which is "proportional to productivity". That's not how our system works.

As long as you've got CEOs making 200-400 times the pay of the average worker in the same corporation, it is impossible to have any pay which is "proportional".

The specific kind of profits which most American companies strive for, the short-term profits that they return to their equity shareholders, make it necessary to pay all workers less than they are worth. And the trend is accelerating. If the same reduction in real income for workers that started during the Reagan administration continues, in 20 years the majority of American workers will be making about ten percent over minimum wage.

This very problem was written about [wikipedia.org] (at great length) by some guy named Karl Marx. Basically, his point was that the capital owners will always pay their employees less than they're worth to the capitalist, because that creates profits.

Basically, his point was that the capital owners will always pay their employees less than they're worth to the capitalist, because that creates profits.

Except that you could also say the capitalist always always pays people exactly what they are worth, and increases costs to consumers to create profits.

Both are non-sensical. That's why in reality, someone decides if payment being offered is worth them working for the company. Pretty much by definition, you are being paid what you are worth because only you can really decide that by accepting an offer. If you think you are not "being paid what you are worth" then you need to find someone who will pay you that, or at least leave and not suffer the insult of a continued paycheck.

Except that those with capital are the ones who decide how much you get paid. If you have no capital, you have no opportunity to increase your worth. It's all fine and dandy to say "If you really think you're worth more, go somewhere else." but if the jobs all pay the same amount, you can't go somewhere else. It's also very difficult to start your own business because the established businesses, with their economies of scale, can crush small ones and push them out of the market (see: Wal-mart).

The biggest problem with capitalism is that, for it to truly work, it requires that every party has equal information. This just isn't the case, information requires time to gather, and time costs money. Therefore, those with money get the information and use it to their advantage whenever they can. There's a reason that Harvard grads get the best paying jobs, and it's not because they are the "best and brightest," as they'd have you think they are.

Don't get me wrong, I'm a capitalist, but to assume that it is a perfect system is silly. To abuse a Winston Churchill quote about democracy, "Capitalism is the worst economic system there is, except all the others."

"Conditions at hand" can be and often are, illegally, deceptively, or destructively manipulated. If you even imagine that I'm kidding, look at the history of child labor, indentured servitude, diamond mining, or the music industry to see how workers have been abused to focus wealth in the power of a select few.

Or like someone who wants to buy a sammich from Subway. Oh you don't give them $20 for a footlong sub? Greedy pig.

Your comment is sadly, more insightful than someone quoting Karl Marx. Individual customers try to maximize the utility of their dollar by buying the cheapest thing and as a consequence lower the value of another person's work. When everyone does this, wages get minimized. Hence, outsourcing is popular because we can usually pay cheaper wages to someone else. CEOs who do the min/max-ing get paid a lot, but how much they're paid is probably small compared to how much they save; usually in the short term. Tho

It's not just pockets, it's a mass phenomenon. Simply due to the laws of the market: Everyone is looking out for their own, personal goal. Companies want to produce cheaply to maximize profit. People want to buy cheaply to get the most for their money. This works as long as customer and producer are in the same trade circle. If I produce something, I get paid by whoever buys it, thus enabling me to buy something else and so on.

The system breaks if there is a net cash flow away from a group of people towards

This equilibrium is what they hard sell to the populations, as the justification for their policies, but no way do they ever want to achieve that.

The globalist fatcats will *always* make sure there is a great imbalance in wages (a lack of equilibrium) so they can take advantage of wage arbitrage, plus get paid to destroy any approaching equilibrium, then get paid again to start to reconstruct it..but they will never let it actually get there. It isn't nearly as profitable for them after that point of equili

Sorry, no sale. One of the core reasons why these jobs are shipped overseas is that the wages are much lower there. Lower than what would be necessary to purchase the goods produced. It's essentially a repetition of history, the scale is just bigger and more people are involved. In 1930, the core problem was that the factory workers earned too little money to purchase the goods they produced. Today, we have Chinese workers who don't have the money to buy the goods they produce (ok, that wasn't part of the plan anyway, they weren't supposed to buy them), but the people who are supposed to buy them can't either because they don't earn any money at all. The whole system kinda-sorta worked for as long as people had some savings to fall back. When the savings were gone, they started refinancing their homes. Now that the real estate bubble popped big time, they can't do that anymore either. People started cashing in their insurance, and if people do that in droves, the insurance and financing companies start to quiver. And, as we have seen, crumble.

I'm quite sure that one of the core reasons for the mess our economy is currently in can be found in the offshoring of jobs. If you want a market economy to work, people have to buy. People have to have money to buy. People need jobs to have money. Companies have to offer jobs in the country they want to sell in, so people have those jobs, thus money, thus can consume. I'm no BA major, but even I know that much.

We can debate the relative merits of the real value of the minimum wage at another time
but, in the interim, in the interests of accuracy, for a slightly less anecdotal analysis of the relative value of the US dollar, see MeasuringWorth.com [measuringworth.com] (which suggests $8.48/hr as an equivalent minimum wage, not $17.50, based on the consumer price index). That's a lot closer to par.

I believe most economists suggest that the CPI slightly-overstates inflation by failing to make any adjustment for increases in product q

Just because the CEO's work results in a larger financial transaction doesn't mean he is more productive; he's just doing his job, same as the coder. If they both come in and work hard for eight hours every day, their productivity is equal. The CEO needs to be paid more because the requirements are higher; the position seeks to attract the most applicants in an effort to attract the most qualified applicant. Beyond the amount of money needed to affect that attraction, CE

There is a very simple counter factual to this. CEO pay has grown 6 fold since 1990 (Forbes [forbes.com]). The economy hasn't. Median salary hasn't. Have they somehow become six times rarer or six times more effective without the economy noticing? The market doesn't drive ceo salary. Productivity doesn't drive ceo salary.

Oh, what the hell. I'll say it. That "good CEO" couldn't do the job without standing on the shoulders of everyone underneath. And emotions are *important*, because otherwise we'd be a bunch of robots (and some CEOs would love that, darling little sociopaths that they are).

Ability to shoulder risk? Stability? How many billions have we had to throw away on bailouts because a bunch of those CEOs turned out to be incapable of giving a damn about the risks - to other people - of destabilising the economy?

Frankly I don't think many here would mind that CEOs can make many times average worker pay if they didn't also see CEOs sailing off in their new yacht/plane/limo while the company retrenches a quarter of its workforce because times are "tough"...

Gross disparity during adversity (whether real or PR snow job) is poisonous to morale - and, for those who insist on "rational analysis", also to productivity.

Finally, I do think there are good CEOs out there. More than the bad. But it doesn't require a lot of bad ones to break the system, and when the system itself rewards sociopathic behaviour, that's not good and does not bode well.

As long as you've got CEOs making 200-400 times the pay of the average worker in the same corporation, it is impossible to have any pay which is "proportional".

This sort of analysis of the business is pretty shallow and based more on emotion and prejudice more than reality and facts. I don't think it's utterly beyond belief that a good CEO can make deals with other bigwigs and boost the company's bottom line at least 200x as much as an average worker can. Furthermore, I think you seem to be conflating the portion of value which is created by the corporate structure (and its access to capital, and its ability to shoulder risk, and its stability in being there for its clients, and its economies of scale) with "exploitation of the worker" (which still may be present, but is probably less than what you're making it out to be).

But then, I suppose I'm wasting my breath: who would ever want to sully political rhetoric with a modicum of rational thought when dealing with a nuanced issue?

This is part of the CEO myth. The fact that the CEO get's paid that much more is part of the myth that CEO talent is rare and hard to find. I have yet to meet a CEO or VP who actually lives up to that claim of talent and ability. There are of course, some CEOs who do live up to this, but for the most part their claim to success is much more due to being semi-competent at managing, and masters of selling their own worth. That 200x multiplier, its due to dumb luck and them taking credit for the sweat of th

And yet a good CEO can execute a strategy that will increase profits hugely.

But guess what? Even the worst CEO, who drives his company into the ground, is making more than 100 times the average worker, not including the fat golden parachute he's going to walk away with when he gets fired.

If someone working the line makes a mistake and gets fired, guess what he walks away with?

I think the problem most people have with CEO pay is that we see them perform their jobs so incompetently, it is painful to know that they get paid so much for such lousy work. I know I could do much better, but I'm not 'qualified' for the work, so I can't get the job. I'd be happy to do the job twice as well for a quarter of the pay.

Another thing is that coders aren't usually that good at expressing themself, so it may not be obvious who is being more productive than others.

Bullshit. Good programmers are great at expressing themselves, thats what programmers DO. That excuse is made by crappy 'programmers' who are really just introverts who aren't actually good at programming but rather are even worse at dealing with other living creatures.

A programmers job is to take an idea and express it in a way a computer can understand. All we DO is express ourselves, if you aren't good at expressing yourself, you aren't a good programmer.

A programmers job is to take an idea and express it in a way a computer can understand. All we DO is express ourselves, if you aren't good at expressing yourself, you aren't a good programmer.

If I understand you right, you consider expressing yourself in a computer-understandable fashion makes you good at expressing yourself. I would like to introduce you to the rest of the world, where being good at expressing yourself is measured by how well other people understand you. People, not computers.

The other item that almost everyone overlooks is that an Uber-coder writes READABLE code. If you look at what a really good programmer writes you will be able to understand what is going on, even 10 or 20 years after it was written. Unfortunately, most people suck...

Another factor is that the manager likely recognizes the uber coders, and any piece that is particularly difficult or important gets assigned to the uber coder. So their productivity may appear to be no better than others because the lead has compensated by giving them the pieces that nobody else can be trusted to do.

One guy has great productivity creating a frequency distribution report. It works, looks good, and everyone is happy. It took him a week to do. The uber coder could have batted that out in an afternoon, but instead spent a week ensuring that histogrammer behind the report was multi-core aware and could scale to billions of data points without dragging the system to its knees. The fact that the report programmer would have floundered at that task for weeks is not going to be apparent to most people - even many other people on the team. So the uber-ness of the uber programmer is hidden by the work they are assigned.

As I pointed out previously, incompetent programmers require more servers. Their code spends more time not running, requires a larger support infrastructure to deal with the problems created and generally reduces profits all round.

These days it's difficult to point at a specific individual, but teams are easy. You can see which teams are a group of competent engineers and which are just a clusterfuck[1] of developers.

To be accurate, you have increased profits by $11.80/hour. How much this relatively increases profits is impossible to determine without knowing what the company is making in total. The 6000% figure only applies if the company was so far making an hourly profit of around $0.20, in which case it's amazing they are still able to pay you.;-)

Some of my most "productive" days have resulted in a net deletion of many hundreds of lines of code. Mostly this is cleaning horrendous cut & paste jobs, and refactoring APIs to dump buggy, unnecessary functionality. That one day of effort probably saves weeks of bug-hunting and spaghetti-unwinding further down the road. It would appear to be negatively productive by any naive metric.

I'd argue coder pay should be proportional to productivity. It's just that there's no shortcuts to measuring a coder's productivity.

I knew a couple folks in my small development shop (~20 people) who were always being rewarded because the informal metric was lines of output. I had to take over for one of the top performers after she left for vacation. Looking through her code, I discovered that the code was merely average, much like mine. I asked another top performer if I could look through his code because I wanted to better understand his interface. His was also mediocre code with roughly the same ratio of lines to output as my code was.

When the other top performer came back from vacation, I took the two of them into the break room and asked them why they are getting undue credit based on the "lines of output metric". They both chuckled and gave each other knowing glances before one of them said, "No, silly, it's how many lines of cocaine we bust out to the boss...see?" The woman pulled out a small bag of whitish powder, a razor blade, and a scratched-up mirror tile. The guy rolled up a 20 dollar bill, tight as a drum, and passed it to me. "Go! Go! Go!", they whispered as I bent down with the tooter in my nostril, snorting 3 medium-sized lines of sweet Columbian. I had felt a strong euphoria like 1,000 cups of coffee overwhelm my body. The guy giggled sheepishly in a high-pitched voice as he went back to work. The woman who was still with me chopped up 3 more gaggers and snorted them up before we fucked madly in the utility closet like wild beasts during the rut. Oh, what a day that was!

"Measuring programming progress by lines of code is like measuring aircraft building progress by weight" -- Bill Gates

The problem is design progress of almost anything is very difficult to measure, and when multiple people work on the same project it makes it almost impossible to sort out who did what. Sales does not have that problem because progress is not measured, however results can be measured before the paycheck is cut, and everyone is responsable for themself only. De

I have, on rare occasions, been Amazingly Productive. There are very narrowly-defined kinds of work where I am super fast. One of them is debugging. So, when we were doing our "no new features, clear out every P1 and P2 bug in this branch" run, I was awesome -- I regularly fixed many more bugs than anyone else. On the other hand... A lot of the time, I'm not much good. If I have a bad-ADHD week, I can have an entire day go by where I simply never quite get around to doing anything but mostly keeping up on my inbox.

So am I super productive, or not very productive, or what? I don't know. Realistically, the answer is probably "if you give me the sorts of work I'm good at, I'm great, otherwise I'm sorta mediocre." But I'm not sure how you'd measure that.

There's also a much more basic failure-to-apply-economics in the article. The value of something which does 10x as much is not necessarily exactly 10x. Is a monitor with 3x as many pixels worth exactly 3x as much? No. Is a video card which can render exactly 2x as many polygons worth exactly 2x as much? No. On the high end, you might see people paying 2x as much for 20% more polygons. On the low end, you might see people paying 20% more for 5x more polygons. Or there might be other factors; you might care about power consumption, or form factor, or...

I just bought a new Eee. It's SLOWER than the previous one I was using. I paid about the same amount for it, several months later. But it has a higher resolution display, and better battery life... So is it worth the same amount? I have no clue.

Long story short: The marginal value of the "more productive programmer" is not necessarily linear with productivity. Add in other complexities (plays-well-with-others, can do trade shows, reliable about giving feedback on progress) and general market forces, and I don't think it's just a question of measurement; I think it's largely that, in general, programmers are willing to work for comparable amounts of money, and the marginal benefits aren't as large as you might think they would be if you looked only at some measure of productivity. Even if it were a very good measure.

So am I super productive, or not very productive, or what? I don't know. Realistically, the answer is probably "if you give me the sorts of work I'm good at, I'm great, otherwise I'm sorta mediocre." But I'm not sure how you'd measure that.

As much as programmers often hate them, this is where good management comes in.A manager who knows when to apply you to the project and where to put you on a team is going to get the most out of your abilities and you will both benefit.

Programmer "A" is an expert and they have a strong opinion that approach "Y" is the best approach- and it is a solid approach.Programmer "B" is an expert and they have a strong opinion that approach "P" is the best approach- and it is a solid approach.Programmer "C" is an expert and they have a strong opinion that approach "3" is the best approach- and it is a solid approach.

I've seen A,B, and C get into very loud, very heated arguments over this (I've been programmer A at times when I thought the "solid" approach was missing something that I saw intuitively which they wouldn't accept until I proved it to them laboriously).

Programming is not plumbing. The goal posts are subject to change.

What is efficiency?

Delivering a 100% perfect product 3 months late?Delivering a 99% perfect product 1 week early?Delivering a 100% perfect product 3 weeks early but then they change the scope and (as one manager said to me) say "this isn't scope creep". (I turned to my programmer and asked, "can you deliver this change by the previous deadline" and they said "no" and I asked "what date can you deliver it by, and she said 5 days later, and I turned back to the sheepishly smiling manager and said, "is that date acceptable?" -- I mention this because it's a great negotiating technique. And you avoid delivering the product later than the delivered deadline without being an ass and refusing changes).

I've known "great" programmers who were- as long as they were the only one in the company- because they used operating system cheats that worked-- as long as someone else didn't use them too.

A lot of great programmers fail to understand the business side of things.

And you can never control being put on a crappy project with a bad deadline and a bad manager.

---

However, fundamentally- the compensation isn't there because there are too many people willing to do the work. I do not recommend to people who ask me that they enter the IT field in general any more. It's pay is not sufficient to cover the low status, increasing lack of freedom, required holiday work, and offshoring risk.

I've always been a firm believer in the 80-20 rule. (Keep in mind its kind of like the rule of thumb, so it wavers a bit). You can achieve 80% of a programs functionality with 20% of the effort. That's 20% of bugs, which is alot, but in the business side, its only 20% of what it would take to be perfected. Most people agree thats a decent trade off. Thats where you should set the first goal post. Once you reach that goal post, something might have come up. Perhaps you'll want to work on new features that clients have requested. Bam, another 80-20 you can fire off. If there isn't anything else to add, work on reducing those bugs.

And if a bricklayer were 10x more productive than his peers this would be obvious too

And he'd end up getting shoved off the top of a building by the bricklayers that he made look bad.

Many years ago, I had the opportunity to assist on a s/w project to replace a (broken) legacy system. It had been identified by the FAA as not providing proper control over engineering data sufficient to maintain our production certification. And, over the years it had cost the company about $250 million to build and maintain. So we (myself and five other developers) build a new system over the course of about 6 months. It was blessed by the FAA and manufacturing loved it (it actually worked). After it was all done, my team got....

...laid off.

Aside from actual coding shops, where the s/w IS your company's product, the whole free market capitalist model breaks down. The further you are away from the finished product, the more the corporation resembles a socialist economy, where headcount matters more than productivity. And much, if not most, software is produced in this setting. MS Word may sell millions of copies, but the are more lines of code (or kBytes of executable) developed internally. My boss only had 5 people under him. He was a first level manager. The legacy system employed over 100, making its manager a unit chief over several layers of PHBs. Guess who has the political power in that organization.

If you've ever dealt with a private bureaucracy, you know
that they can be just as bad as government. The problem is more
that the organazations don't scale. Also, the tendancy for all
these corps to behave in a similar way dulls the effect of competition.

As individuals we don't have much power; but we can start by patronizing
small businesses even if it costs more. Think of the added cost
as a tax paid to a shaddow government, the true government of the people--the
one that fights the big corporations instead of working for them.

No, this is not communism. Communism is dead. It's a 19th century
idea born out of the first wave of industrialization. We need 21st century
ideas, so forget the tradtional worker vs. capitalist tension, please, Please
forget it. Let's not relive that.

You've got a point about big corporations. But some of the worst office politics I've seen has been in very small, privately owned businesses. You get the same empire-building, favoritism and cronyism as in big bureaucracies, plus blatant corruption and things being run on the side to line individual managers' pockets. And compared to larger companies, there aren't nearly as many options for trying to go over or around a troublesome individual.

"If you've ever dealt with a private bureaucracy, you know that they can be just as bad as government. "

actually, there usually far worse. Government bureaucracy can be figured out and has consistent rules withing that specific framework. And you can go over the head of the bureaucrat to your elected officials.

You are thinking about how the corporation faces the outside world. I'm thinking about the internal organization. Dealing with the government or laws (within reason) isn't an issue.

I've seen departments within a company misappropriate millions of dollars of their budget. Was this fraud? Theft? Nope. Not as seen from the outside world (law enforcement and the court system). Its was all internal funds, spent (as far as the legal system is concerned) on legitimate company programs. OK, so they were the wrong p

When i have a task. i find myself 'procastrinating' for days on end, unable to commit myself directly to writing the code. during the period, the task regularly comes to my mind in sudden, odd places, doing odd things, like in wc taking a dump, trying to go to sleep, going to the grocer's and so on. then, after a few days, i suddenly sit down and swiftly complete the task. it seems like im hatching things, dealing with the thing in subconscious before doing it.

the good side, it works. and good. the bad side, i feel like im procastrinating and being irresponsible during the hatching period and its annoying.

I am glad that there are other people who have the same symptoms as I, when it comes to programming. Last week Thursday I just wasted 6 hours doing nothing on my job. Friday it got better and Monday I was back up to speed. But I have the same behaviour on my own hobby projects, yes, it really feels as if you are hatching something.

My dad was a programmer for the Star Tribune, back in the seventies and eighties.

Two things he said stick in my mind.

1. He had his own office, and sometimes he'd put up his feet and stare off into space. He told me that people passing by his office assumed that he was "doing nothing." But, he told me, he wasn't doing "nothing", he was very much doing something: thinking.

2. When he got, say, a directive from On High that he must "write a new program for the secretaries", the first thing he did was go and sit down with the secretaries, ask them about their work, and stick around for a while to actually watch them work. He called this the "going native" phase (he took his degree in anthropology). If he'd started coding on the basis of the directive from On High, the end result would be something the secretaries didn't need and wouldn't use.

> 1. He had his own office, and sometimes he'd put up his feet and stare off into space. He told me that people passing by his office assumed that he was "doing nothing." But, he told me, he wasn't doing "nothing", he was very much doing something: thinking.

I'll go even further. I have the privilege of working from home / running my own outfit.

I frequently simply go to sleep if I feel like it. For a while I felt guilty about this, but the reality is that I usually only doze for 10 minutes or so and when I wake up I have 5 solutions sitting in my head for what I need to do next. I'm not sure how or why it works, but I can struggle through a whole afternoon feeling sleepy and doing mediocre work or I can take a 10 minute nap and be a rock star for an hour... so I do. I wish this was accepted practice in workplaces because I'm sure productivity would rise overall.

The Austrian School of Economics in determining the value of products actually discounts the idea that the value of the end product is somehow connected to the labor expended in producing the product. There are many examples of this in tangible products... for example in the art market, a painter, prior to earning fame may not be able to sell a painting at all or only for a few dollars; after the painter earns fame (and is probably dead) that same painting worth a few dollars many now be worth tens of thousands of dollars. The labor that went into the product didn't change... it's still the same product. But the value of that product to society increased through unmeasurable and intangible factors.

The same amount of code and development time may have gone into a $20 dollar shareware game and a $500 dollar business app. Assuming both sell equal copies, which has more value? Which was the more 'productive'? By looking at lines of code and development time alone their value should be equal, but that's not the case. True the idea behind each of those apps contributed to the overall value differently, but even then the ideas may have taken the same 'labor' to develop while producing uneven value.

I've managed development teams myself. Over time I've learned how long certain types of feature take to develop and how well they should work in that given period of time... sort of a baseline. If a develop provides the product in less than that time with the same quality that developer is clearly more productive than a developer that fails to meet that baseline. This could be formalized to a degree, but would still maintain subjective standards of quality and estimates of effort. I agree with the premise of the posting however... you cannot judge productivity on scientifically measured quantities like lines of code or number of bugs; coding is too creative an endeavor for that and it starts to look like judging value in the way the Austrians rejected long ago.

Most of those people on the above list were just programmers starting out without really all that much but a computer and an idea. Most of them went on to be billionaires. Below them are another tier of thousands of unnameable programmers that are millionaires, and below them are millions who form the back bone of their departments.

It's pretty much, you get paid great not to just code, but more importantly, to have great ideas and code them.

The best coders I have seen wrote amazingly little code. I am not talking about crazy pointer arithmetic but just way less code than lesser programmers. Often the best programmers also deployed the available resources way better. When all is said and done the best programmers leave code that everyone worships as pure genius that everyone else builds on with ease. A great example was someone who did some great code where they took the bull buy the horns and moved the project into proper multithreading and some crazy memory usage. The server went from using maybe 10Megs per process to a collective 8Gigs spread across many threads. Sounds complex but every programmer took one look at the code and went wow. 20 servers out of 23 previously heavily loaded servers were shut down as unneeded. Even with 50% client growth every year our next server purchase will probably be in a decade. That super programmer moved on and we just kept building on his code for a long time. Programming and debugging went from a chore to a joy. Anyone could tell which code was new code because it was ugly and complex compared to the simple elegance of the original code. Without a doubt that programmer could replace the 50 pretty good programmers we have on staff now. Plus his code eliminated 3 full time system admins and has resulted in zero downtime in two years, thus avoiding millions in losses over the last and next few years. So what should his pay have been? 5 Million a year?
On a different topic, in my travels I have seen sys admins who ran well oiled machines that were amazing. At the same time I have seen sys admins who weren't properly backing up critical data. Critical as in the company would go bust in the event of a HD failure. In these same companies they had HR, CFO's, and Sales people who were paid multiples of the Admin. These "senior" managem who's screwups would be hard pressed to completely wreck the company usually saw the various computer people as a bit of a joke.

It doesn't matter if you don't get paid what you're worth, as long as you aren't going to be paid better anywhere else. Because what's your game options, quit and get a different job that sees even less of your value? Go independent and try to bill rates that high? Join a start-up and try to get that much of the total? Quit or take a long vacation and pray they'll miss you enough to take you back on a higher salary? Yeah right.

A lot of people might know internally what you did, but it's hard to convince outsiders that the projects you did really were that hard and you were that crucial to the solution and your solution was that great. Maybe even your boss knows you're brillant and he's rather fire the whole team and hand the money to you if that was what's needed to make you stay, but it will never come to that. Because who else would pay you that much money? Nobody. I guess maybe if you got some entrepreneurial skills and build the company around yourself it might happen, but that takes a very special kind of people which rarely overlaps with mastering coding.

Joel is always looking down his nose at other coders who don't have degrees from MIT. Yet he thinks pointers are the ultimate test of a programmer. He has written one tool that is of note - Fogbugz. That is, if he even wrote the code.

He just reeks of "I know better". He wrote his own language to code-gen classic ASP applications, along with PHP. Right there is a red flag. Did they move to the new ASP.net platform? Nope. That wasn't good enough I guess. No they decided to stick with classic ASP and write a language that outputs both ASP and PHP. Epic arrogance combined with ignorance IMHO.

Then look at Fogbugz. It's just a typical bug tracking application. That's it. Did it need a new language? Hardly. So now these guys wasted all that time on something only they can use and it makes zero dollars. Way to go. Real top notch development there. Fact is his company is small potatoes.

Why do I rant on Joel? Because this guy is believing the shit he spouts and extrapolating from it. Frankly I'm sick of hearing from him about what makes a good programmer. If you aren't a good programmer yourself then STFU about what makes a good programmer. Writing a few insignificant applications doesn't make you a rock star.

He also wrote an article about how Exceptions are pointless and a waste of time,
and that we should track "ErrorNumbers" ourselves manually.

He completely ignored the fact that exceptions were developed to solve
the problem of "working out in the stack where the error happened", and when
people pointed that out how ridiculous his solution was he refused to change
his mind. So screw it.

This topic is terrifying!
Productivity only makes sense when you have a static goal, which is not the case in any working environment I've encountered. Instead, I've found that I'm paid for tolerance. When a manager asks me to deliver X, but a marketer suddenly promises Y, I get paid for not killing both of them. When my manager asks me to make 1 + 1 = 3, and a marketer promises a client that 1 + 1 = 6.255, I get paid for not going on a murderous rampage.
Seriously - if it weren't for these wages - programmers would have a worse reputation than postal workers. We get paid to be driven crazy.

The worst programmers I've met are the ones who are heads down and program. They are usually very arrogant and think they are gods. Case in point, there's a guy I currently work with who is a disaster. People are in awe of him because he will work until 4am and has improved the performance of our application 100-fold.

The problem is that during the design phase, he completely disregarded all of our design recommendations and did things his way. It turned into a complete disaster, with nothing working as it should, deadlocks and complete lack of scalability, etc. So yes, he worked until 4am to improve things and did improve the performance from the initial disastrous numbers, but it was all his own fault! As well, because he was so arrogant and stubborn, he ended up producing something that no one wants anymore because the interface is too abstract and hard to use. Now, our the product is being shut down before it has even launched, because we couldn't convince any consumers to "wait until the next release" to get it to do what they actually want. All the fellow programmers think he's an asshole, but all of the managers who don't understand what he does will undoubtedly promote him.

The best programmers are the ones who keep it simple, design things excellently and program it once, with maybe a couple of iterations of performance enhancement. I've met plenty of brilliant programmers in my time, and these are the key traits that they exhibit. The "brilliant", nerdy programmers that heads-down program are rarely any better than a smart, easy-going programmer that both works hard and spends more time listening to their customers and making common sense design decisions.

It seems to me that it's probably true that it'd be very hard to come up with good metrics for a programmer, but I think people should be more careful about metrics in general.

Sure, you can measure a bricklayer by how many bricks he can lay in an hour, but is that really how you want to measure him? What about quality? Doesn't it matter if the resulting wall looks good? Doesn't it matter whether the resulting wall will hold together under stress?

But now even those are pretty simple things. Let's get a little more complicated. You're a contractor and you hire 6 bricklayers. One guy doesn't seem to work as quickly as the rest, and they all give you comparable results. You fire the slow guy and suddenly all the other guys slow down. Quality drops. The client is less happy. What happened?

Maybe if you look into the situation, you find that the slow guy was slow because he was spending some of his time communicating with the client. He was spending part of his time overseeing the other bricklayers, keeping them on task, and keeping them from being too sloppy with their work. He's been serving a vital role in your team, but you don't see that just by measuring a couple simple metrics.

Like all statistics, productivity metrics can be useful, but they can also be misleading. You should make sure you really know what they mean before you make too many judgements on them. In evaluating your employees, it's better if you actually know your employees and have a sense for who they are, how they work, and how they fit together as a team. The value of a person just can't be represented in a couple of numbers.

All great points. There's also one possible effect that is even harder to measure (perhaps impossible) and that's morale. You can watch the slower worker all day and not realize that he's the one that's keeping all the other faster guys happy and doing good work at a good pace.

Companies in my part of the States make it a point of pride to drive you to make work your life, for managers who work 50 hrs a week make their staff feel guilty if they work 80 hrs near crunchtime, the 40 hr workweek is an illusion even in the beginning of a project. And vacation? Heh. 10 days maybe?

American workers aren't that much more productive than Europe and I see why. Just the typical failed 'more pain more gain' mentality.

And what, pray tell, does your uber-programmer do when he's *not* writing out algorithms "as quickly as his fingers can move"?
Or are you suggesting that this tremendous programmer you describe has a near infinite workload where he's constantly typing out new and revised algorithms?

Given two bugs, one can be as simple as adding a forgotten quote somewhere, while the other can amount to weeks of digging through the lowest levels of some code base.

There's also a third category, my favorite: weeks of digging through the lowest levels of some (old, undocumented, messy) codebase, which is ultimately followed by a fix that adds a forgotten quote. How do you even quantify that kind of thing?