The cost of employing a programmer in my area is $50 to $100 an hour. A top end machine is only $3,000, so the cost of buying a truly great computer every three years comes to $0.50/hour. ($3000/(150 wks * 40 hours))

Do you need a top-end machine? No, the $3000 here is to represent the most that could possibly be spent not the amount that I would expect. That's roughly the cost of a top-end iMac or MacBook (17 inch).

So suppose you can save $2000 every three years by buying cheaper computers, and your average developer is making $60. (These are the most charitable numbers that I can offer the bean-counters. If you only save $1000, or $750, it only strengthens my case.) If those cheaper computers only cost you 10 minutes of productivity a day. (Not at all a stretch, I'm sure that my machine costs me more than that.) then over 3 years the 125 lost hours would add up to a loss of $7500. A loss of 1 minute a day ($750) would give a net gain of $1250, which would hardly offset the cost of poor morale.

Is this a case of "penny-wise and pound-foolish" or have I oversimplified the question? Why isn't there universal agreement (even in the 'enterprise') that software developers should have great hardware?

Edit: I should clarify that I'm not talking about a desire for screaming fast performance that would make my friends envious, and/or a SSD. I'm talking about machines with too little RAM to handle their regular workload, which leads to freezing, rebooting, and (no exaggeration) approximately 20 minutes to boot and open the typical applications on a normal Monday. (I don't shut down except for weekends.)

I'm actually slated to get a new machine soon, and it will improve things somewhat. (I'll be going from 2GB to 3GB RAM, here in 2011.) But since the new machine is mediocre by current standards, it is reasonable to expect that it will also be unacceptable before its retirement date.

Wait! before you answer or comment:

$3000 doesn't matter. If the machine you want costs less than that, that's all the more reason that it should have been purchased.

I'm not asking for more frequent upgrades. Just better hardware on the same schedule. So there is no hidden cost of installation, etc.

Please don't discuss the difference between bleeding edge hardware and very good hardware. I'm lobbying for very good hardware, as in a machine that is, at worst, one of the best machines made three years ago.

$50 - $100 / hour is an estimate of employment cost - not salary. If you work as a contractor it would be the billing rate the contracting agency uses which includes their expenses and profit, the employers Social Sec. contribution, employers health care contribution etc. Please don't comment on this number unless you know it to be unrealistic.

Make sure you are providing new content. Read all answers before providing another one.

This question exists because it has historical significance, but it is not considered a good, on-topic question for this site, so please do not use it as evidence that you can ask similar questions here. This question and its answers are frozen and cannot be changed. More info: help center.

14

Maybe they do, but not as often as you'd like? Any workstation you buy will only be "the best" for 6 months, at best. Usually a better model comes out the next quarter. To always have the best, you'd have to upgrade every 3-5 months. That's hard to maintain.
–
FrustratedWithFormsDesignerJul 18 '11 at 20:01

11

There's a human factor, too. Buy a fast machine and gain all of that productivity, then spend 10 minutes per day at the water cooler and lose it all and then some. The boss sees both sides, so the pure productivity argument loses some weight.
–
JeffKJul 18 '11 at 21:39

4

I definitely know I could use a little more punch in my machine. Not so much CPU power but RAM. Between running multiple instances of an IDE, browsers, and misc other programs another 4GB and a second monitor wouldn't hurt...
–
RigJul 19 '11 at 0:40

24

A developer without an SSD is a sad sight indeed...
–
ShaneCJul 19 '11 at 2:25

9

We spend 4-5k on average for a dev setup here at SE ...
–
ZypherJul 19 '11 at 18:26

39 Answers
39

Seriously. If you asked 10,000 tech mangers, "Let's say you paid Danica Patrick $100,000,000. Do you think she could win the Indianapolis 500 by riding a bicycle?", I'm sure not one of them would say, "Yes."

And yet a good percentage of these same managers seem to think that highly-paid software developers ought to be just as productive with crappy tools and working conditions as they are with good ones - because, of course, those lazy, feckless programmers are getting paid lots of money and ought to be able to pedal that bicycle faster.

Now, what exactly good tools and working conditions consist of depends on the job to be done. People who code the Linux kernel need different kinds of hardware than web site designers. But if the company can afford it, it's crazy not to get people what they need to be as productive as possible.

One company I worked for had a 9 GB source code base, primarily in C, and the thing we most needed were fast builds. Unfortunately, we were mostly working with hardware that had been mediocre five years before, so people were understandably reluctant to build much other than what they were working on at the moment, and that took its toll via low productivity, quality problems, and broken builds. The company had money to upgrade the hardware, but was strangely stingy about it. They went out of business last summer after blowing through over $100 million because their two biggest clients dropped them after repeatedly missed deadlines. We were asked one time to suggest ways to improve productivity; I presented the same kind of cost-benefit analysis the OP did. It was rejected because management said, "This must be wrong - we can't possibly be that stupid", but the numbers didn't lie.

Another company I worked for had fine computers for the programmers, but insisted everybody work at little tiny desks in a big crowded bullpen with no partitions. That was a problem because a lot of us were working with delicate prototype hardware. There was little room to put it on our desks, and people would walk by, brush it, and knock it on the floor. They also blew through $47 million in VC money and had nothing to show for it.

I'm not saying bad tools and working conditions alone killed those companies. But I am saying paying somebody a lot of money and then expecting them to be productive with bad tools and working conditions is a "canary in the coal mine" for a basically irrational approach to business that's likely to end in tears.

In my experience, the single biggest productivity killer for programmers is getting distracted. For people like me who work mainly with compiled languages, a huge temptation for that is slow builds.

When I hit the "build and run" button, if I know I'll be testing in five seconds, I can zone out. If I know it will be five minutes, I can set myself a timer and do something else, and when the timer goes off I can start testing.

But somewhere in the middle is the evil ditch of boredom-leading-to-time-wasting-activities, like reading blogs and P.SE. At the rates I charge as a consultant, it's worth it for me to throw money at hardware with prodigious specs to keep me out of that ditch. And I daresay it would be worth it for a lot of companies, too. It's just human nature, and I find it much more useful to accept and adapt to normal weaknesses common to all primates than to expect superhuman self-control.

+1 for the mentioning the zone. I once worked for a company where it was common that developers also did direct customer support . Now, even if you are writing highly maintainable and really good code, sometimes there are moments where you juggle like five or six information packages in your brain, and you must put those down again atomically. If a call comes in in such moments 3 hours before leaving home, it can really destroy your rest of the day. Not specifically because of the guy on the other side, but because of the state-destruction. ...
–
phresnelJul 19 '11 at 11:55

34

But the managers don't think of you as Danica Patrick, they think of you as the UPS delivery guy, and why do you need a new truck when the 5-year-old truck runs just fine?
–
Mark RansomJul 19 '11 at 15:36

@Mark Ransom: All too true - and it's worse, because we're salaried. UPS drivers get paid extra for working overtime. Lots of them love the holidays: exhaustion, but paycheck happy-time! But programmers' overtime is free for their employers. If tech companies had to pay programmers time-and-a-half for work beyond forty hours in a week, we'd all have screamin' machines and interns to bring us coffee in our cubes.
–
Bob MurphyJul 19 '11 at 16:07

5

@Bob Murphy "But programmers' overtime is free for their employers." This is only true if you aren't willing to draw lines, and only if you aren't willing to demand a salary commensurate with what you bring to the table.
–
PeterAllenWebbJul 20 '11 at 16:02

I would suggest that, in reality, one cost is visible and quantifiable, while the other cost is neither.

If failing to upgrade the hardware bleeds even as much as $1000 per developer per week from the budget, no one outside (read: above) the tech department ever sees that. Work still gets done, just at a slower rate. Even in the tech department, calculating that figure is based on numerous unprovable assumptions.

But if a development manager asks for $3000 per developer, particularly in a company with 50+ developers, then this takes a lot of justification. How does he do that?

If the manager has to ask for $3000 per developer, yes, that's painful. However, if he can ask for $83 per developer per month, that might be more palatable.
–
regularfryJul 19 '11 at 9:30

24

I think it is the manager's responsibility to justify the cost of adequate machines for his/her team. In the past I have found it useful to categorize computers according to roles. Computers used by developers and designers get classed as "for content creation". You just list invariably beefy application requirements for your shop's IDE along with some overhead and make a short list of acceptable machines from HP,Lenovo,etc. If this is not accepted and the team ends up with ridiculously under-performing hardware, the manager should really shoulder the blame for failing justify better machines.
–
AngeloJul 19 '11 at 12:18

8

If the manager staggers his requests, (50 / 3 = 17) which is (17 * $3,000 = $51,000), to stretch the requests over three years as not every developer needs the new machine at the same time which leaves just under 17 requests per year and then again divides those requests up by month, (17 / 12 = 1.6 which rounded is two or two each month for the first quarter and one each month thereafter at 2 * $3,000 = $6,000), he/she will have less than two computers per month which is a much more attainable goal than asking for, (50 * $3,000 = $150,000), at once.
–
Michael EakinsJul 19 '11 at 12:58

13

Many megacorps are such that dev time is wasted for much sillier reasons (like poor allocation of workload) -- so this is not a surprise to me at all.
–
singpolymaJul 19 '11 at 14:34

I will put my 2 cents in here from the employer's side ... who is also a developer.

I agree that low end machines are useless but top end machines are overkill.

There are a number of reasons why you don't get the top end machines:

Cashflow is a real issue, not just a theory. You might be getting paid $60K-$80K per year, but this month we have a total amount in the bank which has to be split amongst every competing thing in that month.

There is a sliding scale of price and benefit. Low end machines are on the whole pretty useless ... if you're getting a celeron or low power chip then whinge away ... mid range machines have good overall performance, once you get into the top you are starting to tune for a given purpose (CAD, Gaming, Video encoding etc) ... and the tuning costs extra.

General parts are generally cheaper, replacements, warranties and insurance all play a part in the overall running costs and the down time while you source a replacement.

Top end machines depreciate just faster than ones 1/3 the price.

If you're doing high end graphics programming or CAD work then the extra grunt is valid; if you're just writing standard business software, running visual studio or eclipse and surfing Stackoverflow for answers then the extra power is cool bragging rights, but realistically a mid range machine will not max out the CPU or memory in a standard box today.

Mid range machines built today hammer and in 2 years time they will be twice as fast (well kind of). Seriously, they are lighting quick.

At the end of the day most of what you do is type raw text into text files and send it to the compiler ... that bit really hasn't changed since VI in the 1970s and the low end machines today are a millions times faster than the ones back then ... your pace of coding really isn't that different.

SO to summarize, you should have good gear and good tooling, it makes a big difference but top end machines are not really justifiable for the "general developer".

... ah, and now I read you edit and that is what you are talking about, I will leave the above cos I have written it now ... Yeah, your machine is underspecced for the tooling.

To clarify a mid range machine should have

2 cores min, 4 cores good anymore at this stage is overkill.

4GB is a min, 8GB is good and anymore is nice to have.

SSD should be standard but really a 10KRPM WD or seagate 80-100GB drive should do fine.

my machine fails all 4 of your bullet points - i had to beg to go from 512 to 1 gig of RAM for example. We are not all just whining about not having the latest alienware setup with cool LEDs and diamond plate.
–
Peter RecoreJul 19 '11 at 4:11

23

"your pace of coding really isn't that different", that might well be true (if we ignore today's tools being huge resource hogs compared to back then), but I think it's fairly safe to say that what most developers gripe about isn't pace of coding, but turnaround time: how long does it take to make a change and see the effects of it in the running application? If the turnaround time from hitting run to seeing the change in action is 10-15 seconds, that's a completely different beast than, say, 5-10 minutes. Yet the amount of time spent coding can be essentially the same.
–
Michael KjörlingJul 19 '11 at 7:43

55

If only I had at work machine with your 'mid range' spec.
–
yoosibaJul 19 '11 at 9:41

28

FWIW, a lot of companies would consider your mid-range machine to be server class hardware! I am fortunate in that I do work for a place where we get these specs, but not everyone does.
–
Paul WaglandJul 19 '11 at 12:12

4

@Bob Murphy: You really need IncrediBuild or a similar distributed compilation setup. It's far easier to justify a 12 core build server with 16 GB as a shared resource, if only because there's no personal jealousy involved in shared resources (plus, you typically pay servers from different budgets)
–
MSaltersJul 19 '11 at 14:27

The difference of productivity between the "top-end" machines and "almost top-end" machines is negligible. The difference in price is significant.

Not to mention the IT support for different machines instead of having all the developers using the same HW and SW images (which you can't do if you're buying a top-end machine for every new hire, the top-end will be different every time). Also, people who got the last year's top-end will want to upgrade because that newbie next cube has a "better" machine than them, and they're oh so much more important, aren't they?

Unless you really need the top-end machine for your work, I see no reason why to throw away the money.

But the difference is cost is more negligable. And my productivity takes a real hit when I have to close everything and reboot, which happens several times a week. If you have a different view of the relative costs, perhaps you could include numbers in your answer. Nevertheless, I agree that almost-top-end would be very satisfactory, wish I had that.
–
Eric WilsonJul 18 '11 at 19:42

8

in the same direction the difference between almost top end and middle pack hardware is enormous and the difference in price is negligible. There is some amortization to be done on hardware that's for sure or we are just throwing money out the window, then again... in the case of devs, too much amortization also amounts to throwing money out the window ! There is a sweet spot to be attained and when taking into account the psychological aspects of keeping devs happy it will tend to be closer to high end than to mid pack
–
NewtopianJul 18 '11 at 19:44

24

@FarmBoy If your productivity takes a real hit - go to your boss and justify an upgrade. You asked a general question, and my answer is for a general case.
–
littleadvJul 18 '11 at 19:45

8

The cost of support for a wide variety of machines is incredible. Individual users tend to over look this (and they should, it isn't their job), but I've been at three companies that all came to the same conclusion. Cheap desktops + high end VM servers makes the most sense.
–
Christopher BibbsJul 18 '11 at 19:51

9

This is a strawman; nobody's talking about top-end vs. near top-end. In my experience, it's between good vs. ridiculously insufficient.
–
niXarJul 19 '11 at 11:50

Because most employers do not understand how developers think, act or work. Or, how top tools can save the company money while increasing productivity. This leads to the loss of a point on the Joel Test, failure to provide "the best tools money can buy". This also leads to loss in productivity and job satisfaction. Thats just the way it is. Maybe one day you can start your own company and score 13/13. Until then, ask questions up front with your employer so you know what to expect before ever taking the job.

As far as your current situation, if you feel they listen and trust you then bring up the discussion. See if they will give you an upgrade. I know I'd work a little bit longer if I had a top of the line rig with dual 50" monitors to work with. Stick me in the matrix.

Same reason people want a Mercedes CLS when a Toyota Camry gets you there just the same. Sure, you may only squeeze a few more seconds of compile time out with a new machine, but appearances do matter.

Your math doesn't seem to include the time required to manage the constant flow of hardware into and out of the company -- it would take an extra IT guy or two depending on the size of your company, so tack another $50-$100k/year on top of your numbers. Plus, you lose productivity on the day they swap your computer out. If they skimp on dedicated IT staff you'll have to do the backups and restores yourself, possibly losing a day or two in the process. In other words, I think it's a bit more complicated than you think it is.

It may well be more complicated than I'm figuring, but I'm not asking for more frequent upgrades, just better quality at the time new hardware is purchased.
–
Eric WilsonJul 18 '11 at 20:09

4

What you say is true, however it also ignores the fact that most of this still needs to happen anyway. The posters idea is to go from high to low on the scale, not low to very low.
–
Paul WaglandJul 19 '11 at 12:14

One problem with your argument is cashflow. If they don't have the money, the point is moot. The other is return on investment.

This may not apply to the companies where you've worked. Some companies are highly leveraged and/or cash poor. They would rather spend the savings you describe on something that will sell more widgets or software. You have to show that your gain in production outweighs an equal investment in other areas.

If a software company is in maintenance mode and needs more sales, there may be a better return on spending the money on sales and marketing.

I think you need to address the fact that in your case, the money is better spent on a programmer, than another area of the company.

Be careful with this argument if you're on salary. They'll just want you to work harder to make up the difference ;)

Then they shouldn't be hiring developers. Sure, if you've no money or there's no prospect of the investment being repaid, you can't/shouldn't be spending. The irrationality is in spending a lot of money on expensive resource (developers) while pennypinching on a cheap resource (hardware). If the excuse is that these are separate budgets, that just pushes it back a step: the irrationality is in having a massive personnel budget combined with a tiny hardware budget.
–
rwallaceJul 19 '11 at 13:04

I made this argument at my work for switching from laptops to desktops. I said everyone should be on a desktop and if they need a computer at home - get them one there too.

The speed advantages of a good computer are not negligible, especially if you remove crashes from really old hardware.

Concerning "top of the line" and "near top of the line" - I would argue near top of the line is always where you should be. At "near top of the line" you can upgrade every 2 years instead of 3 and end up with better hardware on average.

I recommended cyberpowerpc.com and my company let me purchase a PC from them (marketing guy), but they bought all the programmers pcs from Dell because the support was worth the extra cost. Think about that... its 1.5-2x to buy a PC from Dell, but you all appreciate if the PC goes down and you can't fix it fast you lose money.

There's also a question of budgets - usually developers are paid out of a different budget than hardware for said developers, and their might simply not be enough money available in the hardware budget.

Arguably that doesn't fully answer the question (it's more about the mechanics). The follow-up would then be why is the hardware budget undersized, if you accept the premise that you should spend e.g. 2% of developers' salary on workstations?
–
Andrzej DoyleJul 19 '11 at 9:26

1

@Andrzej, you do make a good point. Part of it depends on the size of the organisation - large companies seem to be especially reluctant to give developers high-spec machines as they tend to have standardised their hardware on the Excel jockey level. Smaller companies usually are more flexible, but also have less money to throw around.
–
Timo GeuschJul 19 '11 at 20:46

They can't do the Math or if they do, they somehow believe that it doesn't apply to them.
Budget and accounting for hardware and personnel are separate.
People in decision-making position never heard of the issue and are totally unaware that a problem exists at all.

Now, to the real question: "How do I handle this situation?"

It's essentially a communication problem. You explain the problem and the interlocutor hears "bla bla bla we want shinny new toys". They just don't get it.

If I were in your position, I would make a quick video titled "Can we afford old computers?":
Stills of a typical workstation. On the right side, a blank area titled "cost".

Run through at a quick pace and keep adding the numbers then compare with the cost of a new computer and don't forget to end with "This video was produced at home on a $500 store-bought laptop that outperforms all the "professional" development machines currently available.

If you are concerned that raising the issue will cause problems for you, you could also just bring in your own laptop to work.

If there is no way to get that issue across, then perhaps you should consider finding another job.

Personally I have always had at least an OK development computer when I worked for a 'small' company but when it comes to Big companies, programmers are a dime a dozen compared to a project manager having a budget.

Specially if he/she is one of those having great ideas, read: budget approved.

Whatever the 'good' idea, that person will need really good programmers to actually implement the " New 'better' product" so they will pay the programmer the price needed.

Getting the new development computer, as far as I have been concerned, does not goes through the same 'department' as the other budget though so do expect working under bad conditions if you are paid well :-)
My last work: Dell E5xxx + One LCD 1280x1024 ...

Nah you got it all the wrong way, what I try to stress is that even if that project manager can pay you for what you are, the guys over at 'buying the computers and maintaining them' doesn't run on the same budget. I earned more a day on my last job than that computer costed... Should I have stayed longer I would probably have bought myself another computer + screen but htere were other problems like working in extremely hot and noisy environment (because that was cheap not because there were any real need).
–
ValmondJul 19 '11 at 8:09

Buying new hardware involves money, money involves decision makers and usually they're not developers if your company is big enough. Of course we have exceptions...

As @Rob explained, there is a lot of reason why you'll not get the best hardware. Your company may have a policy defining what kind of hardware is bought, as always with bureaucracy it's hard to have a bleeding-edge policy. Many managers won't bother adapting it to your personal needs, etc.

Poor communication, risk aversion and other flaws:

Let's consider you have reaaally crappy hardware, it's no longer possible to work in these conditions and you want to do something about this.

Now you have to go convince your manager. Well, usually you'll have to convince your project manager who tells your manager who reports to his boss and you'll need to make sure that that guy really understands your issues.
Involves communication skills and the technical understanding of the management.

Second step, if you're lucky enough, the management will think about it. What do they get ?

You'll work faster with some uncertainties (they don't directly get money as you'll try to explain).

It'll cost money, now.

That means they'll have to trade money, and their actual planning of your work, for an eventual opportunity to let you do something else in the future and that, that's an investment but also a risk.
Sadly, many managers are risk-averse. Not to mention that the poorer their understanding of your issue, the riskier it appears. Some may also have a hard time recognizing that someone did not buy the suited hardware in the first place.

Moreover, management usually has a shorter definition of what long term means. If they're asked to do some sort of monthly budget optimization, they may even have direct financial incentives not to buy you new hardware! And they won't care about the two weeks you may save six months later..

That works better if you have smart and open-minded managers who listen, understand your issues, are ready to take reasonable risks and trust you enough to let you explore creative ways to use the freed time.

That's not always the case: I waited 3 months to get a graphic card to connect my second screen while being forbidden to buy it myself (30€), lost 3 days for not having an extra 500GB HDD, regularly had to wait several hours when preparing data for the client because of the slow 100Mbps network. After asking several times for 2GB of ram, I was told to buy it myself and stop bothering the management with those technical issues. And we where doing scientific computing for a big industrial client who was ready to pay the price..

Well said, good analysis on the why. However, if it's going to bad, you can dissipate some upgrade spray through the dedicated case openings (globalpackagegallery.com/…).
–
peterchenJul 19 '11 at 15:45

Math aside, all of your users are not likely to have top end machines. Developing on a machine that is spec'ed more closely to something that is average in price will acquaint the developer more closely with the experience (and pains!) of their users.

Your QA department may have a min-spec machine, but how often is it used? Developing on a machine that is a realistic target environment exposes issues early on (unresponsiveness, poor performance, race conditions because of that slow performance, etc), which drives teams to fixing them sooner.

I've heard this before, and I think it's a false analogy. If it were true, then cars would be built using hand tools and power drills because that's what drivers have at home. A low-spec machine should be used as part of usability testing, but not for development.
–
TMNJul 19 '11 at 12:06

1

this answer points out an interesting thing. I've seen a game which failed so badly when it was released: most of the users couldn't read the texts in the interface because developpers had 21-27inch screens at least, and scaled down to those laptops 15inches characters were rendered at 6px. However, being close to user's specs is needed for tests, which should be done by testers and not developpers.
–
BiAiBJul 19 '11 at 14:13

I was asked to spec out the machine I wanted to use here, within a fairly tight budget. I managed to come up with a halfway decent system that works despite not being perk heavy.

I was originally thinking along the same direction as the OP here, the time I sit here waiting for compiles or loads is money out the window. As I've been moving along I also recognize that the time I spend going to get a coffee, or walking to the printer are also money out the window.

Rather than worry about the small amounts of time that I do have to wait, because we went with a less expensive development system, I've looked at my own habits and improved the larger amounts of time I spend doing nothing particularly useful (ahem... stackexchange is useful, and productive to boot, and I'm sticking to it!! :-) ) Of course we need breaks, but this is time other than "breaks".

So in a way, in a general sense, this question could be the "premature optimization" of work efficiency. Many great points about migration costs, loosing out on volume purchasing, etc.

In your particular situation, where you are losing time on the order of a break in order to reboot/open programs, yes, it makes a lot of sense to upgrade to decent equipment as your productivity is seriously impaired, a halfway decent i3 system with 4 GB RAM is on the order of $500 ... I'm sure it won't take long to recoup that cost.

You need breaks regardless. But minimizing breaks in flow is critical to developer productivity. If a developer has to wait more than about 30 seconds to get feedback from the previous action, work will slow significantly.
–
kevin clineJul 18 '11 at 20:11

1

+1, you can definitely get a sweet machine for not much money, if you optimize for developer productivity. Good graphics card? Almost certainly a waste of money. Huge hard drive? Often not necessary. But RAM? As much as you can get. If you spend smarter, not more, you'll do well.
–
Carson63000Jul 19 '11 at 0:52

One big factor is the kind of bloatware that the IT in a typical big company tends to put on the laptop. If you have a Windows 7 machine at home and just some antivirus, a standard SSD-3GB-Quad-core system will boot up in less than 10 seconds. Compare that to the bloatware my company puts in, and it takes forever to boot. I have seen some folks using zapping the OS completely and installing their own to speed things up. I think that solves a problem to an extent, although it is a huge InfoSec violation. But seriously - 10 minutes?!

IT here on loan from Serverfault. 10mins is inexcusable, but unfortunately common. When I start at a new shop I spend the first few weeks turning off all the crap that someone thought would be a good idea to run on startup. Antispyware scan -> Antivirus scan -> 100s of nested GPOs. My new Win 7 desktops boot so fast I had to tune the switches because they were booting faster than the NICs could auto-negotiate. Hell I can reimage a station in less than 10minutes.
–
RyanJul 20 '11 at 19:52

In large corporate organisations the choice of hardware is pre-defined and locked down due to the fact that such organisations have fixed, centrally managed desktop and laptop specifications and configurations. The specifications for these will have been dictated overwhelmingly by a combination of "procurement" and "support" considerations. The company I am currently working at, for example, has over a 100,000 employees and they work on the basis that "one size" fits all, and that size will have been primarily driven by commercials. Once such policies are in place, they are locked down because support services usually invest a considerable amount of time in testing and deploying the software to that "standard" machine specification. Arguments around "developer" productivity, in such environments, simply fall on deaf ears; production services are not going to make an exception for a small group on the basis that they may be more productive; if they did so, they would quickly be swamped with requests for deviations, and in any event they (production support) are incentivised to keep the support cost as low as possible. > 1 desktop/laptop configuration increases the support cost. In an organisation where the primary "product" is the result of software engineering, such arguments are invalid, but the reality is that most organisations are NOT, and the key driver is keeping support costs low.

Either way you slice this question - the collective group "programmers" bears direct responsibility for any failure to buy the best tools in the workplace.

Business finance is incredibly complicated with numerous conflicting motivations and levers. Without concrete knowledge of what your finance department is currently tracking (tax avoidance, managing quarterly expenses, driving up future capital expenses, maximizing EBITDA or whatever else is on their radar), any discussion of true costs is irrelevant. How would you react to a marketing person bugging you about compiler optimizations for code you know is about to be transitioned to an interpreted language? If programmers can not demonstrate in specific terms how the tools they have don't contribute directly to the bottom line, the business is correct to spend as little as possible. We also have to learn to listen to business finance so we can understand the realities facing resource allocation.

We as a group vote with our presence in the workplace far louder than asking for better tools, submitting the most awesome white paper to our managers, or even posting on the internet. There are organizations that have created a culture of ensuring its employees either have the tools they justifiably need or understand the case as to why not at the moment. Until competitive pressure requires this from the majority of employers, we can only vote by seeking out employers we believe in.

Each of us has to either make this something that matters to the core, or let it go.

I used to be a developer at a large company and then a startup. Here are my two cents:

8GB DDR3 DIMM (2x$4GB) costs $50-$55 today (Circa july 2011)

21" LCD Monitor costs $200 (circa july 2011)

If your company allows you to bring your own equipment, just use your own $ and upgrade the RAM and LCD monitor. Why you ask?

isn't your own productivity something you value?

aren't your eyes worth $200?

You can always take the monitor with you when you quit the job (remember to clearly label it as your personal property). I've done the above recipe (upgrading RAM and using my own LCD monitor) in both my previous jobs - and my current job.

I don't see how you can group all employers together in one basket. I've worked for a few employers as an employee and as a consultant and always got hardware that was more than sufficient for my needs - for current job I was handed a bright shiny new HP quad core with 4 gb ram and Win64 on the first day - not top of the line, but very sufficient - (I use Delphi XE and XMLSpy as my main development tools) - in fact so nice I went and bought the same machine for myself at home. (Maybe I'm not all that productive! LOL.)

If you don't get good hardware, try asking for it - and if you feel you can't ask for it, you're probably not working at the right place because they don't view developers as a resource, but as a liability.

So I guess the answer to your question is: those companies that don't and/or refuse to provide sufficient hardware for a developer are companies that consider their developers a liability - jobs they'd rather outsource and not deal with at all.

The company has a lot of expenses. Every department needs more $ in order to do better and in every department the expense is a must.

when you come to choose the best way to use available $ you take into account:

how much do they need? smaller sums are easier to approve.

will it increase sales? better pc's usually don't contribute directly to increase of sales

does the department like to spend $ or do they understand cash flow. Most r&d departments I have seen have an arrogant "we deserve the best" approach. This is understandable as they earn a lot of $ and when you do you think you deserve the better things in life. $ needs of r&d teams usually give a feeling of a spoiled child requesting more toys while his parents are struggling. "A delicate genius".

The 10min a day waste is not a reasoning that would work with most finance departments. Most r&d teams waste a lot more on all the none programming activities they enjoy during the day. Lets chart all the waste in your department and see what can be done to improve productivity.

Simply put, purchasing decisions are often made by bean counters (Accountants, and middle managers) rather than by project managers.

Lots of people have given potential reasons, and all of them are a factor in one situation or another, so there isn't any single overriding situation. Buying large scale equipment may mean they lose some money on productivity for programmers, but gain money in other areas.

Still, it often just comes down to a budget. You have to fit in the budget, and that's all there is to it.

That doesn't explain why programmers can't talk to the bean counters and demonstrate why money is being left on the table by the business by failing to get the correct tools. The budget serves the business need - programmers have to demonstrate needed tools to expect budgetary consideration.
–
bmikeJul 19 '11 at 15:03

1

@bmike - I don't know about the companies you've been at, but in most cases programmers are not allowed to talk to the bean counters. I mean, nothing stops them from stopping them in the hall and having an informal conversation, but they would ordinarily tell them to "use the chain of command"
–
Erik FunkenbuschJul 19 '11 at 20:42

2

+1 - to get it back at least to 0 - IMO this is a very well informed and accurate answer, particularly in larger shops. The developer should talk to an accountant about how he needs to spend $1000 more than regular people on his hardware? Hard to imagine...
–
VectorJul 20 '11 at 13:41

I used to work for a networking company where they upgraded ram from 512 MB to 1 GB last year. We were working with f**king CRT monitors in 2010. The funniest part was the managers' hardware was upgraded to 2 GB ram. Why on earth would anyone want 2 GB to create damn PPTs and how someone would develop applications with 1 GB ram, I would never know.

It comes down to who handles the money. In larger organization IT is given a budget of say $1M for the year. That includes support salaries, servers, etc. They have to spread it around between all their resources. They cut deals with vendors like Dell or IBM to get x number of the same kind of computer. This they give to everyone from customer support to the programmers. They also get deals on support etc., when they only have to maintain a limited set of models. They are not programmers either, I have had numerous arguments with non-programmers about computers. When I went over my IT managers head for some new HD one time, the CEO said buy them and boom, everybody finally had enough disk space to run virtual machines.

I actually blew up and cussed out my boss because IT was going to take away my 19" second monitor because I had a laptop. They stiffed me on that too, giving me a 13" model when others were getting 15". That goes back to politics in IT which is another problem. It's kind of an us vs. them thinking sometimes.

From the perspective described by the asker, the question makes complete sense. However there are more costs involved with keeping hardware current.

Here are some of the costs that also need to be considered:

requisition cost (research and details that goes into purchasing)

installation & configuration cost

support & maintenance cost

software licensing cost

disposal / upgrade cost

In some cases, these can be 2-5x greater than the cost of the hardware itself. Even more if there is sophisticated software licensing involved.

In general the scale of these costs depends on the size of the company or the complexity of the organizational structure. Smaller teams with direct access to purchasing power can keep these costs low, whereas in a larger organization these costs can get very high.

my premise is that the other costs, while relevant, are the same. In other words, acquisition and installation costs don't increase because the devs get more RAM. Also, I've argued for the same schedule of purchases above.
–
Eric WilsonJul 19 '11 at 15:37

4

What software do you run that costs 2-5x more to license if you put it on a faster desktop machine? @Farmboy is right, this is an anti-point. If a crappy computer costs $1000 to buy + $1500 in IT costs over three years, it's half the price of a great computer that costs $3000 up front + $1500 in IT costs. And in fact, the better computer probably costs less to support, because it breaks less often.
–
RoundTowerJul 19 '11 at 22:20

Because a lot of companies outside of typical tech start-ups are not interested in hiring rock-stars. They're investing in someone who can just do work. So if they don't care how you do you work as long as you do it why should they care what equipment you use? I've worked at places that still use 15-inch CRTs and everyone does just fine. Sometimes when i read questions like this I wonder if people realize that not everybody in the world works for a cool start-up.

I don't work for a cool start up, and I don't think that everyone else does. But I do think my employer should care whether I have equipment that works well, whether they want rock-stars, or just effective developers. Primarily, I expect that my company would want to avoid wasting money paying me to watch my machine freeze up again. No-one thinks wasting money is cool.
–
Eric WilsonJul 20 '11 at 16:27

I've worked for companies that skimped on hardware in the past. It sucks, and if they need convincing the battle is likely to be a never-ending one.

Turns out that companies committed to using the best available tools are rare, but they do exist; I work for one. I've got a quad-core 17" 2011 MBP, 8GB RAM, Vertex 3 SSD, 2 x 24" external monitors, plus a quad-core desktop and a 4GB Xen slice; as well as quiet offices.

Could I get by with lesser hardware? Sure. But I think we'd all rather be bragging than bitching.

In my opinion, there are only two defensible objections a company could raise to keeping developers set up with solid workstations. The first is that they are undergoing a cash crisis. That better be short lived, or the company will not be a going concern for long. If you work for a company like that, you should keep your resume up to date.

The other is that their organization is simply not bottle-necked on software development capacity. That is, an increase in the quality or speed of software development output would not improve the bottom line. If the company's main business is selling software, that will be practically impossible. If software isn't their main business, and they aren't bottle-necked on it, they should be trying to reduce their software workforce by transferring or letting go of their weakest team members. Supplying poor equipment will reduce the size of their team from the opposite end, I'm afraid.

New machines, newer technologies mean newer problems. Not everyone at every company is a techwiz and not every company has the IT resources to train people and handle problems 24/7.

Yes, perhaps if you're a freelance programmer working on your own personal desktop it would be worth blowing $1000 on a rig to squeeze out 10 min of extra productivity everyday. However when you're deploying hundreds of these machines to people who may lose productivity because of new equipment, the prospect seems a little more grim.