For context, Alex St. John is co-creator of the DirectX family of API's at Microsoft and founder of WildTangent Inc., so this isn’t just some whelp game journalist saying provocative things for links. This is an industry veteran who – even if I disagree with him on a few points – has a lot of experience and knows what he’s talking about when it comes to running a business and developing software.

The article isn’t long and you should read the whole thing, but if you insist on me distilling it down to a few bullet-points then:

Game development isn’t physically demanding, therefore complaints about crunch times are just whining.

Working long hours is normal in the tech industry, therefore crunch mode in game development is nothing special.

Complete imbeciles have made millions in this industry as indies, therefore it shouldn’t be a problem to quit AAA development and go on to success.

The internet is often bad at coping with these arguments in the format of:

Your average internet commenter reads this and thinks, “I disagree strongly with B, so I must dedicate all of my energies to refuting A!

But the problem here isn’t A. The problem is the “therefore“. This is an article where all of St. John’s examples of A are reasonable, and I anticipate a lot of people are going to waste their time and energy assaulting good advice rather than flawed conclusions.

Before I get into this, I should make it clear that when I rail against “crunch”, I’m actually talking about long term crunch. Short-term crunch is a perfectly reasonable policy. You generally have a long-running marketing campaign building up to a release date. Regular project scheduling is fiendishly difficult, and game development is moreso, which means you often need a big push at the end to make sure the team hits the required dateThere are real costs to letting a game slip. Once the television spots have been paid for, you need to make sure the product will be on the shelves for the buying public..

Retailers get busy at Christmas, construction workers are busy in the spring, confectioners are overworked before major holidays, and game developers need to do a big push as the scheduled release date gets close. If you need six weeks of long days to hit your planned Christmas release, that’s all part of the business. What I object to are the development houses where 60+ hour work weeks are just part of standard day-to-day operations.

So let’s talk about St. John’s points:

Game development isn’t physically demanding, therefore complaints about crunch times are just whining.

Arguing against crunch mode on the basis of personal discomfort is a mug’s game, because there’s always someone out there who gets paid less to do something more unpleasant. But this doesn’t mean crunch mode is a good idea.

The problem is that anything is hard if you spend 70% of your waking hours doing it. If your job was hanging out on the beach drinking piña colada for 80 hours a week, then sooner or later you’ll start whining that you would love to sit on some grass, that you’d like to have less sand in your shorts, that you’d rather be punched in the face than drink another piña colada, and that you really, really want to spend some time with your family or friends.

The problem isn’t the physical challenge. It’s the time, and the staggering personal opportunity cost. Your daughter is never going to take her first steps again, say her first words again, or any of the other milestones we use to mark the road of parenthood. There’s always another videogame to make, but there’s not always a kid to raise, a spouse to love, or friends to hang out with. Missing out on that stuff is the real cost of eternal crunch mode, and no review score or game credit can ever offset it.

Working long hours is normal in the tech industry, therefore crunch mode in game development is nothing special.

Again, St. John is right. This happens all the time in startups. It’s normal for people to put in huge hours when launching a company. I did that myself during the dot-com bubble. The difference is that – like doctors working brutal internships – this is a temporary arrangement, and it’s tolerated because of the large rewards being offered in the future. If I spend a couple of years working 60+ hour weeks at this sexy new tech startup, then I’m probably doing it because I’m being offered stock options that will allow me to retire at 40Assuming the company doesn’t fold. Which most do. But that’s all part of the gamble.. Or maybe I’m here because getting in on the ground floor will put me in a good position to ascend to management when the company grows. And even if that doesn’t work out, someday this company will grow beyond our team of half-dozen engineers and we won’t need to work all these hours.

There is no such future for AAA game developers. When you’re done crunching on this game, you’ll be shoved along to crunch on the next game. And the next one. Forever. This isn’t some hungry, understaffed startup trying to bring a new idea to life. This is a billion-dollar corporation, and this crunch mode is just a normal part of their rumbling conveyor belt.

Complete imbeciles have made millions in this industry as indies, therefore it shouldn’t be a problem to quit AAA development and go on to success.

I'm not a vigilante. I prefer the term 'indie law enforcement'.

If you’re working at some exhausting 60+ hour development house and you hate it, then this really is good advice. And by “go indie” I mean, “Take a sane non-game job that pays the bills, and work on your indie dream on the weekends.”

While the original article points out that complete morons have gotten rich making terrible games, I’ll point out that those instances are the exception rather than the rule. We’ve seen idiots get rich and we’ve seen brilliant games struggle to pay for their own development. All we’ve proven is that there’s a large random factor involved in success, which means you probably shouldn’t assume you’re going to be the next Notch just because you’ve got more knowledge than the guy who made Flappy Bird.

Look, my game is doing really well by indie standards, but it isn’t remotely going to pay my bills. I depend on my Patreon, freelance work, and my wife’s job. Being an indie developer is like moving to Hollywood to be a star. For every one that strikes it rich, there are ten thousand that go broke and fade into obscurity. Go full-time indie if you like, but make sure you understand the risks first.

In any case, not everyone can “go indie”. You need a broad skillset to pull that off. You might be the most brilliant coder in the world, but if that’s your only skill then you’re screwed. You need to understand game systems. You need to be able to produce art. You need to have a head for marketing and self-promotion. You’re the ultimate startup: Every single position in the company is filled by one person. Even if you’re part of a little 5-person team, most of you will need to do more than one job.

But the “shut up and go indie” attitude people throw around misses the root of the problem. Right now the supply of prospective game developers far exceeds the demand. That’s the real cause of this mess. If there was a labor shortage, companies would be working hard to retain the talent they have. But as it stands, for every thirty-something who gets disgusted and leaves the industry, there are a half dozen eager beavers, fresh out of Gamedev college with dreams of greatness and a crushing load of student loans they need to start paying offThe larger problem of taking out massive student debt to pay for a degree with low market value is a problem beyond the scope of this article, this website, and this author.. Companies can just feed feed these kids into the meatgrinder forever. They will never run out.

Which is why we need to keep this conversation going. We need people to air this dirty laundry when they quit the industry. We need to know which studios and publishers are the worst offenders. Hopefully, this mess will warn off a few kids and they’ll look to other careers, or at least other employers.

Let me end this with a point of my own:

Perma-Crunch is the Policy of Simpletons and Sociopaths

Your beating will continue until my morale improves.

Eternal crunch mode involves massive cost (to employees) in return for a minuscule benefit to the employer. Anyone running perma-crunch has basically decided that morale, loyalty, and enthusiasm have no value. Which would be bad enough if the job was digging ditches, but this is a creative endeavor where those things are precious.

Sure, a short pushHow many weeks comprises a “short” push is another discussion entirely. For now, let’s just agree that we’re talking about about a temporary increase in work hours. at the end of a project is reasonable. You can even crunch for a few months without losing morale as long as you’ve got a really juicy carrot hanging in front of you while you go. (Like stock options, time off, or a promotion.) Maybe a small number of heroes (like future doctors) can crunch for a couple of years. It’s a lot more tolerable if there’s some end in sight.

But burnout is a real thing, and it impacts creative jobs particularly hard. Any project manager worth more than his office chair will know this. People are not machines, and you can’t double their output by doubling the hours you spend running them. As hours climb, stress levels go up. People fight. You get office drama, which will hurt the output of people even if they’re not personally burned out yet. Productivity goes way down. Quality of output goes down. Learning goes down. In the end you’ve got a force of angry, squabbling, disloyal whiners who can’t give you their best work, and for all that trouble you’re not really finishing your game much faster.

Prolonged crunch is bad for employees, bad for the company, and bad for the quality of the games. It’s a stupid, short-sighted policy that should be mocked and derided at every opportunity. Even if this mocking doesn’t change the mind of the idiots running the show, maybe we can steer a few kids away from this circus of cruelty and incompetence.

Which is to say: Alex St. John’s advice to go indie (or leaving the industry entirely) is probably good life advice if you’re miserable. But this doesn’t mean we shouldn’t keep complaining when we see development teams being run poorly, and it certainly doesn’t mean people should be grateful just because they landed a job where they will be treated like disposable, interchangeable cogs. Asking companies to stop being so needlessly destructive is a reasonable thing to do. And even if they don’t listen (and let’s be realistic, it would take a massive change in corporate values and culture for them to listen) it’s still good to tell these crunch-mode horror stories so the next generation of prospective game developers can make informed career decisions.

Keep telling your stories. Prolonged, unremunerated crunch mode is ugly, harmful, and short-sighted. It’s not a badge of honor. It’s not an opportunity. It’s a pointless waste of human potential and every manager that advocates it should be shunned as a callous idiot, unworthy of their position or our respect.

Footnotes:

[1] There are real costs to letting a game slip. Once the television spots have been paid for, you need to make sure the product will be on the shelves for the buying public.

[2] Assuming the company doesn’t fold. Which most do. But that’s all part of the gamble.

[3] The larger problem of taking out massive student debt to pay for a degree with low market value is a problem beyond the scope of this article, this website, and this author.

[4] How many weeks comprises a “short” push is another discussion entirely. For now, let’s just agree that we’re talking about about a temporary increase in work hours.

It also not like ‘going indie’ is the stress free option either. Running your own business and relying on your own productivity can be way more stressful than a 9-to-5 job with crunch time.

Although the one thing that always bugs me is that any IT project I have not had a hand in running has ALWAYS fallen way behind. People in IT just seem to always estimate for ‘best-case’ and then end up in crunch because stuff went wrong. The trouble is, if you give realistic estimates, you’ll never get the contract since another group will give their impossible estimates and take the hit by going overtime on the project.

I think in part that he (the article author) was arguing that making games is inherently entrepreneurial, so you’re going to work like one whether you work for yourself or someone else. So if working for someone else is pissing you off, work for yourself instead.

Even if you “work for” a company in the games industry, it’s a lot more like starting a business under a VC than it is like having a “job”–except for one thing–you don’t get downtime between projects. The minute you finish one, you get cycled to another.

In my experience, it could also be the opposite (the Starfleet Engineering effect?):

O’BRIEN: It’ll take about 36 hours, Captain.
SISKO: You’ve got 4.
O’BRIEN walks away with a “why do I work here” expression on his face.

This has happened to me personally. It may just be a running gag on Star Trek, but in real life it’s the most infuriating, boneheaded thing a manager can possibly do. Your boss (who has no background in computers whatsoever, let alone whatever specific thing your project is) says “no, it shouldn’t take that long” and gives you a deadline far shorter than your estimate. You work your ass off and still look bad when you don’t meet the insane demands. I have a feeling this is a common relationship between publishers and developers.

Yyyuuup! Feature creep, hardware that isn’t even remotely suitable for what the job will actually do (as apposed to what you were told the job was), midnight vendor swaps(Remember Unix? Remember how many different versions there used to be? Remember how many different processor architectures there used to be?), the list goes on and on.

Oh, and the multiple boss game.
Adam is your boss. OK
Adam got a promotion, Betty is your new boss but you still report to Adam. ….Okay?
Betty got a promotion, Chad is your new boss but you still report to Betty. And Adam. Ummm

And this wasn’t at Fly-by-Night Inc. this was a TelCo. Logo looks like a death star? You might have heard of them.

SCOTTY: Do you mind a little advice? Starfleet captains are like children. They want everything right now and they want it their way. But the secret is to give them only what they need, not what they want.

GEORDI: Yeah, well, I told the Captain I’d have this analysis done in an hour.

SCOTTY: How long will it really take?

GEORDI: An hour!

SCOTTY: Oh, you didn’t tell him how long it would *really* take, did ya?

GEORDI: Well, of course I did.

SCOTTY: Oh, laddie. You’ve got a lot to learn if you want people to think of you as a miracle worker.

There’s a great episode of Star Gate where this is inverted. Siler tells Hammond it’ll take x hours, Hammond halves it, Siler just goes ‘sorry, sir, that’s not how it works, it’ll take x hours.’ Loved that :)

In Star Trek (and sci-fi in general) there is almost always the excuse, “if you take that long bad things will happen, so do it faster.” An excuse that is almost entirely absent in real life. Professions where that would regularly come up are without exception built to avoid the situations where that justification would be necessary.

It was a running gag on Star Trek (and pretty much every other TV show). Though I think it is reasonable to give hard deadlines on a military spaceship.”

ENGINEER: It'll take about 36 hours, Captain.
CAPTAIN: You've got 4.
ENGINEER: I can get it down to 12. But that’s it.
CAPTAIN: Then you better figure out how to hold your breath for 8 hours. I told you’ve got 4 hours because that’s all you’ve got on this military ship in space in a time of war!

In a regular office job that’s not going to happen most of the time. Though it can happen. Some things are just due when they are due because there is a hard unmovable deadline. For example if a project is going to take 36 hours and you are going to be evicted in 24 hours, well then you better figure out how to do it in 23.

That’s how it was for me in university. Projects were due at X time and they would not be accepted at X+1 second. Firm. You miss the deadline by 1 second and you get zero.

I can see it both ways. Sometimes adjusting the estimate is meaningless posturing. Sometimes it is because there is an important hard deadline.

I also started mentally prefacing every other sentence with “Back in my day”. To answer Tizzy, I would always rather interact with someone who knows they are an asshole over someone who is, but unaware of the fact.

It’s his handle, it seems, and he also appears to take it pretty far. He’s got a personal site and if you go to the homepage it’s got Hebrew writing and biblical passages. It also has a new article doubling down on those hiring practices he promotes, including literally calling for companies to exploit autistic engineers as Richard mentioned. It’s pretty bad.

I have to disagree with you Shamus (at least on a matter of scale), we should be refuting a couple of those A statements because just because they are the norm doesn’t make them right.

First, they aren’t physically demanding, but they are mentally taxing. You bring this up later, but I think it needs to be used to address point one, straight off the bat. Even short term crunch has vastly diminishing resources. Just because you have a bunch of creatives and problem solvers throwing double the hours at the end of a project, doesn’t mean you’re getting double the work out of them. Actually, by the end of week one, you’re likely starting to see multipliers below 1, as the quality of their work takes a drastic hit. The opportunity cost is high, but compounded on top of that is the gradual progress towards burnout.

And second, crunch should not be a thing in the first place. We’ve been doing this development thing (games and otherwise) long enough that we shouldn’t even be needing to use crunch. Every single company that relies on it as a normal part of operation has failed at project management on a fundamental level. I am not including startups, which — like you said — directly reward initial short-term crunch with long-term rewards; I’m talking about development studios and publishers that have existed for over a decade at this point.

This inline response by Rami Ismail is probably the best I’ve seen so far, and I strongly agree with everything he’s said.

On the second point, yes there should be the option for crunch. Businesses should be flexible enough to aggressively pounce on opportunities. Its operating in perpetual crunch mode that should signal that there’s something wrong with your model.

Makes me wonder with the market glutted as it is why they don’t just hire more workers rather than making the same workers work longer. Are they all salary? Is that legal?

I’m not saying that they should never do it. It’s an emergency state, where something has gone wrong and you need to do it. But if a business — which has been in operation for years on end — still regularly needs 2+ week crunches, then something is not right. Crunch should be treated, outside of break-your-back-for-equity startups, as a failure state and recovery from that failure.

As for the second one, there are very serious diminishing returns on team size. Hiring more people to work simultaneously doesn’t work that well when you can’t properly chunk work out to give to each person, especially when there are cases where some general work is reliant on a specialist team finishing some part of the engine. Throwing more entry-level coders at a problem can do more harm than good as well.

That said, something like shift work where you have two teams working 8 hour shifts back-to-back might work. I’ve never seen it done, and I’m sure it must have handoff issues, but it might allow for the 14 hour workdays they seem to want (allotting something like 2 hours for a post-shift handoff and meeting). If anyone has experience with that in a development environment I’d love to hear about it.

If anyone has experience with that in a development environment I'd love to hear about it.

I have that experience (software, but not game dev). It was a mess, we would’ve done better to simply have everyone work the same hours.

Tasks are not neatly sized such that you can work for six hours, check in your completed code, and have it ready for handoff with the B team. There are two hour tasks and two day tasks. We ended up essentially working in parallel, rather than in series, only with the inconvenience that our work hours didn’t overlap well.

This is actually what I expected, but having not experienced it I didn’t want to assume too much. So it just kind of degenerates into a standard large development team, just inconvenient because of no work overlap?

Speaking from my own perspective, I couldn’t see it being super productive to try to hand over a problem I’m in the middle of solving while also somehow bringing the other person[s] up to speed on my current thought process and progress.

Seriously, I work as a welder and machinist (well a bit more complicated than that) and the company I’ve worked for quite a few years which produces highly complex systems, we do our design in house, and we can have production batches anywhere from hundreds of units to just making half a dozen very specific ones according to what a customer wants. Besides quality, design, etc being really important keeping to deadlines is hugely important because if we miss deadlines the cost to the company that hired us is usually seven digits per day and to us those losses will translate to something in the six digit range and meaning we’ll deliver at a loss.

In the time I’ve worked there? There have been half a handful of ‘all hands on deck’ sort of situations and those have been due to unforeseeable fuck-ups (one time a software update to our two main welding robot stations put them both out of action for a week and involved a technician flying in from Japan to unfuck them. During that time we did all of those GMAW welds by hand, which involved every weld that was done having two welders present — one welding and the other one to inspect the weld as it was laid, both signing off on it being done properly). Sometimes we do miss deadlines and 99% of the time that is because those unforeseeable fuck-ups are the sort that no amount of manpower can solve.

Our equivalent of crunch is non-existent and we deliver product just like a software company would. If every product involved a rush to meet deadlines and expected goals where we worked insane hours and through weekends as a regular mode of operation? Me and pretty much everyone working there would be having some long conversations with our union representative. If those ‘crunch times’ were not only expected but expected without extra pay and just talking about ‘passion for the job’ we’d be having our union coming down on the necks of company management like a fucking trainload of anvils. Usually one way or another that’s the norm in our industry; the sort of shops that treat their workers like trash don’t stay in business for long.

So coming from that perspective? It seems absurd to me that tech — and industry where to my mind you should be able to map out time allocation far more readily to present realistic deadlines — operates under the assumption that weeks of ‘crunch time’ are just the norm of operation even for massive companies that have been in business for years and years. You really are right that it’s bad management all the way through for that to be going on.

…and honestly the weirdest thing in my opinion is that a lot of tech workers seem to tentatively defend the practice because it’s the expected norm. I’m trying to imagine my profession having the sort of work culture where-in we’d only mumble half-hearted criticism about the boot being at our throats and follow it up with, “Well the boot has always been there and you know it used to be a hob-nail boot? So really we have no right to complain so much — it’s a much nicer boot now!”

You’re right. Crunch shouldn’t exist (and as you said it doesn’t even work).

Saying this as someone who develops software for a company that also does a lot of welding, I have to say it is a lot harder to estimate time writing code than it is to estimate time welding. The only way to know how long it’s going to take, is to know exactly what problems you are going to run into and how you’re going to fix them ahead of time–which is actually the definition of “writing software.”

In other words, you have to finish a job before you know how long it’s going to take with certainty, and by that time….well, the job’s done. Whenever someone asks me for an absolute ETA for a project I quote them the maximum time I think I can get away with and cross my fingers hoping it’s right.

Well sure, welders weld and if you have a set product range you can pretty easily estimate that with bits getting cut to shape/machined, tacked, and then welded together by a robot you can run a batch through and then pretty accurate calculate that you’ll be able to make x number of units during an 8 hour shift and even accounting for maintenance and giving leeway for supply problems that you can have a certain amount of product out the door per month. That’s great, you’ve got a factory line…

The thing is that a lot of companies — including the one I work for — don’t operate in a field where we have a set product range. The work that makes money for us is the custom works. That means we’ll do a run of product where the shipped number can be just five units, heck even a single unit. That means we get approached for work, have a few meetings to figure out if this is what they actually need and if we can even in theory produce it, and then we sign into a contract with an estimated delivery date for the first unit and the last one.

…And after that we start design/engineering on the unit that the customer needs. If we’re lucky we can modify an existing design to fit their needs, but usually it means we are going to do a completely unique batch of units. After we’ve finished our first prototype we have the roughest possible estimate for how long a production run of just five units will take. Sometimes between the first one and the last one we’ll streamline the process of making them enough to knock a quarter of the production time out.

So sure… we can estimate the time it takes to weld something but.. uhm? It doesn’t mean much. It’s just one variable we know about. And even welding with robots isn’t a variable we 100% know about the times for in advance, setting up jigs, re-orienting pieces, etc all takes time.

That’s just the welding. It doesn’t even get into cnc-plasma and water-cutting, cnc-machining of precision fit pieces, etc. Also a fair bit of by-hand GTAW welding for piping.

This isn’t a unique company, lots of companies do project work like this when it comes to working with metal, and at least where I work… there isn’t an expectation of doing lots of extra work on projects to get them out on time, even though there are very many unpredictable variables at play. Honestly, I have to think it’s due to it being the industry expectation in tech that ‘oh it’s so unpredictable thus x amount of time is crunch time to make up for unpredictability’ and not out of any realistic necessity — none of the unpredictability seems unique to writing software.

Games development runs into a weird trifecta of problems that have them differ wildly from welding though. Games generally have set times where they need to be released to have the most impact, but this needs to be decided far before that date. So, say, in late fall, you need to accurately decide if you can make next year’s Christmas season. That’s a long time to plan ahead, especially so for gaming where the technology radically changes each generation.

Then when you do get closer to the anticipated release date, you have to decide a few months ahead of time if you can actually make the deadline. Chances are it’s going to be a near thing anyway, as your company doesn’t want to spend too much time holding onto a finished product. They’ve been racking up months of interest, and they want to sell ASAP to cut down on that. Miss your date and you’re losing out on lots of sales, so even if you could do with a few more weeks at normal rates, you don’t want to wait until after. Better to do a bit more beforehand.

But mostly you’re looking at making predictions about how long it will take to do a project a year in advance, when you haven’t had the experience working with the new tech to make an accurate estimate. You’re always going to be off.

There’s a lot of time spent reinventing the wheel in the games industry, too, at least from what I can see. Don’t think I’ve seen many another industry where “hey, we completely threw out all the tools you used to make that last product” is common between one production and the next.

Game development can be a very strange animal. I mean typically when making a movie, you don’t ask the director to build their own cameras and design their own film. I’m sure it helps that the projector is a known entity but so are consoles? I’m not sure where I’m going with this analogy. Maybe I should talk about cars…

At any rate it’s amazing how far games have come and depressing how far they still have to go. In some ways it’s very 1970’s with our own De Palma, Scorsese, Lucas and the death of cinema and in others it’s very modern with our remakes, special effects and homogenized actors and actresses.

There’s also the problems with going from a development environment to a production environment, or to a QA or from a QA environment.

Put simply – imagine if those wielding robots you were planning on adjusting turned out to come across a situation where when testing, the robot arm you were attaching worked fine, but when trying to get it into production, it all of a sudden wouldn’t move.

Someone manning that production tells the testing group that it’s not working, and when they go to test it in testing mode, it’s fine.

Now imagine this can happen for any robot, and the robots can upgrade at any time, so you might go from testing on one to find it works, then trying to get that robot to work, and all of a sudden it complains that it no longer supports that arm.

And this is just for the programming side – then you have stuff like the arm falling off when slicing at particular angles, which the product you’re building needs to be cut at, and only falls off after testing that it works at slow speeds.

So now you want to toggle that testing mode on when you’re doing that particular cut, but not when you’re doing the other cuts, that are supposed to be faster – in fact, some of the cuts don’t work in testing, but *do* work in production.

And then there’s the arm that only stops working if you try to add some other three components to the robot.

Disclaimer: I don’t have massive experience in large-scale game development (I’ve written a couple of small games, for my own enjoyment, though).

Having worked in multiple companies doing IT, I’d say that “maturity in project planning” probably is the deciding factor. I doubt some of the software systems I’ve seen built (and delivered on time) are any less complex than games (a real-time system, with millions of simultaneous users, composed of several hundreds of software systems, communicating using RPCs, written by tens of loosely cooperating development teams).

Some of the systems I’ve seem even manage to go from “doesn’t exist” to “launched” in 1-2 years (although they were not quite ready for “millions” from day one, more like “tens to low hundreds of thousands of simultaneous users”).

And, from what I understand, without much in the way of “crunch”. Maybe in the last 1-2 weeks before launch.

The game-specific problem is that they’re run by idiots who put crunch into the schedule to begin with. In properly-run software companies, they don’t do that and only add it when the project falls behind, which happens pretty often.

My first job out of college sounds a lot like what you you do, though it involved considerably less custom welding than where I work now (most of the welds my Now-Employer makes are pipe-fitting welds, which requires more expertise than a straight up assembly line but less than designing custom welded parts from scratch)

My first job was as a design engineer working for an industrial instrumentation company. They assemble gauges and switches of different types (pressure, temperature, pH, RH, water/air flow etc.) to different degrees (some they actually cast and machine the parts, some they buy parts and assemble, some they just buy the gauge and private label it).

The company’s claim to fame is the customizability of their gauges. For their most popular product, they sell ~50,000/mo, and for their least popular they sell few enough for me to count on one hand in a quarter. The way it works is: customer suggests something to the sales department > sales evaluates it to see if it’s worth selling > sales sends it to engineering manager who assigns it to a design engineer (of which I was one) > design engineer is responsible for releasing product, enlisting expertise from certification, purchasing, drafting, and manufacturing departments as necessary. This process is what was used for brand new products as well as for one off “please use John Deer’s color scheme for the paint on the gauge face” requests.

Contrast that with my job now. I am a systems programmer for building automation–it is my job to program the brains in commercial and industrial HVAC systems. In your average office building–tens of thousands of square feet, not even talking high-rise office building here–there’s roughly fifty different micro-controllers that all need to function a certain way, coordinate with one another, and provide a user-friendly interface to maintenance personnel (who are generally NOT computer savvy) so the HVAC can keep the occupants comfortable and happy for a reasonable amount of cost.

Despite the fact that first-job was an absolutely terrible mess of inefficiency (manufacturing department swamped? Get in line with the rest of the dozen other projects that are backed up along with yours now because they’re also waiting for manufacturing approval…), I would still say that it was SO MUCH EASIER to make (and keep) a schedule for my first job.

What you have to realize about programming, is that there is very little you can take for granted in either the tools you use or the way your program will be used later, yet everyone still expects it to work all the time. It’s a computer, right–hard hard can it be to make all those ones and zeros behave consistently?

Let me give you an example of just how hard it is. Our company has a go-to VAV controller (a relatively simple controller that varies how much air a space gets) which we have used for five years now. Two months ago, these controllers started acting up. Sometimes, when the power went out, they would lock at reading 90Â°F(32.2Â°C) as the temperature for the space they were supposed to control, and then blast the occupants with cold air until a technician came in and cleared out the locked-in value.

After weeks of troubleshooting it was found that the problem was in the nightly backup the controller makes for itself. Every night, the controller takes all the current readings and setpoints in volatile RAM and flashes them to non-volatile RAM–unless it’s “busy,” in which case it skips the backup until the next night. The “busy” condition meant the controllers were going for weeks without backup in some areas after we started them, and if they didn’t have a chance to backup the controllers would intermittently revert to the manual override settings the startup technician used to trick the unit into cooling for testing purposes…

This bug took us three weeks to chase down and blew the schedule on five different projects, all because a manufacturer for a part we had used a hundred times made an update to a single chip. What’s more, the backup feature itself is not documented anywhere nor was the change to the circuit board. I just got a call–at ten ‘o’ clock at night–because a system I had just installed when ape-shit after an electrical storm and nobody knew how or why it happened.

By developer standards, the systems I deal with are tame. I know what manufacturer made the hardware I’m supposed to work with, complete with full documentation on how they designed their software. I know what the software will be used for, and that it won’t be used for anything else. The algorithms for this stuff are ancient in computer-science years–I’ve looked at legacy systems over twenty years old that use the same control loop I would use today.

Despite the relative consistency of the environment I work in, time-wasting shit like the VAV incident happens ALL THE TIME. On EVERY job. And when (not if) an expected bug crops up, I have no idea how long it will take to iron out the wrinkles until they happen–sometimes it’s a matter of hours, and sometimes it’s a matter of weeks.

See, I’m not going to sit down and program 50 VAV controllers from scratch, I’m going to buy 50 controllers that are pre-programmed with a sequence of operations that should nominally work. And by and large, the same goes for the other dozen sub-systems programmed by one of a dozen different manufacturers who all have their own idiosyncrasies; those systems will invariably break in new and stupid ways as time marches on.

Similarly, mass-market software will never be written from completely scratch–it will be cobbled together with bubble-gum and prayer using a mound pre-written tools because nobody has time to roll their own (e.g.) video driver from scratch. Not only that, but a developer can’t know with certainty what hardware their software will run on, what version operating system will load it, what other programs will be competing for hardware resources alongside it, or what doofy thing users will try that the dev didn’t account for. I’m frankly amazed any PC software ever gets released at all.

It’s like trying to weld when you can’t count on your material supplier providing the right alloy. This happened at my first job–Purchasing switched to a Chinese supplier that kept sending recycled steel for our parts instead of the alloy we specified, and we had to institute a 100% testing protocol because we needed to be able to count on that steel having the right properties. It was a mess that cost the company millions by the time it was sorted out, because when you write a spec, you expect the material you receive to follow the spec. Programming is never being able to count on the spec, but a thousand times worse because there are a thousand times more specs.

If it’s something that’s as broad as a job description, and not a single team (or person, if you’re really in a bad place), I wouldn’t call it specialized. The situation described before, is some small number of people, who are not easily replaced by somebody else already in the company.

In many companies there are a very small number of people who really, deeply know each particular subsystem, and simply know how to use (most of) the other subsystems.

If the problem is within a subsystem, it is usually faster and more reliable to wait until one of the team who really knows that particular one to spend some time on it.

Sure, I could take apart the Library Of Doom looked after by another team to find out why it doesn’t quite comply with the spec under particular circumstances. But it’ll take me months simply to learn how it works – during which I’m not doing anything else.

Or I could ask the team who look after the Library Of Doom to fix the problem, wait a few weeks while they do other things, and then get the fix in a few days once they start on it.

During that wait, I can usually do other things – perhaps fixes for my very own Library Of Doom used by other teams.

It seems that the problem is not so much that crunc can be used to finish of a project in case of overrun. The problem is that crunch has become defacto way to end EVERY project. It is as if when planning how long making a game would take they go, n number of years and m weeks of crunch. And then when the project inevitably slides back they extend the crunch even more.

From what I can see, crunch should be a thing of last resort in case of an unbelievable opportunity (cutting your game’s publish date in front of your rival’s in order to undercut his sales) or to get a troublesome project out the door finally.

In addition to salary, hiring has substantial costs. Training time, benefits, taxes, workstations for each employee, etc all adds up to a large cost. I occasionally see some really rough estimates that a salary is basically half the annual cost to the employer for each employee.

If you can just hire one person to work an 80 hour week instead of 2 people working 40 hour weeks, that’s a massive savings. More so if you can get people to work 80 hour weeks for 40 hours worth of pay (or whatever the normal terrible amount of crunch time is) and say its an industry standard as an excuse.

On the other hand if you need to train a new employee every six months because the old one is burned out the whole training costs return, and you loose a chance of keeping people that could form the core of your staff for years to come.

On the gripping hand employing a parade of fresh graduates is probably cheaper than having seasons veterans that would probably want a raise.

And at the end of the day don’t know about you but I have no numbers of any sort to actually try to judge these decisions. I do think that the constant crunch mode is a very short sighted strategy by the higher-ups but this is not an opinion of someone who studied the topic or, pun not intended, crunched the numbers.

Throwing more and more staff at the problem reaches some very sharp diminishing returns once you’ve reached the point where you just can’t divide the work up that neatly anymore.

And yes, most of them are salary, so it’s cheaper to make one person work longer than to pay two people (people’s needs don’t really scale with the amount of time they work, so it’s easier to find someone willing to work 80 hours a weak for 40,000 a year than two people willing to work 40 hours a week for 20,000 a year).

Spend more overall money on salaries isn’t really an option, since the companies that have this problem are already struggling to make a profit off a game that sells ten million copies.

If the labor supply did dry up for them, we’d see a major change in the AAA games coming out. They’d need to get far more value for their development time, which would mean a shift away from some of the more extravagant stuff AAA games spend their money on now. Procedural content would probably come more into style. EA and Ubisoft may or may not survive the transition.

The cost of hiring another worker is not just wage–all workers come with massive overhead that usually far eclipses what you are actually paying them. Whether it’s their insurance, your insurance, the room and equipment they need, etc. It’s usually not feasible to hire more workers compared to just paying what you’ve got over-time or, if over-time isn’t even a thing with the way they’re getting paid, just forcing less people to work more. I think video games usually are salary? So, yeah. But even if over-time pay was on the table it’d still be the more attractive option to hiring more guys.

Even going beyond pay, as has been mentioned, more people is usually horribly cost-ineffective in software development. The best you can do with more people is not “more work” but rather “more accurate work”, if instead of one guy at each terminal you’ve got two. One guy typing with the other guy there to spot-check for errors/bounce ideas off of/take over the keyboard when needed. This leads to slower releases and costs you more to get there, but in exchange you’ll have generally more stable software and it will reduce long-term maintenance costs. With something like video games I don’t think many people care as fast releases matter far more than reducing maintenance–video games are not software your clients will be using for years or that your company will be iterating for years.

Software development is complicated, in terms of accurately estimating the time needed to complete a project and, even worse, scheduling it out. If you’ve ever done any long-term construction projects it’s kind of like that only worse–you can’t do this work until the local bureaucracy clears you for it so that’s an unknown, you can’t do the next work until the work you’re getting cleared for is finished and that work preferably needs to be done before this other thing so…on and on, if you hit a snag at one point everything will get backed up. That’s just how large projects with many moving parts always, always work out.

With video games, as I understand the process, it’s all way worse because unlike most software development or construction projects there are no contracts to point to for each stage, making changes of plan and feature creep way more common than they should be. In software development or construction if a client wants something else mid-way through you simply point to your original contracts and tell them they’ll need to draw up new and separate ones, which will cost them, if they want what they want done. In video game development your “client” is within your company and there are no contracts that I’ve ever been aware of so they can sort of just have you do whatever without necessarily changing your allotted time to finish or your pay.

With video games, as I understand the process, it's all way worse because unlike most software development or construction projects there are no contracts to point to for each stage, making changes of plan and feature creep way more common than they should be. In software development or construction if a client wants something else mid-way through you simply point to your original contracts and tell them they'll need to draw up new and separate ones, which will cost them, if they want what they want done. In video game development your “client” is within your company and there are no contracts that I've ever been aware of so they can sort of just have you do whatever without necessarily changing your allotted time to finish or your pay.

Even worse, the construction industry has thoroughly-proven ideas and blueprints about exactly what constitutes a structurally sound building, and how you can make one. You know before you break ground if your building is going to do what it’s supposed to, and only a series of catastrophic mistakes will mess that up. With games, you have to jump through all the project management hoops, and then you can realize two years in “Crap, the game’s not fun”. The ill-understood nature of what makes entertainment good adds a whole other dimension to scheduling.

I wouldn’t say that the mere fact of having any crunch at all in the first place is the result of incompetence. They probably willingly plan for crunch as a way to cut the deadlines shorter. They don’t need crunch, they just want it, expect it, and plan for it from the beginning.
At that point, the difference between a time-management success and a failure stops being that the good one has no crunch at all, and starts being that the good one has less crunch.
I wouldn’t call it a 100% failure, more like 60% failure and 40% being misguided by the industry standards. They are moderately competent at executing a bad idea.

What I’m really curious about is, is crunch mode actually useful in those dire circumstances, or is it actually counterproductive? Does the gaming industry, or the greater IT industry actually have the data backing up the claim that crunch mode is as useful or necessary as they seem to think it is?

This is an industry veteran who ““ even if I disagree with him on a few points ““ has a lot of experience and knows what he's talking about when it comes to running a business and developing software.

Same thing can be said about a bunch of people however.Bobby Kotick and John Riccitiello,for example.So I wouldnt say that just because someone is an industry veteran who knows what they are talking about automatically knows their stuff well.

Besides,Im pretty sure that in the days of industrial revolution plenty of people were supporting nearly slave like working conditions in factories that have since been proven to be incredibly counterproductive.

Unfrotunately, I don’t have good links to peer-reviewed studies, but my recollection says that ~30 hours per week is optimal for “quality of intellectual work [performed per hour”, you’re still making net gains at 40 hours per week, but once you hit 60-80 hours per week, you’re on the verge of destroying quality and much beyond that, you’re probably both killing people and making negative progress.

Thank you. I read that article yesterday and was utterly disgusted. I didn’t have the time to reply, so I really needed somebody else to write my thoughts for me. Well, my thoughts had way more swearwords and insults, which is another reason why reading yours was better. :P

I know you made the reference to doctors being able to do it for a few years, but even the medical training industry has started to recognize the endless shifts of residency as harmful and is starting to roll them back a bit.

People that had to put up with crap get very invested in NOT changing that cycle of crap. It’s like, if things get better, then maybe all they went through was a waste and valueless, and they tolerated it for no good reason except not knowing they didn’t have to.

Actually, a lot of hospitals are trying to figure out how best to handle resident hours. One perspective says that residents are more prone to making mistakes when they work such long hours. The other perspective says that more mistakes are made when patient care changes hands, i.e. when doctors change shifts and continuity of care is disrupted, there is the greatest opportunity for error.

So, some hospitals are running clinical trials looking at both systems to see which results in a greater number of clinical errors. I’ll note that the hospitals couldn’t give a wet fart about residents’ quality of life.

Maybe we could accept this if we assume that physically means just the muscles.Maybe.If we disregard the fact that our brain is consuming the most of our bodies energy.But lets say that he is correct.However,just because something isnt physically demanding doesnt mean its not physically taxing.Sitting for longs periods of time does great damage to your muscles and bones,especially in your back,and unlike with plethora of other office jobs,working on a software doesnt require you to get up and “take that one file you need” from your coworkers.Then there is the irregular food and fluid intake that are happening when you are concentrated on work,and prolonging those periods can lead to all sorts of horrible stuff for your body.And of course lets not forget stress and the lack of sleep and the influence they have on your whole body,especially your heart and blood vessels.

So yeah,even if we assume that that statement is not wrong,it still is wrong.

You’re clearly forgetting that the only real kind of physical work is the kind where you have to go out into the wild and hunt for your meals, RSI is not a real phsyical problem, and damage to your lower back and knees from sitting in cheap crunch-mode office environments for 14+ hours a day isn’t real damage to your body.

/Sarcasm (aimed at Mr St. John, not you Daemian)

It’s wouldn’t be a surprise to find that people taking less physically intensive roles are less physically strong (generally, obviously this is not true for everyone). As a result it should also not be a surprise that they are more likely to suffer physical strain from less physically demanding work!

Alex St. John sounds like the worst kind of callous idiot, one that believes he is correct in absolutely everything because of (FILL IN LIFE STORY OR CONJECTURAL “EVIDENCE” HERE).

But like I said in the article, arguing about this is a sucker’s game. You can argue it’s bad for the body, but is it worse for your body than being a dockworker? Cab driver? Fry cook? Sanitation worker? Nurse? All of our jobs are killing us all the time, because entropy is a son of a bitch.

That’s great if you’ve got all the time in the world to haggle over wikipedia articles and actuary tables about vocational hazards with the St. Johns of the world, but that doesn’t look like the most productive angle to attack this line of thinking. And personally, if you asked me if I’d rather work at a desk for 40k a year or empty garbage bins into the garbage truck for 50k a year, I’d take the desk job every time.

But beyond the physical aspect, sidestepping entirely the fact that a serious desk job like programming is mentally exhausting is a huge gap in the argument.

Alex St John called it “pushing a mouse around”! I’d never seen that before. This is not factually describing the job, it is going out of his way to demean the guys.

[I guess what I’m trying to say is: beyond arguing about the logic of his argument, there is ample room here to argue that his article is clearly a hit piece more than a pure opinion piece. He has an agenda his piece is very emotionally-charged, whether he is writing from that place or trying to get a rise out of people.]

I think the occasional bit of reductionism like this can put things in perspective. These jobs are cushy by comparison. Thats why I pushed till I could get into a modest front end developer’s position. I’d rather do that than anything I did while working for pizza franchises, even if being on my feet helped keep my weight down.

Or even other desk work. Ever try being a collector or telemarketer? I can tell you from personal experience that both of those jobs are worse from a mental health standpoint. I’d rather work 80 hours a week as a game dev as opposed to 40 as a collector or telemarketer.

I’ve had monotonous physical work (manufacturing), I’ve had monotonous desk work (data entry), and I have had non-monotonous work. From my completely anecdotal, non-scientific observation, the most unhealthy aspect of any of these jobs is the repetition–sitting at my desk doing nothing but typing and reading was terrible for me when I was doing data entry, and it was not good for me to stand in the same spot performing the same task on 500 car parts a day.

I say this, even though my current job often requires me to twist myself into some REALLY uncomfortable positions sometimes at my current job(you’d be surprised at some of the inconvenient places they cram the computerized controls for the HVAC sometimes). However, that strain doesn’t come every single day, so overall I’m better off even if I have the occasional backache.

Sure,it may not be worse,but that doesnt make it substantially better.

Also,theres the type of person you are.I too would pick a desk job for less pay like you,but I know plenty of people that would turn down a desk job that would pay better,simply because they cant force themselves to sit for so long.So rarely can it be said that a job is objectively better.

But isn’t that Daemian’s point, Shamus? It’s completely a false dichotomy. Every type of work is demanding so the refutation of A is that there is no such thing as a truly easy, non-tiredness producing job. Not that office jobs are not physically demanding.

His whole argument stands upon quicksand. I think that can and should be called out when having these discussions.

I could argue the comparison by saying “show me another job where repeated unpaid overtime is normal”.

Our accounting department at work crunches at period end and especially year end. It’s all given back in time off and overtime on the other side. Facilities will have someone in at 4am to snowplow the parking lot, but that someone will either be on their way home shortly after lunch or will get days off. Is there an industry beyond video games/tech where the idea of “oh, I’m just gonna boost your hours for no extra pay” wouldn’t get laughed out the door?

Only if you’re a manager. Everybody else typically works wage and gets overtime. These days in fact, a lot of fast food places try to keep their employees at 30 or less to avoid health insurance requirements (worse, last I checked, if the employee gets two jobs, their combined hours count towards this requirement which makes it harder for willing workers to get the hours they need).

In the U.S., employees have two classifications: hourly/salaried (salaried employees make the same amount of money no matter how many hours they work–it’s interesting to note that this is technically true even if you don’t put in 40 hours), and exempt/non-exempt (“exempt” means “the employer is exempt from having to pay extra for overtime”)

Salaried, by definition, means you are exempt. Hourly employees can be either exempt or non-exempt (hourly exempt means you get paid for extra hours, but not at time-and-a-half). There are regulations on when and whether employers are required to classify an employee as “non-exempt,” which is broadly based on the nature of the work the employee does.

The spirit of the law is that white-collar workers like managers and engineers should be exempt, because they need to be there when the shit hits the fan and they’re damned expensive if all that extra time is overtime, while blue-collar workers don’t make enough money for employer to demand all their free time without paying them for it.

The actuality of the law is that companies are getting better and better at finding ways to classify an employee as “white-collar,” even when common sense says that they’re not. For example, when I worked at Walmart, the floor managers got paid less than the employees they managed–the promotion carried what amounted to a few-dollars-an-hour raise if they worked 40 hours, but they usually had to work 50 or 60 a week on average and they were salaried. For another example, I’m pretty sure there are quite a few publishing studios that get away with classifying testers as salaried/non-exempt even if they pay minimum wage.

The law is more complicated than that, and not all salaried workers are overtime-exempt – but it’s not important for the purposes of this conversation because one of the exempt categories is “computer programmers”, defined broadly enough that anyone who’d be in crunch time in computer game development counts.

I know it’s more complicated than that, but for the purpose of explaining how overtime works to people not from the US, I thought it was good enough.

Salary with overtime is a thing that officially exists? Is it just a law on the books, or do people actually get hired this way sometimes?

One of the places I worked I was salaried, but I would get comp time if I went over 40 in a week (not on time-and-a-half) because the owner was heavily dependent on one particular employee who demanded it and held the accounts for half the customers (three-quarters, if you go by how much they paid). That was all under-the-table, though.

Salary with overtime does officially, legally exist. There’s no law against paying your non-exempt employees on a salary basis, but you have to track their hours to make sure they’re paid minimum wage each week and overtime for anything over 40 hours a week. That’s one reason salary with overtime is uncommon–if the employer has to track hours worked anyway, and has to pay overtime anyway, they’ll probably end up spending less money if they pay on an hourly basis.

Thats how it was for me and was for my younger brother before he became a chef.

But where I lived, you could usually get teens to pad out the rush and then send them home. They still lived at home and just wanted money for the mall or whatever. Easy way to cover demand without having lots of full timers on payroll.

Any salaried role where a monthly salary is paid instead of an hourly rate (at least in the UK). I work a fairly standard 37.5 hour week with an expectation that I will work more to fit the business need as and when required.

Fortuitously it’s not possible for an employer to really mistreat customers in Europe due to the European Working Time Directive which protects workers from being made to work what are deemed to be unfair/unhealthy hours. The biggest highlight would be that over any given 13 week period, your weekly average hours worked cannot be more than 48.

Of course, an employee can always waive their rights under the WTD and work whatever horrible hours their employer deems necessary. Such a waiver seems to be commonplace in many consulting firms over here, though it can never be made mandatory (outside of emergency services and armed forces).

It’s perfectly legal, because the unpaid overtime is technically voluntary. So long as the employees don’t mind being first on the chopping block come layoff time and last on the list for raises and promotions, they can work 40 hour weeks.

Like I said, technically voluntary. It’s not a firing offense, but when the company reduces its staff for other reasons (which happens pretty regularly; they don’t need the full team during the initial planning phase of the next game) they’ll start with the people they can’t convince to work overtime.

Regarding legality of unpaid overtime in the US: if the employees are exempt from overtime under the Fair Labor Standards Act (and I would have guessed most programmers are, but I’m surprised to hear all this talk of contracts, so my guess might not mean much), then the concept of overtime doesn’t apply to them. They’re paid the same amount regardless of how much time they work.

If the employees are non-exempt (and therefore paid by the hour), they cannot legally waive their right to overtime pay, no matter how much they want to. They can of course choose not to pursue legal action over it.

And you can be fired for almost anything–just not membership in a protected class (gender, race, a few others) or in retaliation for legally protected activity (whistleblowing is one example). An exempt employee can be fired for refusing to work more than the 40 hours/week that’s usually considered standard. A non-exempt employee can be fired for refusing to work properly paid overtime. I’m not really clear on whether it’s legal to fire a non-exempt employee for refusing to work unpaid overtime, but the employer can always say they were fired for a different reason and it will be hard and expensive to convince a court otherwise.

My guess* would be the that it’s the combination of these being basically kids to whom this is their first “real” job, their contracts have clauses that makes them easy to fire and management never putting on paper that overtime is mandatory. I guess US work laws allows for workers to voluntarily work unpaid overtime, so while management says they will be open longer from now on it never puts it in writing that these work conditions are mandatory. And so you work longer hours because EVERYBODY else is working longer hours. And if you start complaining and try to actually work only 8 hours, A your work is likely to be passed on to your colleagues who will not look kindly on you not toiling with them B you will then be fired/laid off (is there a difference?) for some unrelated reason. Like not completing your tasks in time or something. And these are young 20 somethings that likely don’t even know their rights.
And on top of it all there are more junior devs than there are open positions, and so you stay because you are rightly afraid that if you make waves and get fired you will need to do the whole look for job strsfull thing all over again, only now you are older and have being fired on your record.
On top of it all courts cost money that these young people most probably don’t have and because of the above the cases would be quite shaky.

*a guess of somebody not from US so completely unaware of how their contracts actually work, so take with a large grain of salt.

Interestingly our work laws passed during “good old” communist/socialist days state that the work week is 5 work days of 8 hours for a total of 40 hours (with some SPECIAL! exceptions). Overtime is only allowed in cases “of unpredictable increases of demand in case of NATURAL DISASTER or STATE of EMERGENCY” and only with a “written permission of the Ministry of Work”. Unfortunately this is largely not enforced and you are quite likely to be working 6 days for 8+ hours in the private sector. On the other hand since the overtime is illegal, work hours of these people are unlikely to be compensated even if the employer was willing.

“Fired” implies incompetence or misconduct on the part of the employee, while “laid off” means the company wants to cut expenses and is no longer able or willing to pay someone to do their existing job.

If you are salaried in the US, your employer can demand you work overtime without compensation. There’s nothing illegal going on, not rights on the books being violated; that’s just how it works in the US.

The only way anyone could bring something remotely resembling a lawsuit, is if they contested the company’s decision to classify them as salaried. Unfortunately, the labor laws haven’t really kept with the times here, so unless you’re in a union or you’re a manual laborer at the very bottom rung of the corporate ladder, the language is vague enough to classify you as salaried.

Employees can quit any time. I have. Its not always easy or pleasant but you just keep looking till you find something that suits you. Some places get reputations for treating their employees poorly, and some get reps for treating their employees well. You just have to ask around. There are good places to work.

Maybe its that way in really competitive fields where you’re shooting for a big lofty goal. I chose comfort and reasonable hours and was willing to settle for middling pay. But at least we get that choice.

If I may ask, what was your particular labor market like when you quit?

Something I’ve noticed a lot of people–especially older people–tend to do is project the socio-economic environment they grew up in onto the socio-economic environment now, and it usually leads to something like “all you need to do is work hard and you’ll advance the ranks like I did even if you start on an assembly line” or “just quit your job and find something better like I did.”*

I have quit my job for greener pastures like you did. I have also been stuck without a job and without any prospects for a job even though I have two degrees in Engineering. The difference between the two is that it was 2009 when I had no job, and it was 2011 when I left for another.

It’s easy to say you will just quit if you don’t like where you’re working, as long as you ignore the context of the wider marketplace.

*:I think St. John is particularly guilty of this sin–when he began his career computers were new and their intricacies much simpler, so employees weren’t expected to have degrees or extensive experience with them. As far as he’s concerned, all it takes is a willingness to learn and a can-do attitude, and anyone should be able to land a job at a competitve tech company. That’s not how it works today.

Unless you guys have changed it,overtime is not illegal,but it is heavily regulated.If you work even half an hour overtime,you have to be paid at least 4 hours of overtime.Overtime hours are worth 50% more.And you cant have your overtime spill….this I forget,but I think you cant have more than double shifts.

What you are talking about is mandatory overtime,when the employer can simply tell the workers to tough it out,instead of asking them (not so) politely to do it.

“even half an hour overtime,you have to be paid at least 4 hours of overtime”

were’s that, in my state there’s no minimum over time payment, you work one hour overtime you only get one hour overtime pay (though it is time and a half). company I’m working for tracks stuff down the minuet to, though I think paychecks might be rounded up to 10 minuet or 15 minuet increments

Like 4th Dimension said,the old communist yugoslavia installed some pretty good laws when it comes to securing the workers.And now that the country broke,every politician is trying to remove them.They are doing pretty good job of it(especially with retirement),but some safeties still remain.

The problem is not so much that politicians are trying hard to remove the old limits, but that the old limits are not being respected, and employers de facto do whatever they bloody want. Six day work weeks with full work hours and nearly mandatory meeting AFTERWARDS are not uncommon, and I was offered to work in suc conditions once.

So as much as it galls me to admit, in a way the politicos are right*. The only sensible thing when law is being violated by the majority of the economy is to legalize at least some of it, in order to provide some protection to the workers, and pursue the worst offenders afterwards. *That is IF they are doing this for this very reason.

One interesting article that was deleted from the Law? The one concerning legalization of shortening the work weeks and hours in cases where modernization reduces the work load.

The thing is,those that are not abiding by said law dont abide by a bunch of other laws.Namely they dont register their workers so that they can avoid paying social care for them.Which,like youve said,is mostly done in the private sector.Though in the public sector there are still those that dont know their rights and they are heavily exploited for it.The ones that know their rights,especially if they are in a union,will slam their employers hard if they even attempt to disobey a law.

Let me check the LAW. Uhhh, yeah you are correct, what I was referring to was mandatory overtime. I did not find any references to how it’s compensated though.
The way the LAW is written it never considers the possibility of basically voluntary overtime.
Article 49 introduces the term and sets down a maximum of 10 weekly hours overtime. The employer has to notify the worker in time of their decision to institute overtime in writing. The employer can do it orally but has to supply the employee with a written instruction to work overtime within 5 days of overtime stopping.
Article 50 states that the worker has to work overtime only in specific cases (floods, earthquakes etc. AND other causes set up by the collective contract*
Article 51 deals with doctors and nurses and how they have to work overtime if it’s the only way to support the necessary level of medical attention.
Article 52 states that the employer has to notify the work inspector of their decision to institute overtime within three days of making such a decision. The inspector WILL FORBID (not may) overtime in such cases where it conflicts with articles 49-51.

* A sort of contract that is a basis for contracts of workers in a specific field, so most of them have the same rights and duties and only person specific things change in specific contracts. I think the union is supposed to negotiate this. I don’t know what happens in the private sector with this though.

So you are right, nowhere does is state that a worker can not decide on his own will to work overtime, but it’s kind of implied that any decision by the business to work overtime is illegal. And for all intents and purposes what game development companies are working is mandatory overtime, and in many cases involves more than 10 hours of overtime. So it’s still applicable.

I see. The tone of what he was saying sounded like “you’re talking about your ass, this is what the law REALLY says” but it didn’t sound remotely close to the way I understand labor law working in the US.

I mean, there are debates happening in my state congress right now over whether it’s fair to employers to demand they give more than a couple hour’s notice before telling an employee they have to show up for work that day. I’m positive there’s no law here requiring employers give three day’s notice before instituting overtime.

I have known many people in the Health Service (Doctors, Nurses, support staff, etc..) who have worked (and in some cases still work) in excess of their paid hours. It’s not exactly expected of them, but many feel that obligation towards the patients to continue to provide the service even when they’ve already had a hard days work. In some cases this is unavoidable, as it can be difficult to plan ahead for certain situations, and in other cases the time is adequately made-up for (either in overtime pay or time off in lieu), but some sectors are particularly bad for being overstretched on a regular basis.

I keep hearing from people “you like games, you can program, why don’t you get a job as a game programmer? They’re constantly on the lookout for new people it seems”.

Well, hearing all the stories about eternal crunch, and then hearing people like Alex St. John defending the practice, even sort of extolling the virtues of it, made me realize I’ll be much happier doing a “regular” programming job, having time to play games and have a family, and maybe do a bit of indie in the weekends.

There are enough jobs out there for me where the pay is OK, the hours are good, and my personal life doesn’t come second to work life, that I don’t need to consider such a degrading work environment. Which means I won’t.

It is too bad for studios who do the right thing though, I guess they’re missing out on a lot of people who made the same decision (either before they even tried, or after they’re burned out on games development). These are the kind of people who really should be railing against this type of articles, since they are the ones who are losing out the most because of these words. The people affected by crunch should somehow act (I’m not exactly sure how though: quit? unionize?).

Shamus mentioned the oversupply of labour. If the company can afford to let go all the unionized staff and replace them, then a union would have no leverage over the employer. Size and capital count here too. Corporations like Walmart are big enough to shut down entire stores rather than let staff unionize.

When I worked at Target, there was a big push to inform employees of the dangers of unionizing. This always sounded sketchy to me.

On a lighter note, when I explained that after I clocked out at Target (after closing time) that I had to wait 25+ minutes for the manager to unlock the door to let me leave, my engineer cousin responded: “I would’ve been calling the cops, telling them I was being held against my will.” I joked and told her it was good she didn’t work retail, but really we can be too complacent in what we’ll put up with from our employers. Sure, calling the cops wouldn’t have been a great idea and would have gotten me fired, but with eight or more employees losing time waiting to be let out of the door, there was hours of unpaid time being lost every year.
We need to be better at standing up for our time, our health, and our sanity – in every industry.

Industries that need good programmers to do boring-ass shit are a dime a dozen. If you can program and happen to have other skills as well, then the world’s your oyster. It won’t sound as sexy, but gives you more time to enjoy things.

I think most programmers who like games would do well to keep in mind that the best way to enjoy them is to have a job that leaves you a lot of free time to play them. (Or, if you’re like me, to obsessively read about them, write about them, and watch other people play them.)

This. As an experienced programmer who mostly does web development and some utility work, I can basically go wherever I want. The company I work for hasn’t *stopped* trying to hire people in the last few years. On the other hand, the game development side of our company hasn’t needed to hire people in quite a few months. Even the “boring” side of things can have some really interesting problems to solve, though!

Thankfully, we aren’t a AAA studio on the games team side so the hours are reasonable and they either pay *all* overtime or give equivalent time off + gifts of the employees choosing.

It’s not just that people avoid game dev due to crunch, it’s that if you have the skills to program games, you probably have the skills to make 1.5-2x more money administering databases or making project management software or something else boring (source: I’m a programmer). Unless you went to one of those game dev colleges, but those are a trap.

The game industry pays terribly for the same reason it treats people terribly: it knows there’s an endless stream of eager kids who love games and have therefore made it their life’s ambition to work on games, whether or not that’s the best career decision to make. I don’t think we’ll significantly improve working conditions until we address that underlying supply and demand issue, and I can’t see that happening any time soon, because that requires changing some really fundamental human nature.

a perspective that i sorely miss in this whole discussion is the health perspective. It is pretty well established that one of the biggest factors when it comes to wellness is stress. Stress can cause not only mental, but also physical illness and affliction. Here in Sweden, the national center for work and environmental medicine has established that stress and mental strain are the most common reasons for work related impairments for women, and the second most common for men. Now, obviously, it could be different in the US. but i would be surprised if stress-related diseases are not a large socio-economic problem there as well. (western countries are usually more similar than we think)

So when an industry is systematically forcing stressful moments on their workforce it is a big deal. This is not only a question about being an “attractive employee”. this is about working conditions that are shortening peoples lives, increasing the risk for employees to suffer from, for example heart diseases, chronic pain and mental problems.

By the way,correct me if Im wrong,but arent most people doing these crunches not paid overtime?Because if they are,you could argue that they are doing for the extra money.But if they arent,thats an extra nail in the coffin for the support of the practice.

It might vary from place to place, but I think the vast majority of developers are on fixed salaries and don’t get paid for overtime. The only exception I knew of personally was the QA team who got overtime, but that was because they were on minimum wage and it would be illegal to get them to do unpaid overtime.

It is galling to be told by the guy who profits from your work that you need to do unpaid overtime to show your commitment to the project and to make sure it gets finished on time. Yeah, I need to work for free to increase *your* profits when you already pay yourself ten times what I earn.

What made it more fun was the way they kept tabs on trivial stuff like being ten minutes late back from lunch (after having started lunch 15 minutes late, so not even taking the full hour). They could never use it as a reason to fire someone (in the UK, you have to have a good reason), but they could certainly pretend it mattered and use it as leverage to try and force more unpaid overtime out of people, if the emotional blackmail of “you’re letting the team down” didn’t work. And they applied such leverage selectively against the people most resistant to doing overtime.

Yeah, I think something missing from the discussion is how there’s a BIG difference between choosing to put in the extra overtime, and being routinely obligated to put in overtime without any control over your own schedule or say in when you have to crunch.

I work programming building automation systems, and I do get “crunch” sometimes. Emergency calls for when million-dollar machines the size of houses take a dump, construction jobs with a deadline fast approaching when we’re the last ones who get to work on them, and of course system-wide upgrades that need to occur during off-time hours…it’s just part of the job.

However, I have no problem with it because A) I largely get to make my own schedule, so I can prepare myself for the “this will take you until three in the morning” stints, and B) if I feel like coming in late or working from home because I had a long night of troubleshooting, I can.

My job demands that I go the extra mile at times, but I choose to exceed reasonable expectations rather than having the choice forced on me, so I put up with it even though I’m not compensated. If I had a boss who demanded I work late nights on their whims, who gave me no autonomy in my hourly output, and didn’t pay me for it, I would tell them to shove off while I find a job somewhere else. I have told such a boss to shove off while I find work somewhere else.

Which, incidentally, is why I’m not in game development, even though I think I’d be good at it. And for the St. John’s of the world: entrepreneurship is for the people who weren’t born with chronic health conditions that have high monthly expenses; just because I don’t hike up my sleeves and foray out on my own doesn’t mean I’m a layabout.

Boy,that sucks.I had to do a long crunch once.I worked for two months without a single day off(just a bit over 60 days),and some of those days I worked for 12-16 hours.And even though lot of it involved business lunches,and only a few of those days involved me being glued to the chair the whole time,it was still extremely exhausting.I got a hefty overtime for it(I basically got the equivalent of 5 regular salaries for that period),but I still would not want to repeat that.But if I had to pick between doing that regularly for no extra pay or starving,I think I would go with starving.

Also,that title is incredibly ironic.You would expect that an article titled “Game developers must avoid the “˜wage-slave' attitude” would be against a practice that is turning young people into literal wage-slaves.

It is hard to decide whether Alex St. John is blinded by his own massive self-regard, or if he is purely a cynical hypocrite. Is he that naive? Or is he trying to perpetuate what is beginning to sound more and more like a Ponzi scheme, with him at the top reaping the benefits at the cost of crushing the spirit of thousands? And could those thousands please have their spirit crushed in silence?

The hypocrisy is felt the hardest when he implies that the true way to go is to be indie, i.e., entrepreneurial. But if everyone had gone indie 20 years ago, where would his own career be? Like he didn’t have some wage slaves to prop up his work when he was at Microsoft?

Either way, his arguments are so easy to punch holes into it’s more sad than satisfying. But his article made me angry and I feel the need to jot down my comments about it before my head explodes.

Me, me, me, me, me. The first paragraph really lets you know just how special Alex St. John is. And you know what? It’s not untrue. He has reasons to feel accomplished. But to suggest that everyone should emulate him is massive hypocrisy: it is not possible for everyone to be that successful. But peons in the industry deserve a decent life even if they are just peons.

Game dev is not physically hard. True, but it is mentally draining. It is a different kind of exhaustion, but it is exhaustion nonetheless. And it takes a toll on your well-being, including physically, and it also takes a toll on the quality of your work.

Just go indie already! Again, it is hard to believe that this argument is made in good faith. Indies don’t have the inside scoop on making their games successful. (Hint: no one can guarantee a game’s success. At any scale. Alex St John must know it.)

Be more passionate.
It is very difficult to continue to feel passionate about your chosen career when you are
treated like a cog in a machine that will casually crush your life. Large projects like AAA games have intrinsically a problem in making all participants feel involved. Mistreating said participants won’t help.

Who needs a life? Many people. Don’t project your pathological values on others. This is not the 1980’s anymore. Game development is not a niche activity done in basements. It is a mature industry that employs a very large number of people in a large number of positions. Expecting all of them to give their all to their work forever is not realistic or fair.

(This last point is the part where I do believe he may be sincere rather than hypocritical. He appears to have walked that walk at least.)

I think he’s more blinded by a cultural perspective that’s weirder than he realizes. He got his start as a developer working for Microsoft, which has a notoriously proprietary attitude towards it’s entry level programmers, which is not really reflective of the way new graduates are treated at EA or Ubisoft.

Experience working for Microsoft has value on a resume, especially if you’re looking for an even better paying job, working for Microsoft.

They have a longstanding policy of hiring tech savvy kids right out of college, and then funneling them up the ladder and supporting the growth of their skill-set. High level Microsoft executives tend to be veteran software developers who branched out into management as they climbed the ladder, and they tend to encourage the same culture of loyalty and continuing training that produced them. They expect a lot from their people, but there’s a return expectation that loyalty will be rewarded with a bright future.

You can say a lot of horrible things about the Microsoft corporate culture, but when it comes to treating their own people like teammates, instead of forklifts, their reputation is solid gold.

I don’t think he really grasps how shitty it is to work at a company that sees you as a disposable battery, to be drained and thrown away.

“Anyone running perma-crunch has basically decided that morale, loyalty, and enthusiasm have no vale.”

Value perhaps?

On the subject itself, we’ve recently had a change of manager at work (non-gaming industry, but plenty of programming)

Manager 1 pushed all staff to work extra time, berated staff for having the temerity to take lunch breaks, and after the project blew out in budget and schedule, blamed everyone but himself.

Manager 2 supported all staff, understood that extra time was necessary occasionally, so gave those that put in extra hours time off in lieu. Additionally, after the project was finished (on time, and under budget) he gave the hardest working staff a sincere thanks, and a small bonus.

My experience is game development is minimal (designed and ran a Neverwinter Night’s server for a few years), but this attitude shows up occasionally in other business environments too (I work in accounting*). Fortunately, many employers are recognizing the value of “work-life balance” these days.

I’ve worked at jobs where they wanted you to put in 50-60 hours every week, and clear your schedule to be available on weekends. Um, no. I did that once during a legitimate crunch time of closing a merger deal for a month and a half, and it’s often par for the course during an audit, year-end, etc., but I refuse to do that as a normal course of business. Usually that means the company is too cheap to hire enough people, and so they’re literally trying to profit off of wearing you out. There’s too many other companies that will pay you the same or better to work a normal week. Hopefully the game industry will learn some sanity sometime soon and join them.

*This attitude is actually pretty worse in auditing. Auditors can work much more than 60 hours a week. The Controller at my last job told me that when she was in Public accounting, they had to keep track of the auditors’ hours because there was a risk of them going below the minimum wage when comparing their hours to salary ratio. I never went down that path, and there’s a reason most auditors are in their early 20s.

I greatly appreciate that you hit on the core problem that causes this: the endless supply of eager game developers who make it economically viable to treat your employees like garbage. Many articles are content to say “The industry treats people like garbage and Someone Should Do Something!”, then smile to themselves as though their statement will fix things.

Look, my game is doing really well by indie standards

Steam Spy puts it at 1500+-900 copies sold, is that inaccurate or are indie standards much lower than I’d been imagining?

It won’t be fixed as long as there are new developers who are willing to work for low wages and pizza in place of overtime pay. And there always will be – there is an oversupply of developers for games development, and 20-something recent graduates often enjoy the working long hours thing. They are doing something they enjoy, and if they went home they’d be coding for fun at home instead, and having to pay for their own food. Working late with your colleagues is as good as a social life to a geek ;) (I say that as an introverted geek).

So you have a ready supply of recent graduates who are happy to work long hours for low pay, until they burn out, or start having a family or interests outside work. At which point they are fired or leave the industry voluntarily, aside from the few who are needed for more experience developer positions, because you can’t have a workforce completely made up of recent graduates unless you want to produce games of the quality of “Tony Hawk’s Pro Skater 5” or “Arkham Knight”, or worse, every time.

Being a game developer is like being a in a band. Lots of hard work and shitty hours until you are either one of the lucky few to make a decent career of it, or one of the great majority who eventually chuck it for a real job.

I greatly appreciate that you hit on the core problem that causes this: the endless supply of eager game developers who make it economically viable to treat your employees like garbage. Many articles are content to say “The industry treats people like garbage and Someone Should Do Something!”, then smile to themselves as though their statement will fix things.

Agree that the long line of eager recruits contributes to game developers being able to get away with worker-unfriendly practices. But that makes me unclear on what “the problem” really is.

The conditions in the game industry aren’t secret at this point. And the people going into the games industry are hardly lacking for options. These aren’t (in general) people with degrees in folklore and mythology. Many have computer science degrees, or computer art, or other highly sought-after skills. There’s no requirement to go into the games industry with those skills. They are among the most sought after skills you can graduate with, and if you’re good you have your pick of companies you can work for.

When I was graduating from college, I briefly considered going into Investment Banking. I knew the hours were long and the pay was good. I talked to a friend who graduate a few years before me and asked if it was really as bad as people say. She told me no – “I don’t need to get to work before 7am, and I’m usually home by 10pm most nights. I haven’t had to work a Sunday in over a month. It’s not that bad.” I thought about this, applied it to what I wanted out of life, and decided it wasn’t for me. That friend made (and continues to make) more money than me. I’m OK with that.

The games industry doesn’t even have the lure of “after 5-6 years, the hours get better and the money really rolls in” going for it. After 5-6 years, you haven’t made that much more than you started at, because the steady flood of newbies keeps costs down. And life isn’t much better. And if you don’t like it, you can leave to go somewhere else (though by this point you’ve invested in highly specialized skills that don’t transfer well…) Again, people know this.

I don’t think most people who go into the games industry are dupes. They know all of this going in (especially these days). For them, the lure of being part of VIDEO GAMES!! is so strong that they are willing to accept those conditions. It’s why Hollywood will never run out of attractive baristas.

Can you blame developers for not spending money to improve conditions that people are clearly willing to accept willingly just for a chance to work for them?

There seems like there’s a market opportunity here – a company could offer less stressful conditions, reasonable pay, actually reward veteran talent (and so have their pick of all the veterans their competitors cast off every year), and be able to produce amazing quality games, albeit slightly more slowly. It continues to surprise me nobody does this (except Valve, but we all know they no longer make videogames).

Having to compete with companies with sane conditions is what will improve the industry. Not the corporate largesse of the companies that find themselves awash in talent despite doing nothing to make themselves attractive to that talent.

There seems like there's a market opportunity here ““ a company could offer less stressful conditions, reasonable pay, actually reward veteran talent (and so have their pick of all the veterans their competitors cast off every year), and be able to produce amazing quality games, albeit slightly more slowly. It continues to surprise me nobody does this (except Valve, but we all know they no longer make videogames).

It seems like that should be an appealing strategy, because everyone likes good pay and reasonable hours right? But every veteran of the game indsutry has demonstrated that they’re willing to put up with low pay and terrible conditions. The ones who weren’t quit before becoming veterans.

I’m not saying it wouldn’t work, but I think it would be less effective than it seems at first glance.

Well, fair enough, but when the inefficient use of human capital keeps driving up development costs (e.g., does endless crunch mode result in more otherwise avoidable errors?), doesn’t it have to bite them on the ass at some point? On the other hand, the customers just seem to accept major bugs on launch as business as usual at this point, and there’s not a big outcry against Day One patches and whatnot. Again–fair enough: if a game ships with bugs, no one dies or gets hurt, and the worst thing that happens is a bunch of complainers flood social media, AKA every other day on the internet. No wonder game publishers have no incentive to change.

The other reason why it doesn’t work is that games have to make a profit, and there is an increasingly large issue in the balance between how much games are willing to pay for a game versus how much they cost to produce.

You can’t have a situation where all of the following are true
* Games want games that exploit the full capabilities of their hardware, especially graphically
* Games are priced at a level that the average gamer can afford to buy a decent quantity
* Game developers (not just coders) have pay and conditions similar to equivalent workers in other industries

Currently AAA gaming manages to fail equally at the last two and only somewhat fail at the first. But as console capabilities continue to rise, and the amount of work required to match that rises, particularly for artists, there is a danger that it simply becomes impossible to produce AAA games for a price that anyone is willing to pay, and AAA gaming becomes economically non-viable – the prices people are willing to pay for a product simply do not cover the costs involved in making that product.

(I personally wouldn’t miss it since the number of AAA games I’ve played in the last decade is probably close to zero, depending on how you count them)

The problem, as I see it, is that item #1 on that list is a terrible, terrible goal.

We have clearly gone beyond the point of diminishing returns when it comes to utilizing the graphical capabilities of computers. There are cases where it makes sense to push the envelope for the top-selling franchises, but companies continue to invest heavily in developing games that don’t need that level fidelity. This is how we get ridiculous “Tomb Raider must sell 5 million copies” nonsense, and the way employees are treated is definitely connected to that (because it would have required, say, 6 million copies to break even if they used the proper number of programmers).

I don’t know whether the onus is on developers or publishers, but somebody need to learn how to properly control costs so they don’t over-deliver to such a ridiculous degree

That is, in my opinion, the real root of the problem- the amount of work being put into a product that people will play for a few hours, put aside, and completely forget about in a month is unsustainable.

Games need to be made more cheaply, even at the cost of super detailed environments, motion capture, and full voice acting. I don’t think the industry would suffer in the slightest artistically if game budgets came back down to a few tens of millions at most. I would actually be very happy if mechanics-driven games came back into prominence over cutscene-driven ones.

Having your AAA games come fewer in number but with more staying power would also help. Games aren’t being made to be replayed anymore, and they very rarely have the kind of interesting narrative ideas that would keep people coming back for years to sell new copies.

Most of the video game companies I’ve worked for in the UK have had very little crunch. The idea that “everyone does this” is a myth, which certainly means “this is the only way you can make good games” is a myth.

Many articles are content to say “The industry treats people like garbage and Someone Should Do Something!”, then smile to themselves as though their statement will fix things.

Sadly,that someone who should do something is the government.And the last time when we needed it to give us laws that would prohibit this type of practice in the factories,it still had to be forced to do so.And still,similar crap happened with the movie industry when it was young.Basically,we need a game developers union.

Working long hours is normal in the tech industry, therefore crunch mode in game development is nothing special.

Again, St. John is right. This happens all the time in startups.

I liked your general point that a lot of times, the problem in arguments like “a, therefore b” in in the “therefore.” But here, I STRONGLY disagree with both St. John and you at the premise is reasonable.

I’ve worked for over a decade as a consultant in the tech industry. I’ve observed the inner workings of over 30 companies in that period, from hedge funds to large retailers to smaller indies to companies that do things you’ve never even thought about people writing software for. And I assure you that crazy hours are generally NOT the norm in the industry.

The vast majority of people in the tech industry tend to work standard 40 hour weeks. Sure there are emergencies here and there that disrupt that (the site is down!!!!) But those tend to be rare in most mature organizations (which is where the vast majority of people in the tech industry work).

Sometimes there are crunches to get projects done, but in most companies, if the project getting done by the projected end date isn’t happening, they slip the date rather than work 12 hour days for weeks at a time to get it done. (and wring their hands over it and complain about it, but still). Companies that demand constant heroic measures to hit end dates for every project are rare (and, as a general rule, horribly disfunctional). Frankly, the rule in most large companies I’ve observed is “it’s always going to be late, we already know that, so let’s hold blame sessions rather than fix our problems.” (Please observe I do not consider most large companies completely rational in how they think about tech).

I’m not saying there AREN’T companies where “the last month of the project is 80 hour weeks for everyone.” I’m asserting they are fewer in number than companies where the norm is “the project is late, and the managers panic and drive everyone crazy, but ultimately we just accept that it’s late or cut features.”

You’re agreeing with him on a flawed basis that “if it happens in startups, it’s reasonable for him to call the practice normal.” But startups are NOT normal. The vast majority of jobs in the tech industry are not in startups. Most developers go most of their lives without working for a startup. The practices of startups are not unheard of. But they are by no stretch industry normal.

And that matters, deeply, in the context of this conversation. You can have a long, happy, productive career in the technology industry without ever experiencing the “massive workload, high potential reward” environment of a tech startup. But you really CAN’T have a long, happy, productive career in the games industry without experiencing months of crunch time every time a game ships.

To translate: Enslave the young, drive them as hard as you can, don’t reward them. The ones who fight back or die by the roadside will be replaced by fresh ones, the ones who survive can be promoted and eventually even treated well.

He’s a little bit on the nose about it, but those are the parts of this I have the least beef with. They might dress it in prettier dresses, but this is the fundamental problem of Human Development at any company with a substantial investment in training costs.

Before you pay someone to attend a bunch of expensive classes and conferences, to turn them into a better employee, you need a process to weed out the two types of people who won’t provide any return on that investment.

1. People who wouldn’t gain enough skills from the training to justify the expense. It doesn’t matter if the employee is stupid, lazy, unsuited to the task, mentally unfocused or burnt out. Whatever the cause, it’s a waste of resources.

2. People who stick around and let you benefit from the skills you taught them. You go out of your way to put a promising new talent on a team of veterans who can teach him all the tricks of the trade, and you pay him to attend some classes that you also paid for, and how does he repay you? He jumps ship for a position at a rival company. You have just spent your hard-earned money training workers for the competition. It’s a double fail!

How do you retain skilled, veteran employees, whilst filtering out people in the above categories? Remember that actually firing people is more expensive, and may risk a lawsuit, while if you frustrate them into quitting there is no such risks, and they may even take a cut to their severance package.

This does cast a shadow on the horror stories that come out, because it’s difficult to gauge what percentage are just sour grapes from people who were judged unworthy and got passed over for training and promotions too many times. Though we do know from the way that their teams are structured, that EA is basically not promoting or training anyone from the trenches in a meaningful way.

Additionally, I would have said that you retain “families” instead of saying that you retain “wives and girlfriends,” but that’s also standard HR industry wisdom. People with bills to pay and mouths to feed need stability, and are less open to going freelance or moving to a new area. they don’t call it “putting down roots” for nothing.

The kind of high-functioning, INTJ Asperger’s patients he’s looking for would actually appreciate the bluntness, becasue they lack the social skills to deal with the white lies and half truths that constitute Politeness in modern society.

There’s nothing wrong with wanting a workforce of creative people who love their jobs and fit in well with your team.

I was just going to suggest you write something on this topic, but you were faster!

I agree on all points. I tried for a bit to think of a reason why he would phrase things this way, and came up with several theories (none of which I can confirm or disprove, so they’re probably all true to various extents):

1: The classical “I did not have it easy, so nobody else deserves to have it easy” thing. Great, so he grew up with outhouses in the woods. How does that qualify him to make light of other peoples’ problems?

2: Sampling bias: He was successful himself, and is probably devoted enough to still be spending copious amounts of time working, just because because that’s very much his thing. Many of the people he works with in the industry are probably similar: Development leads, successful start-uppers etc. All have at least at some point given all, and all of succeeded. There’s a clear correlation. He does not ever meet the people who started out the same, gave all, gave some more, burned out, failed and were never heard of or seen again… those people are the majority but if you’re living in the world of the successful you may not notice how much luck you actually had. It’s a bit like a lottery millionaire advocating people to play the lottery. If it worked for him, why wouldn’t it work for everyone?

3: Self interest. He’s clearly on the upper scales of management, and most of his pals profit directly from treating employees as a disposable resource. They don’t even realize in how many ways they’re coercing people to do their bidding.

The only point in which he’s (kind of) right: Yes, by now most people should have an idea what they’re getting themselves into if they work for the gaming industry, so people should stop queuing up for those jobs. That’s the only way to get employers to realize the value of their employees: When it gets hard to find new ones and the old ones are running away because you treated them badly.

Apparently, though, that knowledge has not quite sunk in, and probably too many people are still blissfully unaware what a deep black hole waits at the end of too much crunch time. When you’re 20 and you think you don’t need sleep that’s one thing but all the sleep deprivation accumulates and will eventually hit you over the head from behind… There was a super-interesting article by one of the Amazon co-founders (don’t remember the name, or the title, or anything… would find the link otherwise) where he says that he thinks he’d have done better with Amazon if he had strictly worked no more than 40 hours every week, no longer. Above ~35-40 h/week (+- a few depending on what you do and how much you enjoy it) long-term workload, people become less effective, and then they need to work longer just to compensate for the physical and mental fatigue. Hearing those words from a person who made it through several successful start-ups was quite a thing.

As a young programmer who made an informed decision to stay out of games, it’s a completely fair burden to put on people. The info is out there, and you’d be a fool not to spend a couple days (as in full, eight hour days) researching the hell out of the field where you’re considering spending the next several thousand hours of your professional career. The fact that people continue to unwisely shirk the research doesn’t mean it’s unfair to expect it of them: people continue to unwisely commit DUIs, but we all agree that you should know better and it’s your fault if you do.

But then what? If you’ve never had a full time job, it might not be obvious how to balance passion for the job with your own needs. And an article like Alex St John’s propagates the idea that work conditions shouldn’t matter if you’re passionate enough.

So without giving young grads a complete pass, I think we should understand the naive place most come from. This naÃ¯veté is not only lack of prep work, it’s also lack of context.

I don’t think people are failing to do the research. I think they’re just letting their romanticism override their bitter, cynical sense of practicality.

As Adam Smith observed, people tend to have an unreasonable amount of optimism when it comes to predicting their own fortunes, which is why people do things from going to war thinking they’ll be a hero when they’ve got a much greater chance of being dead to buying lottery tickets when it can be mathematically shown that they’re throwing money away.

Sometimes there’s nothing for it but to work a crappy job for a few years to work a little wisdom into them.

I think the simplest argument against AAA crunch is the state of AAA games on release. Maybe a robust culture of demanding refunds on broken games will help? They pull this shit because it doesn’t affect their bottom line.

Besides that, we hear about EA, Activision and Ubisoft pulling this perma-crunch shit, but not Valve, Bethesda, Nintendo, or the like. Unless the others are pulling the same shit but not drawing aggro, just don’t buy their stuff until they improve. Though I can’t remember the last Activision or Ubisoft game I bought, and I don’t think EA’s opening the bubbly over me picking up a second-hand copy of Henry Hatsworth.

I think part of it is the 2-year game development cycle that Valve, Bethesda, and Nintendo don’t employ.

Since throwing more and more people on a project has diminishing returns, those people still have to work more individually in order to get a game out in such a short period of time. Longer development cycles are probably just overall much more efficient.

Disclaimer: I’m almost exclusively a consumer of games, I worked on translating and testing a few and that was pretty lightweight.

You know what’s most appalling? It’s not like the (mostly big) devs/publishers push the crunch to make the games polished, perfected and flawless. There’s both the crunch *and* the games are unpolished, riddled with bugs and seem to be released with an attitude of “we’ll patch it over the following weeks/months (if it sells well)”, said patching probably spawned by another crunch…

I can’t help but wonder if the St. John article is trolling. I mean he literally calls people whiners over the issue of, in his words, “not being paid fairly”. We’ve got the golden line “To my great shock and disappointment, they never respond to this feedback with any sort of enlightenment or gratitude for my generous attempt at setting them free “” usually, I just get rage” which reads so much like self-aware parody that I had to go back and search the article for signs he wasn’t serious.

The internet business model is that you convert views into ad money, the ad servers don’t discriminate between contented readers and angry people linked to your article from a takedown piece. And he’s certainly getting a lot of people linked to him from takedown pieces.

When I left grad school (life sciences) one of the consolations I took was that I’d never be subjected to the perma-crunch lifestyle that is academia and the world of PhD holders.

Finishing grad school would have meant getting a post-doc position, which is essentially a junior researcher position. You don’t get paid much more than grad students, and the position generally only lasts ~2 years, but you’re expected to be highly productive in those years, so most post docs are working 60-80 hour weeks. Even worse, most people will go through three post doc positions before landing a “real” job. Given the total time required for grad school, you’re almost 40 before you finally get out of the training/crunch phase of your career.

God help you if you go back into academia, however. Most junior faculty (assuming you’re not stuck in adjunct hell) are still working 60-80 hour weeks, because they don’t have the funding to staff their labs just yet with post docs and grad students. Most will still work those hours anyhow, because getting that funding is an endless cycle of grant submissions and publication writing. All of that to maybe have a middling career as an academic researcher.

As I like to say, sacrificing your life on the altar of science isn’t the key to a wildly successful career, it’s the price of admission to even try. I may not ever have the same opportunities as my colleagues with PhDs, but I won’t have to pay the price for those, either.

Wholly agree as an A.I. PhD. In academia, it’s indeed entirely expected to basically give up on everything outside your job. I’d love to do science, but I also like to enjoy life a bit.

As for Alex St. John’s article, it’s basically a longer version of “you young kids got it easy, I walked uphill through snow both ways”. So basically because other people have it worse off AND some people, through extraordinary skill and/or luck, could turn your situation into gold, you have no right to complain? Scratch that, we’re not talking about complaining but expecting normal competent management?

Not to make this too political, but I gotta say, the American way of treating holiday and free time makes me really glad I am not going into a business there. The idea alone that anyone intelligent can argue *in favour of* burning people out with literally eternal crunch time without even so much of a carrot to dangle in front of them purely because you have enough fresh talent to feed into the grinder is so incredibly capitalist that I can’t even fathom why anyone would defend him. And to defend it with “if you don’t like it, build your own (and risk sinking into poverty if it doesn’t work out) but don’t complain about how you’re treated” is just disgusting.

It’s very easily defended: Nobody is forcing anyone into the arrangement and they’re free to leave at any time. It stops being a problem really, really quickly when you don’t have the innate desire to control everybody else’s lives.

Jobs aren’t poverty-inducing. They’re the exact opposite. You make money doing them, which makes you *less* poor. If a job is taking money away from you, you can just quit.

Then notion that a lack of higher-paying jobs is the fault of people offering what jobs do exist is wrong-headed, and leads to killing off jobs by punishing people for offering ones that you just don’t think are good enough. EA isn’t about to start paying game devs overtime. They literally can’t afford it- the labor costs would put their games into strictly unprofitable territory. Those jobs would just vanish.

Also, the US spends most of its budget on social programs (with social security, unemployment, and medicare being the largest chunks of that). The military budget which people think is so extravagant is just under 16%.

Except that they are forcing you.They just dont use physical force,but using economic force is just as effective and just as dangerous.You are using the same false logic as Alex St. John where you treat only the physical as real and everything else as unimportant.

They’re not using force of any kind. They have no power over your economic situation whatsoever unless you get a job with them. It’s not their fault if you don’t have any better alternative prospects, and them not giving you a job on the terms you want is doing no more harm to you than everybody else in the world is by not offering you a job at all.

The “You’re making my situation objectively better, but not as much as I like so you’re actually hurting me” mentality is incredibly toxic. It encourages people to not help at all because at least that way they’re not singling themselves out.

The “You're making my situation objectively better, but not as much as I like so you're actually hurting me” mentality is incredibly toxic. It encourages people to not help at all because at least that way they're not singling themselves out.

You do realize thats the exact wording used by war profiteers for justifying their behavior?And you actually think the opposite of that is the toxic mentality?For real,no joking,you actually think that?

Actually, no, I’ve never seen any “war profiteers” use that exact wording, nor do I see how it’s not a complete non-sequitur. Are the war profiteers reducing casualties here? Because unless you can point to how they’re actually helping things, it really doesn’t apply to them. And if they are helping things, why should I be indignant at them? Am I supposed to be outraged at a company selling weapons that a nation is using to defend itself against an aggressor?

Look… this is a really, really desperate place you’re going to trying to make an argument. You should really step back for a bit and re-think things, because this direction you’re trying to take is guaranteed not to end well.

War profiteers arent the ones that sell weapons,but the ones that sell food and medicine for 10 times(or even more)the price they paid for it.So yeah,technically they are helping.By exploiting the misery of others.Why should you be indignant at them?Hm,I dont know,perhaps because they are abusing the misery of others to fatten their wallets.

If that’s the definition of “war profiteering” you’re using then, well, they’re right. Raging against someone for providing relief to a war zone because they’re making money doing it is something I’d definately call toxic and counterproductive. You want to condem people to starve because of an irrelevant political sensibility. That’s pretty bad.

It’s not the game industry that’s responsible for the fact that humans need food to live, nor do they set the price of bread. People are forced by their circumstances to work in terrible jobs because the alternative is starvation, but the employer is not doing any of the forcing. The employer just put out a hiring ad and these people walked in the door.

And who cares if the conditions are atrocious.They agreed to it,so its ok.Kind of reminds me of something that was quite a common thing(and is still a thing in some parts of the world):there used to be people that willingly sold themselves into slavery so that they would pay off a debt,or make it so their family had something to eat,or whatever reason.I guess their masters were not to be considered slavers for being willing to buy some slaves.

I know where you’re coming from, but I find that view to be very narrow in scope, and it’s exactly what has driven the attitude of corporate America to be so extreme. “Don’t like it? Leave!” eliminates any hope for positive change, both politically and economically. A labour force that is ready to debase itself as much as it has to to secure what little scraps they can get, instead of setting a base level of what they can expect, puts all power in the hands of those that have zero incentive to be reasonable.

It actually has extreme potential for positive change, because once people start leaving companies *have* to improve conditions in order to keep going.

The labor force’s general woes are the product of much, much broader issues that EA and Ubisoft have nothing to do with, and focusing on them instead of trying to promote actual economic growth in order to create more competition in the labor market is strictly counterproductive. Especially when those companies will likely respond by cutting staff levels, because once you mandate the super-expensive AAA games out of profitability they’ll be forced to seek higher profit-per-employee ratio products.

Actually, it improved the standard of life so much that modern people now have the entirely mistaken idea that an 8 hour work day, children not having to do any work, and our current standards of living were the way things were before it happened.

The thing is, sure, it sounds nice that all the workers could just leave. But the point is that the pressure on them is high enough that not enough of them WILL leave. Because if you’re faced with “have a job in the industry you love, even if it’s soulcrushing”, “have no job and sink into poverty without a social system to catch you” and “desperately try to find something outside your skills, resulting in worse work”, pretty much no option is good.

The whole idea that “mandating” a humane work environment and reasonable hours is damaging to the industry is, not to be insulting, shortsighted – because if we don’t mandate that, it simply won’t happen.

As much as I’d love for people to start talking about who are the companies that force the crunch work on them so they can be avoided, surely those employers keep clauses in their employees’ contracts just to prevent such a thing, even after their jobs are terminated.

The only thing we’re left then is the word of anonymous posters, which is as valid as the inexistent paper they’re written on. These disgruntled employees can either be specific and risk a lawsuit they can’t possibly win or be vague and risk not being heard due to legitimate suspicions about their veracity.

This is really the main reason why this keeps happening. It’s not a lack of care by the workers, it’s that the employers have put themselves in a position where they can’t lose.

maybe some context will help here: the Alex St John piece is in response to a venturebeat article that is linked in Shamus’s opening paragraph.

That article is an interview of the IGDA ( international game devs assoc) president, where she explains that crunch is a major issue based on their years of accumulated surveys of actual workers.

And, and maybe that is what is making Saint John feel so threatened, the IGDA is planning to start awarding gold stars, and possibly demerits, to companies based on their track record.

you have to read the interview for details. they are planning this in a very thoughtful, rehabilitate rather than condemn outright fashion that makes me feel cautiously optimistic they are onto something.

How do these projects get so broken in the first place? This is a clear sign that something is desperately wrong. If your project planners are estimating that badly, there’s a critical flaw in the process somewhere. Maybe this is just because the industry is “different”, but that’s not a very satisfying answer.

His highlights are as follows:
“¢ They have no social skills
“¢ They generally marry the first girl they date
“¢ Can't make eye contact
“¢ Resume and educational background is a mess… because they have no social skills
“¢ They work like machines, don't engage in politics, don't develop attitudes and never change jobs

Yeah–as somebody with that diagnosis, it made me pretty angry that he was characterizing me that way. And then I read the section on women, and got mad again, because seriously? What the hell, dude?

Somebody up top linked his follow-up, and I can’t recommend *not* opening that link up enough, because he’s active in the comments section and it. is. a. mess. There’s clearly no link between his brain and the outside world at this point. It’s all just self-important exploitative rhetoric and spiders in there.

The way I always hear about the game industry chewing up new artists reminds me of the old Hollywood stereotype about starving actors and writers all putting up with bad conditions because they’re hoping for the chance to make it to the big time, but the thing is, there isn’t really a big time for game devs. There are only a very few superstars with big names, but they aren’t even that glamorous. Kojima was one of the biggest names in videogames, but he was unceremoniously given the boot.

Why is it reasonable for a game that is proceeding according to schedule to suddenly need more work done per day in the last two months? Shouldn’t the ordinary schedule be to have the game finished with normal working hours in time for release, and only if it is behind schedule close to release should crunch time be needed? (likewise, if it is behind on any milestone, crunch time is needed for that milestone; if it is behind on EVERY milestone, then the planning failed and the scheduling/staffing/scope should be reevaluated).

1. Incompetent management, for which the likes of St. John are in part responsible.
2. Meeting deadlines will require catching up on any delays, and may result in more work, so some crunches are necessary, but it shouldn’t be institutional.
3. An overabundance of labour, who are mostly just gamers who’ve chosen this career because they love games, with relatively little life experience in being exploited, is available for the likes of St. John to exploit, which means they can coerce their staff into working overtime for crunch, which they may or may not be properly compensated for, and certainly not compensated commensurate with overtime rates.

Basically, a big problem with this is that labour laws in general are really terrible. A lot of effort goes into demonising unionisation, and lobbying by corporations to lower or remove employee protections, removing legal avenues for them to seek compensation. Ideally, underpaying people would result in dramatic fines and damages which would seriously financially injure their employers. That doesn’t happen.

Personally, I just got out of a job, where my boss was violent, and abusive, and regularly threatened (Including with violence) employees should they not stay back to work overtime, because his incompetent ass could not roster correctly. This boy exploited his position of power to bully and threaten people who had no opportunity to respond in kind, because they depended on the job. I made a 3000 word complaint to corporate after I left (Having already made one to the government office of fair work). To date, I have recieved no response from either. I don’t expect to either, and will probably have to pay a lawyer to actually get any justice.

There’s a small possibility that the goal is psychological, to make sure that even the most driven workers are eager to take time off, and turn the end of the crunch into a bonding moment for the team, making them gel better in the next cycle.

I think there’s a bit more to it than you let on there Shamus. This is a big industry. One of the biggest in the world. And it’s managed terribly. This is a problem with the St. John’s of this world. These people are paid small fortunes, and they do not do their jobs properly, but they punish those beneath them, who are paid less, have less security, and have a lot more to lose.

It’s something you see in bad managers the world over. It’s behaviour I expect from an incompetent kid at a fast food restaurant.

Crunch times are occassionally necessary, but not every game should need to crunch. Crunching is a sign of poor time management, and should not be expected from AAAs. And if people are working overtime hours, they deserve to be fairly compensated.

The difficulty of the labour just doesn’t come into it. Workers deserve fair compensation for their time.

And it’s not even a matter of their work being difficult because it occupies most of their time and burns them out. These people are literally being treated as something other than human beings by corporate dickheads like St. John. These people have lives, and expecting them to work those hours seriously impacts on their free time, and time with their friends, family, and significant others. It’s not for nothing that it’s compared to a form of slavery, and worse, the exploited don’t have any real avenue for complaint, these guys are their bosses. And at the end of it, they’ll likely be denied their bonuses, even if they’re Infinity Ward after Modern Warfare 2, based on some insane metric for success like Metacritic, which hit Obsidian with New Vegas, and the response they get from a fat cat like St. John, is that if they don’t like it, they can go indie.

No.

If St. John doesn’t like treating people as human beings, then he doesn’t deserve to be treated as such. He deserves to live in a society that has absolutely nothing to give him. He deserves to go to a place far worse than the boondocks he grew up in, to support himself.

We’re literally talking about a multimillionaire complaining that people feel hurt by not being paid a fair wage, and the psychological toll it takes on them, hurting their work.

His premise is wrong, and so is his conclusion.

Crunch time isn’t massively physically demanding. Neither is his position (For which he’s compensated more, for less work than most programmers and software engineers), yet he feels entitled to it, and feels that it’s right not to pay people for work.

That’s just not how work functions.

Personally I really liked the indie developer’s response that Ventruebeat published. And I do think people should be very angry about this. This man should resign, what he’s said is utterly unacceptable. He’s a disgusting human being.

I’d rather he didn’t resign, so we can always have a solid example to point to about just how bad the job is. It’s one thing to talk about long hours in poor conditions, it’s another to point at an article like this and say “he’s the boss over there, you’ll be working for him and guys like him.”

The game industry is a wretched wasteland of abuse not because of “bad management” or “bad habits” or even “bad juju.” It’s a wretched wasteland of abuse because no type of modern macro-management theory has been developed to “optimize” worker productivity through inhuman methods when computer programming is involved.

Macro-management is a fixture of the modern workplace in most industries. “Matrix Management,” “Total Quality Management,” and “Six Sigma” are sophisticated ways to ensure that everybody appears to be working as hard as they can. Supposedly, they are ways to increase productivity and quality, but really they don't do either of these things. Modern macro-management is aimed at just one thing–making certain that everyone is busy and stressed out. That is the only “metric” that can be made manifest and observed. It’s simply too difficult to actually manage a large organization using rational, productive methods. That type of thing leads to de-centralization which leads to the potential for organized insubordination, and most large organizations would rather have bad, insane, corrupt management from the top than good management from below for that reason. It’s too dangerous–to the people on top.

So what do you do if you can't delegate authority because you won't trust anybody? Large, impersonal industries cannot be micro-managed, personally, by CEOs. So the CEOs end up lying awake at night worrying that the employees are “goofing off” unless some kind of modern macro-management is in place to make employees suffer. I know this sounds bizarre but that’s how it is. Various macro-management tools are implemented in order to reassure the boss that the employees are exhausted so they don’t “get paid for nothing.”

Unfortunately, the video game industry does not lend itself to this type of macro-management. Using multiple bosses (matrix), fan-service as a religion (total quality) or twisted statistics (six-sigma) doesn’t really work out in game development. It’s impossible to monitor and then corrupt the workflow to make “certain” that “everyone is busy” in this way, because most of work is going on in somebody’s head–and that’s just crazy-making for the bosses, who need SOME kind of reassurance that the workforce is suffering. The only way to alleviate the boss’s “goof off anxiety” is to literally make everyone work ALL THE TIME.

This is a “meta-view” of the problem, and if you’re used to seeing these things at the micro level, you might not see it at first, but, believe me, this is the underlying motive for all this seemingly mindless cruelty. It’s the deepest and darkest fear of every employer, that the “staff” will figure out a way to collect money for less work, and make the “old man” into a sucker. It makes no sense from the perspective of the productive trenches, but up in ivory tower, it’s all they really think about, if they think about the “slaves” at all.

In my experience, managers worth more than their chair are a rare and wonderful commodity. I’ve worked with two in all my years, and only one of them was worth more than his chair and desk put together.

(“Project managers” aren’t necessarily “in management” in the usual sense, and some of those are actually pretty good at their jobs… but then, their opinion about things like morale and perma-crunch aren’t all that important or influential, unfortunately.)

“The problem isn't the physical challenge. It's the time, and the staggering personal opportunity cost. Your daughter is never going to take her first steps again, say her first words again, or any of the other milestones we use to mark the road of parenthood.”

This hits the nail on the head. If you forget that employees are human beings, it’s easy to justify almost anything from them. But two hours for a company may just be $16 or $40 or $100 in extra wages, but for a worker it can be missing something irreplaceable and priceless.