Simultaneous Shortage and Oversupply

The EA move­ment has a ton of pro­gram­mers, many of them earn­ing to give, and many of them in­ter­ested in mov­ing into some form of di­rect work.

Roles for pro­gram­mers in di­rect work tend to sit open for a long time, and peo­ple try­ing to hire pro­gram­mers have a re­ally hard time find­ing peo­ple.

As far as I can tell, though, these re­ally are both true! For ex­am­ple I ran a small email sur­vey (n=40, mostly en­g­ineers) and found 30% of were in­ter­ested in switch­ing to some­thing more valuable, and 40% were po­ten­tially in­ter­ested. And there are a bunch of open­ings:

OpenAI has Se­nior Soft­ware Eng­ineer role that doesn’t re­quire ML ex­pe­rience, and a ML Eng­ineer role that re­quires an amount of knowl­edge that an en­g­ineer could pretty eas­ily get work­ing on their own.

Wave has a bunch of open­ings (via 80k) in­clud­ing one for a Soft­ware Eng­ineer. I have a bunch of thoughts about Wave in par­tic­u­lar, but as a former em­ployee I can’t share them.

So, why don’t these open­ings get filled quickly? Some guesses:

Lo­ca­tion: the jobs aren’t where the peo­ple are, and nei­ther want to move. For ex­am­ple, I’m in Bos­ton and don’t want to leave or work re­motely.

Pay: top tech com­pa­nies can offer very high com­pen­sa­tion, and these or­ga­ni­za­tions don’t pay as much. Though since post­ings don’t in­clude comp it’s pos­si­ble that they ac­tu­ally do pay similarly? But maybe peo­ple don’t ap­ply be­cause they think it would be a large pay cut?

Ex­pe­rience: the jobs want some­one who’s been pro­gram­ming for a long time, and peo­ple who could take the jobs haven’t been.

Abil­ity: the jobs want ex­tremely tal­ented peo­ple, and most pro­gram­mer EAs don’t pass their bar. But this doesn’t ex­plain why I know a bunch of en­g­ineers at Google, which has a pretty high hiring bar, look­ing to do more di­rectly valuable things.

Per­sonal risk aver­sion: as a par­ent of young chil­dren this makes a lot of sense to me! Mov­ing across the coun­try to work at a place that’s not as fi­nan­cially se­cure as, say, Google, would be a real risk. (And one that hit me when I was laid off from Wave.)

Work­ing con­di­tions: maybe these jobs aren’t as nice in ways other than pay? More hours, less free food, less abil­ity to work on cool things? But this seems un­likely to me—lots of peo­ple want to work on ML.

Cause mis­match: the good jobs are all in AI safety, but the pro­gram­mers look­ing to move are in­ter­ested in global poverty, an­i­mal welfare, or some­thing.

Aware­ness: maybe peo­ple are not ac­tively look­ing for jobs and don’t know what’s available? Maybe 80k should have some sort of re­cruiter/​head­hunter that tries to match EAs to spe­cific roles? Maybe they already do this and I don’t know about it?

Im­poster syn­drome: peo­ple of­ten don’t have a good model of where they stand, and so might think pos­si­ble jobs aren’t for them. For ex­am­ple, MIRI posts that they’re look­ing for “en­g­ineers with ex­tremely strong pro­gram­ming skills”, and prob­a­bly some of the peo­ple who would do well there don’t re­al­ize that their pro­gram­ming skills are good enough. Even if a job post­ing is framed in a friendly wel­com­ing way, if the or­ga­ni­za­tion has a very strong rep­u­ta­tion that in it­self may make some peo­ple think they couldn’t be good enough.

Com­bi­na­tion: maybe there are jobs that do well on many differ­ent met­rics, but not enough of them for any one per­son. For ex­am­ple, maybe there are jobs that pay well (OpenAI?) and jobs in global poverty (GiveDirectly) but if you want both there isn’t some­thing. Or there’s re­mote work (Wave, etc) and there’s work on AI risk, but no op­tions for both.

What’s go­ing on? I’m es­pe­cially in­ter­ested in com­ments from pro­gram­mers who would like to be do­ing di­rect work but are in­stead earn­ing to give, but any spec­u­la­tion is wel­come!

Thanks to Cather­ine Ols­son for dis­cus­sion that led to this post and read­ing a draft. Cross-posted from jefftk.com.

The situ­a­tion is differ­ent for or­ga­ni­za­tions that can­not af­ford high salaries. Let me link to Nate’s ex­pla­na­tion from three years ago:

I want to push back a bit against point #1 (“Let’s di­vide prob­lems into ‘fund­ing con­strained’ and ‘tal­ent con­strained’.) In my ex­pe­rience re­cruit­ing for MIRI, these con­straints are tightly in­ter­twined. To hire tal­ent, you need money (and to get money, you of­ten need re­sults, which re­quires tal­ent). I think the “are they fund­ing con­strained or tal­ent con­strained?” model is in­cor­rect, and po­ten­tially harm­ful. In the case of MIRI, imag­ine we’re try­ing to hire a world-class re­searcher for $50k/​year, and can’t find one. Are we tal­ent con­strained, or fund­ing con­strained? (Our ac­tual re­searcher salaries are higher than this, but they weren’t last year, and they still aren’t any­where near com­pet­i­tive with in­dus­try rates.)

Fur­ther­more, there are all sorts of things I could be do­ing to loosen the tal­ent bot­tle­neck, but only if I knew the money was go­ing to be there. I could be set­ting up a re­searcher stew­ard­ship pro­gram, hav­ing sem­i­nars run at Berkeley and Stan­ford, and hiring ded­i­cated re­cruit­ing-fo­cused re­searchers who know the tech­ni­cal work very well and spend a lot of time prac­tic­ing get­ting peo­ple ex­cited—but I can only do this if I know we’re go­ing to have the money to sus­tain that pro­gram alongside our core re­search team, and if I know we’re go­ing to have the money to make hires. If we re­li­ably bring in only enough fund­ing to sus­tain mod­est growth, I’m go­ing to have a very hard time break­ing the tal­ent con­straint.

And that’s ig­nor­ing the op­por­tu­nity costs of be­ing un­der-funded, which I think are sub­stan­tial. For ex­am­ple, at MIRI there are nu­mer­ous ad­di­tional pro­grams we could be set­ting up, such as a vis­it­ing pro­fes­sor + post­doc pro­gram, or a sep­a­rate team that is ded­i­cated to work­ing closely with all the ma­jor in­dus­try lead­ers, or a ded­i­cated team that’s tak­ing a differ­ent re­search ap­proach, or any num­ber of other pro­jects that I’d be able to start if I knew the fund­ing would ap­pear. All those things would lead to new and differ­ent job open­ings, let­ting us draw from a wider pool of tal­ented peo­ple (rather than the hy­per-nar­row pool we cur­rently draw from), and so this too would loosen the tal­ent con­straint—but again, only if the fund­ing was there. Right now, we have more trou­ble find­ing top-notch math tal­ent ex­cited about our ap­proach to tech­ni­cal AI al­ign­ment prob­lems than we have rais­ing money, but don’t let this fool you—the tal­ent con­straint would be much, much eas­ier to ad­dress with more money, and there are many things we aren’t do­ing (for lack of fund­ing) that I think would be high im­pact.

I don’t think this is quite right. The peo­ple work­ing at OpenAI are paid well, but at the same time they are tak­ing huge cuts in salary com­pared to where they could be work­ing oth­er­wise. (Good­fel­low and Sutskever could be mak­ing mil­lions any­where.) And given the dis­tri­bu­tion of salary, it’s very likely that the ma­jor­ity of both OpenAI and Deep­mind re­searchers are mak­ing un­der $200k—not a crazy amount for Deep Learn­ing tal­ent nowa­days.

Gen­er­ally, I feel like there are ac­tu­ally pretty few reg­u­lar en­g­ineer­ing po­si­tions around for EAs (Maybe 8-15), and these both have fairly high bars and re­quire work in the US/​UK.

Small orgs have differ­ent needs to large ones, and most of the EA groups are small. This in part means they want se­nior and/​or en­trepreneurial types.

I do sug­gest that pro­gram­mers learn ML or in­tensely learn Func­tional pro­gram­ming, though not that many available peo­ple seem in­ter­ested in ei­ther (es­pe­cially those who are do­ing E2G out­side of EA jobs.) Either would be a sig­nifi­cant challenge, for one thing.

I’m not look­ing for an en­g­ineer­ing role, but definitely for my­self the dis­con­nect be­tween what I am look­ing for and what EA-ad­ja­cent op­por­tu­ni­ties I find ad­ver­tised is 100% lo­ca­tion. I live in a par­tic­u­lar city and I am not in a po­si­tion to move in the short term, and as that city is not the Bay, NYC, or Oxford, it’s hard to find any use­ful post­ings or even guidance from the on­line EA com­mu­nity. I’d love for 80,000 Hours to have any ad­vice what­so­ever tai­lored to some­one con­strained to job-search­ing only within their own city, but so far I haven’t come across any.

The OpenAI and Deep­Mind posts you linked aren’t nec­es­sar­ily rele­vant, e.g. the Soft­ware Eng­ineer, Science role is not for Deep­Mind’s safety team, and it’s pretty un­clear to me whether the OpenAI ML en­g­ineer role is safety-rele­vant.

This seems plau­si­ble, but also quite dis­tinct from the claim that “roles for pro­gram­mers in di­rect work tend to sit open for a long time”, which I took the list of open­ings to be sup­port­ing ev­i­dence for.