Sunday, April 3, 2011

The death of Intel Labs and what it means for industrial research

Intel recently announced that it is closing down its three "lablets" in Berkeley, Seattle, and Pittsburgh. I know a lot of people who work at the Intel Labs and in fact spent a year at the Berkeley lab before joining Harvard in 2003. (I should be clear that not all of Intel Research is closing down -- just the lablets.) All of the researchers have been told to find new jobs, though some of them are getting picked up by Intel-sponsored research centers at the nearby Universities.

The Intel Labs were a fantastic experiment to rethink how industrial research should be done. They first started in 2001 under the model that full-time Intel researchers would work side-by-side with faculty and students from the nearby universities. All of the research was done under an open intellectual property model where results were co-owned by the university and Intel. In fact the labs were not inside of the Intel corporate network and operated largely autonomously from the rest of Intel. This allowed projects to be done seamlessly across the Intel/academic barrier and for students to come and go without restrictions on the IP.

Some fantastic work came out of the Labs. The Berkeley lab drove most of the early work on sensor networks and TinyOS, especially while David Culler and his various students were there. The Seattle lab developed PlaceLab (the precursor to WiFi based localization found in every cell phone platform today); WISP (the first computing platform powered by passive RFID); and lots of great work on security of wireless networks. The Pittsburgh lab did work on camera-based sensor networks, cloud computing, and robotics. All of these projects have benefitted tremendously from the close ties that the Labs had with the university.

Before the Labs opened, Intel Research was consistently ranked one of the lowest amongst all major technology companies in terms of research stature and output. I feel that the Labs really put Intel Research on the map by involving world-class academics and doing high-profile projects. They have attracted some of the top PhDs and offered a much more academic alternative to a place like, say, IBM Research.

I have no idea why Intel decided to close the labs. The official press release is devoid of any rationale, and obviously tries to spin the positive angle (the establishment of the university-based research centers which will replace the Labs). I've spoken with a number of the researchers there since the announcement, and have formed my own theories about why Intel is shutting them down. The most obvious possibility is that the Labs are incredibly expensive to run, and it's hard to link the work they do to Intel's bottom line. After all, very little of the work done at the Labs is picked up by Intel's product groups. The Labs' mission has always been to inform the five-to-ten-year roadmap for the company. It's unclear to me whether they have been successful in this, though at least they have inspired some entertaining commercials.

Personally, I'm worried about what this means for industrial computer science research. Here is one of the world's largest and most wealthy tech companies, closing down a set of labs that employs some of the top minds in the field, which by all measures has been really successful in producing novel and high-impact research. If Intel can't figure out how to leverage that amazing talent pool, it does not bode well for the rest of the industry.

Maybe this suggests is that the conventional industrial research model is simply broken. The only (important) places left that use this model are Microsoft, IBM, and HP. These companies can afford to set up big labs with lots of PhDs and pay them to do whatever the hell they want with little accountability, but maybe this model is no longer sustainable. As I've written before, Google takes a very different approach, one in which there is no division between "research" and "engineering." The advantage is that it's always clear how the research activities relate to the company's priorities, although it does mean that researchers are not doing purely "academic" work, the main output of which is more papers.

One closing thought. Perhaps Intel realizes it can have far more impact by setting up large, high-impact research programs within universities rather than run its own labs. In some ways I can appreciate this point of view: help the universities do what they do best. But the way this is being done is unlikely to be successful. The first such Intel center on visual computing involves something like 25 PIs spread across eight universities. Each PI is only getting enough to fund work they were already doing, so this is an example of doing something that looks good on paper but is unlikely to move the needle at all for these research groups. This seems like a missed opportunity for Intel.

Obligatory disclaimer: This is my personal blog. The views expressed here are mine alone and not those of my employer.

20 comments:

Your comment about researchers working on papers without accountability isn't really a good description of corporate research, though it may seem that way to their companies' product engineers. IBM Research and HP Labs don't really have an academic research mindset and haven't for a long time thanks to business unit based funding. Even within Intel Research, successful researchers (in terms of promotion beyond a certain key point) also had to have some kind of significant internal impact; however, the junior researchers were mainly just accountable to their own management, which is why you may have gotten the impression of paper-centricity you did.

BTW, Intel Labs (which is, for the most part, the former Corporate Technology Group and was always much larger than Intel Research ever was) is still in existence. All elements of what you knew as Intel Research were renamed and fully incorporated into Intel Labs. The lablets are the part of the former Intel Research that has been killed off.

Firing most of the researchers from the lablets seems absolutely crazy to me. I can understand trying to reorg the research budget to try to get more return-on-investment, but the Intel lablets had some absolutely first-rate people. I'd expect that those folks won't have any problem finding new employment, but it seems like a dumb move on Intel's part.

The first thing former Intel Labs people should do is form a mailing list so they can keep in touch with each other in the years to come. Those of us who used to work at BBN have xBBN.org, and it's really great to hear from people on a regular basis who have educated opinions on a variety of subjects. It's a great source of help, support and sanity in this crazy world. I treasure my days at BBN and the people I worked with.

I understand the guarded disappointment. It's not clear to me that this is bad for Berkeley in particular; Intel may reduce its total investment in research but increase its direct grants to universities. Of course, a smaller total research spend does mean less research.

I have thought for a while that we need to improve the way that industrial research happens in the US. Bell Labs created technologies like transistors and lasers and information theory that are the foundation of whole industries, and that have created huge social and economic value around the world. We're largely still living on the work of Bell Labs; no industrial lab has had anything like that impact before or since.

It would be good to figure out how to do that again. Not to go all doom and gloom on you, but this century will need some real innovation. US federal research funding is shrinking and industry is stepping back as well. Hard to see how we create a world that can support us if we are not having great new ideas.

Mike - you hit the nail on the head. I am very concerned about what happens if we don't have enough long-range research. One model that could evolve is that universities do the far-out stuff and industry focuses on the shorter term. It is hard to justify the Bell Labs model in today's world, though no doubt it had tremendous impact.

Matt writes: "Maybe this suggests is that the conventional industrial research model is simply broken. The only (important) places left that use this model are Microsoft, IBM, and HP. These companies can afford to set up big labs with lots of PhDs and pay them to do whatever the hell they want with little accountability, but maybe this model is no longer sustainable."

I am a researcher at IBM. Nobody does "whatever the hell they want to do." IBM does not use this model, and hasn't for quite a while. In fact, management would say, the reason IBM Research still exists is precisely *because* IBM does not use this model. IBM Research has endured because it does provide value (as the company understands it). My sense is HP is going this way too.

Microsoft is different because it is a monopoly-supported research lab. The old Bell Labs had a monopoly underneath it to support it: costs AT*T incurred to run it could be charged back to consumers. In this case, the monopoly is windows/office, but that will slowly fade. So my belief is MSR will eventually go down this road, it just may take 10 or 20 years. Some friends of mine there also believe this.

MSR has been impressive in going from non-existent to the premier lab in 15 years. I find it interesting that Google (or Apple, or Facebook, or ...) does not feel any need to build a similar research division.

This is not to say I support this; quite the contrary. I also am saddened and dismayed by the shutdown of the lablets. It conveys a message about how corporations perceive the value of research, which is declining. Worse, IBM and HP will see this and claim it confirms their outlook. So I am not optimistic about the future of corporate research.

One model that could evolve is that universities do the far-out stuff and industry focuses on the shorter term.

How about a totally different alternative model? Consider the following premises:

1. People do not need to spend half of their life in formal schooling to start doing cutting edge work. Evidence: China's BGI has 16 and 17 year old kids working on Science papers. Skip grad school and even undergrad and put them straight into the deep water. a

2. Most academic research outside of the top 5-10 schools in any field is not useful, even by academic standards. Evidence: citation rankings and any quantitative measure of impact shows a tremendous power law distribution

3. Many of the people capable of contributing at a high level in academia have the ability to start significant companies and create genuine wealth. Evidence: Gates, Brin, Page, Zuckerberg, YCombinator, etc. are all full of people who could easily be top faculty at major CS and Stats departments.

4. Wealth creation is nonzero sum, while social status is zero sum. They call it socioeconomic status for a reason. Academics have dialed down the economic portion and compete on raw social status. It's about Harvard, best paper awards, tenure, prestigious journals, etc. etc. The problem is that said resources are by nature zero sum, or they wouldn't be prestigious. Wealth creation is different: for A to become richer, B does not have to become poorer. Proof: the total wealth of today (cellphones, computers, etc.) is far greater than 1000 AD.

5. Code is (far) more useful than papers. Academics feel the urge to publish papers because it is about career advancement. But peer review is far less discerning a filter than open source. Few papers have been reviewed -- or built on -- like Linux, Django, Rails, or node.js.

The conclusion is this: at least within the science and engineering fields, those capable of going into academia should do a startup with the intention of making enough to live on comfortably for the rest of their lives. This isn't that hard to do if you set yourself to it singlemindedly. In so doing you will tend to create wealth for the world, and also appreciate just how difficult capitalism is relative to academia.

Once you've provided enough for others and pushed the cart quite a bit, you can sit in the cart. Being able to work on whatever you want is in fact a luxury, and right now other people are paying for it (and not very much at that). It's far, far better if you pay for it yourself.

Some will say that only the greats could achieve early retirement and become gentleman scholars -- the type who discovered pretty much everything in the pre-1945 era. That's not true by a long shot, but even if it were, *pure* science is really only about the greats anyway. Filling in of epsilons is already on its way to applied science and does not require the same sort of early-retirement-focus to develop.

the cambridge intel lablet closed sometime back - several people moved to lablets in the US (now have to move again:( - a local beneficiary was MSR Cambridge, which, as noticed, does rather good basic research. Note MSR (for who I was on the TAB for 8 years) doesn't do "random" stuff- they deliver a fair amount to dev groups - they just aren't "slaved" to business units - but they do have to justify their existence on an aggregate delivery rate. Somehow, i suspect, being responsible for the Kinect stuff should keep them alive for a while.

The problem for the intel lablets is that they were created (by Tennenhouse) in a deliberately disruptive way, but also in a way with few possible ways they'd EVER make anything for the bottom line of Intel Corporation directly (and only very indirectly)

so i dont think this is any "end of an era" event at all - just end of one interesting experiment, just as MSR and Google's two models are also interesting experiments - if either gets to last half as long as bell labs (or xerox parc), then congratulations.....

of course, its too early to say if cambridge university is a useful experiment - we've only been here 800 years.

A wise older colleague, Lotfi Zadeh actually, observed some years ago that "research nirvanas never last forever." In my 35 year career, I have seen the rise (and fall) of Bell Labs, Xerox PARC, and IBM Research that were and are no longer "unfettered research laboratories, unconstrained by the winds of the marketplace." This is not to say that they don't do useful and good work -- just that the nature of the kind of work they do is now constrained by their internal and external patrons. The Intel Lablets where an experiment for Intel that had run its course. Ten years is actually a pretty good run these days. I would argue that MSR has a good chance to join this list at the next (first?) regime change in management (when Ballmer is finally booted out -- I can say this as a very disappointed investor in Microsoft, and Intel too). That that while "pure" industrial research may no longer be much of a career opportunity, the opportunity to have broad impact has improved. A couple guys contributing to the right open source project -- e.g., Hadoop -- can have amazing impact.

I think the Intel closure is a shame as many will take the wrong lessons from it (many mentioned above). The structure was truly revolutionary and key to putting Intel near the top of CS research almost overnight (in 5 years). The true lesson is this: industrial research labs cannot survive without support at the top level of the corporation (like MSR enjoys). The Intel lablets have not had this for a number of years and a huge portion of the more established R&D org was so jealous of this "new" org that they made Intel Research change their name and eventually killed them outright. It was only a matter of time...

The word on the streets is that what Intel is doing (funding the university directly) is just a stop-gap till it stops funding research completely. It's just a PR move to make people stop bitching in the meantime.

How about a totally different alternative model?

@Anon: I like your post, it made me think. Most professors tend to spend most of their time reminiscing these days. We need more people to think about the current trends and FIND A WAY TO FIX THINGS.

2. Most academic research outside of the top 5-10 schools in any field is not useful, even by academic standards.

I don't think this is true in general. Actually it's quite demeaning. Albert Einstein worked at the patent office, he wasn't even anywhere close to a university when he did what he did. Not all smart people decide or can go the PhD route. Not all PhDs are smart. But more importantly, this is coupled with the following point:

4. Wealth creation is nonzero sum, while social status is zero sum. Academics have dialed down the economic portion and compete on raw social status. It's about Harvard, best paper awards, tenure, prestigious journals, etc. etc. The problem is that said resources are by nature zero sum, or they wouldn't be prestigious.

What the two points say is that the total amount of social status to be passed around is a constant, and that's ok because only the top few universities do decent research, and that's ok because the amount of social status to be passed around is a constant, and.... It's circular reasoning.

What happens if we break this mentality that we need more and more papers to show we're worth something? What should be replacing papers and conferences?

The conclusion is this: at least within the science and engineering fields, those capable of going into academia should do a startup with the intention of making enough to live on comfortably for the rest of their lives. This isn't that hard to do if you set yourself to it singlemindedly. In so doing you will tend to create wealth for the world, and also appreciate just how difficult capitalism is relative to academia.

The most obvious problems with this idea are:

1) It's not clear that most good researchers will make successful founders.

2) Even for those who have what it takes: if your life goal is to do research, do you want to spend perhaps a decade founding companies before you can start doing it?

3) Given that you probably can't do good systems work working alone these days, you will need need to pay for equipment and assistance over (say) 30-50 years, plus your own living, and still have money left for retirement. I don't know the actual numbers, but it seems like this is more than what an average startup exit yields (let's say $15-$30M for 2-3 founders).

Anon re: "totally alternative model" - beautiful comment, worth a blog post in its own right, why the heck did you post this anonymously?

This is coming back to a common theme on this blog about the role of academic research in today's world. I don't agree with everything you put forth. One of the important roles that academia plays is training the next generation and I think that it is critically important for research to be part of that -- in part because not enough good people would do that job if it only involved teaching, crappy salaries, and no tenure.

That said it's true that it's not 1945 anymore and we all can't be Richard Feynman.

@crowcroft ask DT about the reasoning behind the SRPs - the IR model wasn't to be "impact-free by design."

that said, i don't know exactly what bottom-line impact intel's C-level would have accepted from a long-term research organization. the reality is that truly inventive work takes forever to become innovation on a financial scale that a multinational cares about. sometimes an organization can lay claim to some of the credit for a success like kinect, and that's enough. but as landay suggests, "enough" is always and entirely a function of what the C-level wants it to be...today.

Google+ Badge

About Me

Matt Welsh is a software engineer at Google, where he works on mobile web performance. He was previously a professor of Computer Science at Harvard University. His research interests include distributed systems and networks.