Vulnerability could make Tor, the anonymous network, less anonymous

The Tor network—used by activists, journalists, law enforcement, and yes, criminals—is famous for cloaking web surfers’ identities and locations. And, apparently, it contains a vulnerability that poses a risk to all that protective anonymity, according to researchers at MIT and the Qatar Computing Research Institute (QCRI).

The good (or bad) news—depending on how you view Tor— is they say they’ve also come up with a fix to the problem that they will demonstrate at the Usenix Security Symposium next month, according to an MIT News story “Shoring up Tor.”

An estimated 2.5 million people—including journalists, political activists, terrorists or just consumers who don’t want to share their browsing histories with Facebook or other commercial entities—use Tor daily. And that is why the network is of keen interest not only to “repressive” regimes like Russia and Iran but to governments a lot closer to home, including our own. Not to put too fine a point on this, but one person’s activist could be another person’s terrorist, but I digress.

Tor works by anonymizing the transport of your data. Like an onion, Tor encrypts the data you send through the web in multiple layers. Your data is then “relayed” through other computers. Each relay sheds one layer then finally arrives at the source in full form. The software bounces users around a network of open connections run by volunteers all over the globe. This prevents people from spying on your Internet connection and discovering sites you visit. Tor scrambles information that could pinpoint your exact physical location.

By using a Tor-configured browser, the user enters her request, and it is automatically swaddled in those encryption layers and is sent it to the next, randomly chosen machine that runs Tor. This machine, called “the guard,” peels off the first encryption layer and forwards the still-masked request on until it finally reaches a randomly chosen “exit” machine that strips off the final layer encryption to reveal the destination.

Only the guard machine knows the sender and only the exit machine knows the requested site; no single computer knows both.

The network also offers “hidden services” that enable an activist to aggregate sensitive news reports and make them available to select users, but not the world at large. That is, the archive is not searchable or available on the public Internet.

The creation of those collection points, which involves the building of what Tor calls a “circuit” of machines, offered the researchers a way to snoop on Tor. By connecting a ton of their own machines to the network and then analyzing traffic, they were able to identify likely guard machines.

From the MIT report:

The researchers showed that simply by looking for patterns in the number of packets passing in each direction through a guard, machine-learning algorithms could, with 99 percent accuracy, determine whether the circuit was an ordinary Web-browsing circuit, an introduction-point circuit, or a rendezvous-point circuit. Breaking Tor’s encryption wasn’t necessary.

Furthermore, by using a Tor-enabled computer to connect to a range of different hidden services, they showed that a similar analysis of traffic patterns could identify those services with 88 percent accuracy. That means that an adversary who lucked into the position of guard for a computer hosting a hidden service, could, with 88 percent certainty, identify it as the service’s host.

The researchers, including Albert Kwon, an MIT graduate student in electrical engineering and computer science, and Mashael AlSabah, assistant professor of computer science at Qatar University, and a QCRI researcher, said the fix lies in obscuring data traffic patterns to and from the guard machines in a way that renders such “traffic fingerprinting” ineffective.

If the network sends around enough dummy packets so that all the data sequences look the same to prying eyes, problem solved, and anonymity remains safe.

This 19-year-old founder wants to take on Texas Instruments and other chip giants

Thomas Sohmers, the founder and CEO of Rex Computing never finished high school. But that didn’t stop him from working at MIT or developing a new chip architecture that he says is more powerful and 25 times more energy efficient than anything the major chip vendors can offer today. He’s now raised $1.25 million from Founders Fund’s FF Science Fund to build his idea by producing the chip in sample batches in the middle of next year.

Sohmers, who at 19, has established his bona fides in the industry as a Thiel Fellow and as the head of the high performance computing group at the Open Compute Project (I first met him at a Facebook Hackathon back in early 2014) has redesigned the architecture of his processor to optimize the way it handles memory to improve both energy usage and performance. The net result is a processor that can deliver 64 GIGAFLOPs of computing per watt. A FLOP is a measure of supercomputing power, and the effort matters because the entire chip industry is searching for ways to build the next generation of high performance computers that won’t require a power plant to operate.

The fundamental breakthrough that Sohmers has come up with is how he designed the memory on his chip, which he calls the Neo processor. Like many of the most recent efforts to design a new architecture, the Neo processor is a massively multicore chip—it has 256 cores. However, most chips handle memory in cache, defining where each core’s memory cache is and how it accesses that memory. The Neo has access to a scratchpad memory which is in effect software defined. This is a bit more complicated on the software side when the code is compiled but it makes things faster and offers more flexibility when the processor is working.

So instead of moving data around to defined caches on the chip, the processors can throw data to the scratchpad quickly and have it compiled later on. This is where the energy efficiency is gained. Moving data on a chip can take 40 times more energy than to compute that same amount of data. It also slows things down. So by tweaking the memory to make it less locked down in the hardware, Sohmers is saving energy and boosting speed.

If this sounds familiar, it’s because it’s an approach that’s happening throughout the data center. However, any new chip instruction set will require specialized software, which means the market for the Neo will start small. The chip will suit specialized computing jobs where it can deliver huge performance gains. To that end, Sohmers is starting with digital signal processors and the market for cellular base stations. So far, he said his chips are about 25 times more efficient than the current generation products on the market, so Texas Instruments should watch out.

The massively multicore architecture he’s developed works well for these types of number crunching jobs. After that, machine learning calculations and jobs performed by graphics processors, such as those made by Nvidia and AMD, are the next targets for his new architecture. He said the NEO is about 10 to 15 times more efficient when performing some of the machine learning calculations required for computer vision than existing GPUs.

As for his relative inexperience in the chip industry and life in general, Sohmers believes that’s what helped him see a solution to the energy problem.

“I just fell into this whole thing with computers and eventually into processors design,” he said. “I just wanted to understand how they work and had a bit of this crazy idea that we had some misstep in computer architecture and without 20 years hindsight and not having industry experience and being told why we did this, was just a big help in seeing this fundamental thing that we could do differently to change how they work.”

MIT researchers have found a way to make rotten software fresh again

It turns out software and fruit have something in common: they both can rot. But while a banana that’s been sitting out too long can’t turn fresh again, old software code that no longer functions like it once did can be revitalized.

A team of researchers from the Massachusetts Institute of Technology and the university’s computer science and artificial intelligence laboratory have discovered a way to automatically restore old, poorly performing software to fresh code, MIT said this week. This is a big deal for companies that might spend thousands of dollars on developers to restore old software that may not work as efficiently as it once did.

Often times software programs written several years ago no longer function properly on the latest computers and hardware, explained MIT Professor Saman Amarasinghe in an interview with Fortune.

The reason is because software has to be tailored to work for computers. Each time hardware technology advances and the performance of machines increases, a software program needs to get rejiggered to accommodate the changes.

Typically, coders spend a lot of hours going through the program’s software to make changes, add more lines of code, and add algorithms that ensure that the software works efficiently with the hardware specs of the day. However, each time a programmer tweaks the software and adds new code, the program steadily becomes more bloated.

Additionally, the algorithms and extra code used to match the software to current hardware specifications can often become invalidated as the next-generation gear launches into the market. Coders can have a difficult time removing all the extras, which leads to a program that experiences what is known as software rot. Once the code is rotten, the program as a whole can be slow as molasses and even unresponsive when you try to get it up and running.

This is where the MIT researchers come in.

The MIT team developed a software system called Helium that basically scans the rotten software, discovers the most crucial lines of code that the original programmers developed to make it function, and then builds a revised version of the program that works with the hardware specs that the MIT team wants it to match.

What used to be a month’s long process, the MIT researchers claim can now be cut down to an hour.

Amarasinghe explained that MIT tested out its Helium system with some engineers from Adobe ADBE who were looking for ways to make the over-two-decade-old Photoshop program run faster and process images more quickly with the advancements in modern hardware tech.

Whereas Google and Facebook have been running software that might be as young as three years, Adobe and other companies might have programs or applications that haven’t aged too well as the years have gone by, Amarasinghe said. In this case, the Adobe engineers wanted a way to make Photoshop run better without having to spend the traditional man-power needed to do the job.

MIT claims that when they used Helium to revise Photoshop, the program’s performance increased by 75 percent. Say goodbye to the slow loading times for your favorite vacation pictures you want to make more colorful.

As of now, Helium is not quite ready for primetime and it could be a few more years before it’s ready for widespread use, Amarasinghe said. Adobe was only testing out Helium and is not going to be using it in its products as of now.

Also, while Helium can restore older code, it currently only works on image-processing software like Photoshop, because getting a program to display an image is a relatively easier task compared to other more complex software, he said.

Still, if Helium pans out, Amarasinghe said it could have a lot of benefits to businesses.

The healthcare industry, for example could save a lot of time having to upgrade their image-processing software for medical devices like X-ray machines as the hardware advances.

How startups can save nuclear tech

A recent disturbing report predicts that despite a colossal number of new solar panels and wind turbines over the next quarter century, the planet will still face dangerous rising temperatures. Basically even if these widely embraced clean energy technologies are put on overdrive, we’re still probably screwed.

The report was understandably bearish on big growth in nuclear power. Following the tragic earthquake and accompanying meltdown of the nuclear reactors in Fukushima, Japan in 2011, new nuclear reactor construction has been largely halted in many countries. Public fear over safety, especially following such incidents, has long hampered the industry, and led to it being the sort of black sheep of clean power.

But four years after the infamous accident, environmentalists, nuclear advocates, and researchers are now looking at nuclear tech as an almost necessary way to generate power without carbon emissions that, if used correctly, could be crucial to help the world avoid the worst of global warming. And unlike with solar and wind, nuclear reactors generate power around the clock.

Tapping into this emerging sentiment is a new wave of entrepreneurs and investors, many in Silicon Valley, who are passionate about how tech innovation can lift the industry out of its nuclear stalemate.

The new guard are working within a nuclear industry that is stuck in the regulatory and financing patterns of old. Big conglomerates dominate and career nuclear execs are the norm. But the lack of interest in new tech and new ideas are in sharp contrast to the high stakes of averting planetary catastrophe.

Scott Olson/Getty Images

The Valley

Last month, beneath the high-vaulted ceilings of the sleek offices of Founders Fund, a venture capital firm that backed Facebook, Airbnb and SpaceX, sits a small group of these passionate nuclear evangelists. They are supposed to appear on a panel about nuclear energy that I’m moderating, but they’re actually far more interesting to listen to just chatting on their own.

There’s a spread of wine and cheese in one corner, and around the room are windows big enough to see the San Francisco Bay in the background. The wooden door into the office is so preposterously large that you can’t help but instinctively feel what they seem to be implying: We do big things. At one point, the firm adopted the motto “we wanted flying cars, instead we got 140 characters,” as a sort of dismissal of the preoccupation among many Silicon Valley investors with the next hot consumer app.

Last Summer, Founders Fund invested a small $2 million seed round into an early stage nuclear startup called Transatomic Power. Founded in 2011 by MIT nuclear scientists Leslie Dewan and Mark Massie, Transatomic Power is working on a nuclear reactor that uses molten salt and nuclear waste as a power source. While molten salt nuclear reactor tech is decades old, Dewan and Massie are using new designs and materials.

Dewan, who looks a bit like actress Amy Adams, is one of the clear leaders of this growing movement of nuclear entrepreneurs. The nuclear policy advocates in the room refer to her as “their secret weapon” that they use in meetings on Capital Hill to inspire and make connections. She’s young, well-spoken and doesn’t fit the profile of the typical nuclear industry exec.

She and the senior statesman in the room, venture capital icon Ray Rothrock, take turns interrupting each other to answer questions about the hurdles that face startups in the nuclear sector. Unlike Dewan, Rothrock fits the physical description of the typical nuclear exec, but he’s actually anything but.

Rothrock’s been a venture capitalist for over two decades. He backed Sun Microsystems early on, launched the Internet investing practice for VC firm Venrock in the early 90’s, and later formed the company’s energy investing practice. He also started his career as a nuclear engineer in the late 80’s and early 90’s before getting the investing itch.

Some of Rothrock’s nuclear ambitions are poured into a stealthy startup, Tri Alpa Energy, that is working on nuclear fusion (nuclear fission is what’s used in today’s reactors). Years ago Venrock backed Tri Alpha Energy, and the company now also has the financial support of the Russian government (through the nanotech company Rusnano), Microsoft co-founder Paul Allen, and Goldman Sachs. Rothrock is Tri Alpa Energy’s chairman.

Nuclear tech renaissance?

Dewan and Rothrock say nuclear tech is undergoing a period of rare entrepreneurial innovation. Rothrock says that compared to when he was in school, there are many more students today working on nuclear tech projects because of pressing climate change.

They cite figures of 55 nuclear startups with a total of $1.6 billion in funding. Rothrock says people don’t believe him when he says there’s that many startups working on nuclear technology. But those numbers are tiny compared to how many on-demand delivery startups, or mobile photo sharing startups, there are with millions of dollars from venture capitalists.

Founders Fund partner Scott Nolan, who made the investment in Transatomic Power, compares the nuclear entrepreneurial tech movement to what’s been happening in aerospace with young disruptor SpaceX and others. Nolan was an early employee at SpaceX and helped develop the propulsion systems used on the Falcon 1 and Falcon 9 rockets and the Dragon spacecraft.

Investors beyond Venrock and Founders Fund are starting to take notice. Last month the head of Silicon Valley accelerator Y Combinator, Sam Altman, said he was joining the boards of two nuclear startups. Helion Energy, which is building a magnet-based fusion reactor, and UPower, which has designed a small modular fission reactor, unusually went through the Y Combinator program.

Then there’s young nuclear fusion startup General Fusion, which is backed by Canadian venture capitalists Chrysalix Venture Capital, Amazon CEO Jeff Bezos, and the Canadian government. The company hopes to have a working prototype of its fusion reactor this year and an operating reactor in 2020.

Bill Gates has backed a nuclear startup called TerraPower, which was a spin off from the intellectual property incubator Intellectual Ventures. The company is working on a technology called a traveling wave nuclear reactor, which uses nuclear waste as a power source.

There’s also NuScale Power, which has been working on small, modular water reactors. Instead of the massive 1 gigawatt nuclear power generators that are in use today, NuScale Power makes a smaller 50 megawatt one (1,000 megawatts make 1 gigawatt).

Many of these nuclear startups, like Transatomic Power and TerraPower, are trying to address the issue of nuclear waste with their new reactor designs. That’s because the problem of what to do with the radioactive byproduct that results after a reactor has used up its fuel, is both complicated and important for the future of the nuclear industry.

Despite the dozens of nuclear startups these days, the sector is far from welcoming to entrepreneurs. Combining the difficulties of the startup world with the difficulties of nuclear technology is a colossal endeavor.

Nuclear technology can take decades to move from the lab into commercialization. Many nuclear startups don’t even give commercialization estimates because it can be so many years away. That makes it difficult for investors to back because they don’t know when they can expect a return.

The traditional nuclear industry also requires billions of dollars to get nuclear reactors built. The first nuclear plant under construction in the U.S. in decades, in Georgia, is estimated to cost $14 billion, and is now over budget and behind schedule.

Access to funding will plague all of the nuclear startups. NuScale Power faced a funding crunch and sold a majority ownership to Texas energy services company Fluor. Helion Energy is in the process of raising a round of $21.23 million and has closed on about half it.

But the biggest barrier for a nuclear startup could be regulation. It can take years to wade through the regulatory process of the U.S. Nuclear Regulatory Commission.

The NRC was born out of the Atomic Energy Commission of the 1940s, and its history is tied with nuclear arms proliferation and the military. It’s a closed, enforcing regulatory body and not equipped to deal with building relationships with entrepreneurs.

Construction workers surround the passive containment cooling water storage tank of nuclear Unit 1 in Sanmen — one of over two dozen new reactors being built in China today.Photo: STEFEN CHOW

Rothrock has been advocating for an entirely new federal nuclear agency that would operate like the federal Food and Drug Administration. When pharmaceutical companies submit drugs for approval with the FDA they have to meet certain milestones with their studies and prove that it works. Rothrock thinks an ‘FDA for nuclear’ could create the same predictable milestones for nuclear reactors and enable entrepreneurs and startups to get their new reactor technologies off the ground much more quickly.

If Rothrock and Dewan can help deliver changes in the nuclear tech industry, it could help open up this much neglected sector to some of the brightest and passionate young minds. Instead of flocking to Silicon Valley to create mobile apps, maybe the top students will turn to nuclear, like Dewan did.

And if the U.S. nuclear industry doesn’t change, many of these innovations will be commercialized overseas. China’s appetite for any new power generation tech is voracious. Russia is also particularly interested in new nuclear power.

Dewan and Rothrock have an important mission. With climate change looming on the horizon, nuclear technology could be the planet’s best hope.

Researchers create “skin grafts” for buggy software

Some day, businesses creating software plagued with bugs may be able to fix it automatically by stitching it up with clean, unbroken code.

Researchers at the Massachusetts Institute of Technology showed off an automated code-repair system at a computing conference earlier this month that works similar to the way a surgeon applies a skin graft to damaged tissue. The new system detects bugs, takes healthy code from a publicly available source, and then grafts it onto the sick software.

Presto, problem solved.

The system taps into the growing number of public software libraries like GitHub and BitBucket that anyone can draw upon for free to help with their own software projects. The MIT team’s system takes advantage of the reality that many projects — although different — often share similar code. Part of one project can be fused onto another, at least in theory, to fix a bug. The trick is to get a computer to recognize the problematic code and then fuse the healthy code onto it.

The MIT team tested the system, called CodePhage, on seven open-source programs that they found bugs in. Each software repair took between two to ten minutes.

Fixing buggy code is becoming increasingly important as more companies like car manufacturers and airliners create software. It could also be a way for companies to save huge amounts of money on programmers by having computers do the grunt work of finding and fixing bugs.

In early June, an Airbus military plane crashed and killed four people reportedly because of badly installed engine control software. Although the software itself didn’t contain any known bugs, once it was installed it created unforeseen conflicts with other software that led to the aircraft’s engines stalling. The lesson to take away is that while software may not appear to be buggy, once you combine it with other software and systems in a live environment, there could be unforeseen glitches.

The MIT team isn’t the only one working to solve problems with buggy software. For example, Burlington, Mass.-based Veracode, which last year raised $40 million and is prepping for an initial public offering, sells a bug-scanning system. Developers can hook up the tools they use to build software programs to Veracode’s cloud-based service that then scans their apps for potential glitches.

Subscribe to Data Sheet, Fortune’s daily newsletter on the business of technology.

For more on cybersecurity, check out the following video from Fortune:

The toolkit was already available as a free download and the institute, backed by MIT and Harvard University, will continue to support it whether scientists run it in their own facilities, another cloud or on the Google GOOG cloud. The Cambridge, Mass.-based Broad Institute has aggregated the world’s largest collection of genetic data about diseases.

The advent of massive-scale public clouds has huge potential benefits for scientific and medical researchers who need to run (and re-run) huge data analysis. In the past such researchers would often have to sign up for and wait for time on a supercomputer, which can take weeks or even months.

It’ s not so much that cloud computing saves money and time as that it enables number crunching and research that most scientists could not have considered in the past.

“Large-scale genomic information is accelerating scientific progress in cancer, diabetes, psychiatric disorders, and many other diseases,” said Eric Lander, president and director of Broad Institute in a statement. “Storing, analyzing, and managing these data is becoming a critical challenge for biomedical researchers.”

This sort of research is an ideal showcase for the power of cloud computing which is one reason Google, Amazon[fortune-symbol=”AMZN”], and Microsoft[fortune-symbol=”AMZN”], are all eager to show that their clouds can handle the toughest medical, biomedical, and genomic research tasks.

Watch the scariest robot in the world jump over stuff automatically

It’s bad enough that Boston Dynamics has made a robotic cheetah that can run nearly 30 m.p.h. (48 km/h). Now MIT has its own cheetah-robot that can autonomously leap tall obstacles in a single bound. The robot uses lasers to see its environment, and the onboard computer uses a three-part algorithm to detect an obstacle, adjust its approach, then measure the appropriate jump trajectory. The entire process takes about 100 milliseconds. Right now the cheetah can clear hurdles as high as 18 in. (46 cm) at an average running speed of 5 m.p.h. (8 km/h).

MIT researchers are planning to demonstrate their cheetah’s abilities at the DARPA Robotics Challenge in June.

Here are the jobs automation will kill next

When automated factories started erasing jobs at manufacturing companies, most of us shrugged: Great, better products cheaper, was the general line of thinking

But as automation keeps creeping up the stack, taking over more of what most would call “skilled” positions, well that’s getting some folks—who consider themselves skilled professionals—nervous.

Take airplane pilots for example. That’s now a dead-end job according to Mary “Missy” Cummings, director of the Humans and Autonomy Lab (HAL) at Duke University (and a former Naval fighter pilot.) She said that “in all honesty” she could not recommend that anyone become a commercial airline pilot going forward, given the current state of the art.

“Commercial pilots today touch the stick for three to seven minutes per flight—and that’s on a tough day,” she told an audience at the MIT CIO Symposium on Wednesday.

So, the gist: if you like to fly, make enough money in some other career so you can pursue it as a hobby. The broader problem, Cummings noted, is that humans tend to get jaded when they’re not doing useful things. And that is bad, even dangerous.

“Boredom sets in when you babysit automated systems and if you think this is a problem in aviation, just wait till driverless cars come around,” Cummings said.

Complacent people still have an expectation that if and when something goes wrong at the wheel or the joystick—which it will—a human will intervene at the right time. Which won’t happen, she said.

But it’s not just pilots and drivers who are endangered species. Journalists are also on the block. Automated Insights has programmed the creation of earnings stories and sports stories that show up in news papers around the world. The company’s computers churn out 3,000 earnings stories per quarter for the Associated Press at an average cost of less than $8 per story.

“The AP did 300 earnings stories per quarter, now they do 3,000,” said Robbie Allen, founder and CEO of the Durham, N.C.-based company, which was purchased in February by Stats.

Allen stressed, perhaps sensing reporters in the room, that his service augments rather than replaces, reporters’ work. Publications still want to cover Google GOOG and Apple AAPL earnings so maybe they’ll use Automated Insights story as a template, a starting point, and add their own expertise, he said.

The company’s goal is not to create one story for a million people but a million specialized stories for one person each —all driven by data. The ultimate goal is complete personalization. One of the company’s early projects was to take fantasy football data and to create personal stories based on that data for those stats-crazed team managers.

“I once asked how many reporters would be interested in writing a million fantasy football stories. And I got no takers,” he said.

Coincidentally, NPR ran a story Wednesday morning pitting veteran reporter Scott Horsley against an Automated Insights’ WordSmith program to write an earnings report. It took Horsley about 7 minutes and Automated Insights two minutes. And I’ll bet Horsley makes more than $8 for that effort. Yikes.

Professor Tomaso Poggio of MIT’s Center of Biological and Computational Learning ranked people who are most at risk for professional obsolescence.

Those at what he called the highest level of skill—the engineers and scientists who actually build and fix stuff— and those at the lowest level— plumbers who also actually build and fix stuff—will be safe. It’s those in the middle, the lawyers, financial advisors etc. who are at risk.

“The middle jobs will disappear,” he said.

So what should individual humans do to protect themselves from professional dead ends? Unsurprisingly, all four panelist at this session, recommended programming skills. The thinking likely being that the robot cannot become your overlord if you’re the one programming the robot.

But getting back to the NPR story on Automated Insights: For what it’s worth, Horsley’s story was much better.

Philips moves R&D center to Cambridge, Mass. and pledges $25 million to MIT

Dutch company Philips is heading to the Massachusetts Institute of Technology (MIT) in Cambridge. The company has signed a $25 million, five-year research alliance with the university and is moving its U.S. R&D headquarters to Cambridge, Mass. from its current headquarters in Westchester County, New York where it’s been for the last 67 years.

“The old headquarters were founded very much on this older philosophy of R&D where you needed to be in a quiet place for research and then you handed your ideas to the business for commercialization,” said Henk van Houten, executive vice president & general manager, Philips Research.

But now that philosophy has changed and R&D must be more integrated with the business, as well as with startups and other potential partners in big businesses and academia. Given that Philips will focus on lighting and healthcare technology for its R&D, Boston makes a considerable amount of sense, especially on the health side. There are plenty of academics who can parse data as well as research hospitals willing to explore how the combination of sensors, connected technology, and predictive algorithms can come together to help deliver better patient care, especially in the home.

Remote patient monitoring and helping address the management of chronic diseases, such as Philips’ work with managing chronic obstructive pulmonary disease (COPD), are top research priorities. Other projects might include helping develop better machine learning for ultrasound, so it can be used as a diagnostic tool by those who are not doctors, said van Houten. On the connected lighting front, the main research areas would center around smart cities initiatives, where street lighting takes on multiple roles.

How tech can stop the looming food crisis

The world’s population is expected to increase from 7 billion today to 9 or 10 billion by the end of the century, according to the United Nations. We also can expect more pressure on the food supply as people in the developing world adopt middle class lifestyles, which usually involve eating more meat. To satisfy global demand, we will need to roughly double today’s output, which means getting smarter about how we produce and manage food.

The good news is that innovation is coming to the farm. Advanced information technology, improved communications systems, robotics, drones, and other new technologies have the potential to boost agricultural yields and reduce waste while tempering environmental degradation.

In the past, the world fed a growing population largely by cultivating undeveloped land and increasing agricultural inputs, including fertilizer and water. These are not very good options today. Clearing more land for agriculture now often means destroying rainforests and other valuable natural areas. Adding inputs makes sense in some poorer countries, where fertilizer and other resources are underutilized, but in most parts of the world, this strategy will mean heavier nutrient loads in waterways, depleted water supplies, and higher greenhouse gas emissions. Genetically modified crops have helped to boost production in recent years, but it appears that this strategy too may have limits – certainly political and possibly biological.

If our best chance of escaping a future food crisis is innovation, then an obvious place to look for solutions is food waste. No one knows for certain how much food the world wastes, but it seems that somewhere between 30% to 50% of the food we grow around the world goes uneaten. Waste occurs at almost every point in the chain—from farm to truck to warehouse to grocer to restaurant to household kitchen.

On farms in many parts of the world, food spoils because cold storage and transport are inadequate or non-existent. An obvious answer would be to install refrigeration in more farms, trucks, and warehouses, but this can be very costly. Another more economical approach is to equip farmers with better information and communications tools—smartphones are an obvious choice—so that farmers have information about markets at their fingertips and can better plan their harvests and distribution.

At the other end of the supply chain, retailers often find themselves with food that is bruised or otherwise unattractive but still edible. New IT and communications devices could help to connect this food with people who could use it. A pair of MIT Sloan students recently launched Spoiler Alert, a website, smartphone app and online marketplace that finds good uses for spoiled, expiring, and excess food.

Spoiler Alert and similar approaches enable food banks and other poverty-focused organizations to find out about these products and claim them. And if food truly is unfit for human consumption, it can be directed to places that want it for animal feed, fermentation, or biofuels. This year’s MIT Sustainability Summit, on the theme of Farming, Food, and the Future, focused on these “circular economy” approaches to food system strains.

Another way that innovation helps on the farm is in managing agricultural inputs. For the past dozen years or so, farmers have used tractors equipped with GPS and computers to collect data on how much fertilizer, water, and seeds are delivered where. The next generation of devices and systems includes robots that can move along rows of crops and identify where inputs are needed. Drones can gather similar information through high-resolution thermal and visual imagery.

These and other innovations will make agriculture more precise, which will increase yields and reduce inputs. But like earlier agricultural revolutions, this one too will find its limits. For agriculture to be truly sustainable, we will need to take a hard look at deeply ingrained attitudes and behaviors. For example, might Americans re-think a diet that includes meat twice a day every day? As wondrous as innovations in agriculture may prove to be, there are cultural challenges they will not be able to overcome.

Jason Jayis Senior Lecturer and Director of the Sustainability Initiative at the MIT Sloan School of Management.