The Kyoto protocol was the first agreement between nations to mandate country-by-country reductions in greenhouse-gas emissions. Kyoto emerged from the UN Framework Convention on Climate Change (UNFCCC), which was signed by nearly all nations at the 1992 mega-meeting popularly known as the Earth Summit. The framework pledges to stabilize greenhouse-gas concentrations "at a level that would prevent dangerous anthropogenic interference with the climate system". To put teeth into that pledge, a new treaty was needed, one with binding targets for greenhouse-gas reductions. That treaty was finalized in Kyoto, Japan, in 1997, after years of negotiations, and it went into force in 2005. Nearly all nations have now ratified the treaty, with the notable exception of the United States. Developing countries, including China and India, weren't mandated to reduce emissions, given that they'd contributed a relatively small share of the current century-plus build-up of CO2.

Under Kyoto, industrialised nations pledged to cut their yearly emissions of carbon, as measured in six greenhouse gases, by varying amounts, averaging 5.2%, by 2012 as compared to 1990. That equates to a 29% cut in the values that would have otherwise occurred. However, the protocol didn't become international law until more than halfway through the 1990–2012 period. By that point, global emissions had risen substantially. Some countries and regions, including the European Union, were on track by 2011 to meet or exceed their Kyoto goals, but other large nations were falling woefully short. And the two biggest emitters of all – the United States and China – churned out more than enough extra greenhouse gas to erase all the reductions made by other countries during the Kyoto period. Worldwide, emissions soared by nearly 40% from 1990 to 2009, according to the Netherlands Environmental Assessment Agency.