Climate Change Science Compendium 2009


The UNEP has released a Climate Change Science Compendium 2009 (McMullen and Jabbour 2009) that:

“presents some of the issues and ideas that have emerged since the close of research for consideration by the IPCC Fourth Assessment Report over three years ago. Focusing on work that brings new insights to aspects of Earth System Science at various scales, it discusses findings from the International Polar Year and from new technologies that enhance our abilities to see the Earth’s Systems in new ways. Evidence of unexpected rates of change in Arctic sea ice extent, ocean acidification, and species loss emphasizes the urgency needed to develop management strategies for addressing climate change.”

The UNEP summarises the findings of the report as:

“The pace and scale of climate change may now be outstripping even the most sobering predictions of the last report of the IPCC … many predictions at the upper end of the IPCC’s forecasts are becoming ever more likely.”

One of the most important sections of the report deals with sea-level rise – an area of considerable research debate since the IPCC Fourth Assessment Report was released.

The IPCC Fourth Assessment Report could affirm only 18–59 cm rise in global sea levels over the 21st century based largely on thermal expansion of the oceans. Critically, it excluded contributions to sea level rise from dynamic ice changes, such as from melting of glaciers, because no consensus could be reached based on the published literature available at that time.

The new UNEP report concludes based on recent research publications that:

“Introduction of realistic future melt and discharge values … suggests that plausible values of total global average sea-level rise, including all land-ice sources plus thermal expansion, may reach 0.8 to 2.0 metres by 2100, although no preferred value was established within this range …

Immediate implications are already challenging … for every 20 cm of sea-level rise the frequency of any extreme sea-level of a given height increases by a factor of about 10. According to this approach, by 2100, a rise of sea level of 50 cm would produce events every day that now occur once a year and extreme events expected once during the whole of the 20th Century will occur several times every year by the end of the 21st.”

The UNEP report’s reference list provides a helpful compilation of the leading climate change research since 2007.


McMullen, C.P. and Jabbour, J. (2009). Climate Change Science Compendium 2009. United Nations Environment Programme, Nairobi, EarthPrint (Link to PDF)

Al Gore launches Safe Climate Australia


This monday saw the launch of Safe Climate Australia, a non-government organisation formed by a group of scientists, and community and corporate leaders who aim to end Australia’s reliance on cheap coal for electricity.

Safe Climate Australia will build on a range of international innovation and transition projects such as Repower America, which target energy efficiency and renewable generation, a modern national smart grid and electrification of transport as key actions in addressing global warming, energy security and peak oil.

Inspired by these projects, the purpose of Safe Climate Australia is to identify and catalyse action on the societal transformations and solutions needed to achieve a safe climate for Australia, and for the planet, at emergency speed. The structural change achieved in the next ten years is crucial.

Safe Climate Australia was launched by the Nobel Prize winner and former Vice President Al Gore. In an impressive speech, Gore likened last years Australian bushfires as evidence that the planet had a “fever”:

“It’s difficult to ignore the fact that cyclones are getting stronger, that the fires are getting bigger, that the sea level is rising, that the refugees are beginning to move from places they have long called home,” Gore said.

“The odds have been shifted so heavily that fires that used to be manageable now threaten to spin out of control and wreak damages that are far beyond what was experienced in the past.”(Read More)

For more information on Safe Climate Australia, click here, or follow this link for video excerpts of Al Gore’s speech (which for some reason refuse to embed here, apologies in advance). and see below for a clip from an ABC interview:


Climate change being discussed at the G8 summit

Picture 590Global climate change is being discussed intensely this week at the G8 summit in Italy. The G8 leaders just release a “Declaration of the leaders: the majority economies forum on energy and climate” which can be downloaded as a PDF here.  Some of the highlights:

Climate change is one of the greatest challenges of our time. As leaders of the world’s major economies, both developed and developing, we intend to respond vigorously to this challenge, being convinced that climate change poses a clear danger requiring an extraordinary global response, that the response should respect the priority of economic and social development of developing countries, that moving to a low-carbon economy is an opportunity to promote continued economic growth and sustainable development, that the need for and deployment of transformational clean energy technologies at lowest possible cost are urgent, and that the response must involve balanced attention to mitigation and adaptation.

Our countries will undertake transparent nationally appropriate mitigation actions, subject to applicable measurement, reporting, and verification, and prepare low-carbon growth plans. Developed countries among us will take the lead by promptly undertaking robust aggregate and individual reductions in the midterm consistent with our respective ambitious long-term objectives and will work together before Copenhagen to achieve a strong result in this regard.

Adaptation to the adverse effects of climate change is essential. Such effects are already taking place. Further, while increased mitigation efforts will reduce climate impacts, even the most aggressive mitigation efforts will not eliminate the need for substantial adaptation, particularly in developing countries which will be disproportionately affected. There is a particular and immediate need to assist the poorest and most vulnerable to adapt to such effects. Not only are they most affected but they have contributed the least to the build up of greenhouse gases in the atmosphere.

We are establishing a Global Partnership to drive transformational low-carbon, climate-friendly technologies. We will dramatically increase and coordinate public sector investments in research, development, and demonstration of these technologies, with a view to doubling such investments by 2015, while recognizing the importance of private investment, public-private partnerships and international cooperation, including regional innovation centers. Drawing on global best practice policies, we undertake to remove barriers, establish incentives, enhance capacity-building, and implement appropriate measures to aggressively accelerate deployment and transfer of key existing and new low-carbon technologies, in accordance with national circumstances. We welcome the leadership of individual countries to spearhead efforts among interested countries to advance actions on technologies such as energy efficiency; solar energy; smart grids; carbon capture, use, and storage; advanced vehicles; high-efficiency and lower-emissions coal technologies; bio-energy; and other clean technologies.

It all sounds great, but there aren’t a lot of specifics on how the goals will be met.

One of the setbacks at the summit was the (non-surprising) refusal by India and China to commit to specific reductions in greenhouse gas emissions.  Peter Baker reports on this in the NYT here:

If he [Obama] cannot ultimately bring along developing countries, no climate deal will be effective.

While the richest countries have produced the bulk of the pollution blamed for climate change, developing countries are producing increasing volumes of gases. But developing countries say their climb out of poverty should not be halted to fix damage done by industrial countries.

As various sides tried to draft an agreement to sign Thursday, those tensions scuttled the specific goals sought by the United States and Europe. The proposed agreement called for worldwide emissions to be cut 50 percent by 2050, with industrial countries cutting theirs by 80 percent. But emerging powers refused to agree because they wanted industrial countries to commit to midterm goals in the next decade and to follow through on promises of financial and technological help for poorer nations.

The declaration also states “We recognize the scientific view that the increase in global average temperature above pre-industrial levels ought not to exceed 2 degrees C.”  But is this really a scientific view?  I think it is a social or political goal, but I don’t think we know exactly how a 2 degree increase will differ from a 3 or 4 degree increase.  I.e., we don’t know where the tipping point is. See the very nice discussion (with expert commentary from Stephen H. Schneider, Kenneth Caldeira and others) on the merits of the 2 degree solution being discussed so much this week at the G8 meeting on Dot Earth here.

Shifting baseline of global temperature anomalies

Seems like the weather watchers are fawning over the latest update of the UAH MSU satellite data:

June 2009 saw another — albeit small — drop in the global average temperature anomaly, from +0.04 deg. C in May to 0.00 deg. C in June, with the coolest anomaly (-0.03 deg. C) in the Southern Hemisphere. The decadal temperature trend for the period December 1978 through June 2009 remains at +0.13 deg. C per decade.

Let’s run through this one more time, using the UAH NSSTC data over at WoodForTrees:


Above is the 1979 – 2009 dataset.


If we ‘pick’ the 5yr period between 1993 – 1998, things look like they are getting much warmer!


If we ‘carefully select’ the 5yr period between 1988 – 1993, then it’s abundantly clear that global warming doesn’t exist at all, right?

For some incredibly elaborate cherry-picking, take a look at this post by mathematician Luboš Motls making the most of the ‘shifting baseline’ effect (which he chooses to call ‘trends over different intervals‘) of selecting which years to run the analysis:

Global warming is supposed to exist and to be bad. Sometimes, we hear that global warming causes cooling. In this case, global warming causes global averageness. In all three cases, it is bad news. The three main enemies of environmentalism are warm weather, cool weather, and average weather.

To quote WoodForTrees: “What you find can depend on where (or when) you look!”:

Temperature trends – pick a timescale, any timescale! Temperature trend-lines (linear least-squares regression). I hope this is useful, but I would also like to point out that it can be fairly dangerous…

Depending on your preconceptions, by picking your start and end times carefully, you can now ‘prove’ that:


To summarise, here is the WoodForTrees analysis of ALL datasets (with trendlines, and adjusted anomaly baselines), with trends of 0.13-0.17°C/decade, which projects to between 1.3 and 1.7°C per century.


Whether this continues to increase at the same rate remains to be seen, but hawkishly watching the latest data month and saying ‘it’s colder!’ or cherry-picking the data to your own means isn’t going to ‘disprove global warming’. As John blogged the other day, short term declines in global temperature (as illustrated above) are actually predicted by Global Climate Models.

Climate Models Get Biological Makeover

Quick preface: This great article on global climate modelling (Published in Miller-McCune magazine) was written by Nicholas Jachowski, a Stanford student who as part of the Stanford Overseas Program conducted studies on coral physiology at Heron Island Research Station.

While the ultimate concern over climate change centers on how it affects living things, in the past, modelers have focused on the physics and chemistry of climate change. Now they are including biology.


It’s springtime in Silicon Valley and a timeless tale is being retold. Kevin Arrigo, an oceanography professor at Stanford University, stands in the front of a classroom of students explaining how life works. He’s not talking about any old life though, but life in the ocean — where life began. And it’s not the fishes and the whales, either; as Arrigo puts it, “If it’s big enough to see, it’s probably not important.”

Arrigo is talking about the tiny plants that make up the base of the oceanic food pyramid — the phytoplankton. Like all plants, microscopic phytoplankton take light from the sun and carbon dioxide from the atmosphere to make food and oxygen in the process known as photosynthesis. But in much the same way that Arrigo dismisses the ecological primacy of the oceans’ larger denizens, climate scientists have for the most part dismissed the role of marine life in their climate models.

No longer.

For the first time, researchers at the premier climate-modeling institute in the United States are explicitly incorporating the complexities of marine life into their computer simulations. The first of these next-generation models was initiated last month, and while final data won’t be available until next year, their approach is already promising the most accurate climate simulations ever. More accurate climate models will help to inform and guide world leaders, policy makers and everyday people who seek to avoid potentially irreversible harm to the planet due to climate change caused by mankind. Understanding why — and why it took so long – to incorporate biology into climate models means taking a closer look not just at the computers but at the microscopic life of the oceans.

Phytoplankton grow quickly as long as they get sunlight from above and nutrient-rich water upwelling from the depths. The tiny plants are in turn eaten by zooplankton such as krill and copepods, which in turn are eaten by fish, which are eaten by bigger fish, and on upwards to seals and dolphins, and those other “unimportant” things we can see.

It was the evolution of these tiny plants in the ocean that allowed more complex organisms like humans to evolve. If man were around 3 billion years ago during the advent of the first phytoplankton, he would suffocate from lack of oxygen. By the process of photosynthesis, phytoplankton drastically changed the Earth’s atmosphere from having almost no oxygen to the 20 percent oxygen levels of today.

Changes are occurring in the atmosphere again, but not because of phytoplankton. This time humans are the cause. As scientists try to predict the changes man’s atmospheric tampering will have on the Earth, they are beginning to look to phytoplankton to see what role they might play in keeping Earth’s atmosphere in balance.

Last month, scientists working on the next Intergovernmental Panel on Climate Change report began experiments on the newest climate model, which, for the first time, includes phytoplankton.

According to IPCC, a scientific body charged with evaluating the risk of climate change associated with human activity, the Earth’s temperature could rise between 2.0 degrees Fahrenheit and 11.5 degrees during the 21st century. The main contributor to the warming is the increase of heat-trapping greenhouse gases in the atmosphere due to human activities such as deforestation and the burning of fossil fuel. One of the most significant greenhouse gases is carbon dioxide, a naturally occurring gas that is pumped out in unnatural quantities as a byproduct of burning those fossil fuels. Carbon dioxide levels in the atmosphere have increased 38 percent since the mid-1700s.

Every five to seven years since 1990, the IPCC has put out assessment reports that both summarize the scientific literature on climate change published since the last report and make projections. Key to making projections about the future climate are “global climate models,” or GCMs, which are computer codes used for simulating a dynamic Earth. The Fifth Assessment Report is due in 2014, and computer programmers and scientists are already hard at work on the next generation of GCMs.

According to Arrigo, biology — or to be specific, biogeochemistry, the chemical cycles caused by biology — was not thought to be important enough to include in GCMs until now. “There was no ocean biogeochemistry in the old IPCC models,” said Ron Stouffer, a meteorologist and climate modeler at Princeton University’s Geophysical Fluid Dynamics Laboratory, an arm of the National Oceanographic and Atmospheric Administration. “Now everyone is trying to include terrestrial and ocean biogeochemistry.”

Arrigo says biogeochemical processes were not modeled because scientists thought that the physical and chemical processes relating to increasing greenhouse gases, such as carbon dioxide trapping heat in the atmosphere, ocean circulation transporting heat poleward, clouds reflecting sunlight and sea-ice melting, were more important. Such processes might be more important, but nobody knows for sure because no one has extensively modeled biogeochemistry in GCMs before.

Another reason for not including biogeochemical cycles in GCMs is the extra layer of complexity they add “in a model you didn’t trust very much to begin with,” said professor Stephen Schneider, referring to the uncertainty inherent in modeling future climates. Schneider, a Stanford climatologist who has been involved with the IPCC since 1988, thinks the biggest thing holding back climate modeling is the lack of computer time.

According to Stouffer, it can take up to six months to run just one GCM experiment, and that’s on “one of the bigger (computers) on the planet,” he said. Stouffer noted that with biology in the models, run times could be twice as long — up to a year.

As computers become faster and more computing time is available, Schneider offered three strategies for modelers: Add more processes such as biogeochemistry, add more predictions of future greenhouse gas levels or increase the resolution of the model. Each option has its merits, and “none of it’s wrong,” he said. The decision likely will come down to scientists’ individual preferences.

Oceanographer Anand Gnanadesikan, also at the Geophysical Fluid Dynamics Laboratory, is one scientist who has decided to add biogeochemistry to the models. Gnanadesikan, who headed the ocean model development team for the IPCC’s Fourth Assessment Report, said, “I’m interested in how ocean circulation determines plant growth and how plant growth potentially influences ocean circulation.” The ocean model is coupled with an atmosphere model to make a global climate model.

Oceans are important for GCMs because water circulation is responsible for much of the heat distribution around the world, and the oceans remove carbon dioxide from the atmosphere. The “ocean is more important than the land” when it comes to the climate, Arrigo said — it’s four times more potent than the land at pulling carbon dioxide out of the atmosphere.

But as carbon dioxide in the atmosphere increases, it also increases in the oceans — with sometimes unexpected results. Carbon dioxide combines with seawater to make carbonic acid, which is acidifying the oceans and making it harder for marine organisms, including some phytoplankton, to make shells. The continued addition of carbon dioxide to the atmosphere and its subsequent absorption into the ocean threaten the future of these species.

Ocean biogeochemistry is nothing if not complex. It’s no wonder the first generations of climate models left it out. But following the details is potentially crucial for predicting climate changes. In the case of shelled animals in an acidified ocean, the chemical process that creates shells actually releases a molecule of carbon dioxide. So, decreasing the amount of shell means less carbon dioxide will be in the oceans — which means more carbon dioxide could leave the atmosphere and be absorbed into the water. This “negative feedback,” could decrease the amount of carbon dioxide in the atmosphere — cooling the climate — if it happens on a broad enough scale. The question is: Will it be strong enough to counteract global warming? Modeling may be the only way to find out.

According to Arrigo, most of the potential biogeochemical feedback loops caused by increasing carbon dioxide and global warming are negative feedbacks. Most physical feedbacks tend to be positive, for example, increasing temperatures will put more water vapor in the atmosphere via evaporation, further increasing the Earth’s temperature.

What’s unclear, Arrigo said, is whether first-order effects, like greenhouse warming, or feedback loops, like the demise of shells, are more important in climate modeling. Fortunately, we may know the answer to that question very soon. “We started running the model a couple days ago,” Stouffer said by phone last month, referring to the model he, Gnanadesikan, and about 80 other scientists at Geophysical Fluid Dynamics Laboratory have been working on for the past three years.

John Dunne, another climate modeler at Geophysical Fluid Dynamics Laboratory, says this latest model contains 30 biogeochemical variables used to model the impacts of biology on the climate, which he describes as “fairly sophisticated.” The model even contains three phytoplankton groups. This is light-years ahead of the biogeochemistry in the old IPCC models, in which the biology consisted of assuming the ocean to be “off-green everywhere” to account for phytoplankton absorption of light, says Gnanadesikan.

The GFDL climate modelers are taking their time to produce the best global climate model they can with the limited computational power and knowledge of oceanic biogechemical cycles available. The time has come for biology in the models, but it’ll take years to work out the kinks. The data from models they’re running now will be publicly available in a year and a quarter, said Stouffer. But he added, “There’s too much uncertainty, there’s not enough observation, and there’s not enough understanding.” The best we can hope for by the next IPCC report in 2014 “is to start to get a handle on the uncertainties.”

That means focusing, for the first time, on Arrigo’s favorite marine creatures, the phytoplankton. The needs of global climate science might mean that these tiniest of plants -and the people who study them — will finally get their turn in the big time.

Climate Change Accounting Goes Public in a Big Way

2_image002Solve Climate reports on a massive electronic billboard displaying the real-time stock of greenhouse gases in the atmosphere, unveiled on 18 June outside New York City’s Penn Station.

The world’s first “Carbon Counter”, launched by Deutsche Bank, will be seen daily by half a million people and millions more can do so online at

The basis for the number displayed on the Carbon Counter – over 3.6 trillion tons and rising by 800 tons per second – is not immediately clear. Deutsche Bank explains the calculation of the figure on its website:

Greenhouse gas concentrations are frequently expressed as an equivalent amount of Carbon Dioxide (CO2). This CO2-equivalent concentration in parts per million (ppm) can then be expressed in terms of metric ton of CO2, a standard of measurement, which as a stock of gases in the atmosphere is readily understood.

According to the IPCC AR4 Synthesis Report, atmospheric CO2 concentrations were 379 ppm in 2005. The estimate of total CO2-eq concentration in 2005 for all long-lived GHGs is about 455ppm.

On June 18th as the counter started, long-lived GHGs in the atmosphere were estimated to be 3.64 trillion metric tons, growing at 2 billion metric tons per month, or 467 ppm, of which CO2 was 385 ppm.

The Carbon Counter, therefore, displays in metric tons the absolute amount of all greenhouse gases in the atmosphere (as opposed to the concentration) but excludes the cooling effect of aerosols.

The use of the absolute amount of greenhouse gases in the atmosphere yields a big number that is rapidly increasing, but it is questionable whether this muddies the already confusing array of units used to explain the rising pressure of greenhouse gases on the atmosphere.

Atmospheric concentrations of carbon dioxide is a simpler and much more widely used unit used to explain the rising pressure of greenhouse gases on the atmosphere, though less dramatic for a real-time billboard aiming to capture the attention of passing commuters.

CO2 Now suggests that atmospheric concentrations of carbon dioxide reached 390.18 ppm in May 2009, up nearly 2ppm from 388.50 ppm in May 2008, the highest level in at least the past 800,000 years.

Related posts:

·         Avoiding confusion for stabilisation targets for climate change and ocean acidification.

Biogeochemists Map Out Carbon Dioxide Emissions In The U.


I stumbled across this great mapping system of CO2 emmisions over at Science Daily. Whilst previous estimates of CO2 levels have been calculated per capita in the US, a new map called ‘Vulcan’ created by biogeochemists at Purdue University shows the top local and regional carbon dioxide producers in high resolution.

In the past, CO2 levels have been calculated based on population, putting the Northeast at the top of the list. Now, a new map called Vulcan reveals for the first time where the top carbon dioxide producers are in the country. The answer surprised Kevin Gurney, Ph.D., a biogeochemist at Purdue University in West Lafayette, Ind.

“There are a lot more emissions in the Southeast than we previously thought, and a lot of that is because it’s not necessarily associated with where people live directly, but actually where industry and activities are,” said Dr. Gurney.

The high-resolution map shows 100 times more detail than ever before and zooms in to show greenhouse gas sources right down to factories, power plants and even roadways. An animated version of Vulcan reveals huge amounts of greenhouse gas gets blown toward the North Atlantic region.

“We’ve never had a map with this much detail and accuracy that everyone can view online,” Dr. Gurney said. (Read more @ Science Daily)

The official website (“The Vulcan Project“) has an amazing Google Earth interface, where you can map the emissions from US power producers, residential and commercial CO2 emissions at 100km2 local scale resolution. Perhaps the most interesting contrast is the maps of residential CO2 emissions when comparing Republican vs Democrat districts. Given the difference in population density between the US and Australia, it’d be interesting to see someone scale this effort to a continental scale, allowing regional comparisons and perspectives on global carbon budgets.

Climate ‘whitewash’ to the rescue


You might think that this idea sounds a little crazy but I urge you to read on.   Here’s the idea:  Everyone paints the roof of their house white and the rate of global warming will be radically reduced!

Insane?  Well, this idea does have some sense and logic to it.  I recently discussed this with my friend Ken Caldeira at Stanford University.  Over some monstrously huge American sandwiches, our discussions eventually came round to the amazing impact that losing the reflectivity of Arctic summer ice would have on the rate of global warming.  A few back of the envelope calculations by Ken soon convinced the people around the lunch table that changing the reflectivity of an even 1% of the Earth’s surface could have a major impact on the amount of trapped radiation solar.  Flip side?  Essentially, losing the albedo of the Artic sea ice is like piling massive amounts of CO2 into the atmosphere.  

Back at Ken’s lab, discussions with Long Cao (one of Ken’s postdoctoral fellows) turned to whether or not one could influence this effect by manipulating the reflectivity of the earth through other means – What about doing a Christo? What about covering large areas with white chalk or mirrors?  What about inventing a highly reflective plant that would spread out across arid areas and change the reflectivity of desert regions?   Interestingly, the quick search of the literature revealed that this latter idea is being actively explored by scientists:

Andy Ridgwell and colleagues at the University of Bristol in England have another idea, one they call bio-geoengineering. Rather than developing infrastructure to help cool the planet, they propose using an existing one: agriculture. Their calculations, published in Current Biology, suggest that by planting crop varieties that reflect more sunlight, summertime cooling of about 2 degrees Fahrenheit could be obtained across central North America and a wide band of Europe and Asia. Plants reflect slightly different amounts of light depending on factors like how waxy the leaves are. Even differences in growth patterns between two varieties of a crop — the way leaves are arranged — can affect reflectivity.  (Read More)

Now, Steven Chu, the Nobel prize-winning physicist appointed by President Obama as Energy Secretary, has proposed that we seriously explore this idea.  Rather than use the rather challenging (from all aspects!) idea of engineering plants, Steven would like to whitewash the world – to initiate a global initiative to change the colour of roofs, roads and pavements so that they reflect more sun?

As a weapon against global warming, it sounds so simple and low-tech that it could not possibly work. But the idea of using millions of buckets of whitewash to avert climate catastrophe has won the backing of one of the world’s most influential scientists.

Steven Chu, the Nobel prize-winning physicist appointed by President Obama as Energy Secretary, wants to paint the world white. A global initiative to change the colour of roofs, roads and pavements so that they reflect more sunlight and heat could play a big part in containing global warming, he said yesterday. By lightening paved surfaces and roofs to the colour of cement, it would be possible to cut carbon emissions by as much as taking all the world’s cars off the roads for 11 years, he said. (Read More)

Not a bad idea at all – and one that would be achievable in a short-period of time. It would work like this: basically, governments would institute the painting of the tops of roofs and buildings (traditionally black tar, slate, or grey colours) white, this would alter the albedo (surface reflectivity of the sun’s radiation), effectively reducing the amount of heat trapped by the Earth’s surface to offset projected increases from global warming.

As a weapon against global warming, this idea sounds so simple and low-tech that it just might work.  It could be used to reduce warming associated risks and buy some important time as we struggle to bring emissions down. As with all of these ideas, however, we must also be cautious not to use them as an excuse for not dealing with the problem of rising atmospheric CO2.  These measures will only offset part of the problem and certainly will be exceeded in time. And, clearly, reducing global temperature in this way will do nothing for problems like those assoociated with ocean acidification.

US ‘global warming bill’ one step closer?


“House Panel Passes Limit on Greenhouse-Gas Emissions” – Washington Post, 22nd May 2009
A bill to create the first national limit on greenhouse-gas emissions was approved by a House committee yesterday after a week of late-night debates that cemented the shift of climate change from rhetorical jousting to a subject of serious, if messy, Washington policymaking.

The legislation would create a cap-and-trade system: Over the next decades, power plants, oil refineries and manufacturers would be required to obtain allowances for the pollution they emit. Those who need more or less could turn to a Wall-Street-like market in the allowances. The 33 to 25 vote was a major victory for House Democrats, who had softened and jury-rigged the bill to reassure manufacturers and utilities — and members of their own party from the South and Midwest — that they would not suffer greatly.

The vote gives this bill more momentum than any previous legislation to reduce greenhouse gases, but it faces hurdles. In the House, Rep. Collin C. Peterson (D-Minn.) has said he wants to take up the bill in his Agriculture Committee, seeking to change rules for those who raise corn for ethanol. The Senate has shot down previous cap-and-trade plans.

President Obama supports the bill, an aide said yesterday, though some provisions are weaker than what he advocated during the presidential campaign. In particular, Obama called for all pollution credits to be auctioned off by the government, but the House bill would give away about 85 percent of them.
(Read more at Washington Post)

More than 50% fossil fuel reductions needed by 2050 to meet 2°C climate target


Less than a quarter of the proven fossil fuel reserves can be burnt and emitted between now and 2050, if global warming is to be limited to two degrees Celsius (2°C), says a new study published in the journal Nature today.

The study has, for the first time, calculated how much greenhouse gas emissions we can pump into the atmosphere between now and 2050, to have a reasonable chance of keeping warming lower than 2°C (above pre-industrial levels) – a goal supported by more than 100 countries (2). We can only emit 1000 billion tonnes of carbon dioxide (CO2) between the years 2000 and 2050. The world has already emitted one third of that in just nine years.

“If we continue burning fossil fuels as we do, we will have exhausted the carbon budget in merely 20 years, and global warming will go well beyond two degrees,” says Malte Meinshausen, lead author of the study and climate researcher at the Potsdam Institute for Climate Impact Research. The three-year research project involved scientists from Germany, the United Kingdom and Switzerland

The study concluded that greenhouse gas emissions must be cut by more than 50 percent by 2050 relative to 1990 levels, if the risk of exceeding 2°C is to be limited to 25 percent.

“Only a fast switch away from fossil fuels will give us a reasonable chance to avoid considerable warming. We shouldn’t forget that a 2°C global mean warming would take us far beyond the natural temperature variations that life on Earth has experienced since we humans have been around,” says Malte Meinshausen. (Link to full story @ Potsdamn Institute for Climate Impact Research)