At Mauna Loa Observatory in Hawaii, scientists are charting the passage of a milestone that, if ignored, heralds a future for civilization both tragic and chaotic.
For hundreds of thousands of years, prior to the industrial revolution, carbon dioxide - the principal greenhouse gas that holds heat in our atmosphere - never rose above 300 parts per million in the atmosphere. We are now hovering at 400 ppm. The last time Earth’s CO2 was 400 ppm was during the Pliocene Era, some 3 million years ago, when sea levels were 49 to 82 feet higher than they are today and humans did not exist.
Because of the higher level of greenhouse gases, average global temperatures have risen 1.4 degrees Fahrenheit over the past century. That seemingly small increase in temperature is already showing up in the form of more prolonged droughts that reduce crop yields, wildfires intensified by drier conditions, and storms like Sandy becoming more frequent and destructive.
The effects we’re seeing now are a small taste of what’s in store if we continue along our current trajectory and allow temperatures to climb 7F degrees or more by the end of the century. Such a scenario would cause food shortages on a scale resulting in mass starvation, raise the seas to levels that displace hundreds of millions of people living in coastal areas, and make large swaths of populated areas too hot, literally, for humans to tolerate.
But there’s another number equally important as 400 ppm. Actually, it’s a ratio: 1,000 to 1.
That’s the amount of certainty that exists, among scientists who do peer-reviewed research, as to whether climate change is happening and that human activity is the primary driver.
Jim Powell, a science author who served 12 years on the National Science Board, appointed by Presidents Reagan and George H.W. Bush, conducted a study of nearly 13,950 peer-reviewed climate articles published between 1991 and 2012. Only 24 of those articles “clearly reject global warming or endorse a cause other than CO2 emissions for observed warming.”