Understanding the Science: Climate Change Myths, Debunked and Explained

By Skye Hawthorne ‘22

On January 30th, 2019, temperatures in Chicago were almost forty degrees Fahrenheit colder than at McMurdo research station in Antarctica. These temperatures – record-breaking even in a city notorious for its cold – rivaled temperatures at the North Pole, as well as some locations on Mars. 

The cause of this cold snap? A splitting of the “polar vortex,” one of the most poorly understood atmospheric phenomena in terms of its relationship to weather and climate. Essentially, a lack of sunlight near the Arctic Circle in the fall and winter months causes the temperature differential between polar and mid-latitude regions to be markedly larger than it is during the summer, when the days are long and arctic regions experience a “midnight sun.” This causes a jet of rapidly flowing air to encircle the pole, extend miles into the stratosphere, and trap the coldest arctic air where it belongs – in the arctic. However, the Arctic Circle is warming significantly faster than sub-polar areas, in part due to a decrease in albedo (the reflectivity of ice) caused by rapidly melting polar ice. A warmer Arctic weakens the polar jet, making it more susceptible to bending and, as was the case in January, a total split. (Berwyn, Bob) When polar air is no longer trapped over the Pole, many scientists believe that the U.S. experiences more cold snaps. Arctic warming is why the polar vortex, which didn’t split once between 1989 and 1998, has split almost every year since 2014. (Kretschmer, Marlene)

That’s the most widely accepted scientific explanation. 

Another explanation, posted to Twitter by our Commander-in-Chief on January 29th, goes, “People can’t last outside even for minutes. What the hell is going on with Global Waming [sic]? Please come back fast, we need you!”

It doesn’t take a scientist or expert on the subject to see that Trump is not one. But, global warming aside, his assertion is a prominent example of a widely circulated myth that cold snaps are evidence against climate change. 

In this article, we’ll take a look at just a few of the myths that surround the science of climate change, as well as the data points that contextualize, or even disprove, them. 

It’s cold out, so…

This is potentially the most common myth about climate change, and one that has been trotted out by our President more than a hundred times: a cold day, week, or year disproves the idea that the earth is warming. (Matthews, Dylan) But it’s also one of the easiest to debunk. The misunderstanding at the heart of this myth stems from a simple confusion about the difference between weather – the day-to-day fluctuation in temperature, precipitation, and atmospheric conditions in a specific location – and climate – the long-term, repeating patterns in these conditions. 

When scientists talk about “global warming” they are talking about an increase in the global average temperature. This is not a conjecture or hypothesis; this is a quantifiable, observable upward trend in global temperatures since the start of the industrial revolution. It was understood by many scientists as early as the 1950’s, and since the advent of weather satellites in the 60’s and 70’s, doubt within the scientific community about the multi-decadal increase in global temperature has been virtually non-existent. 

But didn’t global warming stop in 1998?

This is another one of the most pervasive myths about climate change, and it’s one that many conservative politicians, most notably Ted Cruz, have used repeatedly in order to debunk the “hoax” of anthropogenic global warming (Elliott, Philip). But again, the answer is quite simple: no, it didn’t. According to the instrumental temperature record, the five hottest years on record are 2016, 2015, 2017, 2018, and 2014, respectively. 1998, an anomalously warm year for its decade on account of being a strong El Niño year, only manages to come in at number 10 on that list. And even the satellite data, cited by climate deniers such as Ted Cruz despite being widely considered less accurate than the instrumental record, shows 2016 to be hotter than 1998. 

The climate has changed before. Why are we conceited enough to think we’re changing it this time?

This is another argument commonly used by President Trump to discredit perceived “alarmism” on the issue of global warming (All Things Considered). Earth’s climate is dynamic; it is always changing, usually over thousands of years. Sedimentary evidence of dozens of ice ages and interglacial periods are preserved in earth’s stratigraphic record. And thanks to ice core research, we have extremely reliable data for global temperature and greenhouse gas concentrations going back 800,000 years. 

It doesn’t take an expert to interpret the data shown on these graphs. Temperature increases in lockstep with greenhouse gases, in particular CO2, a gas whose concentration varies naturally due to volcanic activity, weathering of feldspar, and other natural processes. In fact, CO2 is such a powerful forcing agent for climate that it’s often called “Earth’s thermostat.” This greenhouse effect has been understood since the end of the 19th century. Atmospheric concentrations of CO2 are now well above what they ever were during the 800,000 years prior to industrialization. If CO2 is earth’s thermostat, then we’ve dialed it to its highest setting and ripped the knob clean off. And we’re starting to feel the consequences.

But aren’t we talking, like, a degree of warming? Why should I care? 

This is a common argument by skeptics who claim to be informed on the science, such as British journalist Matt Ridley.  Factually, that statement is true. The average temperature of the last five years was 1.1°C (or around 2°F) warmer than the pre-industrial average. This might not seem like that much. After all, can anybody really tell the difference between a 50-degree day and a 52-degree day? And if they can, is such an increase really so bad? 

Again, we must remember that when scientists talk about such a warming, they are talking about a planetary average. It doesn’t take a huge amount of warming to dramatically affect climate patterns. For instance, Long Island as we now know it was formed from detrital debris that, during the last ice age, was deposited by the southern terminus of a mile-thick glacier which covered much of what is now the contiguous United States. (Jean-Michel, Didier) During this ice age, which ended around 10-12 thousand years ago, global temperatures were, on average, only about 5°C colder than they were right before the start of the industrial revolution. And not only did a mile-thick glacier cover New England, sea levels were – and yes, you’re reading this right – about 120 meters, or 400 feet, lower than they are today. 

The implications even an additional half a degree of planetary warming will be staggering. 

But how can we be so sure global warming will continue? They can’t even predict the weather a few days out! 

This goes back to the difference between weather and climate. Consider the following hypothetical: On New Years Day, I ask you to tell me whether, on February 1st, it will be colder or hotter than it is today. Chances are, you wouldn’t be able to say: February and January have very similar average temperatures in most of the world, and guessing whether February 1st would be warmer or cooler than January 1st would be just that – guesswork. 

But if I ask you whether it would be colder or hotter on June 1st, you would concede instantly that, provided you’re in the Northern Hemisphere, a day in June will almost certainly be warmer than a day in January. This hypothetical might seem silly, but it represents, in a nutshell, the difference between weather forecasting (a field plagued by so much randomness that it becomes essentially useless more than two weeks out) and climate modeling (which contains some of the same randomness but responds with a lot more regularity to factors such as seasons, ocean temperatures, and multi-decadal patterns such as the Pacific Decadal Oscillation). 

As such, you should have at least some confidence in the forecasts of organizations like the IPCC even after you get stranded in a snowstorm your TV meteorologist failed to warn you about. 

-----
There are plenty of other myths about climate change, and if I tried to debunk them all, this article would become a tome. No, Antarctica is not gaining ice, not according to most scientists who study the continent. (IMBIE) Yes, certain meteorological events can be directly attributed to climate change, such as warmer waters causing devastating hurricanes in locations that have never seen them in recorded history. No, there wasn’t a unified consensus of scientists predicting global cooling in the 1970’s – in fact, many scientists were already sounding alarm bells by then about warming. (Hall, Shannon) The point is that the science surrounding climate change becomes more and more clear with every research endeavor and every peer-reviewed publication, and yet pervasive myths remain. But the more we can understand these myths and how they arise, the more we can do to bust them; an informed public is a key piece of affecting political change, and understanding where the disinformation on climate change comes from is an important first step in that process.