Why I Remain Uncertain about Global Warming and Climate Change — Part 4 of 4

This is Part 4 of 4 parts, continuing and concluding my elaboration of why I remain uncertain about the science behind the theory of global warming and climate change.

Where are the Error bars?

One of the things that I do know about data, any kind of data, is the concept of an error bar — how precise is the data.

Also known as a margin of error, confidence interval, or standard error. Technically there are differences, but the concept is the same.

My big beef is that it is far too common to see a graph or a number quoted without providing an error bar or specifying the margin of error.

It does make a difference. Especially to me, but not just to me.

Technically, an error bar is a visual representation of the margin of error range that is visually superimposed on a graph of a time series of data, so that you can visually tell when a difference between two or more data values is statistically significant. The margin of error is a numeric value or range rather than the visual representation of that range. That said, I personally will commonly use the two terms interchangeably even though I do realize that there is a distinction.

To be fair, the margin of error is frequently given somewhere, buried down in the details, just not where a mere mortal would see it without searching for it diligently. And all too frequently it is not given.

That’s the beauty of a visual error bar — that it’s right there, in the visual representation of the data, not somewhere that you have to go looking for.

As an example of where the margin of error is needed, NOAA reports the global temperature anomalies for 2014, 2015 and 2016 as 0.74 C, 0.90 C, and 0.94 C respectively. Certainly seems like a clear trend, at least superficially. That data is here:

But, there is no margin of error given for any of the years. Ouch.

And no error bars are displayed on the graph for that time series. Ouch, again.

To be fair, NOAA does give the margin of error on each annual report, but only if you scroll down to the details. They don’t show it on the graphs or in the headline numbers.




Interesting how the margin of error almost doubled in 2016.

What these margins of error tell us is that the temperature in 2015 could have been higher, the same, or lower than 2016. In other words, the trend between those years is not clear or certain.

The gap between 2014 and 2015 is large enough that it would be a gap no matter where the actual data fell in the range of the margins of error. But, 2014 could have been as high as 0.76 C (0.69 C plus 0.07 C) and 2015 could have been as low as 0.82 C (0.90 C minus 0.08 C), a difference of only 0.06 C, which doesn’t seem very much at all to me.

Also note that the temperature anomaly from the 2014 report is 0.69 C, but in the full time series report from 2017 it is listed as 0.74%. In other words, it was upwardly revised by 0.05 C, which is still within the margin of error of 0.07 C.

That’s all for the numeric values of the data, but the error bars themselves, the graphical representation should be superimposed on the graphical representation of the time series for the data.

The bottom line is that if you don’t give me error bars and margins of error with your data, you won’t earn my confidence.

What is global temperature given half the Earth is night and half is day?

Given that by definition half the planet is in darkness while half is in sunlight, how exactly do we define global temperature?

Is there a global maximum and a global minimum and we should just average the two?

Is or should temperature be measured continuously or very periodically during the day and night, so that a full average or mean can be calculated from all of the discrete measurements rather than merely the minimum and maximum?

And given seasonality and distance from the equator, how should length of daylight be taken into account for temperature, both in terms of how it is measured and calculated as well as how we define it?

Given that the planet is continuously rotating on its axis, what consequence is there to the fact that at any given moment there is a different set of points or locations that are in light or darkness?

Is there any specific, solid science that addresses these concerns, or is it simply arbitrary and a matter of opinion or at best judgment? Again, to be clear, I’m looking for science, solid science.

Is there a difference between daytime temperature anomaly and nighttime anomaly?

Even if we do strictly focus on temperature anomalies rather than absolute temperatures and differences between daytime and nighttime temperatures, that simply begs the question of whether there might be a distinct difference between the daytime anomaly and the nighttime anomaly.

Are the two identical (equal), in general?

Are the two clearly distinct, in general?

Are they simply averaged?

Are the two symmetric enough that it can be an equal-weighted average of the two? Or does each need to be weighted by the ratio of hours of daylight and nighttime for each particular day?

Might there be some value in examining the two separately?

Does the cooling of the transition from day to night occur at a comparable rate to the warming of the transition from night to day?

Are there any anomalous effects that occur when a location or area is near-night or near-day?

Are there any anomalous effects near the northern and southern extremes of the day/night transition?

Are there any anomalous effects when the day or night of a location or area becomes a significant majority or even totality of the 24-hour period?

Whether any of these concerns matter or not, who’s to say. What I need to know is whether there is specific, solid science that provides the answers.

How strong or weak is the science?

Can’t Al Gore’s satellite measure the temperature of the whole planet?

Back in 1998, then-Vice President Al Gore proposed a satellite, named Triana, that would orbit the Earth from a million miles away, at the L-1 Lagrange point, so that it could beam images of the full sunny side of the Earth to be displayed on the Internet, 24-hours a day. He announced it in a speech at MIT:

Gore calls for new imaging satellite

Deborah Halber, News Office

March 18, 1998

Vice President Al Gore, in a speech at MIT last Friday, challenged NASA to build a new satellite that would provide continuous, live images of the Earth from space.

The 330-pound, 175-watt small satellite (“Smallsat”) to be developed and launched in two years would provide a view of the planet that would be available, via the Internet, to students, teachers, weather experts and the general public, Mr. Gore said.

“The first sailor on Christopher Columbus’ ship, the Pinta, was named Triana. This satellite would be named for him,” Mr. Gore told about 150 leaders of industry, government, labor and higher education gathered in the Tang Center for the first National Innovation Summit.

Mr. Gore delivered the keynote address at the summit, held at MIT March 12–13 and sponsored by the Council on Competitiveness, a coalition of industry, academia and government representatives aimed at ensuring U.S. competitiveness in a global marketplace.

By 2000, the satellite would continuously transmit a sun-lit view of the entire planet to three ground stations at universities around the globe.

Providing “a clearer view of our own world,” the satellite would bring us to the digital age from the famous “Blue Marble” picture taken in 1972 from Apollo 17, which Mr. Gore said launched a new era of global awareness of the fragility of the planet.

The satellite would orbit from a point where the gravitational pull of the Earth and the Sun cancel each other out, he said. The satellite’s pictures would provide views of changing clouds, developing hurricanes, forest fires and other phenomena as they occur.

Curiously, he made no mention of measuring temperature. Just video.

Initially named Triana, the satellite had a tortured history before finally being launched in 2015 as the Deep Space Climate Observatory (DSCOVR).

Unlike the original mission focused on 24-hour Earth video, the ultimate DSCOVR focus became solar observation and early warning, with Earth imaging considered secondary. As per NOAA:

About the Mission

The Deep Space Climate Observatory, or DSCOVR, will maintain the nation’s real-time solar wind monitoring capabilities which are critical to the accuracy and lead time of NOAA’s space weather alerts and forecasts. Without timely and accurate warnings, space weather events like the geomagnetic storms caused by changes in solar wind have the potential to disrupt nearly every major public infrastructure system, including power grids, telecommunications, aviation and GPS.

DSCOVR will succeed NASA’s Advanced Composition Explore’s (ACE) role in supporting solar wind alerts and warnings from the L1 orbit, the neutral gravity point between the Earth and sun approximately one million miles from Earth. L1 is a good position from which to monitor the sun, because the constant stream of particles from the sun (the solar wind) reaches L1 about an hour before reaching Earth.

From this position, DSCOVR will typically be able to provide 15 to 60 minute warning time before the surge of particles and magnetic field, known as a coronal mass ejection (or CME), associated with a geomagnetic storm reaches Earth. DSCOVR data will also be used to improve predictions of geomagnetic storm impact locations. Our national security and economic well-being, which depend on advanced technologies, are at risk without these advanced warnings.

There was no mention of Earth imaging there. But the more detailed program overview information sheet notes two instruments focused on Earth:

  1. National Institute of Standards and Technology Advanced Radiometer (NISTAR). Measures whole absolute irradiance integrated over the sunlit face of Earth for climate science applications.
  2. Earth Polychromatic Imaging Camera (EPIC). Provides images of the sunlit side of Earth for science applications such as ozone, aerosols and clouds.

EPIC essentially delivers on Gore’s original promise and vision.

I’m not positive, but I think in theory the NISTAR radiometer could provide at least a surrogate measure of the temperature of the sunny side of the Earth, based on measuring radiance. But neither NOAA or NASA offers such a capability at this time.

EPIC images ultraviolent in addition to infrared light, so deducing temperature from radiance may be possible.

From the NASA DSCOVR NISTAR web page:

NASA is flying two Earth science instruments aboard NOAA’s DSCOVR spacecraft. One of them is called the National Institute of Standards and Technology Advanced Radiometer or NISTAR. Basically, NISTAR measures the absolute irradiance over a broad spectrum of the entire sunlit face of Earth. That will tell the instrument Earth’s radiation budget, i.e. if the Earth’s atmosphere is retaining more or less solar energy than it radiates back to space.

If Earth is keeping in more solar energy than it expels, then the Earth will warm. If the Earth and the Earth-system radiates more energy to space than it receives from the sun, the Earth will cool. Absorbed sunlight raises the Earth’s temperature. Emitted radiation or heat lowers the temperature. When absorbed sunlight and emitted heat balance each other, the Earth’s temperature doesn’t change — the radiation budget is in balance.

NISTAR is an active cavity radiometer designed to measure the energy reflected and emitted from the entire sunlit face of the Earth from its orbit around the Lagrangian point 1 (L1). L1 is a neutral gravity point between Earth and the sun. This position offers a unique continuous view of the Earth at from sunrise to sunset.

This measurement will improve our understanding of the effects of changes to Earth’s reflected and emitted radiation (radiance) caused by human activities and natural phenomena. This information can be used for climate science applications.

I still haven’t found any public access to this data, especially temperature, energy, or radiance, other than the EPIC images:

So, the short answer to the headline question here is that DSCOVR doesn’t currently provide a publicly accessible indication of global warming, such as a real-time temperature anomaly. My belief is that technically and theoretically it could, but practically the answer remains no.

Personally, I find it rather shocking that Gore did not include temperature in his original vision and that NOAA and NASA did not include it in the final mission which seems to clearly have enough instrumentation that they could have included it. This leaves me a bit befuddled as to how seriously NOAA and NASA are taking climate science.

Or, maybe this episode simply confirms that science and politics really don’t mix. No surprise there, at least for me.

Even if DSCOVR does have such capabilities or did have them, we would only have a data series barely two and a half years old. Personally, I wouldn’t put much faith in such a short duration of data.

I wonder if they are recording all of the detailed EPIC infrared and NISTAR data so that it could be reevaluated for Earth temperature and energy at some later date if NOAA and NASA should decide for DSCOVR to take on that mission.

Incidentally, the current administration has tried to zero out the budget for the Earth-viewing portions of the DSCOVR mission, but Congress restored funding. Future funding remains uncertain.

Even if all of the theoretical capabilities of DSCOVR are fully exploited, I have some concerns:

  1. The sunlit side of Earth is not strictly North Pole to South Pole due to precession of the orbit, tilting the view based on season. How this would affect any measure of total global temperature is unclear.
  2. Since only half the planet is being viewed at any given moment, what does that mean for modeling the full temperature of the Earth?
  3. Can DSCOVR accurately measure incoming solar energy as well so that it can be compared to what is reflected or radiated from the Earth?

Polar regions

Ice and snow in the polar regions is a special concern for climate monitoring. This includes:

  • Temperature — air and ocean
  • Sea ice extent (area)
  • Sea ice concentration (density of ice floes)
  • Sea ice thickness
  • Sea ice age
  • Glaciers
  • Ice shelves

And we have two polar regions, Arctic and Antarctic.


The climate guys refer to as the polar regions as the cryosphere:

The Third Pole

The mountain region of the Himalayas is now commonly referred to as The Third Pole since it has so much snow and ice.

As per the International Centre for Integrated Mountain Development (ICIMOD):

The Hindu Kush-Himalayan region spans an area of more than 4.3 million square kilometres in Afghanistan, Bangladesh, Bhutan, China, India, Myanmar, Nepal, and Pakistan. The region stores more snow and ice than anywhere else in the world outside the polar regions, giving its name: ’The Third Pole‘. The Third Pole contains the world’s highest mountains, including all 14 peaks above 8,000 metres, is the source of 10 major rivers, and forms a formidable global ecological buffer.

Some other resources:

I include this information here for convenient reference, completeness, and to indicate that I am aware of it.

National Snow and Ice Data Center (NSIDC)

Want data, analysis, and narrative about anything related to snow and ice? The National Snow and Ice Data Center is the place to go. I’ve been following them since 2007.

They date back to the 1957–58 International Geophysical Year (IGY). Their history:

NSIDC has served at the forefront of cryospheric data management practices since 1976; today, we are part of the Cooperative Institute for Research in Environmental Sciences (CIRES) at the University of Colorado at Boulder.

NSIDC’s roots go back to 1957 when a World Data Center (WDC) for Glaciology was established to archive all available glaciological information. The American Geographical Society established the WDC under Director William O. Field. The United States Geological Survey (USGS) Glaciology Project Office operated the WDC between 1971 and 1976 under the direction of Mark F. Meier.

The organization that we today call NSIDC began in 1976, when the USGS transferred responsibility for the WDC to the National Oceanic and Atmospheric Association (NOAA) Data and Information Service, and the center moved to the University of Colorado in Boulder under the direction of Professor Roger G. Barry. In 1982, NOAA created the National Snow and Ice Data Center (NSIDC) as a means to expand the WDC holdings and as a place to archive data from some NOAA programs. In the 1980s and 1990s, support to NSIDC widened with NASA funding for the National Snow and Ice Data Center Distributed Active Archive Center (DAAC) and National Science Foundation (NSF) funding to manage selected Arctic and Antarctic data and metadata.

They aren’t technically part of the government, being run by the University of Colorado in Boulder, but work very closely with NOAA. As their sponsorship web page says:

NSIDC’s research and scientific data management activities are supported by NASA, the National Science Foundation (NSF), the National Oceanic and Atmospheric Administration (NOAA), and other federal agencies, through competitive grants and contracts.

NSIDC is part of the Cooperative Institute for Research in Environmental Sciences at the University of Colorado Boulder.

I used to walk past their building in Boulder years ago when I lived there.

Generally, I am very supportive of their work and I value their data.

Where I have trouble is how others interpret and use (or misuse) their data.

Cryosphere glossary

NSIDC provides a glossary of significant terms relevant to snow and ice in the polar regions:

Arctic and Antarctic are very different

Despite similarities, the Arctic and Antarctic have significant differences.

They have very different geographies and geologies. And very different hydrospheres as well.

The Arctic is a semi-enclosed ocean, almost completely surrounded by land.

But Antarctica is a land mass surrounded by an ocean.

Can’t much more opposite than that.

Temperature is very different as well. In summer, it gets above freezing over the entire Arctic, causing dramatic melting of ice, both sea ice and glaciers (Greenland.)

It does get above freezing along the coast of Antarctica and up on the Antarctic Peninsula, but the core of the continent stays below freezing all year long. In fact, the highest temperature ever recorded at the South Pole was -12.3 C or 9.9 F, well below freezing.

Arctic temperature

I’ve been following arctic temperature for some years now. The Danish Meteorological Institute provides updated data on a daily basis:

They’ve been monitoring Arctic temperature since the International Geophysical Year (IGY) in 1958.

The curious thing about Arctic temperature is that although it is clearly significantly warmer in the winter, there has been no significant increase in temperature during the spring and summer melt season. The average for the full year is warmer, but the spring and summer are not warmer.

In other words, less sea ice seems to be the result of less freezing in the winter than more melting in the summer.

Further confusing the matter is the lack of clarity as to the impact of ocean heat on slowed freezing, as opposed to surface temperature. And whether it is air temperature or sea surface temperature that is more significant or even significant at all.

I am unimpressed by IPCC coverage of these issues. Or coverage by any science organization. I won’t be able to have an significant confidence in the theory of global warming and climate change until all such concerns are fully addressed to my satisfaction.

Arctic sea ice extent

Polar sea ice extent was one of the first areas of actual climate data that I started following after reading Michael Crichton’s State of Fear. He focused on Antarctic sea ice extent, but I focused on both.

Note that sea ice is by definition ice that is frozen seawater, floating over water, and does not include glaciers which are over land. As such, the ice sheet covering Greenland is not included in measurements or estimates of Arctic sea ice. Similarly, the ice sheet covering the continent of Antarctica is glacial ice, so it is not counted as Antarctic sea ice.

I started focusing more heavily on Arctic sea ice extent in 2007 since that was such a bad year from a global warming and climate change perspective. Sea ice melted much more dramatically in 2007 than ever before.

In fact, in the fall of 2007, even scientists were predicting that the Arctic would be ice-free by the summer of 2013. After 2007, nobody doubted them. Not even me.

But, they (scientists!) were wrong. The annual minimum Arctic sea extent in 2008 was well above the level in 2007. Get that? Well above. Sure, a new low, below 2007, was set in 2012, but 2013 was not even close to ice-free. And, no new low has been set since 2012 — that’s five years now without a new low. And the ice is now greater than back in 2007.

Talk about hysteria!

How could they get that prediction so wrong? Welcome to the world of climate science!

And 2009 had even greater sea ice than in 2008. In fact, 2009 didn’t even make the top 10 list for minimum Arctic sea ice extent. See the list below.

I look to the National Snow and Ice Data Center (NSIDC) Arctic Sea Ice News & Analysis web page for data and discussion of Arctic sea ice extent:

Also the NSIDC Sea Ice Index:

The NSIDC September 2017 analysis for sea ice extent gives the ten years with the lowest Arctic sea ice extent, which occurs in September, usually somewhere in the second or third week of the month:

  1. 2012 3.39 million square kilometers
  2. 2016 4.14
  3. 2007 4.15
  4. 2011 4.34
  5. 2015 4.43
  6. 2008 4.59
  7. 2010 4.62
  8. 2017 4.64
  9. 2014 5.03
  10. 2013 5.05

As you can see, 2017 was only the 8th lowest, exceeded only by 2013 and 2014 with more sea ice.

By the way, 2009 came in at 5.10 million square kilometers.

Sorry scientists, but I don’t consider this a clear and obvious trend. There was a clear and obvious trend downwards until 2006, then a sudden plunge in 2007, but then… no clear trend.

Granted, I concede that Arctic sea ice hasn’t recovered to the level of 2006 or earlier, but the trend since 2007, over ten years now, is not distinctly downwards at all.

Yes, the maximum Arctic sea ice extent for 2017 this past winter did indeed set a new record low, so it is curious how we could start the spring melt season with a record low maximum but not set a record minimum sea ice extent in that same year, but that just shows how unpredictable climate really is, despite scientists being so bloody certain that they really understand what is actually going on.

Needless to say, I am not impressed.

But, again, I otherwise have great confidence in the data of the National Snow and Ice Data Center.

Impact of storms on 2012 record low Arctic sea ice extent

In case you might be wondering why a record low of Arctic sea ice extent was reached in 2012 and no new record low since since, I think I have an explanation, specifically a pair of unusual storms that year that may have disrupted ice formation and then exacerbated ice destruction.

In November 2011, about two months into the freezing season, there was an unusual Arctic hurricane, which I think managed to break up a lot of ice that then needed to refreeze.

Then, in August of 2012 there was another storm, that broke up a lot of ice prematurely, making it easier to melt more ice that season.

That’s my conjecture.

The scientists discussed the details of these two storms, but have never connected them to the specific record minimum sea ice extent of 2012.

I would appreciate seeing some scientific discussion of the coincidence of these two storms in a single freeze/melt annual cycle. And whether that odd coincidence should warrant treating the 2012 record low as being statistically significant or an anomaly.

Arctic sea ice concentration

I do briefly note Arctic sea ice concentration on a fairly regular basis, but I don’t focus much attention on it. It is included on the main NSIDC daily news page:

Arctic sea ice extent doesn’t measure only a solid sheet of ice. As chunks of ice break off, forming ice floes, NSIDC uses a threshold of 15% for a density of floes that they consider part of the Arctic sea ice extent.

A concentration of 50% means that half of the a given area was open water. 20% means 80% open water.

They have a nice graphic that shows the concentration of sea ice as a gradient of blue that goes up to white for 100% (unbroken) sea ice and darker blue for 100% open water.


These are areas of open water surrounded by solid ice.

Whether they have any scientific significance beyond simply counting as open water rather than sea ice is unclear, to me.

I include this information here for convenient reference, completeness, and to indicate that I am aware of it.

Melt ponds

These are pools of water from melted snow sitting on top of solid ice.

A few years ago activists got really excited when a web camera near the North Pole showed nothing but open water, but this was in fact simply a melt pond. A few days later the ice cracked, all the melt pond water drained out, and then the web camera showed solid ice again.

Whether they have any scientific significance in the theory of global warming and climate change is unclear, to me.

I include this information here for convenient reference, completeness, and to indicate that I am aware of it.

Arctic sea ice thickness

Extent is only one measure of Arctic sea ice, although it is the most common and popular, being the most directly visible. Thickness is an important measure as well.

Unfortunately, there doesn’t appear to be any prominent, comprehensive, accessible record for Arctic sea ice thickness.

I see occasional reports, but no clearly identified data series as NSIDC provides for extent.

I did find some interesting data on CryoSat satellite-based radar measurement of Arctic sea ice extent from the Center for Polar Observation and Modelling (CPOM) at the Mullard Space Science Laboratory of the University College London in the UK:

The National Snow and Ice Data Center (NSIDC) began reporting more earnestly on ice thickness in 2015:

But two years is not enough of a record to grant the data any statistical significance or to make any firm judgment about any trend, emerging or historic.

CPOM has a nice map of Arctic sea ice thickness for the winter season of 2016/2017.

They also have a graph showing monthly sea ice thickness. I didn’t sense a strong, clear, and obvious trend.

How is Sea Ice Thickness derived from CryoSat? As per CPOM:

CryoSat uses a synthetic aperture radar to sample a much smaller footprint than previous satellite radar altimeters. This allows us to discriminate between sea ice floes and ocean leads with confidence which, in turn, means we can measure their shape.

From this, we can compute sea ice thickness and volume using independent measurements of the ice age, density, and snow loading.

This measurement technique works in autumn, winter and spring. In summer, melt ponds prevent us from estimating sea ice thickness

As far as I can tell, their data begins in the fall of 2010, so that their record in barely eight years long. To be sure, they are providing interesting data, but the record is still far too short to be drawing an firm conclusions concerning global warming and climate change or trends in general.

Prior to CPOM’s robust Cryosat efforts, ice thickness was a little bit too anecdotal for my taste. NSIDC had this to say about previous efforts in 2015, when they introduced CPOM’s efforts:

Data from new sensors, combined with older sources, are providing a more complete picture of ice thickness changes across the Arctic. In a recently published paper, R. Lindsay and A. Schweiger provide a longer-term view of ice thickness, compiling a variety of subsurface, aircraft, and satellite observations. They found that ice thickness over the central Arctic Ocean has declined from an average of 3.59 meters (11.78 feet) to only 1.25 meters (4.10 feet), a reduction of 65% over the period 1975 to 2012.

Sorry, but “a variety of subsurface, aircraft, and satellite observations” just doesn’t cut t for me. I want to see a very clear, comprehensive, consistent, and long-term data series.

Concerning CPOM’s efforts, that same NSIDC announcement had this to say:

In addition, near-real-time thickness data from the European Space Agency’s CryoSat-2 satellite are now available from the Centre for Polar Observation and Modelling at the University College London. The spatial pattern of ice thickness in spring is a key factor in the evolution of sea ice through the Arctic summer, and CryoSat-2 data bring the promise of regular sea ice thickness monitoring over most of the Arctic Ocean.

The data indicate that Arctic sea ice thickness in the spring of 2015 is about 25 centimeters (10 inches) thicker than in 2013. Ice more than 3.5 meters (11.5 feet) thick is found off the coast of Greenland and the Canadian Archipelago, and scattered regions of 3-meter (10 feet) thick ice extend across the Beaufort and Chukchi seas. Elsewhere, most of the ice is 1.5 to 2.0 meters (4.9 to 6.6 feet) thick, typical for first-year ice at the end of winter.

Unfortunately, NSIDC hasn’t reported sea ice thickness regularly, which is what I expect to see for a data series to determine trends. It seems to me that the problem is that ice thickness is too new an area of research and not quite ready for prime time operational climate monitoring.

Arctic sea ice thickness is clearly in a stay tuned category, and clearly not a settled science category. So, I have to remain in the not yet convinced category until there is a more substantial record available for review. How long? Personally, I’d insist on at least 15–20 years, although I’d be content if we insisted on having over 30 years to have some sense that we are looking at a true climate record rather than short-term weather-related variability.

Arctic sea ice movement

The ocean flows in the Arctic, causing Arctic sea ice to be constantly in motion, especially in the summer months.

This means that there is no stability for measurements of ice in a given location from year to year.

This makes it difficult to track characteristics such as ice thickness and ice age.

So, by definition there is no North Pole ice from year to year. The pole stays where it is, of course, but each year some different patch of ice will transit over the pole.

I haven’t seen any science from NOAA, NASA, IPCC, or any other scientific organization concerning the scientific significance of Arctic ice motion.

Arctic sea ice age

Note: As used in this section, ice age means age of ice, not an ice age.

Ice age is measured in years.

A significant amount of Arctic sea ice melts every year.

When the ice refreezes in the winter, new ice that forms over open water is called first-year ice, which tends to be only one to two meters thick, or even less.

There are two stages of ice on the path to first-year ice:

  1. Nila — four-inch or less, thin ice that appears dark when thin.
  2. Young ice — four-inch to one-foot, thicker ice, earlier in the fall, that hasn’t had enough time to freeze more fully.

Very commonly this first-year ice tends to be the first to melt in the next melt season, but not all first-year ice actually melts in the next melt season.

Less commonly, first-year ice that doesn’t fully melt accumulates additional thickness in its second and subsequent years.

Sea ice thickness is generally used a surrogate for ice age. The Center for Polar Observation and Modelling (CPOM) at the Mullard Space Science Laboratory of the University College London in the UK provides data on Arctic sea ice thickness:

Unfortunately, for whatever reasons, NSIDC doesn’t provide a regular report of Arctic sea ice age. Here’s a report from March of 2012, before the melt season got underway:

NSIDC has this interactive graph, but it only goes from 1985 through 2014:

It shows that only a tiny fraction of Arctic sea ice is over five years old and about 25% of ice that is 2–4 years old, so almost three-quarters of the ice is first-year ice.

Curiously, NSIDC provided more sea ice age coverage in past years, such as this report from April 2008, which includes a significant amount of 4, 5, and 6-year ice:

So, scientists have some very interesting data, but not enough to satisfy my interests in a long-term, full, and consistent data series.

I need to see Arctic sea ice age data given more substantial and regular coverage.

Antarctic sea ice extent

Shortly after reading Michael Crichton’s book State of Fear I did a little research of my own and located the National Snow and Ice Data Center (NSIDC) and their regular reports on Antarctic sea ice extent:

Scroll down past the Arctic narrative to read about Antarctic sea ice. Or click on the Antarctic daily images button to see the extent and concentration maps as well as the extent graph for the year.

Also see the NSIDC Sea Ice Index:

Back when I was following this data series very closely, as well as the data included in Crichton’s book, Antarctic sea ice extent was setting new record maximums — not minimums — every year. It got to the point where I stopped monitoring this data series since it doesn’t seem to show much evidence of global warming. Instead, I focused my energy on Arctic sea ice extent, which had a lot more volatility and at least the appearance of impact of global warming.

I probably stopped monitoring Antarctic sea ice extent regularly in around 2012, about five years ago.

In fact, 2012, 2013, and 2014 each set new record highs for maximum Antarctic sea ice extent, as noted in this press release from NASA:

But, as that report notes, in 2015 Antarctic sea ice reverted to average levels.

And this year in 2017, the Antarctic sea ice extent maximum is at its lowest in the satellite record (since 1979.)

Why Antarctic Sea Ice Crashed in 2017

After Antarctic sea ice grew so fast for so many years, in 2017 people are now confused why it started to decline so sharply for the past three years.

This coverage from Gizmodo:

This Is Why Antarctic Sea Ice Crashed This Year

July 2017

…the sea ice crash coincided with a series of remarkable weather anomalies and storms — beginning, in September, with an extreme low pressure center in the Amundsen sea off the coast of West Antarctica. In October, strong atmospheric Rossby waves brought additional heat toward the south pole, triggering ice loss in the Ross Sea and Indian Ocean. By November, the Weddell Sea was shedding 30,000 square miles of ice — roughly the area of South Carolina — each day.

There’s no indication this is anything but just natural variability,” lead study author John Turner said in a statement. “It highlights the fact that the climate of the Antarctic is incredibly variable.”

This coverage from Nature:

Solve Antarctica’s sea-ice puzzle

19 July 2017

John Turner and Josefino Comiso call for a coordinated push to crack the baffling rise and fall of sea ice around Antarctica.

Different stories are unfolding at the two poles of our planet. In the Arctic, more than half of the summer sea ice has disappeared since the late 1970s. The steady decline is what global climate models predict for a warming world. Meanwhile, in Antarctic waters, sea-ice cover has been stable, and even increasing, for decades. Record maxima were recorded in 2012, 2013 and 2014.

So it came as a surprise to scientists when on 1 March 2017, Antarctic sea-ice cover shrank to a historic low. Its extent was the smallest observed since satellite monitoring began in 1978 — at about 2 million square kilometres, or 27% below the mean annual minimum

Researchers are struggling to understand these stark differences. Why do Antarctica’s marked regional and seasonal patterns of sea-ice change differ from the more uniform decline seen around most of the Arctic? Why has Antarctica managed to keep its sea ice until now? Is the 2017 Antarctic decline a brief anomaly or the start of a longer-term shift? Is sea-ice cover more variable than we thought? Pressingly, why do even the most highly-rated climate models have Antarctic sea ice decreasing rather than increasing in recent decades? We need to know whether crucial interactions and feedbacks between the atmosphere, ocean and sea ice are missing from the models, and to what extent human influences are implicated.

Better representations of the Southern Ocean and its sea ice must now be a priority for modelling centres, which have been focused on simulating the loss of Arctic sea ice. Such models will be crucial to the next assessment of the Intergovernmental Panel on Climate Change, which is due around 2020–21. A good example of the collaborative projects needed is the Great Antarctic Climate Hack (see go.nature.com/2ttpzcd). This brings together diverse communities with an interest in Antarctic climate to assess the performance of models.

In short, the much-vaunted climate scientists actually don’t have a clue.

Okay, to be fair, they do have plenty of clues and even plenty of theories, but they just don’t have solid, settled science.

So much for their models when I read statements such as:

Pressingly, why do even the most highly-rated climate models have Antarctic sea ice decreasing rather than increasing in recent decades?

Curious phrase there: “the most highly-rated climate models.”

And people wonder why I lack confidence in climate science!

The essential problem here as illustrated by this particular episode is that so much of the theory of global warming is far too regular and mechanistic, without acknowledging the combination of natural variability and that the climate is a complex adaptive system (CAS). For whatever reason, a fair fraction of scientists are simply unable to accept those two realities.

The other problem here is this bizarre penchant for some scientists and plenty of environmentalists and their friends in the media to be all too overeager to leap to outrageous, over-the-top, exaggerated narrative for what may simply be statistical aberrations or outlier events. They want and desperately need the event to mean something a lot more socially significant than being merely a data point in a data series.

Antarctic geography

Before we get to discussing ice shelves in Antarctica, we need to briefly review the geography of the continent.

First, the directions get really weird in Antarctica. Most countries or even continents have clear northern, southern, eastern, and western regions. But being at the South Pole, Antarctica is completely different.

In Antarctica, south can mean only the South Pole itself, in the middle of the continent. No side or coast of the continent is south.

In Antarctica, every point on the coast is by definition north of the middle of the continent. Got it? The entire coastline is north!

Nominally, even east and west are meaningless in Antarctica. About all you can technically say is that if you are facing inland towards the south, with your back to the sea (north), east is to your left and west is to your right, and that is true for all points along the coast. An exception is the arm of the continent, the Antarctic Peninsula, which we’ll come to shortly.

The good news is that there is a rough but arbitrary distinction of east vs. west, based on longitude, roughly lining up with South America, so that east of a line drawn from the tip of South America, Cape Horn, through the South Pole, is considered East Antarctica. West of that line is considered West Antarctica.

East Antarctica borders the South Atlantic Ocean, or at least the portion of the South Ocean that abuts the South Atlantic Ocean.

Technically, the Atlantic, Pacific, and Indian oceans don’t extend to the coast of Antarctica. Below the 60-degree south latitude, each of the three oceans becomes the South Ocean, also known as the Antarctic Ocean.

West Antarctica borders the South Pacific Ocean, or at least the portion of the South Ocean that abuts the South Pacific Ocean.

The opposite coast of Antarctica from South America borders the Indian Ocean.

The main body of Antarctica is roughly circular, but a little more pear-shaped, with the greater part on the Indian Ocean side.

But the really interesting aspect of the geography of Antarctica is that the top portion of the pear has a stem or peninsula called the Antarctic Peninsula, which extends well north from the main body of the continent, so that its tip is much closer to the southern tip of South America than the South pole.

This peninsula has several consequences:

  1. It’s weather is a bit warmer than the rest of the continent.
  2. It’s coastal areas are more exposed to warmer ocean currents.
  3. This is where a lot of the major ice shelves are, notably those that appear heavily in the news related to global warming, such as Larsen.

When you read about record high temperatures in Antarctica, they are typically at the northern end of the Antarctic Peninsula.

Ditto for temperatures above freezing — they never occur far away from the coast of the main body of the continent, but fairly frequently occur on the northern reaches of the peninsula.

I include this information here for convenient reference, completeness, and to indicate that I am aware of it.

Antarctic temperature

Unlike the Arctic,which has a nice and long temperature data series dating back to 1958, I haven’t been able to identify a comparable data series for Antarctica. There is data, but so much of it is anecdotal or for specific locations, without any model for combining it into a single data series. Or maybe there is, but I simply haven’t found it.

There is a Wikipedia page for climate of Antarctica:

The main takeaway from the discussion there is that there is a lot of difference in temperature based on what part of Antarctica you are talking about. The coasts and the Antarctic Peninsula are significantly warmer than the interior of the continent.

I did find some temperature data on the website for the Scientific Committee on Antarctic Research (SCAR), but it was for individual stations rather than an integrated model:

For example, temperature data for Vostok station:

They have temperature data back to 1958, the famed International Geophysical Year.

I include this information here for convenient reference, completeness, and to indicate that I am aware of it.

The Antarctic ice shelf

I see quite a few media stories that refer to the Antarctic ice shelf, as if there were one or it was all one, but there are a number of distinct ice shelves around Antarctica, each with its own distinct character, so it is neither possible nor reasonable to paint all of them with one brush.

There is also sea ice all around the coast of Antarctica, but sea ice and ice shelf are not the same thing.

Sea ice forms from the freezing of seawater, while an ice shelf is simply the ice tongue of a glacier that has moved to the coastline and is now moving out over the water.

If you do see a reference to the Antarctic ice shelf, it will typically mean either:

  1. All of the Antarctic ice shelves collectively.
  2. The Antarctic ice sheet. See below.
  3. The Antarctic ice shelves plus sea ice. Any ice that is over water along the coast.
  4. All three of ice shelves, sea ice, and ice sheet.

Yes, it can definitely be confusing.

The Antarctic ice sheet

In contrast to ice shelves, an ice sheet is simply a very large area of land that is covered with glacial ice and snow. Greenland and Antarctica have the only two ice sheets in the modern, non-ice age, world.

The NSIDC has a nice glossary of terms related to glaciers:

Antarctic ice shelves

Wikipedia gives a list of the major Antarctic ice shelves (ice tongues of glaciers that are moving out over the water.) It is a lengthy list (44), but the major ones are:

  1. Ross (472,960 sq km)
  2. Filchner-Ronne (422,420 sq km)
  3. Amery (62,620 sq km)
  4. Larsen (48,600 sq km)
  5. Riiser-Larsen (48,180 sq km)
  6. Fimbul (41,060 sq km)
  7. Shackleton (33,820 sq km)
  8. George VI (23,880 sq km)
  9. West (16,370 sq km)
  10. Wilkins (13,680 sq km)

The Larsen ice shelf, which has been much in the news lately, is actually seven distinct shelves or embayments or segments, referred to as A, B, C, D, E, F, and G. The recent news concerned Larsen C.

Antarctic glaciers

The Wikipedia has a list of the many glaciers in Antarctica. It’s quite a long list, 538 by my count.

By definition, glaciers are ice over land. Although in some cases the glacial ice has moved over frozen lakes, called subglacial lakes, such as Lake Vostok:

Melting of the glaciers in Antarctica?

Despite the impression given in the media, the main bodies of the glaciers of Antarctica are not melting.

Any melting is occurring on the ice shelves or immediate coastal regions rather than the main inland bodies of the glaciers.

To be crystal clear, all of this melting is occurring in only the coastal regions, where the glaciers meet the ocean, no more than about 75 miles inland from the coast.

Here’s a typical media account that uses provocative language:

New Discovery in Antarctica Suggests Ice Sheets Could Disappear Way Faster Than Previously Thought

Justin Worland

Apr 19, 2017

Antarctica’s ice may melt faster than previously thought as result of a newly discovered network of lakes and streams that destabilize the continent’s ice shelves, according to new research — making them more vulnerable to collapse.

Scientists have long understood that water from melted ice harm ice sheets by flowing into cracks and refreezing, but that phenomenon was thought to be limited to a small part of the continent. Researchers behind a new study published in the journal Nature this week found that the process has been ongoing for decades and actually occurs across the continent including in places where scientists did not think liquid water was commonly found. The pace of the damage will increase as temperatures continue to rise as a result of man-made global warming.

If you check the actual research paper itself, it makes clear reference to ice shelves, not the main ice sheet that covers the continent itself. As discussed above, ice shelves are not the same as the ice sheet.

The paper in Nature:

Widespread movement of meltwater onto and across Antarctic ice shelves

Jonathan Kingslake, Jeremy C. Ely, Indrani Das & Robin E. Bell

Nature 544, 349–352 (20 April 2017) doi:10.1038/nature22049

Received 02 December 2016 Accepted 08 March 2017 Published online 19 April 2017

Surface meltwater drains across ice sheets, forming melt ponds that can trigger ice-shelf collapse, acceleration of grounded ice flow and increased sea-level rise. Numerical models of the Antarctic Ice Sheet that incorporate meltwater’s impact on ice shelves, but ignore the movement of water across the ice surface, predict a metre of global sea-level rise this century in response to atmospheric warming. To understand the impact of water moving across the ice surface a broad quantification of surface meltwater and its drainage is needed. Yet, despite extensive research in Greenland and observations of individual drainage systems in Antarctica, we have little understanding of Antarctic-wide surface hydrology or how it will evolve. Here we show widespread drainage of meltwater across the surface of the ice sheet through surface streams and ponds (hereafter ‘surface drainage’) as far south as 85° S and as high as 1,300 metres above sea level. Our findings are based on satellite imagery from 1973 onwards and aerial photography from 1947 onwards. Surface drainage has persisted for decades, transporting water up to 120 kilometres from grounded ice onto and across ice shelves, feeding vast melt ponds up to 80 kilometres long. Large-scale surface drainage could deliver water to areas of ice shelves vulnerable to collapse, as melt rates increase this century. While Antarctic surface melt ponds are relatively well documented on some ice shelves, we have discovered that ponds often form part of widespread, large-scale surface drainage systems. In a warming climate, enhanced surface drainage could accelerate future ice-mass loss from Antarctic, potentially via positive feedbacks between the extent of exposed rock, melting and thinning of the ice sheet.

I can’t check the full paper or fine detail of its graphics since it is hidden behind a paywall, but from the detail I can view, the study is actually limited to the coastal regions, not the deep interior of the continent. So, the references to across the ice sheet is very misleading.

It appears that the paper authors are using the term sheet a bit differently that the NSIDC glossary. The authors are using it to refer to the ice shelves, rather than the main body of the true Antarctic ice sheet that lies mostly well in from the coast.

The paper notes water flows of “ transporting water up to 120 kilometres from grounded ice onto and across ice shelves”, which tells us that the study was limited to no more than 75 miles from the coast. And the paper is explicit there, referring to ice shelves, not the main ice sheet.

The paper states “Here we show widespread drainage of meltwater across the surface of the ice sheet”, but once again I will ding the authors for referring to “the ice sheet”, when they are primarily referring to either the shelves or at best the fringe coastal portion of the sheet rather than the vast bulk of the main body of the continental ice sheet.

I also have to ding the authors for their use of the term “widespread” since once again their study did not involve the vast bulk of the continental ice sheet.

So, even if there is melting out on the ice shelves, that does not mean that the main bodies of the glaciers themselves or the massive continental ice sheet (beyond the narrow coastal regions) are melting.

As even I will concede, there is melting of grounded ice (glaciers) within the coastal region (up to 75 miles from the coast.) But it remains too cold for melting of glaciers further inland than that.

So, a much clearer statement would be to say that the fringes or coastal glacial ice is melting. But, going beyond that is very misleading.

So much for science communicators!

My main point here is not that the climate science is wrong or that I am disproving the theory of global warming and climate change, but simply that the combination of weaknesses or limitations in the science combined with exaggeration by the media make it difficult for me to accept the theory, as offered. Who knows — as the science improves and the media gets real, maybe someday I will find the theory credible, but we’re not there yet.

Antarctic glacial movement and ice shelf instability

Glaciers, by their nature, are constantly moving. The continued accumulation of snow increases their mass and then the force of gravity pushes them forwards and downwards.

When glaciers front on a body of water, that constant movement pushes the ice out over the water, in something called an ice tongue which in Antarctica takes the form of an ice shelf.

If the temperature is too warm, the front edge of the glacier will break off, calving into icebergs.

At some point the ice shelf will extend so far out into the water that it can become unstable and even crack and break free, calving a massive iceberg, as happened with the Larsen C ice shelf in July 2017.

But just because an ice shelf does calve does not automatically mean that this instability is somehow unnatural. It doesn’t somehow automatically prove that global warming is the cause or even that human activity is the cause.

Sure, seawater can also melt the ice shelf from below, but that is always the case, regardless of global warming. Note that the temperature at the upper portion of the Antarctic Peninsula can get above freezing, which can cause melting, regardless of any impact from global warming.

Was the Larsen C iceberg of 2017 evidence of global warming?

The jury is out whether the calving of the massive iceberg from the Larsen C ice shelf in 2017 is evidence of global warming.

Such calving is not necessarily caused by or evidence of global warming. In truth, even the best scientists do not know either way with any significant degree of certainty.

What’s more, it’s far from clear that this event is primarily the product of climate change. Icebergs have been breaking off of ice shelves for millions of years. Scientists broadly accept that climate change played a role in the disintegration of Larsen A and B. But natural variability may be sufficient to explain the mere splitting off of a large chunk of Larsen C. Adrian Luckman, professor of glaciology at Swansea University, told The Guardian that recent data suggests most of the shelf’s ice has actually been thickening.


Why is part of the ice shelf going to break off and collapse into the ocean? Since large calving events are so rare, and since our measurements in and around ice shelves don’t go back in time far enough, it’s hard to say whether this is a natural progression, variability, or a result of human activity (or more likely a mixture of both). One reason may be human-caused warming, which has led to melting from both above and below in nearby areas and is widely accepted to have contributed to the disintegration of nearby Larsen A and Larsen B. The Western Antarctic (the parts south of the US) is warming quite quickly, faster than most of the planet. In addition, warmer waters can reach underneath the ice shelf and can melt it from below.

That being said, there are vigorous discussions within the scientific community about how much, if any of this can be attributed to humans. Some scientists think there is strong connection; others are much less sure and see little or no evidence that humans are the cause. From my vantage point, part of this relates to our limited ability to measure what’s going on, and part of this is a common sticking point of whether an absence of evidence is evidence of an absence.

That’s a great summary of the state of climate science as of 2017. Opinions are quite strong, but the underlying science is not as strong.

Katabatic wind keeps the Antarctic frozen solid

Incidentally, the main reason it doesn’t get cold enough very far from the coast for the ice sheet to melt in Antarctica is the concept of katabatic wind — the cold air in the center of the continent is denser and flows outwards (northward) towards the coast due to gravity, pushing out any warmer air from the ocean.


I’ll admit that Greenland is well-worth studying from the perspective of global warming and climate change, but I simply haven’t had the extra time needed to delve into the actual science yet. Yes, I’ve seen many media reports and anecdotal evidence, but that doesn’t constitute science. I’ve been too busy on the Arctic overall. Greenland, with its massive ice sheet is very different from the oceanic portions of the Arctic, Antarctica, and climate science in general.

Some day I’ll get around to studying Greenland, but it hasn’t happened yet.

Media coverage? Yes, I’ve seen all the media coverage and general, rhetorical statements about Greenland, but I myself can’t make any meaningful statements about Greenland until I am personally able to delve into the actual science.

Science communicators? They have no credibility per se. The best they can do is draw my attention to a subject; only after I delve into the science myself can I say or believe anything either specifically or generally.

Role of black carbon

To be honest, I haven’t looked very deeply into the role of black carbon in global warming and climate change.

One of the main reasons I haven’t given it attention is that the media never gave it any real attention. And Gore’s movie didn’t even mention it. So, it’s a relatively new concern, unless you are a scientist on the inside.

I have seen a number of anecdotal media reports over the past two years, but nothing that has led me to believe that black carbon is settled science.

There are two areas of interest for black carbon:

  1. It’s role in global warming itself, as a causal agent.
  2. It’s role in climate change as a causal agent.

One of the areas I have seen cursory coverage of is black carbon from shipping in the Arctic being deposited on snow and ice, causing heating and melting from absorption of sunlight.

Black carbon is a particulate, commonly known as soot:

Given my own lack of knowledge in this area, I cannot raise much in the way of questions or concerns at this time. But, my lack of knowledge also means that I cannot express confidence that the theory of global warming and climate change adequately takes black carbon into account.

I’ve also seen some mention of black carbon in the IPCC assessment reports, such as the claim that black carbon on snow and ice can reduce the albedo of the surface, so that more solar energy is absorbed rather than being reflected.

All of that said, there is also very real concern over black carbon from a health perspective. It is definitely not good for your lungs. As such, my proposed energy policy would seek to reduce consumption of fuels that cause health effect due to poor local air quality.

Geothermal heat

One of the big question marks in my head is the role of geothermal heat on the temperature of the surface, especially the oceans, and its role in climate. I just don’t find enough discussion or research on the matter to convince me that the climate scientists have considered the matter comprehensively enough.

The deep interior of the earth certainly causes at least some amount of heat to rise to the Earth’s crust.

If not for the atmosphere and greenhouse effect all of this heat would be radiated out into space.

But we do have an atmosphere, we do have oceans, and we do have a natural greenhouse effect even before any global warming is taken into account.

So, what exactly happens to all of this rising geothermal heat (energy)?

What are the pathways for energy flow?

What is the math for how the energy flows accumulate, balance out, or dissipate?

Again, I am not asserting that my questions disprove the theory of global warming and climate change, but simply that with so many unanswered questions, I cannot accept the existing theory as credible. The theory may be true, but the uncertainty means that it isn’t necessarily true.

Emissions vs. measurements of atmosphere

I’m still not completely sure why climate scientists are so obsessively focused on emissions of greenhouse gases rather than simply measuring the actual amounts of the various greenhouse gases present in the atmosphere.

My suspicion is that they are simply trying to develop a model for forecasting future emissions to predict future warming, but who knows for sure.

First, they are making estimates of emissions, not actually measuring emissions. This seems like the opposite of what a scientist should want to do.

They have absolutely no sense of certainty as to what emissions really are.

And they have absolutely no way to empirically validate their estimates. Sure, they can measure the methane in the atmosphere, but that just gives the net amount retained and doesn’t directly infer the amount emitted or the amount removed from the atmosphere.

It seems to me that it is policy people, like government officials who are more interested in emissions, but I would like to the the underlying science get settled rather than have scientists get distracted by mere policy issues.

It seems like every year or even every few months there is yet another report about dramatic changes to estimates of greenhouse gas emissions. Not that actual emissions are changing but the theories and hypotheses of how and where emissions are occurring are evolving. Such as this report from October 2016:

Study finds fossil fuel methane emissions greater than previously estimated

But energy development is not responsible for global methane uptick

October 5, 2016 — Methane emissions from fossil fuel development around the world are up to 60 percent greater than estimated by previous studies, according to new research led by scientists from NOAA and CIRES. The study found that fossil fuel activities contribute between 132 million and 165 million tons of the 623 million tons of methane emitted by all sources every year. That’s about 20 to 25 percent of total global methane emissions, and 20 to 60 percent more than previous studies estimated.

Or this paper from September 2017:

Revised methane emissions factors and spatially distributed annual carbon fluxes for global livestock

Julie Wolf, Ghassem R. Asrar and Tristram O. West

Livestock play an important role in carbon cycling through consumption of biomass and emissions of methane. Recent research suggests that existing bottom-up inventories of livestock methane emissions in the US, such as those made using 2006 IPCC Tier 1 livestock emissions factors, are too low. This may be due to outdated information used to develop these emissions factors. In this study, we update information for cattle and swine by region, based on reported recent changes in animal body mass, feed quality and quantity, milk productivity, and management of animals and manure. We then use this updated information to calculate new livestock methane emissions factors for enteric fermentation in cattle, and for manure management in cattle and swine.

Using the new emissions factors, we estimate global livestock emissions of 119.1 ± 18.2 Tg methane in 2011; this quantity is 11% greater than that obtained using the IPCC 2006 emissions factors, encompassing an 8.4% increase in enteric fermentation methane, a 36.7% increase in manure management methane, and notable variability among regions and sources. For example, revised manure management methane emissions for 2011 in the US increased by 71.8%.

So much for settled science.

I’ll stick with measured gases in the atmosphere.

NASA Orbiting Carbon Observatory-2 (OCO-2) satellite

NASA launched the Orbiting Carbon Observatory-2 (OCO-2) satellite in 2014. It measures carbon dioxide concentrations and distributions in the atmosphere.

How does it work? As per the Wikipedia:

The OCO-2 satellite was built by Orbital Sciences Corporation, based around the LEOStar-2 bus.[4] The spacecraft is being used to study carbon dioxide concentrations and distributions in the atmosphere.

Rather than directly measuring concentrations of carbon dioxide in the atmosphere, OCO-2 records how much sunlight is reflected off the CO2 molecules in an air column. OCO-2 makes measurements in three different spectral bands over four to eight different footprints of approximately 1.29 km × 2.25 km (0.80 mi × 1.40 mi) each. About 24 soundings are collected per second while in sunlight and over 10% of these are sufficiently cloud free for further analysis. One spectral band is used for column measurements of oxygen (A-band 0.765 microns), and two are used for column measurements of carbon dioxide (weak band 1.61 microns, strong band 2.06 microns).

In the retrieval algorithm measurements from the three bands are combined to yield column-averaged dry-air mole fractions of carbon dioxide. Because these are dry-air mole fractions, these measurements do not change with water content or surface pressure. Because the molecular oxygen content of the atmosphere (i.e. excluding the oxygen in water vapour) is well known to be 20.95%, oxygen is used as a measure of the total dry air column. To ensure these measurements are traceable to the World Meteorological Organization, OCO-2 measurements are carefully compared with measurements by the Total Carbon Column Observing Network (TCCON).

I haven’t delved deeply into OCO and its capabilities, limitations, and actual data yet, so I am not in a position to critique it in any detail, but until I can delve into it more deeply it won’t be able to improve my level of confidence in the theory of global warming and climate change.

This is a very interesting project, but has not been in operation long enough to produce any data series of sufficient length to be considered long-term monitoring. It’s a good start, but that’s the problem — it’s just a start. I’d want to see at least a decade or even fifteen years of data before paying too much attention to it.

One interesting question that this satellite mission does raise is how well-mixed carbon dioxide really is in the atmosphere. Carbon dioxide has traditionally been considered a well-mixed greenhouse gas (WMGHG), but if the data from this satellite is showing dramatic spikes in significant areas of the world, and variable concentration, then well-mixing is called into question. The simple fact that the whole point of this satellite is to “study carbon dioxide concentrations and distributions in the atmosphere” suggests that mixing is not as uniform as even the IPCC has asserted.

A recent report on spikes of carbon dioxide in specific areas:

NASA’s latest carbon dioxide-mapping satellite has detected a dramatic spike in the amount of the greenhouse gas in the atmosphere, measuring the largest annual increase Earth has seen in at least 2,000 years. The cause? Overheating of three major tropical forest regions across the globe.

NASA’s Orbiting Carbon Observatory (OCO-2), is one of several satellites that collect greenhouse gas emissions data, and researcher Junjie Liu of NASA’s Jet Propulsion Laboratory (JPL) in Pasadena, California, used this probe’s data to uncover how much — or in this case, how little — carbon dioxide (CO2) was absorbed out of the atmosphere by Earth’s tropical forests.

OCO-2 has given scientists a “revolutionary” new way to understand the effects of droughts and heat on tropical rainforests, Freilich said in the briefing. The remoteness of these regions, their lack of field stations and the distorting effect of thunderstorms on land-based measurements of CO2 make the OCO-2 satellite an important and unique tool for monitoring the movement and increase of this greenhouse gas, he added. NASA launched OCO-2 in 2014 just in time for the record spike of global atmospheric CO2 that occurred between 2015 and 2016.

“In both 2015 and 2016, OCO-2 and the National Oceanic and Atmospheric Administration (NOAA) measured the largest annual increases in atmospheric carbon dioxide in at least 2,000 years,” Eldering said during the briefing. Using OCO-2 data, Liu quantified that “in total, the three tropical land regions released at least 2.5 gigatons more of carbon into the atmosphere than they did in 2011,” or about a 50 percent increase, he said during the briefing.

As interesting as it is, this particular report would have to be classified as more of an anecdote than an analysis of a long-term trend of an established data series. As I said, it’s a start, but just a start.

We should directly measure greenhouse gas emissions from large facilities and vehicles

We do some measurements of greenhouse gases in the atmosphere itself, such as carbon dioxide, and we do estimates of emissions from facilities and vehicles, but I am unaware of efforts to directly measure emissions from large facilities and vehicles, including power plants and industrial plants.

We should measure carbon dioxide emissions from exhaust stacks and tailpipes.

At least do it on a trial basis for a small number of facilities and vehicle, to determine the feasibility and usefulness of such measurements.

It might also turn out that through a significant sampling program, a small amount of direct measurement could be extrapolated and possibly correlated with energy inputs (tons of coal, barrels of oil, cubic feet of gas, gallons of fuel) so that much more accurate estimates of greenhouse gas emissions could be modeled and calculated.

Carbon sinks and carbon sequestration

I don’t intend to delve deeply into the topic in this paper, but carbon sinks and carbon sequestration are an important topic in climate science. They have more to do with mitigation of global warming than the science of causes of global warming.

A carbon sink is a natural or artificial reservoir that accumulates and stores some carbon-containing chemical compound for an indefinite period. The process by which carbon sinks remove carbon dioxide (CO2) from the atmosphere is known as carbon sequestration.

The main natural carbon sinks are:

  1. Plants.
  2. Oceans.
  3. Soil.

Carbon sequestration can be natural, as above, or artificial as in geoengineering.

I only mention the topic here for convenient reference, completeness, and to indicate that I am aware of the topic.

Carbon dioxide from ice cores

I am aware that scientists are examining the chemical composition of air bubbles trapped in ice cores to deduce the level of carbon dioxide in the atmosphere from many thousands of years ago.

I personally haven’t delved deeply into this aspect of climate science yet, so I don’t have a strong view one way or the other as to the validity and utility of these efforts.

I won’t be able to form a sound view on the value and impact of such efforts until I do eventually delve deeply into this area — and have any questions and concerns that my future efforts might uncover answered to my satisfaction.

I only mention the topic here for convenient reference, completeness, and to indicate that I am aware of the topic.

Why use 30 years as threshold for climate change?

As a general proposition, 30 years is the accepted threshold of elapsed time for judging changes to climate.

Changes in weather patterns or even trends lasting less than 30 years may simply be variability.

But patterns or trends that last 30 years are considered significant changes to climate.

Says who?

As per the World Meteorological Organization (WMO):

What is Climate?

Climate, sometimes understood as the “average weather,” is defined as the measurement of the mean and variability of relevant quantities of certain variables (such as temperature, precipitation or wind) over a period of time, ranging from months to thousands or millions of years.

The classical period is 30 years, as defined by the World Meteorological Organization (WMO). Climate in a wider sense is the state, including a statistical description, of the climate system.

NOAA also uses 30 years for their concept of a climate normal:

What are Climate Normals?

In the strictest sense, a “normal” of a particular variable (e.g., temperature) is defined as the 30-year average. For example, the minimum temperature normal in January for a station in Chicago, Illinois, would be computed by taking the average of the 30 January values of monthly averaged minimum temperatures from 1981 to 2010. Each of the 30 monthly values was in turn derived from averaging the daily observations of minimum temperature for the station. In practice, however, much more goes into NCEI’s Climate Normals product than simple 30-year averages. Procedures are put in place to deal with missing and suspect data values. In addition, Climate Normals include quantities other than averages such as degree days, probabilities, standard deviations, etc. Climate Normals are a large suite of data products that provide users with many tools to understand typical climate conditions for thousands of locations across the United States.

NOAA also uses the 30-year threshold for considering weather station data significant:

For a station to be considered for any parameter, it must have a minimum of 30 years of data with more than 182 days complete each year.

That said, the IPCC doesn’t appear to use this 30-year threshold. Interesting. They don’t have any alternative threshold either.

What was so special about 1958?

When diving into climate-related data the year 1958 keeps popping up. Why is that?

Superficially, a lot of people may recall 1958 as the year the U.S. launched its first satellite, Explorer 1, after the Soviet Union launched Sputnik in 1957. It is also known for being the year NASA came into existence.

What many people don’t know, is that the launch of Explorer 1 was part of a much larger, global, international scientific effort known as the International Geophysical Year (IGY), which was intended to kick-start a wide range of scientific efforts to better understand the planet. The IGY lasted from the middle of 1957 through the end of 1958, a year and a half, rather than precisely a year as in its name.

The Soviet Union launched their own Sputnik satellite, first, in 1957. This helped prompt the creation of NASA. Sputnik was part of the IGY as well. People think of it in terms of the space race and geopolitical competition between the U.S. and the Soviet Union, but it too was essentially a science mission at heart, part of the IGY.

Beyond the satellites and other space efforts of the IGY, the IGY focused a lot of attention on measuring and monitoring the surface and atmosphere of the planet.

Two efforts that I am personally aware of from 1958 are of particular note:

  1. The Scripps Institution of Oceanography began monitoring atmospheric carbon dioxide from the top of the Mauna Loa volcano in Hawaii in 1958. That effort continues today. View here.
  2. The Danish Meteorological Institute began monitoring Arctic air temperatures. That effort continues today. View here.

The Scientific Committee on Antarctic Research (SCAR) was also created as part of the IGY in 1958:

The Scientific Committee on Antarctic Research (SCAR) is an inter-disciplinary committee of the International Council for Science (ICSU), and was created in 1958. SCAR is charged with initiating, developing and coordinating high quality international scientific research in the Antarctic region (including the Southern Ocean), and on the role of the Antarctic region in the Earth system. SCAR provides objective and independent scientific advice to the Antarctic Treaty Consultative Meetings and other organizations such as the UNFCCC and IPCC on issues of science and conservation affecting the management of Antarctica and the Southern Ocean and on the role of the Antarctic region in the Earth system.

The Wikipedia notes that SCAR:

…was established in February 1958 to continue the international coordination of Antarctic scientific activities that had begun during the International Geophysical Year of 1957–58. SCAR is charged with the initiating, developing and coordinating of scientific research in the Antarctic region.

In short, 1958 was the year scientists and governments got really serious about the science of climate and the atmosphere on a global basis.

That’s a good thing, but it also implies that we didn’t have a critical mass of efforts to monitor and track the global climate and atmosphere prior to 1958.

It’s not that there weren’t efforts before 1958 or that data from prior to 1958 cannot be deduced indirectly (such as tree rings, ice cores), but that the records prior to 1958 are simply not as detailed and robust as from 1958 onwards.

This is why I make frequent reference in this paper to how we did things before 1958 and after 1958.

Why was there no warming from 1940 to 1970?

I still haven’t found any really solid explanation as to where there was no significant global warming from 1940 through 1970, even though carbon dioxide was rising steadily during that period.

Yes, I’ve seen the hand-waving arguments, but as comforting as they may sound, that’s not the same as hard-core scientific proof.

A reasonable-sounding account comes from NASA scientist Dr. James Hansen:

If greenhouse gases are to blame then why did Earth’s average temperature cool from 1940–1970? And why has the rate of global warming accelerated since 1978? Hansen’s answers to these questions brought him full circle to where he began his investigation more than 40 years ago.

“I think the cooling that Earth experienced through the middle of the twentieth century was due in part to natural variability,” he said. “But there’s another factor made by humans which probably contributed, and could even be the dominant cause: aerosols.”

In addition to greenhouse gas emissions, human emissions of particulate matter are another significant influence on global temperature. But whereas greenhouse gases force the climate system in the warming direction, aerosols force the system in the cooling direction because the airborne particles scatter and absorb incoming sunlight. “Both greenhouse gases and aerosols are created by burning fossil fuels,” Hansen said, “but the aerosol effect is complicated because aerosols are distributed inhomogeneously [unevenly] while greenhouse gases are almost uniformly spaced. So you can measure greenhouse gas abundance at one place, but aerosols require measurements at many places to understand their abundance.”

After World War II, the industrial economies of Europe and the United States were revving up to a level of productivity the world had never seen before. To power this large-scale expansion of industry, Europeans and Americans burned an enormous quantity of fossil fuels (coal, oil, and natural gas). In addition to carbon dioxide, burning fossil fuel produces particulate matter — including soot and light-colored sulfate aerosols. Hansen suspects the relatively sudden, massive output of aerosols from industries and power plants contributed to the global cooling trend from 1940–1970.

There you have it, in a single word: aerosols.

I actually find that reasonably credible, although it is still a speculative theory, without empirical validation. As such, I have to withhold my full confidence.

Why did temperature take off so suddenly in the late 1970’s?

I haven’t found any great, truly robust scientific explanation for why global temperature started to rise so dramatically in the late 1970’s.

Carbon dioxide was rising at a reasonably steady rate since it was first measured reliably in 1958, with no change in that rate as temperature started rising dramatically.

I do have my own theory…

The 1970’s was when air pollution controls were instituted on a massive scale, for both large industrial plants and motor vehicles. The air started to become more clean and clear. Specifically, much less particulates were pumped into the air.

From 1940 to 1970, a massive buildup of vast clouds of particulates was in theory responsible for the lack of global warming during that period.

The 1940’s included the wartime industrialization.

The 1950’s included the postwar efforts to rebuild the world, with heavy industrialization and massive scale production and use of vehicles consuming vast amounts of fossil fuels. All without any air pollution controls.

Clouds and aerosols (particulates) are known by scientists to force a cooling effect.

But from the mid to late 1970’s, that cooling effect was reversed as the vast clouds of particulates dissipated.

But is this account true?

It sounds good, and I believe it could be true, but that’s not real science.

The bottom line is that for all their data, papers, and theories, scientists still don’t have a firm grasp on all the complexities of our climate.

I really want to see a much better joint account of the lack of warming during the 1940’s to 1970’s period and the excessive warming of the 1970’s onward. Without a clear account of these two conflicting periods, I won’t be able to have a significant confidence in the theory of global warming and climate science.

Why hasn’t there been a Nobel science prize for climate science of global warming?

If the science behind global warming is so significant and spectacular, why hasn’t anyone been awarded a Nobel science prize for it?

No, Al Gore was not awarded a Nobel science prize. Nor was the IPCC, and their are scientists.

Rather, a Nobel peace prize was awarded to Gore and the entire team at the Intergovernmental Panel on Climate Change (IPCC) in 2007.

As the Nobel folks note:

The Nobel Peace Prize 2007 was awarded jointly to Intergovernmental Panel on Climate Change (IPCC) and Albert Arnold (Al) Gore Jr. “for their efforts to build up and disseminate greater knowledge about man-made climate change, and to lay the foundations for the measures that are needed to counteract such change

But why no science prize — even for the hundreds of scientists who have labored so feverishly on the IPCC assessments?

As a minor point, I would note that the Nobel citation is strictly for climate change, not even mentioning global warming. But as I have previously noted, a lot of people treat the two terms as synonyms. Still, I would have expected the relatively sophisticated folks at the Nobel Foundation to use reasonably precise language. Who knows, maybe they really did mean to refer to climate change without referring to global warming.

My concern is that the science behind the theory of global warming and climate change just isn’t completely there yet. I mean, if it was all there, a Nobel science prize would have been warranted, right?

Another possibility is that the science is too trivial, simple, or obvious to warrant such a prestigious science award. I personally couldn’t say for sure.

But there is a technicality that may answer my question, namely that there is no Nobel prize in earth science, climate science, climatology, or anything similar — Nobel only awards prizes for physics, chemistry, physiology or medicine, literature, peace, and economic sciences.

Curiously, there are no Nobel prizes for astronomy, geology, oceanography, or mathematics.

There have certainly been Nobel prizes awarded for what appears to be astronomy, but they are really physics prizes.

In fact, for all of the poking around I have done, if there is any fundamental scientific basis for global warming, it should be a matter of physics, or maybe chemistry. Physics covers the behavior of gases and energy, so if the science of global warming has any significant merit, it should warrant a Nobel physics prize.

This is a fascinating mystery to me.

Non-Nobel prizes for Earth science

There are in fact a lot of other prizes besides those awarded by Nobel. The Fields medal and the Abel Prize are awarded in mathematics, for example.

There is also the Vetlesen Prize in earth sciences:

The Vetlesen Prize was established in 1959 by the New York-based G. Unger Vetlesen Foundation. Awarded for “scientific achievement resulting in a clearer understanding of the Earth, its history, or its relation to the universe”, the prize was designed to be the Nobel Prize of the earth sciences. The prize is administered by Lamont-Doherty Earth Observatory, one of the world’s leading earth-science research institutions, which convenes a committee from both its own ranks and those of other major institutions to judge nominations.

In fact, in 1987 the Vetlesen Prize was awarded to Professor Wallace Smith Broecker, sometimes referred to as the Godfather of Global Warming (although James Hansen and others are also sometimes given that appellation.) As per his Wikipedia entry:

Wallace Smith Broecker (born November 29, 1931 in Chicago) is the Newberry Professor in the Department of Earth and Environmental Sciences at Columbia University, a scientist at Columbia’s Lamont-Doherty Earth Observatory and a sustainability fellow at Arizona State University.

In 1975, Broecker coined the phrase global warming when he published a paper titled: “Climate Change: Are we on the Brink of a Pronounced Global Warming?” He has recently co-written an account of climate science with the science journalist, Robert Kunzig. This includes a discussion of the work of Broecker’s Columbia colleague Klaus Lackner in capturing CO2 from the atmosphere — which Broecker believes must play a vital role in reducing emissions and countering global warming. Broecker has been described in the New York Times as a geoengineering pioneer.

The list of Past Vetlesen Laureates shows more than a few with interests and accomplishments that intersect with climate science.

Fine. Still, it bugs me that none of that seems to rise to the level of warranting a Nobel physics or chemistry prize.

Climate engineering (geoengineering)?

I am very skeptical of so-called climate engineering, also known as geoengineering, not because it could not achieve targeted results, but because of unexpected or unintended consequences which could be disastrous or even catastrophic, and there is no robust sense of what the most effective path should be.

And the climate being a complex adaptive system makes it quite dubious that any mere mortal could anticipate the true consequences with any sense of warranted confidence. Another words, it would be a real crap shoot.

And then there are the twin points made earlier in the discussion of the greenhouse effect:

  1. The greenhouse effect, and the greenhouse gases that cause it, is natural and necessary for life on Earth.
  2. The absence of the non-water vapor greenhouse gases in the atmosphere would cause water vapor to precipitate out, destroying the greenhouse effect, sending the Earth into a deep freeze, which would be very bad for human life on Earth.

How many people believe that government experts and bureaucrats would be knowledgeable and competent enough to carefully monitor and control greenhouse gases in the atmosphere so that the beneficial aspects of the greenhouse effect itself are fully protected?

I sure don’t.

As long as people are seriously proposing such speculative geoengineering as a necessary consequence of the theory of global warming and climate change, I will be very, very reluctant to have any significant confidence in any such theory of global warming and climate change.

Solar Radiation Management (SRM)

Solar Radiation Management (SRM) is a form of climate engineering (geoengineering).

Solar radiation management (SRM) projects are a type of climate engineering which seek to reflect sunlight and thus reduce global warming. Proposed methods include increasing the planetary albedo, for example using stratospheric sulfate aerosols. Their principal advantages as an approach to climate engineering is the speed with which they can be deployed and become fully active, their potential low financial cost, and the reversibility of their direct climatic effects.

Solar radiation management projects could serve as a temporary response while levels of greenhouse gases can be brought under control by mitigation and greenhouse gas removal techniques. They would not reduce greenhouse gas concentrations in the atmosphere, and thus do not address problems such as ocean acidification caused by excess carbon dioxide (CO2).

I won’t delve deeply into this topic here.

I’ll simply say that I am skeptical. Not necessarily because of the science and technology per se, but how practical and risky it might be.

I mention it here only for convenient reference, completeness, and to indicate that I am aware of the topic.


Gimmicks can make weak proposals more attractive or at least palatable, but ultimately they will tend to backfire and prove to be unsustainable, or have undesirable unintended consequences.

What do I mean by a gimmick? A gimmick is any partial or pseudo-solution or policy that doesn’t amount to a true, long-term, sustainable, and economically viable approach.

Some examples of gimmicks:

  • Subsidies. Such as solar panels and electric cars.
  • Geoengineering or climate engineering. Again, not true, long-term solutions. Unintended consequences, possibly catastrophic.
  • Carbon credit trading. Again, not true, long-term solutions.
  • Carbon sequestration. Lots of questions, issues, and concerns.
  • Vehicle fuel economy standards. Again, not true, long-term solutions.

Part of the problem is that many, if not all, of these gimmicks are driven out of desperation by adherents to the theory of global warming and climate change rather than by basic economics.

Adaptation to climate change

Regardless of any theory of causes of global warming and climate change, the climate is what it is, including frequency and intensity of serious storms, droughts, floods, fires. Adaptation to climate is simply practical, common sense — no special, controversial, or debatable science is needed. Storm surge is storm surge; if one spot on the coast experiences it, every spot on the coast needs to be prepared for it.

I personally think we should ban permanent development of coastal areas below 10 or 15 if not 25 feet of elevation above coastal waters or major waterways, plus restore coastal and riverine wetlands to cope with floods and storm surges, but that should be done regardless of any beliefs about global warming or climate change. There were plenty of extreme weather events well before the 1970’s.

How or whether to adapt residential construction practices for Category 4 hurricanes or tornado winds is problematic. I mean, sure, yes, you can build each house or apartment like a fortress, but that’s not exactly practical. Ditto for smaller-scale commercial structures.

Larger-scale commercial structures, including hospitals and tall office buildings should definitely have such a fortress-like construction to be able to withstand Category 5 hurricanes and tornados. Ditto for public facilities such as schools and emergency responder facilities, especially since shelter and operation of emergency response is needed especially in extreme weather events.

In my mind, above-ground utility poles are a historical anachronism. All utilities should be buried below ground.

This is all common sense and completely independent of what impact greenhouse gases have on temperature or climate.

Carbon dioxide emissions per BTU for various fuels

How much worse is coal than natural gas in terms of carbon dioxide emissions?

The Department of Energy Energy Information Agency (DOE/EIA) has a table and an FAQ detailing how much carbon dioxide is emitted when a given amount of a particular fuel is consumed, as well as the amount of carbon dioxide emitted for each unit of energy that is generated from that combustion (BTUs — British Thermal Units, actually millions of BTUs).

I’ll provide just a few examples here, giving pounds of carbon dioxide per million BTUs:

  • Coal (anthracite) — 228.6
  • Coal (all types) — 210.20
  • Coal (bituminous) — 205.7
  • Diesel fuel and heating oil — 161.3
  • Kerosene — 159.40
  • Gasoline (without ethanol) — 157.2
  • Jet fuel — 156.30
  • Propane — 139.0
  • Natural gas — 117.0

So, coal is about 80% worse than natural gas.

I couldn’t immediately find the comparable number for biofuel.

My proposal for an energy policy that helps address global warming and climate change

Whether you believe in global warming and climate change or not, I have a proposal for an energy policy that works for either prospect and it has only three simple elements. It is a guaranteed win, either way.

The good news is that we have no urgent need to know the truth about global warming and climate change one way or the other.


Because… my proposal for a national (and global) energy policy works for either prospect. Not only will it work, but this energy policy will offer dramatic benefits either way.

A quick preview of my proposed energy policy, which has only three elements:

  • The relentless march of technology — especially the shift towards a digital and electric world.
  • Efficiency. Do more with the same energy, and even do more with less energy. Saving money is a huge motivator.
  • Local air quality. Focus on health and quality of life.

Seriously, that’s it.

Read about it here.

Hidden science?

I can only judge scientists and their theories by data and research that is publicly available and accessible to me. If I can’t find it or get at it, then effectively it doesn’t exist.

Who knows, maybe scientists have some hidden, secret data or research that expresses the real truth about global warming and climate change, and maybe my views would change if I could gain access to that hidden, secret data and research, but… that’s just speculation on my part.

I certainly am not assuming that any such hidden, secret data or research exists, but I can’t fully discount it either.

The bottom line is that my views and beliefs expressed here are based solely on readily and freely accessible public data and research reports and papers.

My main point here is simply that I can’t be positive that scientists don’t have convincing evidence that the theory of global warming and climate change is true and valid, but if they have such a firm basis then it must be hidden where I can’t find it.

Scientific papers hidden behind paywalls

The good news is that the recent trend is to make many scientific papers publicly accessible on web sites. Kudos for that.

Unfortunately, many papers, especially older papers remain hidden behind paywalls, requiring membership, subscription, or other forms of payment to access them.

This simply doesn’t work for me. My budget is… zero.

If I cannot access a paper or report for free, then as far as I am concerned it doesn’t exist. It is effectively hidden and secret.

What is the role of the EPA in developing climate science?

Is the Environmental Protection Agency (EPA) limited to using climate science in the process of creating and enforcing regulation, or are they involved in the research stage as well?

It appears they are at least somewhat involved in climate-related research:

Their research grant page suggests they are focused on impact and mitigation of climate change, as opposed to basic research on causes of climate change:

Climate Change Research Grants

EPA funds climate change research grants to improve knowledge of the health and environment effects of climate change, and provide sustainable solutions for communities to effectively manage and reduce the impacts of a changing climate.

Are they involved in the IPCC assessment report development?

I did a quick search on the IPCC site under contributors, but found no instances of the Environmental Protection Agency:

I was a little surprised that EPA did not even have contributors for the impact and mitigation working groups (WG2, WG3.)

Is carbon dioxide air pollution?

Personally, I don’t accept carbon dioxide as a form of air pollution.

I don’t see it as being directly harmful to human health.

If it makes you choke, sneeze, gasp, have difficulty breathing, or develop some respiratory health condition, then it’s air pollution. Carbon dioxide (down at levels of around 400 parts per million) does none of those.

Yes, I am well-aware that carbon dioxide has now been defined to be air pollution due to its claimed role in anthropogenic global warming, and I know you can’t fight city hall, but I still don’t accept it.

How many climate scientists are there in the world?

How many scientists are we talking about when we refer to climate scientists?

How many of them are truly expert in the physics of global warming? As opposed to being expert in other non-physics areas such as impact of temperature on animal habitats and ecosystems or mitigation of climate change. Not that the latter are not important in their own right, but they won’t have the expertise to weigh in on matters of the physics of global warming.

How many scientists are expert in the atmospheric physics of greenhouse gases?

How many scientists are expert in atmospheric physics in general?

How long has each of these climate scientists been active in the field of climate science?

How many years should a scientist be active in the field of climate science to merit being referred to as an expert?

Answers to these questions would give me a better perspective on how much we really know about climate science and how well we know it. Confidence in the science requires such knowledge.

What are the specific fields that are relevant to climate science?

How many sub-fields are there under the broad umbrella of climate science?

How many climate scientists are there in each of those sub-fields?

What degrees of specialty are required for a scientist to work in each of these areas and merit being referred to as an expert?

How many years should a scientist be active in a particular sub-field of climate science to merit being referred to as an expert in that sub-field?

Answers to these questions would give me a better perspective on how much we really know about climate science and how well we know it. Confidence in the science requires such knowledge.

Research needed

If climate science is as settled as public rhetoric asserts, then why would any additional research be needed? That’s a rhetorical question since I don’t consider climate science settled at all.

More research is probably needed across the board. I’m not sure the scientists in any of the sub-fields of climate science would disagree with me there.

There are four areas that I personally think deserve special research attention:

  1. Climate as a complex adaptive system (CAS). What does that mean, what are the implications, and how certain can we be about anything in a CAS?
  2. Global temperature modeling. How a limited number of temperature sensors can be combined to model the entire surface of the planet. Scientists are using models today, but how good are they really?
  3. Empirical validation of climate theories and modeled data series. Various theories and models produce results, but they have no empirical validation to prove that they are valid.
  4. GoreSat 2.0. A next generation of satellite, maybe a cluster of them, at the L1 Lagrange point, to provide credible temperature measures of the entire planet. And measures of solar output as well.

Science vs. public policy

There is no clear correlation or causal connection between science and public policy.

We certainly want public policy to be informed by science, but I don’t think that any sane, practical citizen would tolerate a public policy that is controlled by science.

If nothing else, public policy also needs to be informed by human considerations, a sense of humanity.

Inevitably, public policy is heavily informed by politics and political considerations.

Some might prefer that public policy not be controlled by politics, but from a practical perspective, political control will be more the norm than not.

Some might prefer that elites, especially elite experts should be in control, but many prefer a democracy in which they, the citizens get to vote for who they wish to be in control and responsible for setting public policy. And that elected officials are clearly accountable to the voters.

The proper role for science and scientists in the making of public policy remains an open question and matter of endless debate.

Me, I’m content with science being given an advisory role, both to provide advice to policymakers and to communicate with the public in as plain language as possible. After they have communicated to the best of their ability, scientists must then accept that policymakers and the people will make their own decisions about public policy.

What I am opposed to is politicized science or politically activated scientists. What we don’t want to see are two things:

  1. Scientists selected for advisory roles based on how politicians perceive the science to be compatible with political ideology.
  2. Scientists accepting advisory roles based on their perceptions of the scientific or political ideologies of the sponsoring politicians.

Public policy should never be based upon speculation

Whenever possible, public policy of government should be based on evidence and sound science rather than speculation.

The theory of anthropogenic global warming and climate change simply feels too speculative to me at this time to justify any drastic public policy initiatives.

There is still too much hand-waving.

Too few people actually understand the many nuances of the underlying science.

There is too much reliance on passion.

There is too much moralistic preaching.

Too much of the core science continues to evolve, as we speak.

Besides, my proposed energy policy will provide most of the benefits that a lot of the drastic policy proposals would deliver anyway, and may be more likely to deliver the same benefits, and with none of the dramatic risks.

My interest in science

I’ve already covered some aspects of my long interest in science, but just to complete that discussion:

  1. As a child I was extremely interested in science, nature, and how things work.
  2. I remained interested in science in high school, but in the 10th grade I became enamored with computers, which were a real novelty in 1970, well before the advent of personal computers.
  3. I took a few science courses in college, but by the middle of my sophomore year I was firmly on the path towards computer science, leaving traditional science in my past.
  4. I always retained an interest in science, although strictly as a hobby and background interest.
  5. These days I’m diving as deeply as I can into physics, especially quantum mechanics.
  6. But it was the combination of Michael Crichton’s State of Fear book and Al Gore’s An Inconvenient Truth movie that reinitiated me into an interest in digging deeper into science and data in particular.
  7. Now, I periodically check Science News and phys.org. I’m especially interested in anything new dealing with photons, neutrinos, beta decay, electron-positron pair production and annihilation, and high-energy physics in general.
  8. And I have a special interest in understanding the real science underlying the theory of global warming and climate change.

I regularly monitor at least some of the climate science data, particularly:

  1. Arctic Air temperature — http://ocean.dmi.dk/arctic/meant80n.uk.php
  2. Arctic sea ice extent — http://nsidc.org/arcticseaicenews/
  3. NOAA State of the Climate global temperature — https://www.ncdc.noaa.gov/sotc/
  4. Mauna Loa CO2 level — https://www.esrl.noaa.gov/gmd/ccgg/trends/

Why am I not a scientist?

With all of this interest in science, you might ask the question of why I am not a scientist. My reasons:

  1. I’m not interested in the formality of formal papers. This informal paper is about as formal as I can tolerate.
  2. I’m not interested in the hands-on details of science, experiments, and all of that.
  3. I’m more of an idea guy, focused on working with ideas.
  4. I’m not interested in the detail needed to carry out experiments. I’m more interested in the higher-level, more abstract concepts.
  5. But I don’t have enough of a genius brain to succeed as a pure theoretician. I simply don’t have the level of superior intellect and raw mental ability needed to be a scientist, just the interest.
  6. I don’t have the raw mental ability to memorize enough detail to work rapidly enough.
  7. I don’t possess the rote memory ability needed to memorize all of the detail needed to master any science of any great significance.
  8. I actually hate working with numbers. I can do it if I have to, but I have no passion for numbers.
  9. I’m not interested in teaching, dealing with students, classes and all of that.

Other than that, I’d love to be a scientist!

Climate Science Narratives & Presentations

The American Chemical Society provides online resources for communicating about climate science:

Communicating Climate Science

Responses to common climate science questions and arguments. Adopt and adapt these narratives for use in engaging in deliberative discourse with all audiences including non-scientists.

I include this material here for reference, completeness, and to indicate that I am aware of the material, but without endorsing or denying any information contained therein.

What is the role of low-altitude, low-density particulates or aerosols in global warming?

I am personally wondering if low-altitude, low-density particulate or aerosols may be having a significant impact on global warming that is not fully reflected in current theories and models.

Particulates and aerosols are roughly synonyms. The only distinction that I would draw is that a particulate is generally a solid particle, while an aerosol could be either solid or a liquid.

I am specifically differentiating the effects of high-altitude particulates and aerosols from those at low altitudes.

High-altitude particulates and aerosols would include the ash clouds from volcanoes, while low-altitude particulates and aerosols would include air pollution from vehicles that that is near the surface and maybe no more than a few thousand feet above the ground, as opposed to well above the ground like many thousands of feet, literally up in the clouds or at least at their level.

Particulates include black carbon, commonly known as soot:

I am conjecturing that at a low altitude they act as a warming blanket, keeping heat from escaping to space, even as at a high altitude they act as a barrier keeping solar radiation out, exerting a cooling effect rather than a warming effect.

I am also differentiating my relatively low density, low-altitude particulates and aerosols from dense clouds, such as visible clouds, fog, smoke from fires, and dust storms. I presume that those dense forms act as a barrier to incoming solar radiation and hence a cooling effect. At least to some extent, and I am receptive to the notion that the effects are complicated rather than a simple, black and white distinction between cooling and warming.

What the truth is I do not know. What I do know is that this distinction hasn’t been deeply explored or explained in the publicly available scientific literature or even by the IPCC.

I am well aware of the existing treatment of aerosols by the IPCC, but I don’t believe they considered low-altitude, low-density aerosols in particular as I mention here.

I would need a more full account of the impact of low-altitude, low-density particulates and aerosols on warming before I could sign on with greater confidence to the theory of global warming and climate change.


The IPCC assessment reports contain a reasonably decent glossary for global warming and climate change:

They have a separate glossary for acronyms:


To make a very long story very short, maybe climate scientists have a lot of good points, but there are so many question marks in my mind about their theories, data, and methodologies that I am unable to unequivocally express any great confidence in the science behind the theory of global warming and climate change.

In short, maybe the theory of anthropogenic global warming and climate change is valid, or maybe it isn’t, but I personally can’t tell.

Science is not a matter of faith for me. My loyalty is to science (physical science) and reason, but neither seems to adequately support the extreme statements being made in public discourse in support of the theory of anthropogenic global warming and climate change.

I am open-minded, but only to real, hard science (physical science) and real reason, not bluster, bullying, rhetoric, politically-motivated rhetoric, or political, emotional, or moralistic arguments.

If you’ve got some fresh science, I’d be glad to take a look at it.

Polls? Sorry, but science has never been done using polling or surveys. If citing polls is the best you can do, and your first argument, then as then say on the streets in New York City, you got nuthin’.

The bottom line is that even if my life depended on it, I would be unable to definitively prove that the theory of anthropogenic global warming and climate change is definitively true — or definitively false for that matter. Sure, I could wave my hands with the best of them and even invoke Pascal’s Wager, or bandy my long list of very valid concerns, but that would all fall very far short of scientific proof.

Matters of science are not supposed to be this way.

That’s why I have tentatively concluded that global warming and climate change is primarily a sociopolitical matter rather than fundamentally about science.

Maybe some day the science will finally be definitive, fully addressing every one of my concerns expressed in this paper to my satisfaction, but we are not there yet, not even close.

Freelance Consultant