The Truth of Global Warming: Part 2 - History of Research

Published: Feb. 1, 2024, 7:56 a.m. (UTC) / Updated: Feb. 26, 2024, 1:52 p.m. (UTC) 🔖 3 Bookmarks
👍 3 👎 0

History of Research

In fact, research on global warming has been ongoing for about two centuries. This overview traces the process through which new insights were established over the years, driven by advancements in technology and theory. Explaining the development of the discourse on global warming alongside the accumulation of human knowledge and scientific progress.

Early 19th Century – Rise of the Industrial Revolution

The sun's rays are what warm the Earth, so it is crucial to first understand this sunlight well, which proves to be quite challenging. A significant technological development in the study of light is the prism, introduced for the first time by Newton in his 1704 work "Opticks." Fast forward 100 years to the early 19th century, in 1802, when William Hyde Wollaston first reported that using a prism could reveal dark lines in sunlight, known as Fraunhofer lines.

The Fraunhofer lines are dark lines that appear in the prism spectrum of sunlight, indicating places where light is dimmed. Despite sunlight being natural light, which should reveal all wavelengths evenly when dispersed through a prism, there are certain regions where the light diminishes, creating lines. At that time, no one was aware of any connection between these lines and the understanding of global warming.


The prediction that the atmosphere has a greenhouse effect is attributed to the mathematical physicist Fourier, renowned for Fourier transforms. This occurred in the 1820s, almost two centuries ago. Fourier proposed the hypothesis that while the atmosphere is transparent to solar energy, allowing it to pass through without heating, it captures and warms up the energy radiated from the Earth (infrared radiation). He speculated about the asymmetrical effect of the atmosphere in this regard. In order to achieve a sufficient infrared radiation balance to compensate for the incoming sunlight, Fourier believed that it was necessary to raise the temperature. Fourier correctly understood the essence of the greenhouse effect, involving the principles of energy balance and the asymmetrical effect of the atmosphere on incident light and outgoing infrared radiation. However, at that time, Fourier did not identify the specific gases responsible for the greenhouse effect or fully comprehend the underlying mechanisms. The tools needed for a quantitative discussion of global warming were lacking in that era.


In 1837, the term "Industrial Revolution" was coined by the economist Jerome-Adolphe Blanqui.
In 1844, "Industrial Revolution" gained popularity through Friedrich Engels and became an established academic term when used by Arnold Toynbee in his writings.
The Industrial Revolution itself is considered to have begun in the late 18th century in Britain. When discussing global warming, the Industrial Revolution serves as a crucial turning point, often used as a benchmark to compare the current atmospheric $CO_2$ levels with those before the revolution.


The first half of the 19th century - The Rise of the Industrial Revolution -

In 1859, John Tyndall identified which molecules among the constituents of gases exhibited the greenhouse effect. He found that major gas components like nitrogen and oxygen do not absorb long-wave radiation, but gases such as water vapor, carbon dioxide, methane, nitrous oxide, and ozone contribute to the greenhouse effect.

In 1859 (the same year), Kirchhoff and the chemist Bunsen, in their paper "On the Lines of Fraunhofer," demonstrated that the dark lines' wavelengths (spectra) were absorbed by elements like oxygen in the upper regions of the Sun and in the Earth's atmosphere. This marked an important step in understanding the mechanism of global warming.

Also in 1859, Kirchhoff presented the Kirchhoff's law in his paper "On the Relation between the Emission and Absorption of Light and Heat." This law states that "at the same temperature, the ratio of the absorption rate to the reflection rate is constant regardless of the substance." In the course of this research, the fundamental concept of "blackbody radiation" was proposed, which remains crucial today in explaining the process where the Earth's surface reflects solar energy as infrared radiation, and the atmosphere absorbs it, leading to warming. The problem regarding blackbody radiation can be described as the task of finding a formula that reproduces the spectrum (wavelength) of light within a certain temperature furnace, given that its distribution is known and measured. The physics research into how this distribution of blackbody radiation occurs served as a strong motivation for the development of later quantum mechanics.


In 1884, a theoretical proof was provided for Stefan-Boltzmann's law, stating that "the energy emitted from a blackbody due to thermal radiation is proportional to the fourth power of the absolute temperature." By employing this law, the surface temperature of the Sun was estimated to be approximately $6000 {}^{\circ}K$. In other words, it became possible to estimate the amount of energy radiated from the Sun to the Earth.

However, while this law provides information about temperature, the spectral distribution of radiation (which colors of light are emitted and in what quantities at a given temperature) remained unknown. Kirchhoff's theory of blackbody radiation was left incomplete, preventing the identification of the role of $CO_2$ in global warming.

In the 1890s, Svante Arrhenius presented quantitative research results suggesting that a doubling or tripling of the concentration of certain gases in the atmosphere could potentially induce climate changes comparable to the difference in average global surface temperatures between ice ages and interglacial periods (Arrhenius, 1896). Arrhenius was interested in understanding the role of greenhouse gases in generating temperature differences between ice ages and interglacial periods.Arrhenius' quantitative model was quite complex, reflecting an ambitious attempt to construct the real dynamics of the Earth. Factors considered by Arrhenius included:

  • Cooling by the emission of long-wave radiation and heating by absorption
  • Heating by absorption of solar radiation
  • Heating by net upward heat transport from the Earth's surface to the atmosphere
  • Heating and cooling by the large-scale circulation in the atmosphere, transporting heat in the north-south direction
  • The above three factors from the top are in balance at the Earth's surface with a net zero effect

Svante Arrhenius applied the aforementioned mathematical model to various latitudes and seasons, obtaining parameters for the model. He concluded that "if the concentration of $CO_2$ in the atmosphere doubles, the Earth's average temperature will rise by $5-6$ degrees." While this value is too high compared to modern observational facts, considering it as the first quantitative analysis and a crucial early proposition of the issue is highly significant. Incidentally, Arrhenius speculated that the cause of ice ages could be attributed to changes in the concentration of $CO_2$.


Early 20th Century

The 20th century commenced with the dawn of quantum mechanics.
On December 14, 1900, Max Planck made a presentation at the German Physical Society, later regarded as the "birthday of quantum mechanics." Planck introduced quantum mechanical concepts (quantization theory) to provide a physical explanation from a microscopic perspective for Kirchhoff's blackbody radiation. Successfully explaining blackbody radiation theoretically, this achievement marked the completion of the theoretical understanding of solar radiation relevant to global warming.


"Quantum mechanics" was developed in the early 1920s at the University of Göttingen by a group of physicists, including Max Born, Werner Heisenberg, and Wolfgang Pauli. Max Born's 1924 paper "Zur Quantenmechanik" serves as its initial exposition. Over the following years, this theoretical framework gradually began to be applied to chemical structures, reactivity, and chemical bonding. As a result, tools were finally available to explore the straightforward question that even laypeople could pose: "Why is it that nitrogen ($N_2$) and oxygen ($O_2$), the major components of the atmosphere, do not exhibit the greenhouse effect, while carbon dioxide ($CO_2$), which makes up only about 0.04% of the atmosphere, exerts a strong greenhouse effect?"


In the 1920s and 1930s, Serbian geophysicist Milutin Milanković discovered the "Milankovitch Cycles," which are periodic variations in solar radiation due to three factors: 1. periodic changes in the eccentricity of Earth's orbit, 2. periodic changes in the tilt of the rotational axis, and 3. precession of the rotational axis. Through these cycles, it became theoretically possible to accurately determine the ages when polar ice caps change in size, and ice ages or interglacial periods occur. However, during an era when computers were not well-developed, the calculations were challenging, and the Milankovitch Cycles were not widely used. It was also revealed that these Milankovitch Cycles contribute to the alternating cycles of ice ages and interglacial periods on Earth occurring approximately every several tens of thousands of years.


In 1931, Hulburt finally developed an "atmospheric column's vertical one-dimensional model" that took into account the vertical mechanisms of the troposphere and stratosphere. According to this model, if the concentration of $CO_2$ doubled, the surface temperature would increase by $4$ degrees. Upon obtaining these results, Hulburt, like Arrhenius before him, also believed that the cause of ice ages could be attributed to changes in $CO_2$ concentration.


In 1938, Callendar began explicitly asserting that "climate is being influenced by human activities." Callendar focused on the energy balance of the Earth's surface, using a simple differential equation to derive the relationship between changes in $CO_2$ concentration and surface temperature. As a result, he found that if $CO_2$ doubled, the surface temperature would increase by $2$ degrees. Remarkably, scientists almost a century ago were already sounding the alarm about global warming. Unfortunately, at that time, people did not take it seriously.

Late 20th Century

In 1960, Kaplan, considering the influence of clouds, obtained results suggesting that if $CO_2$ doubled, the surface temperature would increase at most by $1.5$ degrees. In 1963, Möller obtained a result of $1$degree. During this era, with each update, lower values were announced, but the response to changes in $CO_2$ concentration did not adequately consider the increasing warming effect of water vapor associated with it.

In 1964, Manabe and Strickler devised a more sophisticated model, the Radiative Convective Model. The Radiative Convective Model consisted of the following four processes:

  1. Absorption of solar radiation
  2. Emission and absorption of longwave radiation
  3. Upward heat transport from the Earth's surface to the tropospheric atmosphere
  4. Upward heat transport due to convection within the tropospheric atmosphere

In this model, the atmosphere is divided into 18 different vertical layers, and the temperature is calculated for each layer, making extensive use of the computers available at that time. The model also accounted for the warming effect of water vapor. As a result, if the $CO_2$ concentration doubled from $300ppm$, the surface temperature would rise by $1.3$ degrees without the water vapor effect. However, with the water vapor effect considered, the temperature would increase to approximately $2.3$ to $2.4$ degrees.




In 1965, "Moore's Law" was announced, predicting that "the number of components per integrated circuit would double every year." From this era onwards, atmospheric simulations became a struggle against computer power. To simulate more complex phenomena, it became common to develop theories and programs ahead of time, awaiting the era when computer power would catch up. Research for practical weather forecasting using computers began in this era. Meanwhile, as models explaining the mechanisms of the modern climate were broadly established, efforts to expand their applications commenced. If the latest models have universal power, could they explain the climatic variations during glacial and interglacial periods? Approaches to research were also directed towards deepening our understanding of Earth's climate through such studies.


In 1971, Imbrie and Kipp paved the way for quantitative analysis of the climate during the maximum glacial periods, elucidating the relationship between the classification of plankton biota in deep-sea sediments and the sea surface temperatures of that time. Starting with such achievements, efforts were made in the 1970s and 80s to quantitatively and theoretically explain paleoclimatology, gradually revealing the atmospheric conditions during the glacial periods. However, what was clarified at this point was the nature of the atmosphere during the glacial periods, not the mechanisms behind the repeated alternation between glacial and interglacial periods.


In 1988, The Intergovernmental Panel on Climate Change (IPCC) was established, initiating efforts to synthesize and assess the latest scientific knowledge on climate change. The IPCC provides advice and counsel to governments worldwide, establishing global warming as a common concern for the world.


By the way, while atmospheric theories have been largely developed, the task of collecting actual measurements of the Earth's overall atmosphere and surface conditions raises the question of who and how this work is conducted. In the past, when parameters were needed for creating a model, researchers would search for empirical studies to reference values and incorporate them into their models. However, this method can be somewhat arbitrary. On the other hand, obtaining comprehensive measurements requires actual sampling of the atmosphere, such as launching balloons to measure samples in various parts of the Earth or capturing the infrared radiation emitted from the Earth into space by artificial satellites to determine the actual amount being emitted. These tasks involve significant budgets and time commitments. Typically, the following approaches are taken.

  1. Meteorological Observation Networks: Each country has a network of meteorological observation stations, utilizing ground-based weather stations, weather balloons, weather radar, meteorological satellites, etc., to measure various parameters of the atmosphere. This includes observations of temperature, humidity, atmospheric pressure, wind speed, precipitation, and more.
  2. Balloon Observations: Balloons are employed to obtain vertical profiles of the atmosphere. Instruments for meteorological observations are attached to balloons, collecting meteorological data during ascent.
  3. Satellite Observations: Artificial satellites observe the entire Earth, measuring different characteristics of the atmosphere and Earth's surface. This enables monitoring of widespread weather conditions and climate changes.
  4. Meteorological Aircraft: Specially equipped aircraft for meteorological research are used to directly measure concentrations of tiny particles and gases in the atmosphere.

These efforts, driven by advanced countries with substantial national budgets, are necessary for purposes such as "improving practical weather forecasts." Moreover, amidst the scattered results of research conducted worldwide, the crucial task of consolidating these findings into a form that contributes to estimating global warming serves as a vital foundation for subsequent studies.


In 1997, Kiehl and Trenberth of the U.S. National Center for Atmospheric Research published a paper titled "Earth's Annual Global Mean Energy Budget". In this paper, they comprehensively examined and disclosed the actual values of various elements of Earth's energy budget, building on diverse research results to date. The details are explained in the chapter on the solar energy budget in this article. Additionally, in the chapter on the mechanism of global warming, a simple model utilizing the results of this paper is proposed.


In 1999, more precise estimates of the Earth's average temperature were announced. According to these findings, the global annual average temperature from 1961 to 1990 was $14.0{}^\circ C$ ($14.6{}^\circ C$ in the Northern Hemisphere (NH) and $13.4{^\circ}C$ in the Southern Hemisphere). This value has since become a commonly referenced figure.


Early 21st Century

Since the beginning of the 21st century, observation, computational, and theoretical technologies have all experienced significant advancements. Simultaneously, global warming has been increasingly recognized as a critical international issue, with awareness of sustainable environmental practices permeating society. In this context, the Intergovernmental Panel on Climate Change (IPCC) has played a significant role.

In 2021, the IPCC released its "Sixth Assessment Report". According to this report, the projected rates of global average temperature increase for short-term (2021-2040), medium-term (2041-2060), and long-term (2081-2100) scenarios are presented, with temperature increases (${^\circ C}$) referenced to the 1850-1900 period.

Emission Scenario Description 2021-2040 2041-2060 2081-2100
Very Low(SSP1-1.9) Net-zero $CO_2$ emissions around 2050 or later, followed by continued low emissions 1.5 1.6 1.4
Low(SSP1-2.6) Net-zero $CO_2$ emissions around 2050 or later, followed by slightly higher but still low emissions 1.5 1.7 1.8
Intermediate (SSP1-4.5) $CO_2$ emissions remain at current levels until the mid-century 1.5 2.0 2.7
High(SSP1-7.0) $CO_2$ emissions double current levels by 2100 1.5 2.1 3.6
Very High (SSP1-8.5) $CO_2$ emissions double current levels by 2050 1.6 2.4 4.4

(SSP:Shared Socioeconomic Pathways)


The results presented in this table have a very clear and impactful format, encouraging people to take action. However, it is important to remember that these values are not purely scientific research outcomes; they are the result of adjustments made to prompt action, as well as political considerations and negotiations on an international level. It should not be overlooked that various warming effects incorporated into these findings need careful examination, questioning whether they are sufficiently considered or not before accepting them without scrutiny.

Access to the series