April 24, 2008
from DailyGalaxy Website
Evidence has mounted that global warming began in the last century and that humans are, at least in part, responsible. The concern is that the warming of our climate will greatly affect its habitability for many species, including humans.
Both the Intergovernmental Panel on
Climate Change (IPCC) and the U.S. National Academy of
Sciences concur that this is the case. But some argue that this
thinking is too limited. They say that too many scientists are
either ignoring, or don’t understand, the well-established fact that
Earth’s climate has changed rapidly in the past and could change
rapidly in the future—in either direction.
This event began and ended rather abruptly, and for its entire 1000 year duration the North Atlantic region was about 5°C colder.
Could something like this happen again?
It sure could, and because the changes
can happen all within one decade—we might not even see it coming.
Could we be preparing for the
He believes this is the reason why the world cooled rapidly between January last year and January this year, by about 0.7C.
However, scientists from the US National Center for Atmospheric Research published a report in 2006 that claims the Sun likely has a negligible effect on climate change.
Another study, recently published study
in the Institute of Physics' Environmental Research Letters, by
researchers from Lancaster and Durham Universities found that there
was no strong correlation between cosmic rays and the production of
low cloud cover. If that is correct, it would mean the lack of
sunspots is not necessarily an indicator of higher cloud cover and
subsequent future cooling.
On the other hand, since widespread
temperature records have only been kept for a relatively short
period of the Earth’s history, it’s hard to know exactly what these
increases mean from a long-term perspective.
He proposes preventive measures to slow any potential cooling, such as bulldozing Siberian and Canadian snow to make it dirty and less reflective.
Canadian scientist Kenneth Tapping of the National Research Council has also noted that solar activity has entered into an unusually inactive phase, but what they means—if anything—is still anyone’s guess.
Oleg Sorokhtin, a fellow of the Russian Academy of Natural
Sciences agrees with Chapman. Sorokhtin believes that, in spite of
the results of certain recent studies, lack of sunspots does
indicate a coming cooling period. In fact, he calls manmade climate
change "a drop in the bucket" compared to the cold brought on by
inactive solar phases.
Reichler is probably right, but it wouldn’t be the first time if the fringe opinion turned out to be onto something.
But from a broader perspective, does it really matter who’s “right” as far as preparations go?
Whether the climate gets cooler or
warmer, or does nothing at all, people will still need massive
amounts of energy. Even if we were to take the reverse approach and
intentionally increase greenhouse gases in the atmosphere in order
to stave off cooling, it would likely have little effect other than
to further pollute the environment with standard energy
consumption’s many toxic byproducts.
But what can’t be
disputed is that humans are polluting the planet. Current and future
weather conditions do not change the fact that using oil and coal
for energy isn’t a good long-term idea. The need for cleaner energy,
cleaner air and cleaner water has never been greater. The widespread
call for better handling of resources, and habitat protection
doesn’t change with the thermometer.
If we prepare for global warming in ways that help protect the environment - we’ll still be a lot better off - even on the off chance that we end up with a mini Ice Age instead.