The basic reality of the situation is that global warming is probably real, but that human activity has less than nothing to do with it.
Present accepted scientific paradigms are a big part of the problem. The standard paradigm which says that stars are nuclear fusion engines would not allow for a star itself to go through hotter or colder cycles, or for anybody to use that as a plausible explanation for global warming.
There is another view of the situation and it goes like this: In real life, stars are electrical engines and not thermonuclear engines. A star is a focal point of a cosmic electrical discharge, and the fires and light you see on the surface of stars are the same kinds of fires and light you see with arc welders.
Our own sun in fact behaves entirely like an electrical phenomenon and not a thermonuclear one. A thermonuclear star would simply keep getting cooler from the center of the star outwards and on into space. The environment of the sun actually gets enormously hotter as you move from the surface into the photosphere and chromosphere.
As a star moves through regions of space with greater or lesser electrical potential difference, it goes through hotter and colder periods, thus the "mini ice age" of the 1600s and the warming period of today.
There are a number of sites on the net which deal with this sort of thing, e.g.
http://electric-cosmos.org/
http://holoscience.com/
http://kronia.com/
http://thunderbolts.info/
http://grazian-archive.com/Quantavol.htm
In particular, the idea of doing what Algor and others would have us do and go back to the age of horses and buggies on the wrongheaded assumption that man is responsible for "global warming" would be idiotic as well as counterproductive.
Or, as Rush Limbaugh puts it, if WW-II didn't cause the great man-made ecological disaster, it's not gonna happen.