When I was in college, my experiences with liberal professors were similar to those that this author describes. I don't think my professors were as bad as this guy's, but maybe I just wasn't paying close enough attention. Anyway, the article is pretty good. I'd be interested to know if anyone else had similar experiences or if this sort of thing is unusual.
The Title is: "Lies My Teachers TOld Me"
This is a link to the
Article
This is a quote:
"Just over a year ago, I received a degree in history from the largest state college in Massachusetts. Having enrolled as a history major with the hope of acquiring a well-rounded historical perspective, I assumed that my professors would encourage my disinterested ambition. I was wrong.
With just a few exceptions, my professors were leftists who infused their lectures with political opinion and assigned reading material that reinforced their leftwing views. For information that would help me develop a balanced perspective on history, particularly the history of the West, I was forced to look beyond the confines of my formal education."