I'm looking for people's thoughts on the statement that feminists often make: "Teach men not to rape!"
How is this statement socially accepted instead of being decried for the bigotry that it is? This statement implies by it's premise that men are
all rapists by their very nature who need to be trained otherwise.
How is this statement any different from:
~Teach black people not to steal!
~Teach Muslims not to be terrorists!
~Teach Jews not to be cheap!
~Teach women how to drive better! (That one actually makes me laugh a bit
)
Why is bigotry aimed towards men openly and unquestionable accepted by society?