I wasn't talking about you personally but, don't you think that feminism has degraded men? Men have to legally sign up for going to war if our country calls us. Women have no such duty. in fact, women have no legal duty to the country at all however, it has always been that women have a duty to men. Women have used feminism to shirk that duty and false stories about men systematically degrading women. Men are made to look like idiots in almost every place in the media, especially in commercials. I could go on but, you won't want to discuss the issue I guess.