A common cry of many conservatives is that, in general, our institutions of higher learning in this country (colleges, universities, etc.) are "liberal." They teach "liberal concepts," they cater to liberals, all of the professors are liberals, and so forth. Let's assume, for sake of argument, that this is correct - that our colleges and universities are, for the most part, liberal. What is it about our colleges and universities that makes them liberal, in your opinion and why do you think they became so, as opposed to leaning more toward conservative thought?