According to Pew research, By a steep 22-point margin, Republicans now believe American colleges and universities have a net negative effect on the country. In 2010
Most colleges and universities are nothing more than Liberal indoctrination mills. So of course they're detrimental to America. With their preponderance of Liberal/Socialist/Marxist professor and instructors they can not be anything less.
From '95 to 98 I attended a Liberal Arts (heavy on the "Liberal) college. Out of all of my professors, one PhD holder kept her political affiliation and beliefs to herself. She allowed divergent opinion to be discussed and debated in her classroom with retribution and/or retaliation.
I was fortunate to graduate in 2016 from a University that frowned on professors trying to teach their political beliefs. I didn't have a single professor that attempted to influence students in any way, shape, manner, or form to accept or refute any political party. All debate was strictly moderated so as ideas were shared without rancor. As a matter of fact there were a couple of professors that were released because they failed to abide by those standards.
So yes, there are some that actually try to teach instead of steering and molding students into a particular political ideology, be it Left or Right.