Colleges Aren’t Indoctrinating Anyone

Patricia Hughes
4 min readApr 2, 2024
Photo by Tim Gouw on Unsplash

How Education Influences Political & Ideological Beliefs

“There is no wealth like knowledge, and no poverty like ignorance.”

~ Buddha

Former Secretary of Education Betsy DeVos and her boss gained attention for accusing college faculty members of indoctrinating students. In Florida, Governor Ron DeSantis signed a law requiring staff and students to complete questionnaires about their political beliefs. They aren’t original and this is not the first time higher education has been accused of indoctrinating students. We hear lots of accusations, but not much evidence to support these claims.

Forcing college students, faculty, and staff to divulge information about their political beliefs is most likely a violation of their First Amendment Rights. The reason behind the attacks on college professors is the insistence that college graduates tend to be more liberal because they are indoctrinated in college. Politicians like Trump, DeSantis, and many others repeat this claim over and over, but where is the evidence? Is college really indoctrinating our children? Thanks to politicians, many conservative parents think so.

Solving Problems That Don’t Exist

--

--

Patricia Hughes

Progressive writer, freelancer, mindfulness practitioner, social justice and environmental activist. Twitter @phugheswriter