Colleges have become left wing indoctrination centers and an increasing number of Americans see it.…
Colleges have become left wing indoctrination centers and an increasing number of Americans see it.
A recent survey found that an increasing number of Republicans and Republican-leaning independents see colleges as bad for the country.
From the Pew Research Center:
The Growing Partisan Divide in Views of Higher Education
Americans see value in higher education – whether they graduated from college or not. Most say a college degree is important, if not essential, in helping a young person succeed in the world, and college graduates themselves say their degree helped them grow and develop the skills they needed for the workplace. While fewer than half of today’s young adults are enrolled in a two-year or four-year college, the share has risen steadily over the past several decades. And the economic advantages college graduates have over those without a degree are clear and growing. “READ MORE…”