When did it become necessary to have a college degree?

John

New member
A lot of jobs these days require a Bachelors Degree. There are some teacher positions that require a Masters Degree. Schools encourage students to go to college after they finish high school. When did it become necessary to have a college degree? Wasn't sometime in the 1960s or 1970s?
 
Back
Top