Why is it all about the college degree?
We know that people are becoming more educated. Why is it that companies are still focusing on the college degree? More and more people have them but barely passed their classes. While there are lots of people like me who were smart enough not to get caught in the web of debt that do not have them.
I didn't want to get stuck $30,000 in debt because I got a degree in a subject that is no longer important. I made that choice, but the opportunities that were once available to someone in my shoes are no longer there. Now, all of those opportunities are going to someone with a degree.15 years of profession work experience or not.
What do you think the degree actually shows to the company?