Also about Life+115The American Dream is dead. I grew up being told that if I worked hard I could be what ever I want to be, that I could build a better life for myself and there was a stressed importance of higher education.
The cost of living rises, wages stagnate, higher education no longer guarantees a job and shackles with crippling debt, and worst of all, I feel like my government, my country, has turned it's back on me.
Sorry about a depressing opinion, I just felt the need to shout into the void. I know it's probably better than what I think, it's just hard to see it when I struggle to just live, amirite?