For the people who believe the United States was founded as a Christian nation, what do you think should change in our nation? I'm wondering if/how this might affect us current citizens.

I'm curious as to what being a "Christian Nation" would entail.