Of course, we'd ALL love to make America strong again. Are we dependant on politics...to have that happen.

Is is just through politics that America can become great again? Are there other forces at work as well? (Some countries think America is pretty great, anyway...) To NOT let refugees in...is that going to make America great again?