Is America a “Christian Nation”?
Awhile back, the President made headlines—and made a lot of people irate, it would seem—when he said words to the effect that America is not a “Christian nation”. This really cheesed a lot of people, not all of whom are evenly particularly devout.
I ask, “what’s all the fuss about?”
- Can anyone even define what it means for a nation to be a “Christian nation”?
- What practical difference does it make whether or not we consider ourselves such?
- Are people more likely to become committed followers of Jesus if we can somehow prove this to be true, that we are, in fact, a “Christian nation”?
- Will we somehow be more moral, or what-have-you, if we can prove this?
- If not, what other tangible benefits would derive from such a designation?
- If we can demonstrate none, why should we care what some politician thinks?
Talk amongst yourselves…