While the United States is not a Christian nation constitutionally nor by legislation (which would be unconstitutional according to the 1st Amendment), the culture is Christian, by her traditions and the dominance of Christianity among the populace, the reason the U. S. has become great, so don’t be mistaken that the elimination of Christianity from the public square would not spell doom for the U. S.