I've had a discussion on this topic held in one of my classes before and it sparked an interesting debate.
As y'all know the Pledge of Allegiance mentions God:
Quote:
I pledge allegiance to the Flag of the United States of America, and to the Republic for which it stands, one Nation under God, indivisible, with liberty and justice for all.
|
The phrase "In God we trust" was also adopted as one of the official motto of the United States in 1956. It also began appearing on currency as early as 1864.
Some people argue that because Christianity was a major religion of our country back in the day, the mention of "God" should remain engrained in our culture. Others argue that we have a separation of church and state and the nation has a blend of several other beliefs other than Christianity, which is why they believe the world "God" should be ommitted.
What are your thoughts, ATRL? Should the word "God" remain in our American culture, or should that change?