preview

The United States As A Christian Nation

Better Essays

Was the United States Founded as a Christian Nation? Have you ever wondered why people in the American society think any kind of demonstrated behavior is acceptable? The Christian views once instilled in United States citizens have grown sparse today, in comparison to what they once were. Opinions of many people have changed concerning multiple issues. Many founding views were centered on Godly principles as portrayed in the U.S. Constitution, and as well as the Declaration of Independence. As society continues to argue over the inclusion of religion in politics, members of society reap the effects in many areas, particularly in the education system. I strongly agree that the United States was founded as a Christian nation. Religion has been a large part of this country, starting with the very first people to arrive in America, the Pilgrims. Breaking away from the Church of England, the Pilgrims settled in what is now known as Massachusetts in 1620, in order to express religious freedom. They wanted to practice Christian beliefs similar to ones practiced by earlier Christians. Based on the views of the founding people of America, many would claim this as support that America was founded as a Christian nation. Fast forward to the late 1700’s, where both the Declaration of Independence and The United States Constitution had been written. These documents both showed signs of a Christian foundation with multiple references to God. As stated in the Declaration, “[They] hold

Get Access