preview

The Role Of Native American Culture In The 17th Century

Decent Essays

The 17th century, known as the colonial era, reformed the United States and brought about many changes within the Native American culture. The whites who overtook the native colonies thought that they transformed culture, when in reality they did not. Property was taken from the natives, so “the term New World” does not fit (Rury 27). Several cultural traditions and religious values were adapted after life in the Americas, and became the predominant culture among the people. Anyone that resisted the new lifestyle and concepts were left behind in the process of the growing and progressing civilization. Since that religion played an essential role during the 1600’s, a majority of the colonies adopted some form of it. Different colonies had

Get Access