In public high school, I was always taught that the idea of owning land was completely foreign to Native Americans when European settlers arrived and was a major reason settlers were able to convince natives to sign contracts that ceded land. Is this true?
Upvotes: 94
Favorite this post:
Mark as read:
Your rating:
Add this post to a custom list