In recent times, some Christians, particularly those of a more Evangelical orientation, say that "Christianity is not a religion." How far back in time does this go? Is it unique to the United States and to people in other countries who have joined US-dominated denominations?
by /u/JJVMT in /r/AskHistorians
Upvotes: 4
Favorite this post:
Mark as read:
Your rating:
Add this post to a custom list