/r/Christianity
Mark as read: Add to a list
So why are Western Christians so embarrassing they act as if Christianity is a western religion and not a religion that emerged from the Middle East ?
Mark as read: Add to a list
Mark as read: Add to a list