Ancient Rome is referenced and revered a lot in the West, particularly the Roman Republic. Was the roman republic always emphasized, or is that a result of democracy rising in America and elsewhere? for example, did Imperial Europe emphasize the roman empire rather than the republic?
Upvotes: 20
Favorite this post:
Mark as read:
Your rating:
Add this post to a custom list 