It's interesting how France first attempted to wrest the American colonies from the British in the French and Indian War, and place them under French rule, to helping the colonies become independent of England. Enlightenment ideas probably played a role in that but it is still ironic nonetheless.
You are viewing a single comment's thread from:
Enlightenment but I also bet on the history they had ^^