x

History of the United States

They were originally British colonies at the eastern coast.
After the war between France and Britain, Britain started taxing the colonies.
The colonies demanded self-determination, so they united and fought a war against the UK.
France helped them win the war, and they began colonizing the entire continent.

Expansion: https://www.quora.com/Why-didnt-the-US-colonize-other-countries-like-Britain-did/answer/Dan-Bradbury-22

Left-click: follow link, Right-click: select node, Scroll: zoom
x