Definition of expansionism:
The doctrine of expanding the territory or the economic influence of a country.
Throughout the late 19th century the US expanded in territory, especially by taking land and islands in the Pacific Ocean. Precise information to that later.
Abonnieren
Kommentare zum Post (Atom)
Keine Kommentare:
Kommentar veröffentlichen