Imperialism was a period between 1700-1950 where powerful Western nations expanded their political and economic control over weaker nations and territories throughout Africa, Asia and other parts of the world. European countries and the United States aggressively colonized these regions and asserted their authority through military force. This era saw much of Asia and Africa come under the direct or indirect domination of Western colonial governments.