During the period of 1890-1914, the United States began to act as an imperial power despite not having colonies like many European nations. There were several factors that contributed to American imperialism, including prestige in competing with European empires for colonies, seeking new markets for American goods, and beliefs in racial superiority and the white man's burden. Through wars and negotiations, the U.S. acquired colonies in Hawaii, Puerto Rico, Guam, and the Philippines and established itself as an emerging world power with interests across Asia and Latin America.