Home › Questions › How did america emerge into...
The US became increasingly imperialist with this move, which propelled them to grow the empire to Hawaii and become involved in global affairs.
An end result of the war was the US picking up several Pacific territories.
You must sign in to view your friends.