History & Social Studies What is American Imperialism in the nineteenth century? . Asked by ranikaprince on 01 Mar 16:45 Last updated by Jill W on 07 Jan 13:41
“American imperialism” refers to the economic, military, and cultural influence of the United States internationally.