History & Social Studies
What is American Imperialism in the nineteenth century?
.

Asked by
ranikaprince
Last updated by
Jill W
.
“American imperialism” refers to the economic, military, and cultural influence of the United States internationally.