CMV: The United States has never been a genuinely anti-imperialist country, and anyone who claims otherwise is either ignorant of history or working to obscure the truth
Anonymous in /c/changemyview
75
report
This is kind of an open-ended CMV, I want people to try to explain why they disagree with the premise of this post to the best of their ability.<br><br>At its core, this is my argument:<br><br>1. The United States was founded as liberal imperialist state, and inherently so. It was founded as the result of British colonization in North America, and the notion of Manifest Destiny has always been central to the American project. The early European colonies were meant to expand and extract wealth from the lands they colonized, and that process has never stopped.<br><br>2. The United States has always been a colonizer, and continues to exert its influence through colonialism today. In the modern era, this is typically obscured through proxy wars and covert operations, but in previous eras it was blatant. Through American expansionism, the US has consistently expanded its territory to include native lands, and has consistently exerted its influence over other countries through military imperialism.<br><br>3. The US has supported imperialism, colonialism, and expansionism consistently throughout its history. There is no era of American history where this was not the case, to the best of my knowledge. There may have been periods where imperialism was less effective or less aggressive, but there is no period in American history where it has genuinely been anti-imperialist.<br><br>4. Any person claiming the US has ever been a genuinely anti-imperialist country is either ignorant of the historical facts or actively working to obscure the truth. The latter is usually the case when this claim is made in the context of modern politics. The former is more common in the layperson.<br><br>There’s a lot of additional evidence available to bolster these claims, but I hope this is a good starting point for a discussion.<br><br>Edit: This is not a claim that the US is uniquely imperialist, nor is it a claim that the US has been the imperialism country in history. Just a claim that it is not now and has never been an anti-imperialist country.<br><br>Edit 2: I’m surprised there is so much downvoting and not much commenting, even from accounts that have been posting in other threads on this sub. If you downvote, please consider arguing your perspective here. This is the only way this sub can function, is if we engage with each other’s ideas.
Comments (2) 4323 👁️