America is doing just fine in combat since World War II, hand wringing notwithstanding.
I'm seeing frustration that post-World War II America isn't allowed to win wars.
I’ve been saying this for some time now: post WWII, Western militaries are no longer allowed to win wars.
Every civilian casualty of the opposing side is branded a war crime without any factual basis. Every targeted strike is deemed an “escalation,” even if it is a response to a planned attack from the enemy (as such here). Every siege is de factor unlawful (even though sieges are perfectly legal under international law). No international order can function like this—where the bad guys routinely survive and re-offend because total victory is simply not permitted.
The point about unfair war crimes accusations and uneven judgment is spot on. This is related to what I think are double standards in judging if America wins on the battlefield.
But recall the observation that North Vietnam lost all the battles against American troops but won the Vietnam War. Not really accurate, but close enough to be a good point.
Now apply that observation to post-World War II America. Maintaining the post-World War II world that America built is the war. The "wars" we have fought since World War II are really campaigns in the actual war. That should adjust our judgment about them, no? America won the Cold War despite our alleged failures in the campaigns that made up the war.
We continue to win that war to remain king of the hill. We can hold our heads up a little higher, eh?
NOTE: TDR Winter War of 2022 coverage continues here.
NOTE: You may also like to read my posts on Substack, at The Dignified Rant: Evolved. Go ahead and subscribe to it.
NOTE: I made the image with Bing.