Before the invasion of Iraq, the US government had everyone believe that Saddam Hussein posed an imminent threat to the West, was in some way responsible for 9/11, and had his hands on Weapons of Mass Destruction. One epic war and trillions of dollars later, we know that virtually everything the Bush Administration told us was in fact, a pack of lies.
One major lesson that should have been learned was that we shouldn't always believe the official government narrative; a healthy dose of skepticism is always a good thing, particularly in the age of media propaganda. So why is it that many Americans (particularly men) believe the US 'won' the war in Iraq? The official narrative, as dictated by the White House, is that all military objectives were met and America troops came home after making Iraq a safer place.
But what is the truth? Noam Chomsky provides some harsh realities:
The United States was seriously defeated in Iraq by Iraqi nationalism -- mostly by nonviolent resistance. The United States could kill the insurgents, but they couldn't deal with half a million people demonstrating in the streets. Step by step, Iraq was able to dismantle the controls put in place by the occupying forces. By November 2007, it was becoming pretty clear that it was going to be very hard to reach U.S. goals. And at that point, interestingly, those goals were explicitly stated. So in November 2007 the Bush II administration came out with an official declaration about what any future arrangement with Iraq would have to be. It had two major requirements: one, that the United States must be free to carry out combat operations from its military bases, which it will retain; and two, "encouraging the flow of foreign investments to Iraq, especially American investments." In January 2008, Bush made this clear in one of his signing statements. A couple of months later, in the face of Iraqi resistance, the United States had to give that up. Control of Iraq is now disappearing before their eyes.
Politically, it is impossible for governments to accept they have lost in war. The war in Vietnam continued for years after it was clear the US was never going to reach its initial objectives, in large part due to the political ramifications Presidents would have faced had they accepted defeat. The US government prides itself on being the greatest military force on earth, and losing to an impoverished nation with a barely functioning military isn't exactly part of America's carefully constructed image.
When 'winning' a war means dropping $2 trillion, elevating Iran's prestige in the region, increasing the threat of terrorism worldwide sevenfold, and seeing most of the newly found access to oil benefits go to China, what exactly does losing look like?