http://mcnamarafallacy.com/ Search for: [ ] Search Skip to content The McNamara Fallacy measurement is not understanding When Metrics Mislead In the 1960s and 1970s, the United States was mired in a war that didn't make any sense. Huge numbers of American soldiers were flown over the Pacific Ocean to fight on behalf of South Vietnam against the Viet Cong and the Communists of North Vietnam. 849,018 of America's enemies were killed during direct US involvement in the war, while only 58,318 American soldiers were killed. Nonetheless, the United States lost the Vietnam War. How could that have happened? The US military walked straight into the McNamara Fallacy. Quantification at War The McNamara Fallacy is named after Robert McNamara. As U.S. Secretary of Defense from 1961 to 1968, McNamara was responsible for organizing American strategy in the Vietnam War. As a business executive, McNamara had learned to put a priority on quantitative metrics. Following along with the professional culture of scientific measurement established under Frederick Taylor, McNamara decided that he could win the Vietnam War by quantifying it. McNamara tracked the progress of the war by focusing on the ratio of enemy fatalities to American fatalities. As long as there were more enemy deaths than American deaths, McNamara concluded that the military was on the path to victory. What McNamara didn't keep track of was the narrative of the war, the meaning that it had both within the military forces of each side, but also in the civilian populations of the nations involved. Instead, he applied the business aphrorism that you can't manage what you can't measure and treated his metrics as if they were in themselves the definition of success. McNamara insisted that if factors could not be quantified, they were not relevant to the management of the Vietnam War. The consequences are well known. Despite achieving his metrics, McNamara lost the war. The Quantitative Fallacy The McNamara Fallacy is to presume that (A) quantitative models of reality are always more accurate than other models; (B) the quantitative measurements that can be made most easily must be the most relevant; and (C) factors other than those currently being used in quantitative metrics must either not exist or not have a significant influence on success. This flawed approach to reasoning is also known as the quantitative fallacy. This isn't to say that quantification is always a mistake. The mistake of the McNamara Fallacy is to presume that quantitative analysis is always the most effective option. Careful quantitative thinkers recognize the limitations of the metrics they employ. Statistician W. Edwards Deming, who founded Total Quality Management, warned his followers that, "Nothing becomes more important just because you can measure it. It becomes more measurable, that's all." The Fallacy in Afghanistan One might presume that the U.S. military had learned its lesson with the loss of the Vietnam War, and adopted measures to avoid chasing metrics to the detriment of comprehensive models of conflict. This presumption would be mistaken. In 2019, after 18 years of war against the Taliban in Afghanistan, The Washington Post used the Freedom of Information Act to obtain a collection of US government documents showing that US military and civilian Pentagon leaders had systematically provided false information about the progress of the war, in order meet metrics of success. For instance, the US military regularly reported that it had trained large numbers of soldiers in the central Afghan army, when in fact, the soldiers who had supposedly been trained did not exist. In another case of military metrics gone wrong, the US military reported success in undermining Taliban financing after it paid Afghan farmers to destroy their crops of opium poppies. What went unreported, however, was that the farmers planted larger fields of opium poppies in response, in the hopes that they might be paid by the US military to destroy the crops again. When US payments didn't come through, the opium was harvested and entered the international drug trade. Much of the profit went to support the Taliban's anti-American military operations. As in the Vietnam War, as quantitative metrics were used to present a false picture of progress, other sources of information that indicated problems in the war in Afghanistan were ignored. Colonel Bob Crowley, a senior US counter-insurgency advisor, reported that, "Every data point was altered to present the best picture possible. Surveys, for instance, were totally unreliable but reinforced that everything we were doing was right and we became a self-licking ice cream cone." As a result of its reliance on quantitative measurements of progress, the US decided to negotiate a retreat from Afghanistan. After a generation of war, there would be no victory. The Vulnerability of Data-Driven Business The McNamara Fallacy isn't just a problem for military strategists. It looms over any field of human activity in which people are tempted by the apparent ease of management by the numbers. In our time, digital technology is the greatest source of this temptation. With skills of misdirection worthy of a practiced magician, digital businesses urge us to focus on the quantitative data delivered by online analytics systems while ignoring cues from the offline world because they don't come in the form of a cute, concise, glowing dashboard. All too often, the data-driven approach puts consumers and corporations alike in the position of General McNamara, discovering only when it's too late that obsession with measurement makes management impossible/ In the early years of the 21st century, consumers fell en masse for the McNamara fallacy of social media even as the business world was besieged by hordes of self-appointed social media marketing experts. The saturation of affirmative messages about the power of quantification made the allure of the social media "like" difficult to deny. Many had doubts, but were reluctant to be the first person to point out that the digital emperor was not wearing any clothes. So, few people spoke out in objection to the large scale conversion of social networks into digital form. For a while, the promises of social media marketers seemed to be coming true. It seemed that we were all becoming "influencers", and that the financial rewards we deserved would soon be on their way. It's difficult to deny that progress is being made when the numbers keep going up, after all. The question of whether there was any connection between our on-screen numbers and our off-screen needs faded as we were distracted by the emotional boost we received every time we looked at the metrics of our followers, our likes, and our retweets. We became measurement junkies, hooked to the dopamine shots of reinforcement that accompanied every digital notification. The further we became invested in the Quantified Self, the less capable we became of perceiving factors outside of the digital frame. The result has been social isolation, economic inequality, and declining physical health. Avoiding the McNamara Fallacy Those who are concerned about falling into the trap of the McNamara Fallacy shouldn't abandon quantitative measurements and metrics. Quantification is a valuable analytic tool, when it's applied properly. The key is to broaden analytic perspectives, to consider progress and challenges from many perspectives, both quantitative and qualitative. It's especially important for organizations to do more than make mere gestures toward qualitative inquiry, with quick and superficial methods such as focus groups or surveys. Genuine in-depth interviews that are analytically deep, rather than just taking a long time, are an important tool. So is ethnography that's culturally immersive, more involved than the brief observational visits typically done by design thinkers. Above all else, it's vital to keep research human. Reality is multidimensional, and people are skilled at looking at issues from multiple angles, following their curiosity even when it doesn't seem at first to have a direct, linear connection to the bigger situation. Instead of relying solely on narrow algorithmic analysis, organizational leaders need to learn to trust the special ability of human minds, honed over millions of years of natural selection, to pick up subtle clues in complex situations. The McNamara Fallacy Copyright (c) 2022 Jonathan Cook. All rights reserved.