Cutter IT Journal VOL. 27, NO. 9
Decisions, decisions. We may dread having to make them, but we know we'll feel a sense of relief once they're made...unless we don't, when doubts (aka "buyer's remorse") intrude. We'll feel satisfaction when our decisions turn out to have been good ones, and we'll grind our teeth when they don't, but how much do we learn from either outcome? Some of us -- who tend to be leaders -- almost enjoy making decisions, but are such people better decision makers than others who are more deliberate? Not necessarily. Decisiveness is generally considered a virtue, but being decisive for its own sake can be as dangerous as protracted vacillation.
Some decisions are made in an information-rich environment, where the quality of a decision is directly related to our ability to process that information. The game of chess is the ultimate example; once computers got powerful enough, they could blow away grand masters. But few real-world decisions of consequence are like that. Most are more like poker, where there is a limited amount of hard information available, and success depends on making inferences from softer information like others' betting and squishy factors like reading your opponents (are they for real or bluffing?) while making yourself as unreadable as possible. Decisions in business -- at least those that are not purely technical -- share poker's need for going well beyond cool, rational analysis of hard data. Decisions in government invariably do, especially in a democracy.
GOOD DECISIONS: A MIX OF SCIENCE AND ART (WITH A TOUCH OF ALCHEMY)
The objective of any decision is to achieve the best possible outcome based on what we can reasonably be expected to have known and understood at the time we made the decision. What do we need in order to decide? Beginning with the science part first, we need:
-
Facts, obviously, but they must be carefully distinguished from opinions, beliefs, assumptions, conventional wisdom, and hopes. Not that those aren't sometimes useful, but we need to recognize them for what they are and treat them accordingly. Specific facts we need include details of the situation, our options, risks, rewards, and constraints, as well as the urgency of the situation and the consequences of inaction.
-
Insights developed from personal experience and comparison with historical analogs and precedents.
-
Logic and reasoning are essential in making sense of how the things we know, believe, and assume interrelate and apply to the situation at hand. Game theory, a branch of mathematics, can be helpful in cases where a number of options are available and you want, for example, to minimize how bad the worst case would be, which is not necessarily the same as maximizing how good the best case could be.
These ingredients are necessary but not sufficient for good decision making, and this is where art gets into the mix. Emotions matter, our own and those of others. There's a reason Captain Kirk was in charge of the Enterprise rather than Mr. Spock.
We need to know ourselves -- our tendencies, style, biases, self-image, and the image we want to project -- plus our own culture's beliefs, norms, and expectations and how those factors may affect our decisions for good or ill.
We need to understand our environment -- the people whom our decision will affect, the cultures in which those people live and work, and the politics (i.e., stakeholders who may or may not be willing to influence people and nudge culture).
Judgment is what good decision makers use to bring all these ingredients (calling them "tools" suggests they're more mechanistic than they actually are) together in the right proportion. The critical element is pragmatism, which comprises both science and art. Inquisitiveness and healthy skepticism are essential in making judgments. Like art, which it is, good decision making is hard to teach. Some fortunate people are naturals at it. Others can learn from mentors, examples, and mistakes. Still others never quite get the knack.
Unfortunately, an optimally made decision is not enough to guarantee a good outcome. There's also luck. Some well-made decisions inevitably prove wrong for reasons that could not have been anticipated, and some badly made ones back their way into fortuitous success. Both present "teachable moments" if we let them. Too often we punish those responsible for well-made decisions that don't pan out and reward the alchemists lucky enough to have gold paint spill on their lead. (Clairvoyance can be safely dismissed as an explanation for success!)
Adaptation and Flexibility
A decision is made at a given time, but few decisions once made are truly irrevocable in the face of changing circumstances and new information. While a U-turn without a compelling reason can be politically difficult, there is usually wiggle room for modifying goals and milestones and even approaches. Charging ahead regardless may seem heroic, but it is usually costly and futile. When things are not going as planned, the immediate question is what to do differently. For example, should we change the relative attention paid to the various ingredients? Do we need more rigor or formality in the analyses?
Course corrections are also teachable moments. Could we reasonably be expected to have anticipated the new circumstances, and if we could have, why didn't we? Is there a pattern of analysis or behavior that could be improved for the future?
THE ROLE OF IT
People made decisions for many millennia without the benefit of IT, and it's not self-evident that we make our really big decisions in the computer age consistently better than before. Smaller decisions, in relatively information-rich situations, are another matter. But IT, properly used, has become and will continue to be important to decision makers in critical ways:
-
IT is really good at collecting, storing, retrieving, and analyzing facts and making them instantly available everywhere. The more heavily a good decision requires and relies on facts and rigorous analysis versus the other ingredients noted above, the more helpful IT can be.
-
By shortening the time between when decisions are made and when their results can be seen (so that decision makers can analyze and act on them), decisions can become much smaller in scope, thus limiting their risk. While many more such decisions are needed, the sheer volume of data collected can help decision makers improve their rules and guidelines. (See sidebar "Zara: Shortening the Decision Cycle.")
-
IT can help identify in a timely way when decisions are needed, for example, by early detection of problems and trends through dashboards, executive support systems, and business intelligence.
It should be obvious, but it bears repeating in view of the recent scandals that have rocked US veterans' hospitals: the information upon which decisions are based must be accurate! There's no room for fudges, and a culture that tolerates them (or even encourages them with a nudge and a wink) has to change. So the more information that can be gathered and transmitted directly, without the opportunity for creative massaging along the way, the better. (Zara has done this.)
HOW DECISION MAKING GOES WRONG
It is hard to generalize what makes someone a good decision maker, other than having the intelligence, wisdom, and temperament to avoid common pitfalls. Some pitfalls are not IT-specific, although they can apply to IT. Others relate to decisions that apply broadly but particularly bedevil the work of IT people. Most important are pitfalls that IT itself can mitigate or amplify.
General Pitfalls That Also Apply to IT Management
-
Failure to verify supposedly factual information. If you put garbage into your decision making, you'll get garbage decisions out. Managers in some US veterans' hospitals concealed critical information about their performance from executives in Washington, who never knew there was a problem until they were blindsided by its revelation. And just because IT can make information look highly authoritative, replete with clever graphics, it doesn't make that information any more intrinsically reliable than a scribble on the back of an envelope. As US President Ronald Reagan famously said, "Trust but verify."
-
Failure to challenge received opinions, assumptions, and beliefs, whether our own or those of others. Examples of this failure could fill several books -- and have -- but here are a few specific cases: the infamous weapons of mass destruction that weren't in Iraq, Microsoft's failure to recognize the central importance of the Internet even as late as 1995,1 and intelligence fiascos like the Bay of Pigs.2 Another example is the failure of America's big three automakers to even try to comprehend why drivers began flocking to imports. In the mid-1980s, I interviewed a number of US auto industry executives. While they knew their respective companies had serious problems, they professed (and probably even believed) that their products were the best in the world, and that their declining market share was due to those devious foreigners planting doubts in American minds.
-
Confirmation bias, which occurs when we subconsciously screen out information that doesn't agree with what we believe to be true. This is a particularly insidious form of the previous pitfall, because we're typically not even aware that we're doing it. Nobody is immune; the only countermeasure is constant examination and reflection to sort what we really know from what we or others think is true.
-
Closing off options by deciding prematurely (or tardily). Decisive people fear tardiness much more than the opposite, making prematurity their more likely pitfall. We've all done things that have gone wrong and then said, "If only I'd known." Too often we could have known but did not want to take the time to learn more, or perhaps at some level we didn't want to learn something that would dissuade us from making the decision we wanted to make -- a conscious form of confirmation bias. Again, the US invasion of Iraq comes to mind, where the haste to have the war over before the brutal Iraqi summer arrived caused the premature cessation of the UN inspection, which had up to then revealed no weapons of mass destruction (as in reality none existed).
In 1993, a client of mine had to choose between Windows and IBM's OS/2, when it was still unclear which one would dominate desktops. Since a great deal of reengineering and business analysis for this project needed to be thought through, the company could have postponed the operating system decision by a few months, which would have revealed a lot. But it chose OS/2, just to get that particular decision behind them.
In contrast, the phrase "overtaken by events" describes the situation where options fall off the table before a decision is made, sometimes to the point where the decision is made for you. If it's not the decision you would have wanted, well...
-
Going too countercultural. If the best decision we can conjure up is one that the culture will actively resist or passively undermine, it's not the right decision. For example, even many proponents of abortion rights now admit that the US Supreme Court's 1973 Roe v. Wade decision may have been too far ahead of its time, occurring before the political process could gain traction. It provoked a furious reaction across the US that persists to this day. By contrast, most other western countries, even ones at least nominally Catholic, moved later on legalizing abortion, and their protests are insignificant. An IT example would be installing a production and inventory control system that depends on timely, accurate input in a fudge-tolerant culture without first having implemented a serious change management program.
-
Failure to learn from history. "This time it's different," we're often told. Few endeavors suffer as much as IT from optimism uninformed by bitter experience. "Yes, we've had our problems with previous initiatives, but this time we're using Agile techniques,3 so not to worry."
-
Overlearning from history with flawed analogies. How often do we hear that "we tried that back in '03, and it didn't work"? Nuances matter. Sunday morning talking heads are quick to invoke Munich4 or Vietnam as facile shorthand according to whether they are, respectively, for or against US action in some international trouble spot. It's a convenient way to avoid having to think through the nuances.
-
Getting caught up in events and losing sight -- until it's too late -- of options to go our own way. A sterling example is the way Europe's great powers lurched into the bloodbath of World War I. (See sidebar "Exactly a Century Ago.") As things start to go wrong in an IT project, the project can go into a death spiral when nobody stops to look at the bigger picture instead of just "staying the course."
-
Machismo, or making decisions quickly to demonstrate strength and cojones. Bullying and browbeating are often involved, as the decision maker exercises political clout to force his or her will, declaring that "failure is not an option."
-
Groupthink. Participants get caught up in mutually reinforcing enthusiasm, drowning out questions and voices of caution.
-
Doubling down. There's a saying that when you find yourself at the bottom of a hole, stop digging. Too often we switch to a bigger shovel (e.g., responding to a troubled project by adding staff who will only trip over one another) rather than understanding and adapting to the new reality.
-
Freezing up in the face of seemingly overwhelming disaster. The hyper-urgent actions taken to keep the financial system from imploding in 20085 were nothing the principals would want to do, or ever dreamt of doing, but they emerged as the least awful alternatives. Near equivalents in IT are events like natural disasters that take down vital infrastructure or cyber attacks that steal data, where organizations don't have the luxury of time for figuring out the ideal solution.
-
Overly rigid adherence to abstract principles or standard procedures. The words "always" and "never" can be dangerous when taken too literally. This is particularly true in dire situations, where "I did it by the book" is no defense for letting a disaster get worse. As boxer Mike Tyson said, "Everybody has a plan until they get punched in the face."
-
Making decisions too close to the vest even when secrecy is not critical, thus losing out on potential sources of knowledge and insight as well as reducing the scope of ownership. In IT projects, it's almost always a mistake to try to work around a difficult person who wields political clout that could affect the project's success.
-
Death by a thousand cuts is where there is no one disastrous big decision, but rather a pattern of badly made small ones reflecting a dysfunctional culture. General Motors' recently revealed problems with ignition switches and the cavalier treatment this deadly defect received for years is an example. In the IT world, it could be a culture of tacitly allowing scope creep, missed deadlines, empty promises, or shading of the truth when reporting progress.
-
Letting long-term success blind us to changing realities so that we don't see the need for decisions. In IT, this could apply to security. Just because we've fended off cyber attacks and hackers for years doesn't mean there aren't some very talented criminals out there on the track of our inevitable vulnerabilities.
Pitfalls That Apply Particularly to IT Management
-
Letting perfection get in the way of improvement. There is a natural tendency, particularly in IT projects, to load on features, which results in bloat and budget- and schedule-busting complexity.
-
Failure to recognize the probabilistic (rather than deterministic) nature of events, which leads to inadequate risk analyses and failure to develop contingency plans. No IT function lacks horror stories about projects going terribly wrong when tasks proved more difficult than anyone could have imagined, products did not meet expectations, vendors and contractors failed to deliver, or intended users did not effectively communicate their needs, any of which can derail a project. All the promises may have been made in good faith, with nobody's head belonging on a pike, except possibly that of the decision maker who assumed the suspension of Murphy's Law. One of my clients tried to portray reality by showing a "worst case" scenario that went 20% over budget. I shouldn't have needed to point out that IT projects that come in "only" 20% over budget are usually occasions for high-fives.
-
Not anticipating the reactions of those who are affected by a decision but were not a party to it, and thus feel no ownership. IT functions have rich experience in this as well. So many IT-based innovations have fallen short or failed outright due to insufficient consultation with the intended beneficiaries, an oversight that is often compounded by insufficient preparation and training. This is mostly not IT people's fault, except insofar as they failed to insist on more rigorous change management and introduction processes. (Of course, that would not be politically easy in most environments.)
Pitfalls in the Use of IT for Decision Making
-
Letting quantitative analyses crowd out the qualitative. The use of surrogate or indirect measures is appropriate when direct measures are impossible or impractically expensive. The problem comes from using them as the only measures, especially for high-stakes decisions. It's even more problematic when the surrogate measures can be gamed by people who will be impacted by them. (See sidebar "Measurement Gone Wrong.")
-
Insufficient skepticism about numbers. Numbers convey a level of exactness that can become an intellectual shortcut, obscuring the need for more in-depth understanding of just how they are derived and what they do and don't tell us. (As Mark Twain said, "There are lies, damned lies, and statistics.") Why should we believe they capture the essentials we need to support good decisions? What possibly essential information is not captured? Measurement of social systems is as much or more art than science. How we phrase the question or describe what we want measured can profoundly influence the quality and usefulness of the information we receive. Just ask any political pollster about this.
-
IT's great strength can be its weakness. Because IT is so good at manipulating and presenting information in a way that just seems so credible, it amplifies our tendency toward overreliance on it.
-
Big data can easily become big misinformation. "Data scientist" is an impressive new job title, but like science practiced at a laboratory bench, there's a lot of janitorial work involved. As reported recently in the New York Times, "Data scientists, according to interviews and expert estimates, spend from 50% to 80% of their time mired in this more mundane labor of collecting and preparing unruly digital data, before it can be explored for useful nuggets."6 As anyone who has ever tried to merge different coding schemes or scrub a mailing list knows, this kind of work is painstaking, frustrating, and a bit boring. Since "done" is not unambiguously definable, it's easy to take shortcuts in response to business pressure. Sometimes that's necessary and appropriate, but never without caveats.
-
Analysis paralysis is aided and abetted by IT's ability to produce endless scenarios for us to analyze and compare.
TACKLING A QUINTESSENTIALLY HUMAN CHALLENGE
Decisions can affect the next five seconds or the next five centuries. Decision-making techniques, using that term loosely, range from carefully crafted algorithms to "seat of the pants"; each approach has its role. Since the ability to decide is what makes us human, decision making showcases every human foible. Our lives and the world we live in have been shaped for good or ill by an infinity of decisions. As long as humankind exists, people will analyze and try to improve how we make decisions. And while some of us -- those in powerful positions, we hope -- should gradually get better at it, progress won't be monotonic, and there is no endpoint. Nevertheless, we must chip away, trying to learn from past mistakes -- our own and those of others. We can do that by reading history, psychology, and classic stories. (See sidebar "A Rich Subject.")
Here are four guidelines that apply to decision makers at all levels:
-
Be inquisitive. Ensure you really understand the nature of the decision and its ramifications. Asking lots of questions is a sign of wisdom, not a confession of ignorance.
-
Be skeptical. Don't just accept answers at face value.
-
Be diligent. Making good decisions is not easy; tools and techniques can help, but only if their limitations are understood. If not, they're dangerous.
-
Be humble. Nobody is immune to mistakes, and surrounding oneself with yes-people greatly increases the likelihood of going wrong. The destruction caused by hubris over the millennia is incalculable.
ENDNOTES
1 Bill Gates' book The Road Ahead, published that year, barely mentioned it.
2 In 1961, the CIA was sure that a small-scale invasion of Cuba would topple Fidel Castro. The force, such as it was, landed at the Bay of Pigs and was ignominiously defeated. Allen Dulles, the long-time CIA director Kennedy inherited, was consequently ousted.
3 This is not to single out Agile. There is a long history of promising techniques that didn't deliver all they promised.
4 In 1938, Britain, France, and Nazi Germany made an agreement in Munich to let Germany annex predominantly German-speaking parts of Czechoslovakia. Prime Minister Chamberlain said that the pact would assure "peace in our time."
5 Geithner, Timothy. Stress Test: Reflections on Financial Crises. Crown Publishers, 2014.
6 Lohr, Steve. "For Big-Data Scientists, 'Janitor Work' Is Key Hurdle to Insights." The New York Times, 17 August 2014.
7 McAfee, Andrew, Anders Sjoman, and Vincent Dessain. "Zara: IT for Fast Fashion." HBS Case Study No. 604081-PDF-ENG. Harvard Business School, 25 June 2004.
8 Kahneman, Daniel. Thinking, Fast and Slow. Farrar, Straus and Giroux, 2011.
9 Halberstam, David. The Best and the Brightest. Random House, 1972.
10 Geithner, Timothy F. Stress Test: Reflections on Financial Crises. Crown Publishers, 2014.
11 Tuchman, Barbara W. The March of Folly: From Troy to Vietnam. Knopf, 1984.