8 | 2006

We've been working on business intelligence (BI) for going on 20 years now. Yet many executives and managers still can't get what they need from their data analysis tools. They wish they had better prognostic capabilities. They wish they had predictive models that would notify them about potential problems or risks before they occur. They wish, in short, for intelligent business intelligence. In this issue, we'll debate how best to give it to them. You'll discover how you can stand out from the BI crowd by putting meaning and interest ahead of available data and consistency. You'll learn why BI technology will get you nowhere without a clear data strategy -- and how you can craft yours. Be sure to join us this month for advice on increasing your organization's business intelligence quotient.

"Despite all the data warehousing efforts, BI technology advancements, and the deepening expertise and experience of the BI community, the fact remains that many difficult challenges stand between the business objective and the actionable outcome."

-- Ken Collier, Guest Editor

Power to the Data Wonks!

Data quality, data stewardship, data understanding, and simpler analytics are the keys to effective business intelligence. Get a grip on your data and analysis methods, and you will gain the business insights you have been looking for.

Power to the (Creative, Connected) People!

Those data wonks have it all wrong. Communities, social networks, and tacit knowledge are the real keys to better business intelligence. Fancy-pants technologies and gee-whiz analytical methods are the wrong focus.

Opening Statement

As a longtime business intelligence (BI) and data warehousing practitioner, I've been curious about when we would begin to see big changes in the ways companies use their data assets to improve their business. Yes, I know, most large companies today have data warehouses or marts that provide users with some means of analyzing their data. They use business performance management and balanced scorecards to measure and improve. But considering the ability of today's companies to collect and analyze huge amounts of data about every aspect of their business, I expected corporate leaders to gain all sorts of wonderful new insights. Why hasn't this happened?

I will concede some areas of growth. BI tools have matured in capability and usability. Data mining is now embedded into domain-specific business solutions such as customer relationship management (CRM). BI dashboards and portals have widened the user base. And there has been a lot of positive development in the automation of corporate scorecards, business process management, business rules management, and the like.

However, almost all of the executives and managers in companies that I work with have the same complaints they had 10 years ago. They can't get what they need from their data analysis tools. They don't always trust their data. They wish they had better prognostic capabilities. They wish they had predictive models that would notify them about potential problems or risks before they occur. While data warehouses have boosted diagnostic capability, what happened to the promise of better prognostic capability through data mining, statistical analysis, and other advanced methods?

When Cutter Consortium invited me to be Guest Editor of this issue of Cutter IT Journal, I took the opportunity to find out what other experts think about the surprising lack of advanced BI as an integral part of the executive decision support toolkit. This issue contains some very insightful points of view from some really smart BI experts.

REMEMBER OUR ROOTS

What has been will be again, what has been done will be done again; there is nothing new under the sun.

-- Ecclesiastes 1:9

As in life, this maxim appears true in business intelligence. Although the term "data mining" wasn't coined until the early 1990s, Thomas Redman, author of our first article, was doing the equivalent data analyses at Bell Labs well before that time. Maybe he wasn't using neural networks and Bayesian nets back then, but the issues and challenges were much the same as the ones we face today. Those early data miners were forced to use rudimentary tools, hand-coded models, and pencil-and-paper data visualizations. In doing so, they got much closer to the data than practitioners using today's more robust tools are required to do. This "intimacy" with the data has been lost, Redman argues, to the detriment of current data mining projects. Redman concludes with three prescriptions for what ails data mining today: manage data mining as an end-to-end process, ensure data quality at the sources of data, and get intimate with the data through hands-on work with a sample.

REMEMBER OCCAM'S RAZOR

In vain we do by many which can be done by means of fewer.

-- William of Occam, 14th century

Bob Daugherty is an experienced statistician and business intelligence expert. As the director of the Center for Data Insight (CDI) at Northern Arizona University, Daugherty has worked with top-level decision makers from many different companies who are seeking to gain new business insights from their data. In spite of all of the advancements in machine learning and analytical modeling methods, Daugherty continues to discover that the most powerful revelations tend to come from the more basic analytical methods. In his article, Daugherty points out that the greatest "aha!" moments for many executives come not from clustering algorithms or neural network analyses but from simple sums, percentages, and averages. While these results are easier to understand and act upon, Daugherty cautions that they are not necessarily trivial to produce, since the data must be assembled and prepared for analysis. One positive side effect is that if and when a company determines that it would benefit from using more sophisticated analytical methods, these data preparations will provide the necessary inputs to more advanced data mining models.

IT'S PEOPLE!

Victor Rosenberg and Cutter Senior Consultant Donna Fitzgerald fire up the debate by suggesting that real business intelligence is not produced by hotshot data geeks wielding esoteric technologies. Rather, truly meaningful business intelligence stems from social networks and communities of experts within the organization. True insights require substantial expertise, tacit knowledge about the meaning of the data, the questions to ask, and interactions with other business experts across the enterprise. Rosenberg and Fitzgerald claim that BI techniques and technologies are simply "subactivities" in the organization, not at the core of how senior decision makers really operate. They support this point of view with several case examples and observations from years of experience working with and around executives in a variety of large companies. They contrast the "traditional intelligence community" against what they consider to be the wrong-headed "IT-BI" perspective.

I have to admit to being a little taken aback by the "them versus us" nature of this article. However, after giving their article a second read, I found that my observations are much the same as Rosenberg's and Fitzgerald's. Since I'm a data wonk to the core, I can't completely discount the IT-BI point of view. However, anyone who has worked with CEOs, CFOs, and CMOs to provide better BI knows that these leaders aren't interested in hearing about technologies. They have deep and penetrating business questions, hypotheses, and theories. They have extensive domain expertise, and they know who within the organization can help them get the answers they need from the organizational data assets.

The problem I have with the authors' model is that it doesn't scale across the organization particularly well, and it is difficult to effectively deploy this sort of intelligence to mid-level management. Nevertheless, if you are an IT executive or manager or consider yourself a data geek, you should definitely read this article. It will give you a healthy alternative point of view to consider for your own organization.

GARBAGE IN, GARBAGE OUT

My good friend and colleague Luke Hohmann likes to point out that business intelligence is like shining a very bright light into the dark attic of your data assets. It highlights the flaws and quality problems with the data that nobody has discovered before. He's right. I've been involved in many BI projects in which users are surprised by what they see. Their reactions are interesting. First there is denial -- users prefer to believe that there are flaws in the BI system rather than in the data itself. Next there is dismay -- why does the data have the problems and how long have they existed without their knowing? Then comes distrust -- they cannot base any decisions on data that is flawed; after all, they don't know what other problems exist.

It's no wonder that business intelligence doesn't often drive powerful new insights with false starts like these. We in the BI community have long known that data cleansing and data preparation are the bulk of the effort required in an end-to-end data analysis project. In our next article, Cutter Senior Consultant Larissa Moss pays careful attention to all of the essential elements of effective data hygiene and stewardship. In spite of all those corporate dollars being spent on data warehousing and BI solutions, I rarely find organizations willing to fund sound data management strategies. However, one does not have to look far to find IT personnel who can outline the ever-increasing cost of allowing data quality problems to persist. Moss outlines and summarizes a cohesive and comprehensive data management strategy, then describes the roles and responsibilities needed to support such a strategy.

LEARNING FROM OTHERS

We can often learn a lot about our own issues and challenges by examining the methods of unrelated domains. Certainly, there is much to be learned from studying the BI methods of companies in different industries, but let's take this even a step further. Can we learn something valuable by examining the BI methods and solutions of organizations that are entirely outside of the typical vertical industry structure? Can they learn something from us?

James Slebodnick is an experienced BI practitioner and data warehouse developer. While he has worked with companies in many different industries, he has also had the experience of developing a BI solution for environmental monitoring agencies whose end users are a mix of scientists and researchers involved in studying Grand Canyon (USA) geology, topography, flora, fauna, and water quality. Many companies I work with face challenges in providing BI that supports disparate source systems, legacy databases, and users with widely varying business requirements. Well, these challenges have nothing on the degree of disparity covered in Slebodnick's article, "Bridging the Canyon: Introducing Business-Oriented Practices to an Environmental Data Project." Did I mention that the monitoring data had to be made selectively available (according to three "release status" designations) to environmental policy makers, local Native American tribes, and the general public?

While users and source systems in a company may seem plenty varied, enterprises benefit from the fact that users and systems are still focused on the core mission and objectives of the organization. Slebodnick's case study involves entirely independent organizations, each with a different mission, and many with ad hoc source databases. Any business organization can learn much from this environmental case study about difficult data merges and integration. The clever use of metadata repositories, a reference database, and complex merge logic provide the data geeks among us (myself included!) with some interesting perspectives on merging difficult and highly disparate source system data. This article further highlights the challenges of fundamental data management that most organizations continue to wrestle with as a precursor to powerful data analysis.

IF BI WERE EASY, EVERYBODY WOULD DO IT

It is evident that business intelligence is not for the faint of heart. In spite of all the advances in technology, BI still requires a unique blend of technical skills, including data management, data integration, data quality and cleansing, database development, statistical analysis, data mining, visualization, and so on. And that's just the technical skills.

A virtuous BI cycle is one that is driven by a clearly defined business objective and results in actionable and deployable outcomes. Despite all the data warehousing efforts, BI technology advancements, and the deepening expertise and experience of the BI community, the fact remains that many difficult challenges stand between the business objective and the actionable outcome. I hope this issue of Cutter IT Journal will help your organization identify the big hurdles and provide you with guidelines for:

  • Managing data mining as an end-to-end process

  • Simplifying your analytical modeling

  • Recognizing the value of social networks and communities as intelligence providers and consumers

  • Implementing a sound data strategy to improve and manage your data assets

  • Learning valuable lessons from outside your domain and industry

ABOUT THE AUTHOR