Better BI

Chris Gerrard – Exploring the world of Better Business Intelligence

Archive for the ‘Enterprise BI’ Category

Note on the IBM Cognos – Solution Implementation Method

leave a comment »

A recently created discussion in the LinkedIn Business Intelligence group (here) prompted this note.

The original blog is favorable towards the IBM Cognos – Solution Implementation Method (C-SIM). This note disagrees and argues that C-SIM is in fact detrimental to the delivery of effective, high quality, valuable BI that really does help the business.

The overarching problem with C-SIM is its assumption that a BI “solution” is a discrete, finite thing amenable to the traditional Analyze–Design–Configure&Build–Deploy–Operate approach to building fixed-function software systems whose functionality is determinable prior to its construction.

C-SIM is a very poor model for delivering BI solutions. Its history is littered with embarrassingly low rates of delivering even reasonable levels of business value in the form of meaningful, timely, valuable data-based business information and insights.

You may ask yourself: “That’s a pretty bold claim. How can I evaluate it?”
I’m glad you asked.

Here’s a threshold test one should ask of any potential BI vendor partner: “How quickly will I or one of the other business stakeholders get valuable information from my business data?”
In the case of C-SIM a reasonable follow-on is: “Show me in your method/project plans the earliest point at which this happens.”

If your C-SIM vendor partner cannot or will not answer, stumbles, fumbles, hems or haws there’s a very large problem on your near horizon, should you choose to go down that path.

There’s a bright spot: C-SIM does have real value.

It’s attractive for large BI resource vendors because it’s road-mapped well into the future, and provides a firm framework for dedicating the amount of resources—people, and technology—that can be assigned to the elaborate project tasks structure, guaranteeing predictable revenue streams. It’s hard to argue with the Big BI resource vendors that their model isn’t working when it’s generating $Billions in revenue; their position boils down to: “Of course, it’s working. look at all the money our clients are paying us.” (A cynic might say that the method is only valuable for providing the revenues that the Big BI consulting company principals’ fortunes are built upon, and that they get to keep whether or not the business clients get the information and insights they’ve paid for.)

Although, with the increasing media awareness that Big BI initiatives are largely not delivering upon their promises, and the emergence of Agility in the practice of BI, along with 5 years of experience with tools like Tableau, Spotfire, QlikView, and their cousins, the tide might be changing.

Written by Chris Gerrard

February 10, 2012 at 8:09 am

Posted in Bad BI, Big BI, Enterprise BI

Tagged with , , , ,

BI RAD – Business Intelligence Rapid Analytics Delivery

with one comment

A fundamental virtue in BI is the delivery of value early, often and constantly.

Business Intelligence—Rapid Analytics Delivery (BI RAD) is the practice of creating high quality analytics as quickly, efficiently, and effectively as possible. Done properly, it’s possible to develop the analytics hand-in-hand with the business decision makers in hours.

BI RAD requires the combination of tools that permit easy access to the business data, are highly effective and usable in the exploratory model of analysis-browsing through the data looking for interesting patterns and anomalies, and then persisting meaningful analyses as analytics.

BI RAD emerged as a viable practice over twenty years ago with the first generations of specialized reporting tools that were much more nimble and effective than the COBOL report generation coming from DP shops. I worked extensively with FOCUS, the original 4GL, first at JCPenney, then as a consultant and product manager with Information Builders, Inc., FOCUS’ vendor. People were surprised, sometimes shocked, at how rapidly useful reports could be generated for them—once up to speed new reports could generally be available within a day, sometimes a couple of hours.

In today’s world of Big BI, where enterprise BI tools and technologies are the norm, business decision makers have become accustomed to long waits for their BI analytics. In a sense, Big BI has become the modern DP; the technology has become so large and complex that getting it installed and operational, with everything designed, implemented, cleansed, ETL’d, and so on before any analytics get created consumes all the time, energy, effort and resources available and results in failure and disappointment, with very little value delivered in the form of information making it into the minds of the real human people who need it.

Yet BI RAD lives. An entire modern generation of direct-access data discovery tools has emerged, providing the ability to establish an intimate connection between business data and the people who need insights into the data in order to make informed business decisions.

Tableau, from Tableau Software, is a near-perfect tool for BI RAD. It connects to the great majority of business data sources almost trivially easily, and is unsurpassed in terms of usability and quality of visualizations for the majority of business analytics needs.

Using Tableau, it’s possible to sit down with a business stakeholder, connect to their data, and jump right into exploring their data. Any interesting and valuable analyses can be saved and kept at hand for immediate use.

Experience in multiple client environments has shown that these analyses have multiple overlapping and reinforcing benefits to the organization. The immediate and obvious benefit is that business decision makers are provided the data based insights they require to make high quality business decisions. Beyond that, the analyses created are the best possible requirements for Enterprise analytics

Who could ask for anything more?

Written by Chris Gerrard

April 5, 2011 at 11:32 pm

Posted in BI RAD, Enterprise BI

Tagged with ,

How Big BI Fails To Deliver Business Value

with 3 comments

Business Intelligence, the noun, is the information examined by a business person for the purpose of understanding their business, and using as the basis of their business decision.

Business Intelligence, the verb, is the practice of providing the business person with the Business Intelligence they require.

Business Intelligence is conceptually based in a valuable proposition: delivery of actionable, timely, high quality information to business decision makers provides data-based evidence that they can use in making business decisions.

Timely, high quality information is extremely valuable. It’s also somewhat perishable. Deriving the maximum business value from Business Intelligence is the expressed value proposition of all BI projects. Or it should be.

Enterprise Business Intelligence is the entire complex of tools, technologies, infrastructure, data sources and sinks, designs, implementations and personnel involved in collecting business data, combining it into collective stores, and creating analyses for access by business people.

Enterprise BI projects are very strongly biased towards complexity. They’re based on the paradigm of using large complex products, platforms, and technologies to design and build out data management infrastructures that underpin the development of analytics—reports, dashboards, charts, graphs, etc., that are made available for consumption.

Big BI is what happens when Enterprise Business Intelligence grows bigger than is absolutely necessary to deliver the BI value proposition. Unfortunately, Enterprise BI tends to mutate into Big BI, and the nature of Big BI almost invariably impedes the realization of the BI business value proposition.

Big BI is by its very nature overly large, complex and complicated. There are many moving parts, complicated and expensive products, tools and technologies that need to be installed, configured, fed, and cared for. All before any information actually gets delivered to the business decision makers.

Big BI is today’s Data Processing. In the old days of mainframes, COBOL, batch jobs, terminals, and line printers business people who wanted reports had to make the supplicant journey to their Data Processing department and ask (or beg) for a report to get created, and scheduled, and delivered. Data Processing became synonymous with “we can do that (maybe) but it’ll take a really long time, if it happens at all.” Today’s Big BI is similarly slow-moving.

There are multiple reasons for this sad state of affairs. In the prevalent paradigm new BI programs need to acquire the requisite personnel, infrastructure, tools, and technologies. All of which need to be installed and operational, which can take a very long time. Data needs to be analyzed, information models need to be created, reporting data bases designed, ETL transformations designed and implemented, reporting tool semantic layers, e.g. Business Objects Universes, Cognos Frameworks, need to be constructed, reports need to be created, and, finally, the reports made available.

Only then do the business decision makers (remember them?) benefit.

This entire process can take a very long time. All too frequently it takes months. Not just because of the complexity of the tools and process, but also because of the friction inherent in coordinating the many moving parts and involved parties, each with their own bailiwick and gateways to protect.

The tragic part in many Big BI projects is that the reports delivered to the business people usually fall far short of delivering the business value they should provide.

The reports are out of date, or incomplete, or no longer relevant, or poorly designed and executed, or simply wrong because the report production team “did something” without having a clear and unambiguous understanding of the information needs of the business decision maker.

In far too many instances there’s a vast gulf between the business information needs and the reports that get developed and delivered. This situation occurs because there’s too much distance, in too many different dimensions, between the business and the report creators.

Here is a map of the structure—the processes and artifacts, and the linkages between them—of a typical Big BI project. It’s worth studying to see how far apart the two ends of the spectrum are. At one end is the business person, with their business data that they need to analyze. At the far end, at the end of all of the technology, separated by multiple barriers from the business, are the report generators.

A typical scenario in this environment is that somebody, with any luck a competent business analyst, sometimes a project manager, all too frequently a technical resource, is tasked with interviewing the business and writing up some report specs—perhaps some wireframes and/or Word documents, maybe Use Cases. The multiple problems with these means of capturing analytics requirements are too extensive to discuss here. One fatal flaw that occurs all too frequently must be mentioned, however: the creator of the report specs lacks the professional skills required to elicit, refine, and communicate appropriate, high quality, effective business analytics requirements. As a result the specifications are inadequate for their real purpose, and their flaws flow downstream.

These preliminary specs are then used as the inputs into the entire BI implementation project. Which then goes about its business of creating and implementing all of the technological infrastructure necessary to crank out some analyses of the data. At the end of the “real work” a tool specialist renders the analytics that then get passed back to the business.

The likelihood of this approach achieving anything near the real business value obtainable from the data is extremely low. There’s simply too much distance, and too many barriers, between the business decision makers and the analyses of their data. Big BI has become too big, too complex, with too much mass and inertia, all of which get between the business and the insights into their data that are essential to making high quality business decisions.

There is, however, a bright horizon.

Business Intelligence need not be Big BI. Even in those circumstances where Enterprise BI installations are required, and there are good reasons for them, they need not be the monolithic voracious all-consuming resource gobblers they’ve become. Done properly, Enterprise BI can be agile, nimble, and highly responsive to the ever-evolving business needs for information.

Future postings will explain how this can be your Enterprise BI reality.

not just because of the complexity of the tools and process, but also because of the friction inherent in coordinating the many moving parts and involved parties, each with their own bailiwick and gateways to protect.

Written by Chris Gerrard

March 30, 2011 at 9:00 am

Posted in Big BI, Enterprise BI

Tagged with ,