Better BI

Chris Gerrard – Exploring the world of Better Business Intelligence

Archive for the ‘Big BI’ Category

Note on the IBM Cognos – Solution Implementation Method

leave a comment »

A recently created discussion in the LinkedIn Business Intelligence group (here) prompted this note.

The original blog is favorable towards the IBM Cognos – Solution Implementation Method (C-SIM). This note disagrees and argues that C-SIM is in fact detrimental to the delivery of effective, high quality, valuable BI that really does help the business.

The overarching problem with C-SIM is its assumption that a BI “solution” is a discrete, finite thing amenable to the traditional Analyze–Design–Configure&Build–Deploy–Operate approach to building fixed-function software systems whose functionality is determinable prior to its construction.

C-SIM is a very poor model for delivering BI solutions. Its history is littered with embarrassingly low rates of delivering even reasonable levels of business value in the form of meaningful, timely, valuable data-based business information and insights.

You may ask yourself: “That’s a pretty bold claim. How can I evaluate it?”
I’m glad you asked.

Here’s a threshold test one should ask of any potential BI vendor partner: “How quickly will I or one of the other business stakeholders get valuable information from my business data?”
In the case of C-SIM a reasonable follow-on is: “Show me in your method/project plans the earliest point at which this happens.”

If your C-SIM vendor partner cannot or will not answer, stumbles, fumbles, hems or haws there’s a very large problem on your near horizon, should you choose to go down that path.

There’s a bright spot: C-SIM does have real value.

It’s attractive for large BI resource vendors because it’s road-mapped well into the future, and provides a firm framework for dedicating the amount of resources—people, and technology—that can be assigned to the elaborate project tasks structure, guaranteeing predictable revenue streams. It’s hard to argue with the Big BI resource vendors that their model isn’t working when it’s generating $Billions in revenue; their position boils down to: “Of course, it’s working. look at all the money our clients are paying us.” (A cynic might say that the method is only valuable for providing the revenues that the Big BI consulting company principals’ fortunes are built upon, and that they get to keep whether or not the business clients get the information and insights they’ve paid for.)

Although, with the increasing media awareness that Big BI initiatives are largely not delivering upon their promises, and the emergence of Agility in the practice of BI, along with 5 years of experience with tools like Tableau, Spotfire, QlikView, and their cousins, the tide might be changing.

Written by Chris Gerrard

February 10, 2012 at 8:09 am

Posted in Bad BI, Big BI, Enterprise BI

Tagged with , , , ,

How Big BI Fails To Deliver Business Value

with 3 comments

Business Intelligence, the noun, is the information examined by a business person for the purpose of understanding their business, and using as the basis of their business decision.

Business Intelligence, the verb, is the practice of providing the business person with the Business Intelligence they require.

Business Intelligence is conceptually based in a valuable proposition: delivery of actionable, timely, high quality information to business decision makers provides data-based evidence that they can use in making business decisions.

Timely, high quality information is extremely valuable. It’s also somewhat perishable. Deriving the maximum business value from Business Intelligence is the expressed value proposition of all BI projects. Or it should be.

Enterprise Business Intelligence is the entire complex of tools, technologies, infrastructure, data sources and sinks, designs, implementations and personnel involved in collecting business data, combining it into collective stores, and creating analyses for access by business people.

Enterprise BI projects are very strongly biased towards complexity. They’re based on the paradigm of using large complex products, platforms, and technologies to design and build out data management infrastructures that underpin the development of analytics—reports, dashboards, charts, graphs, etc., that are made available for consumption.

Big BI is what happens when Enterprise Business Intelligence grows bigger than is absolutely necessary to deliver the BI value proposition. Unfortunately, Enterprise BI tends to mutate into Big BI, and the nature of Big BI almost invariably impedes the realization of the BI business value proposition.

Big BI is by its very nature overly large, complex and complicated. There are many moving parts, complicated and expensive products, tools and technologies that need to be installed, configured, fed, and cared for. All before any information actually gets delivered to the business decision makers.

Big BI is today’s Data Processing. In the old days of mainframes, COBOL, batch jobs, terminals, and line printers business people who wanted reports had to make the supplicant journey to their Data Processing department and ask (or beg) for a report to get created, and scheduled, and delivered. Data Processing became synonymous with “we can do that (maybe) but it’ll take a really long time, if it happens at all.” Today’s Big BI is similarly slow-moving.

There are multiple reasons for this sad state of affairs. In the prevalent paradigm new BI programs need to acquire the requisite personnel, infrastructure, tools, and technologies. All of which need to be installed and operational, which can take a very long time. Data needs to be analyzed, information models need to be created, reporting data bases designed, ETL transformations designed and implemented, reporting tool semantic layers, e.g. Business Objects Universes, Cognos Frameworks, need to be constructed, reports need to be created, and, finally, the reports made available.

Only then do the business decision makers (remember them?) benefit.

This entire process can take a very long time. All too frequently it takes months. Not just because of the complexity of the tools and process, but also because of the friction inherent in coordinating the many moving parts and involved parties, each with their own bailiwick and gateways to protect.

The tragic part in many Big BI projects is that the reports delivered to the business people usually fall far short of delivering the business value they should provide.

The reports are out of date, or incomplete, or no longer relevant, or poorly designed and executed, or simply wrong because the report production team “did something” without having a clear and unambiguous understanding of the information needs of the business decision maker.

In far too many instances there’s a vast gulf between the business information needs and the reports that get developed and delivered. This situation occurs because there’s too much distance, in too many different dimensions, between the business and the report creators.

Here is a map of the structure—the processes and artifacts, and the linkages between them—of a typical Big BI project. It’s worth studying to see how far apart the two ends of the spectrum are. At one end is the business person, with their business data that they need to analyze. At the far end, at the end of all of the technology, separated by multiple barriers from the business, are the report generators.

A typical scenario in this environment is that somebody, with any luck a competent business analyst, sometimes a project manager, all too frequently a technical resource, is tasked with interviewing the business and writing up some report specs—perhaps some wireframes and/or Word documents, maybe Use Cases. The multiple problems with these means of capturing analytics requirements are too extensive to discuss here. One fatal flaw that occurs all too frequently must be mentioned, however: the creator of the report specs lacks the professional skills required to elicit, refine, and communicate appropriate, high quality, effective business analytics requirements. As a result the specifications are inadequate for their real purpose, and their flaws flow downstream.

These preliminary specs are then used as the inputs into the entire BI implementation project. Which then goes about its business of creating and implementing all of the technological infrastructure necessary to crank out some analyses of the data. At the end of the “real work” a tool specialist renders the analytics that then get passed back to the business.

The likelihood of this approach achieving anything near the real business value obtainable from the data is extremely low. There’s simply too much distance, and too many barriers, between the business decision makers and the analyses of their data. Big BI has become too big, too complex, with too much mass and inertia, all of which get between the business and the insights into their data that are essential to making high quality business decisions.

There is, however, a bright horizon.

Business Intelligence need not be Big BI. Even in those circumstances where Enterprise BI installations are required, and there are good reasons for them, they need not be the monolithic voracious all-consuming resource gobblers they’ve become. Done properly, Enterprise BI can be agile, nimble, and highly responsive to the ever-evolving business needs for information.

Future postings will explain how this can be your Enterprise BI reality.

not just because of the complexity of the tools and process, but also because of the friction inherent in coordinating the many moving parts and involved parties, each with their own bailiwick and gateways to protect.

Written by Chris Gerrard

March 30, 2011 at 9:00 am

Posted in Big BI, Enterprise BI

Tagged with ,

Beware BDWUF – Big Data Warehouse Up Front

with one comment

Data warehouses are good, useful, and tremendously valuable in relationship to their contribution to providing the data-based information essential to making timely, high quality business decisions. Properly designed and implemented, fronted with useful and meaningful reports, user-operated analytical frameworks, dashboards, scorecards, and the like, data warehouses effectively fulfill their fundamental purpose.

There are, however, far too many examples of data warehouses that have consumed large, even enormous, amounts of time, energy, effort, money, executive attention, and other resources, without delivering a reasonable return on these investments. Why has this come to be?

One recurring theme is the phenomena of the BDWUF – Big Data Warehouse Up Front.

BDWUF (pronounced bee-dee-woof) refers to the practice of investing tremendous amounts of effort into deep analysis of the business along with designing and implementing a comprehensive enterprise-wide data warehouse that completely addresses the wide variety of considerations and constraints involved in delivering the “one version of the truth” to business decision makers.

The emphasis on creating the whole enterprise-wide data warehouse-based environment and systems has the overall effect of shifting the emphasis away from delivering high quality, timely, effective information to the activites required to design and construct the technological artifacts. There is a very real danger, and high liklihood, that building the technology, and maintaining the technology-constructing bureaucracy, becomes the whole point. And that the outputs finally delivered to the business decision makers will be of relatively little real value.

If not BDWUF, then what?

Assuming that BDWUF is a less-than-optimal approach to delivering business intelligence to business decision makers, is there an effective alternative?

The answer is yes. Better BI is an approach to business intelligence that stresses the importance of delivering timely, high quality business information to business decision makers. Better BI is a superset of data warehouse-based business intelligence that recognizes the importance of working intimately with business stakeholders throughout the entire process of developing and delivering the information they require.

Learn more about Better BI here. The Better BI manifesto is here. Better BI’s home is here.

Written by Chris Gerrard

May 22, 2008 at 2:47 pm

Posted in Big BI, Uncategorized