Better BI

Chris Gerrard – Exploring the world of Better Business Intelligence

Posts Tagged ‘Big BI

Note on the IBM Cognos – Solution Implementation Method

leave a comment »

A recently created discussion in the LinkedIn Business Intelligence group (here) prompted this note.

The original blog is favorable towards the IBM Cognos – Solution Implementation Method (C-SIM). This note disagrees and argues that C-SIM is in fact detrimental to the delivery of effective, high quality, valuable BI that really does help the business.

The overarching problem with C-SIM is its assumption that a BI “solution” is a discrete, finite thing amenable to the traditional Analyze–Design–Configure&Build–Deploy–Operate approach to building fixed-function software systems whose functionality is determinable prior to its construction.

C-SIM is a very poor model for delivering BI solutions. Its history is littered with embarrassingly low rates of delivering even reasonable levels of business value in the form of meaningful, timely, valuable data-based business information and insights.

You may ask yourself: “That’s a pretty bold claim. How can I evaluate it?”
I’m glad you asked.

Here’s a threshold test one should ask of any potential BI vendor partner: “How quickly will I or one of the other business stakeholders get valuable information from my business data?”
In the case of C-SIM a reasonable follow-on is: “Show me in your method/project plans the earliest point at which this happens.”

If your C-SIM vendor partner cannot or will not answer, stumbles, fumbles, hems or haws there’s a very large problem on your near horizon, should you choose to go down that path.

There’s a bright spot: C-SIM does have real value.

It’s attractive for large BI resource vendors because it’s road-mapped well into the future, and provides a firm framework for dedicating the amount of resources—people, and technology—that can be assigned to the elaborate project tasks structure, guaranteeing predictable revenue streams. It’s hard to argue with the Big BI resource vendors that their model isn’t working when it’s generating $Billions in revenue; their position boils down to: “Of course, it’s working. look at all the money our clients are paying us.” (A cynic might say that the method is only valuable for providing the revenues that the Big BI consulting company principals’ fortunes are built upon, and that they get to keep whether or not the business clients get the information and insights they’ve paid for.)

Although, with the increasing media awareness that Big BI initiatives are largely not delivering upon their promises, and the emergence of Agility in the practice of BI, along with 5 years of experience with tools like Tableau, Spotfire, QlikView, and their cousins, the tide might be changing.

Advertisements

Written by Chris Gerrard

February 10, 2012 at 8:09 am

Posted in Bad BI, Big BI, Enterprise BI

Tagged with , , , ,

Big BI is dead. But it’ll twitch for awhile.

with 2 comments

The end came last week in Las Vegas at the annual Tableau Customer Conference.

Big BI put up a valiant struggle, but it’s been unwell for some time, sputtering along, living on past glories real and imagined.

Its passing was inevitable. Big, bloated, and overripe, it never lived up to its promise of being the path to data enlightenment. Although its nominal goals were good and proper, in practice it failed to deliver. Much has been written on Big BI’s problems, including here, here, and here.

Big BI flourished, and then held on in spite of its flaws. Partly through inertia—a lot of money and time had been spent over its two decades of dominance. Partly through pervasiveness—Big BI’s proponents have been hugely successful at promoting it as the one true way. Partly through the absence of the disruptive technology that would upend the BI universe.

Big BI is brain-dead, but support systems are keeping the corpse warm.

Like all empires on the wane, its inhabitants,and sponsors haven’t realized it yet. Or, cynically, those who have are milking it for what they can get while the getting is still good. Like many empires, its demise comes not with a big bang, but with a nearly silent revolution that upends the established order—even as the Big BI promoters and beneficiaries are flush, fat, and happy their base of influence and position, wealth and power, has eroded away, leaving them dining on memories.

Big BI’s fundamental premise was always deeply flawed, erroneous when it wasn’t disingenuous or worse. The paradigm held that the only approach to achieving Business Intelligence within an organization was through the consolidation of the enterprise’s business data into data warehouses from which a comprehensive, all-encompassing single version of the truth could be achieved.

The only points of differentiation and discussion in the Big BI universe were squabbles about ultimately minor aspects of the core reality. Data Warehouses vs Data Marts. Inmon vs Kimball (Google “Inmon Kimball”). Dimensionally modeled analytical databases are relational vs no they’re not. And so on and so forth and such like.

The baseline concept remained the same: Business Intelligence is achieved by collecting, cleansing, transforming and loading business information into the integrated, homogenized, consolidated data store (Mart or Warehouse), where it then, and only then, can be fronted by a large, complex, complicated “Enterprise BI Platform” that provides a business semantic facade for the dimensional data and is the only channel that can be used to code up, create, and deliver reports, graphs, charts, dashboards, strategy maps, and the like to the business people who need to understand the state of their area of responsibility and make data-based decisions.

The overarching goal is completely correct: delivering information (intelligence) to the real human people who need it. But the reality is that Big BI has abjectly failed to deliver. With an eye to history, and another to the evolution of technology the possible end of Big BI has been in sight for some time. The history of BI is deep and rich, encompassing much more than Big BI. A brief history (mine) is here.

What happened? Why now? Why Tableau?

A number of years ago novel products appeared, sharing the concept that data access and analysis should be easy and straightforward, that people should be able to conduct meaningful, highly valuable investigations into their data with a minimum of fuss and bother.

Tableau was the best of these products, squarely aimed at making it as simple and straightforward as possible to visualize data. This simple principle is the lever that has ultimately toppled Big BI by removing the barriers other technologies impose between real human people and their data, and the friction they impose, making data access and analysis a chore instead of an invigorating and rewarding experience.

But Big BI was well established. There were institutes, academics, huge vendors to sell you their databases and Enterprise BI platforms, and huge consultancies to help you wrangle the technology.

And there was a whole generation indoctrinated in the One True Way that held as articles of faith that there is only One Version Of The Truth, that only Enterprise-consolidated data carries real business value, that business data is too important to be left to its owners: like Dickensian orphans it needs to be institutionalized, homogenized, and cleaned up before it can be seen in public.

Tableau has a different philosophy. Data is in and of itself valuable. Data analysis is the right and privilege of its owners. Data analysis should be fast, easy, straightforward, and rewarding. There is truth in all data, and all those truths are valuable.

Still, the Big BI advocates found ways to block the radical upstarts, the data democratizers. “But what about consistency across units?” “How can we trust that two (or more) people’s analyses are equivalent?”

And the most damning of all: “Pretty pictures are nice toys, but we need the big, brawny, he-man industrial controls to ensure that we at the top know that we’re getting the straight poop.” There’s so much wrong with this last one that it will take several essays to unwind it. (to be continued)

Distilled to their essence, the objections of the Big BI proponents to the use of Tableau as a valuable, meaningful, essential tool for helping achieve the real goal of getting the essential information out of business data into the minds of those who need it as fast as possible, as well as possible, are these:

Point: There must be single-point, trusted sources of the data that’s used to make critical business decisions.

Subtext: Local data analysis is all fine and good but until we can have point control over our data we’re not going to believe anything.

Subsubtext: This is an erroneous perspective, and ultimately harmful to an organization, but that’s another story, the reality is that even as misguided as it is in the entire context, there is a need to have single-source authoritative data.

Tableau’s response: the upcoming Tableau version 7 provides the ability to publish managed, authoritative data sources to the Tableau Server, available for use by all Tableau products. This feature provides the single trusted data source capability required for Enterprise data confidence.

Point: There must be absolute confidence that similarly named analyses, e.g. Profit, are in fact comparable.

Subtext: As long as people have the opportunity to conduct their own data analysis we suspect that there will be shenanigans going on and we need a way to make sure that things are what they claim to be.

The Tableau reality: Tableau’s data manipulations are, if not transparent, not deliberately obfuscated. Every transformation, including calculations, e.g. Profit, is visible within the Tableau workbook that it’s part of. There are two ways to ensure that multiple analyses are conveying the same data in the same way: the workbooks containing the analyses can be manually inspected; or the workbooks can be inventoried with a tool designed for the purpose, the results of which is an database of the elements in the workbooks, through to and including field calculations, making the cross-comparisons a simple matter of data analysis with Tableau.

Tableau does not itself possess the self-inventorying ability. There is, however, such a tool available: the Tableau Workbook Inventory System (TWIS), available from Better BI, details are available here.

So Big BI’s day is done. The interesting part will be watching how long will it take before it’s grip on business data analysis—Business Intelligence—loosens and enterprises of all types and sizes really begin to garner the benefits of being able to hear the stories their data has to tell.

Written by Chris Gerrard

October 25, 2011 at 10:44 am

Posted in Uncategorized

Tagged with , ,

How Big BI Fails To Deliver Business Value

with 3 comments

Business Intelligence, the noun, is the information examined by a business person for the purpose of understanding their business, and using as the basis of their business decision.

Business Intelligence, the verb, is the practice of providing the business person with the Business Intelligence they require.

Business Intelligence is conceptually based in a valuable proposition: delivery of actionable, timely, high quality information to business decision makers provides data-based evidence that they can use in making business decisions.

Timely, high quality information is extremely valuable. It’s also somewhat perishable. Deriving the maximum business value from Business Intelligence is the expressed value proposition of all BI projects. Or it should be.

Enterprise Business Intelligence is the entire complex of tools, technologies, infrastructure, data sources and sinks, designs, implementations and personnel involved in collecting business data, combining it into collective stores, and creating analyses for access by business people.

Enterprise BI projects are very strongly biased towards complexity. They’re based on the paradigm of using large complex products, platforms, and technologies to design and build out data management infrastructures that underpin the development of analytics—reports, dashboards, charts, graphs, etc., that are made available for consumption.

Big BI is what happens when Enterprise Business Intelligence grows bigger than is absolutely necessary to deliver the BI value proposition. Unfortunately, Enterprise BI tends to mutate into Big BI, and the nature of Big BI almost invariably impedes the realization of the BI business value proposition.

Big BI is by its very nature overly large, complex and complicated. There are many moving parts, complicated and expensive products, tools and technologies that need to be installed, configured, fed, and cared for. All before any information actually gets delivered to the business decision makers.

Big BI is today’s Data Processing. In the old days of mainframes, COBOL, batch jobs, terminals, and line printers business people who wanted reports had to make the supplicant journey to their Data Processing department and ask (or beg) for a report to get created, and scheduled, and delivered. Data Processing became synonymous with “we can do that (maybe) but it’ll take a really long time, if it happens at all.” Today’s Big BI is similarly slow-moving.

There are multiple reasons for this sad state of affairs. In the prevalent paradigm new BI programs need to acquire the requisite personnel, infrastructure, tools, and technologies. All of which need to be installed and operational, which can take a very long time. Data needs to be analyzed, information models need to be created, reporting data bases designed, ETL transformations designed and implemented, reporting tool semantic layers, e.g. Business Objects Universes, Cognos Frameworks, need to be constructed, reports need to be created, and, finally, the reports made available.

Only then do the business decision makers (remember them?) benefit.

This entire process can take a very long time. All too frequently it takes months. Not just because of the complexity of the tools and process, but also because of the friction inherent in coordinating the many moving parts and involved parties, each with their own bailiwick and gateways to protect.

The tragic part in many Big BI projects is that the reports delivered to the business people usually fall far short of delivering the business value they should provide.

The reports are out of date, or incomplete, or no longer relevant, or poorly designed and executed, or simply wrong because the report production team “did something” without having a clear and unambiguous understanding of the information needs of the business decision maker.

In far too many instances there’s a vast gulf between the business information needs and the reports that get developed and delivered. This situation occurs because there’s too much distance, in too many different dimensions, between the business and the report creators.

Here is a map of the structure—the processes and artifacts, and the linkages between them—of a typical Big BI project. It’s worth studying to see how far apart the two ends of the spectrum are. At one end is the business person, with their business data that they need to analyze. At the far end, at the end of all of the technology, separated by multiple barriers from the business, are the report generators.

A typical scenario in this environment is that somebody, with any luck a competent business analyst, sometimes a project manager, all too frequently a technical resource, is tasked with interviewing the business and writing up some report specs—perhaps some wireframes and/or Word documents, maybe Use Cases. The multiple problems with these means of capturing analytics requirements are too extensive to discuss here. One fatal flaw that occurs all too frequently must be mentioned, however: the creator of the report specs lacks the professional skills required to elicit, refine, and communicate appropriate, high quality, effective business analytics requirements. As a result the specifications are inadequate for their real purpose, and their flaws flow downstream.

These preliminary specs are then used as the inputs into the entire BI implementation project. Which then goes about its business of creating and implementing all of the technological infrastructure necessary to crank out some analyses of the data. At the end of the “real work” a tool specialist renders the analytics that then get passed back to the business.

The likelihood of this approach achieving anything near the real business value obtainable from the data is extremely low. There’s simply too much distance, and too many barriers, between the business decision makers and the analyses of their data. Big BI has become too big, too complex, with too much mass and inertia, all of which get between the business and the insights into their data that are essential to making high quality business decisions.

There is, however, a bright horizon.

Business Intelligence need not be Big BI. Even in those circumstances where Enterprise BI installations are required, and there are good reasons for them, they need not be the monolithic voracious all-consuming resource gobblers they’ve become. Done properly, Enterprise BI can be agile, nimble, and highly responsive to the ever-evolving business needs for information.

Future postings will explain how this can be your Enterprise BI reality.

not just because of the complexity of the tools and process, but also because of the friction inherent in coordinating the many moving parts and involved parties, each with their own bailiwick and gateways to protect.

Written by Chris Gerrard

March 30, 2011 at 9:00 am

Posted in Big BI, Enterprise BI

Tagged with ,