Better BI

Chris Gerrard – Exploring the world of Better Business Intelligence

Big BI is dead. But it’ll twitch for awhile.

with 2 comments

The end came last week in Las Vegas at the annual Tableau Customer Conference.

Big BI put up a valiant struggle, but it’s been unwell for some time, sputtering along, living on past glories real and imagined.

Its passing was inevitable. Big, bloated, and overripe, it never lived up to its promise of being the path to data enlightenment. Although its nominal goals were good and proper, in practice it failed to deliver. Much has been written on Big BI’s problems, including here, here, and here.

Big BI flourished, and then held on in spite of its flaws. Partly through inertia—a lot of money and time had been spent over its two decades of dominance. Partly through pervasiveness—Big BI’s proponents have been hugely successful at promoting it as the one true way. Partly through the absence of the disruptive technology that would upend the BI universe.

Big BI is brain-dead, but support systems are keeping the corpse warm.

Like all empires on the wane, its inhabitants,and sponsors haven’t realized it yet. Or, cynically, those who have are milking it for what they can get while the getting is still good. Like many empires, its demise comes not with a big bang, but with a nearly silent revolution that upends the established order—even as the Big BI promoters and beneficiaries are flush, fat, and happy their base of influence and position, wealth and power, has eroded away, leaving them dining on memories.

Big BI’s fundamental premise was always deeply flawed, erroneous when it wasn’t disingenuous or worse. The paradigm held that the only approach to achieving Business Intelligence within an organization was through the consolidation of the enterprise’s business data into data warehouses from which a comprehensive, all-encompassing single version of the truth could be achieved.

The only points of differentiation and discussion in the Big BI universe were squabbles about ultimately minor aspects of the core reality. Data Warehouses vs Data Marts. Inmon vs Kimball (Google “Inmon Kimball”). Dimensionally modeled analytical databases are relational vs no they’re not. And so on and so forth and such like.

The baseline concept remained the same: Business Intelligence is achieved by collecting, cleansing, transforming and loading business information into the integrated, homogenized, consolidated data store (Mart or Warehouse), where it then, and only then, can be fronted by a large, complex, complicated “Enterprise BI Platform” that provides a business semantic facade for the dimensional data and is the only channel that can be used to code up, create, and deliver reports, graphs, charts, dashboards, strategy maps, and the like to the business people who need to understand the state of their area of responsibility and make data-based decisions.

The overarching goal is completely correct: delivering information (intelligence) to the real human people who need it. But the reality is that Big BI has abjectly failed to deliver. With an eye to history, and another to the evolution of technology the possible end of Big BI has been in sight for some time. The history of BI is deep and rich, encompassing much more than Big BI. A brief history (mine) is here.

What happened? Why now? Why Tableau?

A number of years ago novel products appeared, sharing the concept that data access and analysis should be easy and straightforward, that people should be able to conduct meaningful, highly valuable investigations into their data with a minimum of fuss and bother.

Tableau was the best of these products, squarely aimed at making it as simple and straightforward as possible to visualize data. This simple principle is the lever that has ultimately toppled Big BI by removing the barriers other technologies impose between real human people and their data, and the friction they impose, making data access and analysis a chore instead of an invigorating and rewarding experience.

But Big BI was well established. There were institutes, academics, huge vendors to sell you their databases and Enterprise BI platforms, and huge consultancies to help you wrangle the technology.

And there was a whole generation indoctrinated in the One True Way that held as articles of faith that there is only One Version Of The Truth, that only Enterprise-consolidated data carries real business value, that business data is too important to be left to its owners: like Dickensian orphans it needs to be institutionalized, homogenized, and cleaned up before it can be seen in public.

Tableau has a different philosophy. Data is in and of itself valuable. Data analysis is the right and privilege of its owners. Data analysis should be fast, easy, straightforward, and rewarding. There is truth in all data, and all those truths are valuable.

Still, the Big BI advocates found ways to block the radical upstarts, the data democratizers. “But what about consistency across units?” “How can we trust that two (or more) people’s analyses are equivalent?”

And the most damning of all: “Pretty pictures are nice toys, but we need the big, brawny, he-man industrial controls to ensure that we at the top know that we’re getting the straight poop.” There’s so much wrong with this last one that it will take several essays to unwind it. (to be continued)

Distilled to their essence, the objections of the Big BI proponents to the use of Tableau as a valuable, meaningful, essential tool for helping achieve the real goal of getting the essential information out of business data into the minds of those who need it as fast as possible, as well as possible, are these:

Point: There must be single-point, trusted sources of the data that’s used to make critical business decisions.

Subtext: Local data analysis is all fine and good but until we can have point control over our data we’re not going to believe anything.

Subsubtext: This is an erroneous perspective, and ultimately harmful to an organization, but that’s another story, the reality is that even as misguided as it is in the entire context, there is a need to have single-source authoritative data.

Tableau’s response: the upcoming Tableau version 7 provides the ability to publish managed, authoritative data sources to the Tableau Server, available for use by all Tableau products. This feature provides the single trusted data source capability required for Enterprise data confidence.

Point: There must be absolute confidence that similarly named analyses, e.g. Profit, are in fact comparable.

Subtext: As long as people have the opportunity to conduct their own data analysis we suspect that there will be shenanigans going on and we need a way to make sure that things are what they claim to be.

The Tableau reality: Tableau’s data manipulations are, if not transparent, not deliberately obfuscated. Every transformation, including calculations, e.g. Profit, is visible within the Tableau workbook that it’s part of. There are two ways to ensure that multiple analyses are conveying the same data in the same way: the workbooks containing the analyses can be manually inspected; or the workbooks can be inventoried with a tool designed for the purpose, the results of which is an database of the elements in the workbooks, through to and including field calculations, making the cross-comparisons a simple matter of data analysis with Tableau.

Tableau does not itself possess the self-inventorying ability. There is, however, such a tool available: the Tableau Workbook Inventory System (TWIS), available from Better BI, details are available here.

So Big BI’s day is done. The interesting part will be watching how long will it take before it’s grip on business data analysis—Business Intelligence—loosens and enterprises of all types and sizes really begin to garner the benefits of being able to hear the stories their data has to tell.

Advertisements

Written by Chris Gerrard

October 25, 2011 at 10:44 am

Posted in Uncategorized

Tagged with , ,

2 Responses

Subscribe to comments with RSS.

  1. Agreed. Tableau Software is a major revolutionary and disruptive technology. Tableautastic !!! However, it’s extraordinarily difficult to wrestle the power of influence from the big boys who are heavily entrenched in contract agreements which yield huge financial profits. The existing DW/DM culture controls every industry. It doesn’t either reward or support innovation. People have for 10-20 years grown up with traditional data management concepts and they still fight the standard philosophical battles – too entrenched to conceive that a new and better paradigm exists. However, history repeats itself. Bleeding edge early adopters saw the same transformation in the late 80’s when Mainframe Computing was rocked on it’s heels by Client/Server RDBMS. The latter technology won and unleashed a world of possibilities which empowered two generations of knowledge workers to grow the world-wide working middle class; thereby, fueling world economies. Nevertheless, the KISS principle is not embraced by everyone. Leadership takes courage and vision to embrace a bold new path towards truth and enlightenment. It’s time to follow A Visual Intelligence Beacon…

    Eric Paul McNeil, Sr.

    October 27, 2011 at 8:55 am

  2. […] to Eric Paul McNeil, Sr’s comment on my earlier “BI Bis is dead…” post, here, in which he notes that history repeats itself, relating the emergence of Tableau to Client/Server […]


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: