Better BI

Chris Gerrard – Exploring the world of Better Business Intelligence

Posts Tagged ‘Tableau

BI as the path to art.

leave a comment »

Two months on the Panamanian Pacific coast. The beach is right below us, the surf a constant presence. The tides here are quite high-averaging almost 15 feet.

Living right on the ocean has it’s great benefits. The beaches here are strikingly beautiful; the surf intermingles black volcanic and white sand into fantastic patterns visible from space. Or at least in Google Earth and maps. The ocean has shaped the beaches into a steeply sloped upper section, and a nearly flat lower zone wonderful for all types of activities. Here’s a time lapse video of high tide in the early morning (it’s receding):
As you can see in the video, there’s not much room on the beach at high tide, and plenty at low tide.

It’s hot here. Not absolutely blisteringly hot, but hot enough to avoid strenuous outdoor activities, walking or running on the beach say, during the middle of the day.

It’s important to know the tides so that you can plan your beach activities. Fortunately, the local community has posted the tides online in a series of web pages, one for each month; March, 2012’s page is here. Unfortunately, the tides are presented in a table that, although it contains the data, is minimally useful.

But there’s good news: it’s pretty easy to scrape the data from the site’s HTML tables and build some Tableau dashboards to present it more effectively. I published several versions to Tableau Public, one for standard browsers here, one sized for iPads here, even one suitable for blogs (although WordPress doesn’t support Tableau’s JavaScript embedding).

Written by Chris Gerrard

March 24, 2012 at 7:45 pm

Posted in Uncategorized

Tagged with , , ,

Big BI is dead. But it’ll twitch for awhile.

with 2 comments

The end came last week in Las Vegas at the annual Tableau Customer Conference.

Big BI put up a valiant struggle, but it’s been unwell for some time, sputtering along, living on past glories real and imagined.

Its passing was inevitable. Big, bloated, and overripe, it never lived up to its promise of being the path to data enlightenment. Although its nominal goals were good and proper, in practice it failed to deliver. Much has been written on Big BI’s problems, including here, here, and here.

Big BI flourished, and then held on in spite of its flaws. Partly through inertia—a lot of money and time had been spent over its two decades of dominance. Partly through pervasiveness—Big BI’s proponents have been hugely successful at promoting it as the one true way. Partly through the absence of the disruptive technology that would upend the BI universe.

Big BI is brain-dead, but support systems are keeping the corpse warm.

Like all empires on the wane, its inhabitants,and sponsors haven’t realized it yet. Or, cynically, those who have are milking it for what they can get while the getting is still good. Like many empires, its demise comes not with a big bang, but with a nearly silent revolution that upends the established order—even as the Big BI promoters and beneficiaries are flush, fat, and happy their base of influence and position, wealth and power, has eroded away, leaving them dining on memories.

Big BI’s fundamental premise was always deeply flawed, erroneous when it wasn’t disingenuous or worse. The paradigm held that the only approach to achieving Business Intelligence within an organization was through the consolidation of the enterprise’s business data into data warehouses from which a comprehensive, all-encompassing single version of the truth could be achieved.

The only points of differentiation and discussion in the Big BI universe were squabbles about ultimately minor aspects of the core reality. Data Warehouses vs Data Marts. Inmon vs Kimball (Google “Inmon Kimball”). Dimensionally modeled analytical databases are relational vs no they’re not. And so on and so forth and such like.

The baseline concept remained the same: Business Intelligence is achieved by collecting, cleansing, transforming and loading business information into the integrated, homogenized, consolidated data store (Mart or Warehouse), where it then, and only then, can be fronted by a large, complex, complicated “Enterprise BI Platform” that provides a business semantic facade for the dimensional data and is the only channel that can be used to code up, create, and deliver reports, graphs, charts, dashboards, strategy maps, and the like to the business people who need to understand the state of their area of responsibility and make data-based decisions.

The overarching goal is completely correct: delivering information (intelligence) to the real human people who need it. But the reality is that Big BI has abjectly failed to deliver. With an eye to history, and another to the evolution of technology the possible end of Big BI has been in sight for some time. The history of BI is deep and rich, encompassing much more than Big BI. A brief history (mine) is here.

What happened? Why now? Why Tableau?

A number of years ago novel products appeared, sharing the concept that data access and analysis should be easy and straightforward, that people should be able to conduct meaningful, highly valuable investigations into their data with a minimum of fuss and bother.

Tableau was the best of these products, squarely aimed at making it as simple and straightforward as possible to visualize data. This simple principle is the lever that has ultimately toppled Big BI by removing the barriers other technologies impose between real human people and their data, and the friction they impose, making data access and analysis a chore instead of an invigorating and rewarding experience.

But Big BI was well established. There were institutes, academics, huge vendors to sell you their databases and Enterprise BI platforms, and huge consultancies to help you wrangle the technology.

And there was a whole generation indoctrinated in the One True Way that held as articles of faith that there is only One Version Of The Truth, that only Enterprise-consolidated data carries real business value, that business data is too important to be left to its owners: like Dickensian orphans it needs to be institutionalized, homogenized, and cleaned up before it can be seen in public.

Tableau has a different philosophy. Data is in and of itself valuable. Data analysis is the right and privilege of its owners. Data analysis should be fast, easy, straightforward, and rewarding. There is truth in all data, and all those truths are valuable.

Still, the Big BI advocates found ways to block the radical upstarts, the data democratizers. “But what about consistency across units?” “How can we trust that two (or more) people’s analyses are equivalent?”

And the most damning of all: “Pretty pictures are nice toys, but we need the big, brawny, he-man industrial controls to ensure that we at the top know that we’re getting the straight poop.” There’s so much wrong with this last one that it will take several essays to unwind it. (to be continued)

Distilled to their essence, the objections of the Big BI proponents to the use of Tableau as a valuable, meaningful, essential tool for helping achieve the real goal of getting the essential information out of business data into the minds of those who need it as fast as possible, as well as possible, are these:

Point: There must be single-point, trusted sources of the data that’s used to make critical business decisions.

Subtext: Local data analysis is all fine and good but until we can have point control over our data we’re not going to believe anything.

Subsubtext: This is an erroneous perspective, and ultimately harmful to an organization, but that’s another story, the reality is that even as misguided as it is in the entire context, there is a need to have single-source authoritative data.

Tableau’s response: the upcoming Tableau version 7 provides the ability to publish managed, authoritative data sources to the Tableau Server, available for use by all Tableau products. This feature provides the single trusted data source capability required for Enterprise data confidence.

Point: There must be absolute confidence that similarly named analyses, e.g. Profit, are in fact comparable.

Subtext: As long as people have the opportunity to conduct their own data analysis we suspect that there will be shenanigans going on and we need a way to make sure that things are what they claim to be.

The Tableau reality: Tableau’s data manipulations are, if not transparent, not deliberately obfuscated. Every transformation, including calculations, e.g. Profit, is visible within the Tableau workbook that it’s part of. There are two ways to ensure that multiple analyses are conveying the same data in the same way: the workbooks containing the analyses can be manually inspected; or the workbooks can be inventoried with a tool designed for the purpose, the results of which is an database of the elements in the workbooks, through to and including field calculations, making the cross-comparisons a simple matter of data analysis with Tableau.

Tableau does not itself possess the self-inventorying ability. There is, however, such a tool available: the Tableau Workbook Inventory System (TWIS), available from Better BI, details are available here.

So Big BI’s day is done. The interesting part will be watching how long will it take before it’s grip on business data analysis—Business Intelligence—loosens and enterprises of all types and sizes really begin to garner the benefits of being able to hear the stories their data has to tell.

Written by Chris Gerrard

October 25, 2011 at 10:44 am

Posted in Uncategorized

Tagged with , ,

BI RAD – Business Intelligence Rapid Analytics Delivery

with one comment

A fundamental virtue in BI is the delivery of value early, often and constantly.

Business Intelligence—Rapid Analytics Delivery (BI RAD) is the practice of creating high quality analytics as quickly, efficiently, and effectively as possible. Done properly, it’s possible to develop the analytics hand-in-hand with the business decision makers in hours.

BI RAD requires the combination of tools that permit easy access to the business data, are highly effective and usable in the exploratory model of analysis-browsing through the data looking for interesting patterns and anomalies, and then persisting meaningful analyses as analytics.

BI RAD emerged as a viable practice over twenty years ago with the first generations of specialized reporting tools that were much more nimble and effective than the COBOL report generation coming from DP shops. I worked extensively with FOCUS, the original 4GL, first at JCPenney, then as a consultant and product manager with Information Builders, Inc., FOCUS’ vendor. People were surprised, sometimes shocked, at how rapidly useful reports could be generated for them—once up to speed new reports could generally be available within a day, sometimes a couple of hours.

In today’s world of Big BI, where enterprise BI tools and technologies are the norm, business decision makers have become accustomed to long waits for their BI analytics. In a sense, Big BI has become the modern DP; the technology has become so large and complex that getting it installed and operational, with everything designed, implemented, cleansed, ETL’d, and so on before any analytics get created consumes all the time, energy, effort and resources available and results in failure and disappointment, with very little value delivered in the form of information making it into the minds of the real human people who need it.

Yet BI RAD lives. An entire modern generation of direct-access data discovery tools has emerged, providing the ability to establish an intimate connection between business data and the people who need insights into the data in order to make informed business decisions.

Tableau, from Tableau Software, is a near-perfect tool for BI RAD. It connects to the great majority of business data sources almost trivially easily, and is unsurpassed in terms of usability and quality of visualizations for the majority of business analytics needs.

Using Tableau, it’s possible to sit down with a business stakeholder, connect to their data, and jump right into exploring their data. Any interesting and valuable analyses can be saved and kept at hand for immediate use.

Experience in multiple client environments has shown that these analyses have multiple overlapping and reinforcing benefits to the organization. The immediate and obvious benefit is that business decision makers are provided the data based insights they require to make high quality business decisions. Beyond that, the analyses created are the best possible requirements for Enterprise analytics

Who could ask for anything more?

Written by Chris Gerrard

April 5, 2011 at 11:32 pm

Posted in BI RAD, Enterprise BI

Tagged with ,

High Level Enterprise BI Project Activities Diagrams

leave a comment »

This PDF document–Data Warehouse Typical Project–maps the normal set of high level activities involved in Enterprise BI projects.

Typical Enterprise BI projects are complex complicated affairs that have difficulty delivering Business Intelligence quickly due to multiple factors:

  • There are many complex discrete interconnected activities which have stringent analytical interconnections and dependencies
  • there is generally a lack of analytical expertise brought to bear on the data, meta-data, and meta-meta-data that needs to be transported and communicated between the various parties
  • high quality requirements are extremely difficult to achieve, primarily because there’s an enormous amount of setup work that needs to occur before an Enterprise BI tool can be connected to real data and be used to develop preliminary analytics
  • absent live reports, any end-user analytical requirements are usually low quality, low fidelity best guesses made in the vacuum of the real feedback essential for arriving at truly useful Business Intelligence

This PDF–Data Warehouse – Tableau Augmented Project Processes–highlights those areas where Tableau can be profitably employed to dramatically improve the velocity and quality of the Business Intelligence delivered to business decision makers, and in significantly streamlining the entire process stream by introducing the practices of data analysis into the Enterprise BI project activities.
In practice, this approach has been shown to provide business value early and often, and result in better Enterprise BI outcomes much sooner and at lower cost. Leaving more resources available to continuing to expand the scope, sophistication, velocity and quality of the Business Intelligence provided, and therefore providing a much higher business value delivery.

Written by Chris Gerrard

January 28, 2010 at 12:02 am

Posted in Uncategorized

Tagged with , ,

Common Problems Saving Tableau Packaged Workbooks

leave a comment »

Tableau’s packaged workbooks are tremendously useful. Bundling data with the workbook allows anyone to peruse the data using the Workbook without having access to the original source data. I use them frequently in large BI projects as a way of providing Reports to end users, analyses of data all along the project process chain, even in providing the database schema to downstream technical teams when the “normal” processes take too long.
Packaged Workbooks can be opened with the Desktop Application or the Tableau Reader. Published to the Tableau Serer they’re available just like normal Workbooks.
Creating a Packaged Workbook is really pretty straightforward: create an extract of the data (for every data source used in the Workbook); save or export the Workbook in its packaged form.
There are a couple of reasonably common circumstances I’ve run into again recently; this post covers them.

Problem—SQL Parsing error creating the extract

I’ve seen this more than once: when Tableau tries to create an extract it fails with a fairly obscure error along the lines of “Data format string terminated prematurely”, which seems to indicate that there’s been a problem parsing a date value using whatever internal format it’s employing. There are no calculated fields or data calculations, so it’s really puzzling and Tableau doesn’t really provide any diagnostics.
There’s also the matter that this problem doesn’t surface until the extract is under preparation, implying that it’s not involved with any of the fields being referenced in the Worksheets, which leads us to the

Solution—Hide the unused fields and try to create the extract

Almost too easy, isn’t it? Hiding the unused fields also reduces the size of the extract, which in some cases makes a big difference. On the other hand, the unused fields aren’t available for use in the extract, and therefore in the Packaged Workbook; this isn’t a problem for Tableau Reader users, but limits those Desktop Application and Server users who otherwise could extend the Workbook’s analytics.

Problem—creating the Packaged Workbook generates an “unconnectable data source” message

[insert message here]

Solution—find and close any Data Connections that aren’t being used

Orphaned Data Connections can have a number of causes, but usually because the last Worksheet using the Data Connection gets deleted or pointed to another Data Connection.
Finding and closing unused Data Connections from within the Workbook can be a bit of a hunting expedition–this will be the topic of another post. But very soon the Tableau Inventory will identify orphaned Data Connections.

Written by Chris Gerrard

January 27, 2010 at 11:28 pm

Posted in Uncategorized

Tagged with

Inventory Your Tableau Workbooks (or…)

leave a comment »

“Are You Using That Field?”

It always happens: you’ve put together a nice set of Workbooks, produced a bunch of really valuable analytics, and now you need to figure out what’s where so you can:

  • accommodate the inevitable changes to the database
  • enumerate the reports
  • identify the calculated fields, and their calculations
  • identify what Dashboards and Worksheets are in what Workbooks
  • the relationships between Dashboards and Worksheets
  • so on and so forth and such like

You COULD manually browse through the Workbooks, Dashboards, and Worksheets and dutifully record everything.
(good luck with that, and with keeping up with changes)

Or you could automate the process by processing the Workbooks and teasing out the information about Dashboards, Worksheets, Rows, Columns, Filters, Fields, etc. into data that Tableau can read and then prepare a Tableau workbook that provides the essential information.

Or you could use the Tableau Inventory application that I’ve built to do the inventorying for you, and the TableauReportsInventory.twb to see the inventory.

There’s a Tableau Reports Inventory – Sample Workbooks PDF attached to this post with the output of TableauReportsInventory.twb connected to the inventory of the Tableau Sample Workbooks attached to this post. (I can’t attach Tableau Workbooks)
If you think it’s useful I’d sure like to hear about it.

I’m preparing to release the Tableau Inventory as an Open Source project, and welcome anyone who wants to participate.

I can be reached at – please put “Tableau Inventory” in the subject line.
Or comment here.

Written by Chris Gerrard

January 27, 2010 at 9:08 am