This morning, I read a blog post from someone in the “business intelligence” industry complaining that, after years of innovation, “BI” is not really having the envisioned effects. The author wondered whether it was because BI, like most other business applications, are not “fun.” The author suggested that BI applications should consider borrowing from fun social applications with features like “friends” and “like.”
I share the author’s disappointment in how little visible impact we’ve achieved, at least in the health care industry, from years of investment in “business intelligence,” “management information systems,” “data warehouses,” “data mining,” “dashboards,” “report cards” and other systems intended to put more data and reports at our fingertips.
But, I don’t share the view that social networking features are the missing ingredient. Rather, I think the key to making analytics more engaging and useful is to focus on telling interesting, truthful, actionable stories, backed up by data. People are natural story-tellers and naturally enjoy listening to other people’s stories, particularly if they have an interesting “line” — a beginning that sets the scene, a challenging problem or crisis, and some resolution. The problem with “BI” (and the other data-at-fingertips technologies) is that they are devoid of story line.
When I have had the opportunity to work with young analysts, this is the most important concept that I try to get across. Fresh out of grad school, the young analysts who have an IT background tend to start with a view that computers can do everything. They just need to put the power of computers and data in the hands of the “people.” The young analysts coming out of training in statistics, epidemiology, economics and other research disciplines tend to start with a world view that advanced mathematical methods can do everything. In developing such young analysts, “job one” is getting them to appreciate the importance of story-telling. To be effective in changing the world, they have to learn to change minds. To change minds, they have to keep the attention of their audience — not with outlandish, shocking conclusions or colorful “eye candy” graphics — but with an interesting plot-line that makes their audience eager to see how the story ends. Once they learn how to tell a story, they can apply that knowledge to the telling of actionable stories, where the ending is a call to action.
The first step in effective story-telling is to develop a good outline. This is a lesson that we all learned from our high school English teachers (or, whatever was your first language!). Back in the 1990’s, I worked with Dr. Bruce McCarthy at the Henry Ford Health System. He used to talk about the “logic train” and we would compose our abstract presentations by putting our slides in a line on the floor. We would joke around by doing a little “chug chug” dance along the line of slides to see if there was a good high level flow to the story. I repeated this exercise with my 7-year old daughter in the kitchen last month when she was preparing a class speech. She loved the chug chug dance. I challenge you to try it out at your next meeting with analysts.
After the outline is clarified, the second step in effective story-telling is to design each slide to clearly and simply communicate its message. This requires focus on the visual design. In my opinion, a good designer avoids clutter, demanding that every bit of “ink” has a purpose, other than just decoration. Effective communication takes advantage of the conventions of graphical language unless there is a good reason to violate convention. For example, if you are telling a story about change over time, people generally expect time to be on a horizontal axis moving from left to right. They generally assume “up” means “more.” They expect an arrow to convey sequence or the direction of flow. They expect thicker lines, bigger fonts, and bolder colors to indicate importance. Experienced designers of quantitative visuals know that a particular format may work well or not at all depending on the data itself, making it difficult to design an unchanging report that will be consistently effective over time when applied to different data.
This brings me to the most important conclusion. Despite the huge attention that has been paid to the need for more data and more sophisticated analytic software, the rate-limiting step in our efforts to bring the health care field into the knowledge age is neither data nor technology. Except for the most mundane monitoring purposes, useful analytics cannot be completely automated. Effective analytics requires effective analysts.