L. Miguel Encarnação, SVP, Enterprise Data and Analytics, Head of Data Visualization, Regions Financial Corporatio
A C-level executive, let’s call him Chuck, needs to better understand the market performance of his business and poses a question to his team to seek answers based on the abundance of data collected and regularly reported. The team fans out and spends the next couple of weeks integrating, analyzing and massaging data from different sources using a variety of different analytical tools. A meeting is being scheduled to discuss their findings. In preparation, a PowerPoint deck is being created by compiling any insights the team identified, supporting data, additional contextual information as well as data, that the team anticipates might also be useful for the meeting – just to be on the safe side. To increase credibility and comprehensibility of the presented results, snapshots from the different analytical tools are added to the narrative of the slides: graphs and charts for easier consumption as well as tables due to their familiarity and focus on details. The PowerPoint report, which has grown by now to an impressive 65 pages, is being distributed in advance so that attendees can prepare appropriately.
At the meeting, the presenter makes it to slide 5 before Chuck interjects. He grows impatient because the process has already taken so long, and he hasn’t gotten an answer to his initial question yet. Another high-level executive is wondering whether a certain KPI on slide 37 is accurate, and yet another one wants to take the opportunity of the meeting to discuss another high-priority question that has come up during the weeks that the analytical team has spent preparing the presentation. The meeting adjourns without much progress but with a new, additional request to the analytical team to answer more questions.
Despite the fact that enterprises around the world continue to invest heavily in infrastructures and resources for data analysis and AI, effective decision making based on such data is still hampered by barriers to translate the resulting data and information into actionable, just-in-time insights. This is especially true the more strategic the need for insights is. In fact, a 2021 NewVantage Partners survey of large U.S. firms found that while 99 percent of the surveyed corporations reported investments in data and AI, only 27 percent of their executives felt that their companies were data driven, and only 23 percent of their executives reported they were themselves comfortable accessing or using data from their tools and resources. To make matters worse, the same survey reported 37 percent in 2017 and 31 percent in 2019 for the last metric, highlighting that apparently the confidence of executives in their own data-driven decision-making abilities has been decreasing over the past four years despite increased investments in corporate data, analytics, and AI capabilities. Considering that the ultimate purpose of data and analytics is, in fact, improved decision making, this is an eye-opening statistic. While operational functions seem to increasingly benefit from data- and AI-driven tools, it is the strategic decision-making suite that seems to be left out.
So, what went wrong?
The answer is multi-faceted but ultimately comes down to making the data, information, and insight not just accessible but consumable, tangible, and actionable to the involved stakeholders.
Let’s revisit the introductory story:
The analytics team took full advantage of the available data, analytics, and AI capabilities to pursue the insight requested. However, in communicating the results, they took the shortcut to include and present everything they knew, rather than translating it into what the target recipients, Chuck, the higher-level decision maker, and other members of his executive team, needed to know to make their decision. The analytics team designed for their own comfort rather than their audience’s benefit. This resulted in information overload caused by an abundance of available, possibly relevant, yet likely not necessary information, distracting from the main insights.
From a data visualization perspective, the analytical team repurposed the graphs, charts and tables from their analytical tools and workflows, possibly to gain efficiencies. They did, however, overlook that data depictions for analysis must be designed very differently than data presentations for communication. Analytics-focused data visualizations re aimed at maximizing the discovery and recognition of insight. Visual communication, in contrast, must be aimed at maximizing consumability, knowledge transfer and trust. While for analytical purposes perceptual and cognitive aspects dominate, for communication it is data-driven storytelling that must be emphasized.
Using the right tool based on data, analytical task and end-user is key in gaining efficiencies and mitigating information overload, whether in analytical or communication contexts.
Similarly, PowerPoint is not a tool for creating reports but a presentation tool, assuming it to be narrated by a presenter. Far too often, PowerPoint slides and similar presentation decks are being overloaded with information, which reduces the effectiveness of the communication and the intended knowledge transfer.
In preparing the presentation, the analytics team also ignored the human limitations when it comes to information processing and retention: Study results have varied over the years but there is consensus that the human attention span can be as low as 7 minutes up to 12 minutes depending on the number of distractions in the environment. There is further scientific evidence that our attention span has in fact been decreasing over recent years with the explosion of digital distractions in our work environments. What does this mean? Using the rule of thumb of two (2) minutes per slide for effective presentation, then a presentation should not have more than six (6) informational slides. Or, to put it differently, after six (6) slides you are running the risk that your audience is losing its attention, so that you should try to ensure that the most important messages and insights are within those first slides. This assumes that slides are not too densely populated: From cognitive psychology we know, e.g., that humans cannot keep more than 7 +/- 2 chunks of information in their working memory at any point in time. Keep that in mind when you add information to your slides.
Finally, the fact that the presentation of findings was static and “pre-canned” limited the audience’s ability to ask farther-reaching questions and to engage in a closer discourse with the presenter(s), forcing the decision-making process into a time-consuming loop of recurrent, multi-week question, analysis, and presentation cycles.
While the listed shortcomings showcased by our story of a typical decision-making process seem common and habitual, we do have the knowledge, best practices and technologies to work towards eliminating them. However, it will require a corresponding change in team- and decision-making habits as well as potentially workflows, to take full advantage of those resources. Here are a few recommendations along the way:
It is important to remember that even the differentiation of data representation for communication and analysis can and should be further delineated:
For communication, beyond narrated presentations and storytelling, there are different design affordances towards e.g., creating awareness through static info charts for general audiences, versus creating dashboards for monitoring data with an emphasis on quickly detecting deviations from established baselines, versus creating static reports for auditing and review purposes by a very knowledgeable audience.
For analysis, data visualization evangelist Stephen Few differentiates visual tools for analysis as faceted analytical display to contrast them to dashboards, which predominately suit monitoring and reporting purposes. (After all, how analytical can you get with the dashboard in your car, which is where this term stems from?) Even within an analytical context, one can observe different design requirements with increasing levels of interaction affordances when it comes to applications to.
inquiry-based analysis, i.e. the problem space is well understood, you know what you are looking for and are seeking answers to concrete questions,
investigative analysis, i.e. you try to understand your problem space, often to answer the why question, and
exploratory analysis, i.e. you try to define the problem space to avoid blind spots or to identify white spaces.
In general, using the right tool based on data, analytical task and end-user is key in gaining efficiencies and mitigating information overload, whether in analytical or communication contexts. Reporting tools like Power BI are appropriate for creating monitoring dashboards and relatively fixed reports with limited interactivity, while highly interactive tools like Tableau or Qlik Sense are more suitable for interactive visual analysis towards investigation and exploration. Tools like PowerPoint are adequate for narrated presentation of results with no need for further discourse or inquiry. If, however, a livelier collaborative discussion is expected, interactive storytelling driven by live data might be the way to go and is readily supported by many modern BI tools’ storyboarding capabilities. For solely inquiry-based analytics, such as the curiosity-based requests by high-level decision makers like Chuck often entail, self-service data queries, often paired with natural language interfaces, are supported by tools like with ThoughtSpot, Cloudera DataViz and Tableau. Furthermore, separating your presentation from auxiliary informational materials will additionally help focus the presentation.
Making the users of your analytical and communication tools proficient in their uses is crucial. No tool can be used effectively and to its fullest extent if its users did not receive the corresponding in-depth formal training. I am always flabbergasted when I hear that team members became their team’s dashboard developers of choice by learning a tool as they were using it. That ultimately limits the extent by which such tool can be effectively employed. Especially in the case of data-driven decision making, such lack of BI tool proficiency can do more harm than good: If statistics can lie it is even easier to get misled by visualized statistics, whether intentionally or just due to poor and uniformed design.
The same is true for data-driven communication: One cannot hope for effective decision making if insights and findings are not effectively communicated in ways that are clear, concise, focused and leave no room for ambiguity or misinterpretation. Such effective data-driven communication and storytelling requires training and practice and cannot just be assumed nor learned just by doing.
While the stated recommendations for effective data-driven decision-making might at first seem overwhelming, they do not only bear corporate value when it comes to data-related tasks. Digital citizenship and data fluency rely equally on intentional and well-informed use of any technology we employ. Moreover, effective communication is a crucial skill and sought-for behavior for effective corporate cultures, if not even at the core of our society and, by extension, humanity. For an increasingly digital world, it should, therefore, be a priority to focus digital transformation, automation and data analytics efforts not just on technical and technological components but to add emphasis on the prerequisite development of digital skills and competencies of the entire workforce.