Wednesday, 18 May 2011

Making the results of the analysis accessible

We have all been there. You have just completed an analysis to be proud of. You cleverly collected data from many information sources. Then, through nifty data management that really shows your sas skills, you thickened the data mart with meaningful aggregations, transformations and imputations. And to cap it all, you performed some brilliant statistical modelling pushing your personal boundaries. But when you try to communicate this you encounter glazed looks and you feel your effort is not appreciated. Even worse, you learn that the results of your model (e.g. a segmentation of s scoring) are not really bought into by management and the business.
When I was a youngling I taught in the Open University introduction to statistics courses and similar courses which were compulsory as part of a degree in psychology. That was invaluable experience in honing and toning my ability to explain and discuss statistics. However, explaining what a regression is all about is rather different from telling people about the analysis and discussing the results. When I worked in mainland Europe I developed and adapted a mode of communicating results that was slightly peppered with ‘statistical jargon’. The people I worked with, such as marketing manager & back office managers, had some statistical training in their past. Not only were they comfortable with box plots, lift charts and stepwise variable selection they expected to hear about it. Moreover, there was appetite to explore innovative statistical techniques as it was perceived essential to the business’ survival in the market. When I started working in the UK I had to change the way I talked & presented. The people I worked with did not want to go beyond discussing basic averages. They still wanted sophisticated and advanced analysis and solutions but challenged me to communicate it at the ‘shop Stuart’ level. To an extent that is because some of the clients grew from the shop floor so to speak. Moreover, there is a tendency to share the results with field which is great. The ultimate challenge was supporting a team preparing for tough negotiations with the trade union where the spirit was sharing the facts and analysis so the discussions could focus on strategy and planning.
The current client I am working for is exploring how to improve the way the analytics team communicates and presents analysis, findings and recommendations. The analysts are asking themselves how to up their game and talk at the business statistics without seeing that glazed look. The challenge is not just the communication to the decision maker but how to gain buy-in form the field. The consensus is that we should not fall into the trap of telling everyone how great the analysis is. Instead the approach should be “you should trust us to do a good job, now lets tell you what we found.”  The trust in the team’s skills and abilities should be acquired through the daily interactions with the business. A presentation of results should focus on the Business and address it pain. It should not be a naval gazing exercise.
Taking a step back to basics, the key question is “What do they really what to know”?
·         What information sources did we explore – if we covered the data they expected and more then that? This is an important first step in gaining their confidence in you.
·         What are the main findings – they do not want to hear about coefficients and correlations. They would like a high level summary such as “The number of face to face sales visits does not seem to be predictive when accounting for X”. Even if they do not like the message, at least they understand it and they know that they should concentrate on X. They might ask for evidence and you should have the response ready in a format that is appropriate for the audience.
It is a misconception that the sleekness of the communication of the results flows from the dichotomy between “Academics in their ivory towers” and the “In tune consultants”. I witnessed a reputable consulting firm prepare a “Deck” of about 300 backup slides for an hour-short presentation. Admittedly, it was their way of getting the team to address questions and document the thought process along side the findings. However, after weeks of sleepless nights, the result was that no one in the team could remember what it was all about; not to mention reproduce the numbers. Moreover, instead of creating more confidence in the analysis it achieved the opposite. Each graph and table needed some time to digest and understand. Many of them just were showing no effect or statistically significant differences that were not practically significant. The longer this went on during the meeting the more the feeling was “These clever guys might understand this but I do not have the time – or are they pulling one over?”
We just finished a few high profile modelling and targeting projects for keystone products. Our findings were dispelling common beliefs and suggesting a new strategy. The first presentation that we prepared was the bog standard ‘Tell them everything – a graph is worth more than a 1000 words’. It did not work. I spent a week reworking the presentation and ended up with 7 slides with mainly bullets and only two killer graphs that together brought the message home. It worked really well. When designing the presentation and the graphs, I harked back to my student days (the first fish were starting to climb out of the see just about then) where Professor Benjamini (http://www.math.tau.ac.il/~ybenja/) introduced us to Tuffte’s work (http://www.edwardtufte.com/tufte/) and discussed other research about cognitive perception of graphical information (see: high overview in http://www.perceptualedge.com/files/GraphDesignIQ.html)
There are a few challenges to keep in mind:
·         Do the correct modelling – a elegant and simple presentation should not be an excuse to discount statistical rigour.
·         It is important to communicate the quality of the modelling. It is not easy for annalists to not mention correlations, p-values, PPVs, Sensitivity, Spesifity, and Lift Values. How ever, is it worth spending the time educating the un-interested. It is better to translate to terms they know. “If we use this model we are likely to visit 50% more GPs that will respond positively. Had we applied this last year we probably would have seen an increase in sales of about X Pounds.” – Now that is a challenge that merits a paper of its own.
·         Communicate the margins of error – managers understand worst case, expected case and best case scenarios.
·         Communicate innovation – that is not easy – keep trying until you get it write for your audience.
A good consultant should have the same confidence the public give to their doctors. They are trusted to know and be experienced. What we want of them is a diagnosis and a solution.
What is the right balance? To bullet or to graph?
The answer to that, I believe, is that the client is always right.