Real Life Examples of Miscommunication Between Business and Analytics TeamsJul 09, 2022
Every day in corporate America...
Business leader: Wait, what is this?
Data scientist: Um, the results you asked for.
Business leader: I wanted a comparison of program performance.
Data scientist: Yes, this compares our performance to other programs.
Business leader: I needed a comparison of performance now to before we had it…
Data scientist: Oh, that isn’t how we looked at it. We’ll go back.
(Walking away) Business leader: Why don’t I ever get what I need?
(Walking away) Data scientist: Why can’t he ever make up his mind?
Now multiply that by thousands of companies. Every day. All year. Every year.
Any one conversation where we answered the wrong question may seem relatively inconsequential. After all, mistakes happen. Right?
Well, that might be true if it really was the exception. It’s not. Rework is the norm. As we have discussed before, over 80% of analytic projects in companies produce No. Measurable. Value.
And the consequences of these simple—but repeated—misunderstandings are significant. Besides the wasted time, effort and money, repeated miscommunication often leads to antagonistic relationships between the teams.
Today, in thousands of companies, we have highly paid, talented professional teams who barely get along, each believing that the other group is making their lives difficult on purpose. It doesn’t take much imagination to realize the scope of this dysfunction and its immense impact on how businesses operate.
Some causes of miscommunication that we can avoid
Let’s examine some of the typical ways that miscommunication occurs.
These occur at the “hand-offs” between business and analytics: initial request, updates, and delivering results.
You will notice that not only do these examples of poor communication cause problems during a specific project, they also have the potential—when experienced repeatedly—of eroding trust and damaging the relationship in the long run.
Fly-bys and fire drills
I know a CEO who is notorious for sending cryptic requests with no context or background.
He expects analysts to drop everything else and provide detailed results, causing the team to guess what the request is about and scramble to answer the presumed question.
Oh, and he doesn’t encourage dialog for clarification.
Analysts work late and on weekends to provide the answer. As often as not, the answer is not what the CEO had in mind, which puts everyone on edge.
In one case, he urgently needed information about turnover rates among front-line workers. The analytic team hustled to gather details about turnover trends across locations and job types, hoping this was what the CEO needed. Other deadlines were put on the back burner.
After delivering the results, the team heard nothing. A few weeks later, the analytic team leader inquired about the project to the head of HR. “Oh, that. Someone made a comment to him about turnover being out of control at the Huntsville site and he freaked out. He found out it wasn’t really a problem before he even got your report.”
I know of another CEO who would ask two different analytic teams to answer the same question in order to see whether one would give her a preferable answer.
After being pitted against one another one too many times, the teams figured out how to collaborate to protect each other.
But, behind the scenes, they did twice the work unnecessarily. And resented it.
These are only a few examples where business leaders do a “fly-by” without sufficient information or cause disruption with unnecessary fire drills.
Many are less severe than these. But anytime we toss a request without any context, it has the potential to waste effort and go sideways.
On the flip side of business professionals generating a frenzy of activity based on unclear requests, there is a parallel problem initiated by data science professionals: tossing grenades.
In these instances, analysts mention or deliver a potentially-alarming result without sufficient explanation.
Take this example. An analytic team spent many weeks evaluating the impact of a health promotion program. The company wanted to position these results in a marketing campaign and leaders were anxious to see the findings.
In a meeting to present initial results, the lead analyst began with a summary that was technically correct: across everyone eligible for the program, there had been no measurable positive impact. As he shared the graphic, you could have heard a pin drop. And then the panic set in. The group erupted with questions, and marketing began to consider damage control.
This was a crisis. A failure!
But really it wasn’t.
Eventually, another analyst stepped in to bring order to the meeting.
After everyone calmed down, she explained that those who did participate in the program experienced notable improvements in health.
However, the participation rate was lower than hoped.
This (more complete) set of findings helped the company focus on what was working (the program) and what needed to improve (enrollment).
However, the initial shock remained, creating a lingering distrust between the teams.
While the initial result was not incorrect, it was not the full picture and it generated needless concern.
When this happens, it indicates that someone is either unaware of or (worse) doesn’t care about what is most important to the company.
Hurry up and read my mind
In my experience, the most common communication failures result from things happening too quickly.
Rather than taking a few minutes to better define the request, or understand the purpose of the request, management states a need and the analytic team says, “sure!”
This isn’t a case where information is being withheld intentionally; it simply doesn’t occur to either team (or they don’t exactly know how) to elicit more detail.
For example, the business leader says: I need a comparison (like the first example above); or we need an ROI; or I need to know if our product is better; or how long does it take for our service to have an impact? And instead of slowing down to ask more about what the leader is trying to accomplish and how the information will be used, the analyst begins working.
I have seen all of these terms—comparison, ROI, better, duration, impact—interpreted differently by the businessperson and the analyst. As a result, the first answer wasn’t what the requester wanted.
In these cases, everyone had good intentions, but the initial request wasn’t clear.
As we’ve talked about before, this is often because the business leader hasn’t thought it through. Or the words they use don’t mean the same thing to the analyst.
Either way, a short discussion using a sequence of big picture and confirmation questions can dramatically increase the likelihood of mutual understanding.
Every profession has its own terminology. The more advanced and specialized the profession, the more unique its terminology is likely to be. It makes sense.
Speaking with each other, pilots need to use exact words about their position and equipment. Neurosurgeons need to describe the specific anatomy of the brain and type of technique they will use. Finance, Sales, Marketing and Analytics all have their own insider jargon.
Inside a profession, such language is essential.
Between professions, however, insider communication can have the opposite effect, by confusing or alienating outsiders.
And, while I’ve seen people of every profession use jargon more than necessary, data scientists and academics seem especially prone to using fancy terminology.
I’ve had colleagues who seemed to delight in using words like trichotomize (split into 3 groups), heteroskedastic (inconsistent variability across other factors), or multicollinearity (where things are closely related to each other) despite having non-researchers in the room. My sense was that they thought using those words made them look smart (note: It doesn’t).
Other analysts use technical terms because they were trained that way.
Rather than converting to plain ol’ English, they describe their methods in detail, and deliver content that emphasizes coefficients and P-values.
As a result, their non-data colleagues fail to fully comprehend the results.
If our goal is to find common ground and enhance understanding, it’s important to choose language carefully. We can describe the meaning of an analytic result without describing the methods by name, in technical detail. On the business side, we can describe the goal of a project without using business acronyms.
Example: Some of my training was in epidemiology. There is an abbreviation in public health, STD. It stands for Sexually Transmitted Diseases. When I had my first position in corporate health, I listened to the medical director describe how almost 12% of employees were out with STDs and that the average absence was 40 days. Although I remained silent, I was shocked to learn that so many people had such serious STDs! Who knew?
It was only in my second meeting that I learned that STD meant Short Term Disability, insurance that covers salary during any serious illness. I felt silly. But it’s just one (drastic) example of jargon that we fail to explain to each other.
Everyone can do better.
Communication failures happen because we don’t dedicate the time, energy and attention to understand or to be understood.
Perhaps we feel like our time is more important than someone else’s. Or we feel that they should be able to “get it” without us going into more detail. Or we make a request before the idea is fully formed.
Let’s all admit that everyone’s time is valuable.
As colleagues, we can learn skills to help unpack what another person means. We can be patient and solicit more input before jumping into action. We can take care to deliver information and use language that helps positive communication, instead of causing confusion or alarm.
Analytic translators focus on making sure teams get and give information in constructive ways, avoiding poor communication habits. That’s why they are so important.
Think about hiring or becoming an analytic translator to make your teams successful.