Data Science in the Real WorldFinding Business Value in Simple ModelsHow a general linear model succeeded where complex models failedElliott StamBlockedUnblockFollowFollowingMay 22To this day, the most useful predictive model I’ve seen in the wild was a general linear model.
It wasn’t the most powerful model, it didn’t make the most accurate or precise predictions, and the engineers in the room didn’t get excited talking about it.
However, business users trusted and understood its outputs, and the model was absorbed into their decision-making processes.
I believe there’s a good explanation for why this model was embraced by the non-technical teams, succeeding where numerous complex models failed.
The model accomplished something more valuable than regurgitating recommendations or labels — it facilitated a conversation between teams who needed to make important product decisions.
As simple as the model was, its impact rippled through the business.
I see a few valuable lessons that can be extracted from this outcome, and throughout this article my goal is to illustrate those lessons.
Lesson 1: Give non-technical teams a reason to careThere is often a wall that stands between technical teams and business teams.
Smart people live on both sides of that wall, but in general they speak different languages and are driven by different motivations.
What seems important to an engineer might appear trivial to a product manager, and vice versa.
If we don’t all share the same perceptions of what’s really important, it’s natural that wires will be crossed and miscommunications will happen.
This has certainly been true for many projects I’ve contributed to.
This phenomenon isn’t exclusive to technical and non-technical people; this happens all the time when consultants are producing work for their clients.
If we don’t continuously align on expectations, the paths of what the customer wants and what is actually delivered tend to naturally diverge.
Something we can learn from this is: the way you discuss your model matters.
You can frame the discussion differently if you are speaking to a technical audience versus a non-technical audience, taking care to emphasize the points that matter to your audience.
You might not be interested in the details discussed in a business development presentation, so don’t falsely assume anyone shares your interest in the technical details of your model.
I believe this helps explain why a general linear model was able to capture the attention of decision makers, even while ‘better’ models were ignored.
The linear model was digestible and it gave the business a reason to care about it because it clearly illustrated relationships between past decisions, customer behavior, and business performance.
In that situation, the model was able to address the questions the business users had and it was better equipped to feed into their decision-making process.
Lesson 2: Emphasize the results, not the technologyAs the tools our technical teams have at their disposal continue to advance, it becomes increasingly important that we take into consideration the growing divide between our technical and non-technical teams.
For example, think about people who enjoy a hobby you don’t, and consider how quickly you tune out of their conversations when they start tossing around jargon you can’t understand.
Now consider how a technical person’s vocabulary is literally different than that of their non-technical peers.
That doesn’t have anything to do with intelligence, but it does mean that there are different ‘business dialects’ depending on which team you belong to.
Despite the differences between various disciplines in the workplace, the teams working there all exist because they serve some function that benefits the business.
Just as you might find common ground with someone who doesn’t speak your language when traveling in a foreign country, you can find common ground with your co-workers regardless of the function you serve.
The key is to not lose site of the common goals you share.
Therefore, when developing a predictive model which is intended to further the goals of the business, it follows that you should take a step back and consider what the model means to your business counterparts.
You might have fallen in love with a new framework, or spent late nights wrapping your mind around a new algorithm, but does that really matter to the business?I would argue that it does not matter.
Yes, you should celebrate your successes with other peers who are also excited about the same niche that excites you, but a predictive model is only useful if it gets used.
The business will only use your model if they are convinced it produces results, so focusing on the results appears to be a winning strategy.
The general linear model that impressed my client’s business users didn’t impress them because of its implementation or because of the programming language it was written in.
It impressed them because it produced results they could integrate into their decision-making process.
The results were presented to the business in the jargon they understood, and this was an example of the data science team checking their ego at the door in order to meet the business on common ground in order to advance the goals of the entire business.
Lesson 3: Interpret the results as a guide, not as the answerI tend to assume that a significant amount of bias exists in my models.
Working very closely with data over the years has had a sobering effect on me, now that I know how the sausage is made.
There are so many assumptions that go into building a model; there are assumptions when producing the data, collecting the data, cleaning the data, processing the data, and finally there’s the assumption that analyzing historical data accurately predicts future behavior.
Like a game of ‘telephone’, this system of assumptions almost ensures that the outputs are flawed.
For that reason, I’ve learned that it’s often better to treat the results of models as a guiding beacon rather than a certain truth.
We’re flying through the fog and our model’s results are the lights guiding us, but they are not necessarily the final destination.
I find that this approach resonates with the decision-makers in the business.
If the data is flawed, and all of the intermediate systems the data travels through are flawed, it would be ill-advised to interpret a model’s results as pure truth.
Pair that with the fact that many people don’t like being told what to do, and you have a recipe for non-technical people ignoring you if you tell them to listen to your model.
However, if you treat your model as a guiding hand for conversations, and you can demonstrate that the relationships your model suggests are real, then you are in a strong position.
That is a recipe that tastes better to the non-technical teams, who may not necessarily believe that data is the answer to all things.
This is one of the reasons the general linear model was such a success.
Rather than claiming to have all the answers, the analytics team that developed this model was careful to focus on the relationships their model identified.
Business users responded well to the model’s results, and I think their positive reaction was likely due to the model quantifying certain relationships they were already aware of on some level.
Lesson 4: Good models don’t always ensure good decisionsThe final anecdote I will leave you with demonstrates that even a well-designed model can lead to poor results, depending on how it is interpreted.
In the case of the general linear model I’ve been referencing, several months after its initial success differing perspectives began to emerge regarding how to capitalize on the relationships the model helped quantify.
For example, one of the insights the model pointed to was that customers who bought product ‘X’ were associated with behavior that tended to lead to other lucrative purchases.
In summary, the conversation eventually became hyper-focused on how to sell more of product ‘X’.
The resulting business decisions optimized for selling more of product ‘X’ at all costs, against the advice from the analytics team, and the results were not nearly as profitable as the business expected.
The strategy they chose led to unintended behavior in their product marketplace, which hit the reset button on the relationships the model had revealed.
From my perspective, this was not a failure of the model.
The model had correctly identified the relationships that existed.
The problem was that the business, having tasted the success of the initial insights, jumped the gun.
They temporarily bought in to a data-driven process, but then quickly turned around and removed those same data-driven processes from their decisions once they felt things were moving in the right direction.
Conversations soon became political, and different camps developed within the business teams.
People fortified their defensive positions, and the ensuing miscommunication detracted from the initial success of the predictive model.
I’m sure there are many more lessons we could unpack from this situation.
My intention here has been to illustrate what technical teams, specifically analytics and data-oriented teams, can do to improve the odds of their work being utilized by the business.
Although the outcome described here wasn’t exactly a fairy tale ending, the model was uniquely successful in its ability to penetrate into strategic business conversations.
At the end of the day, I think this company would have benefited more if their business units had continued to collaborate with the technical team that built the predictive model.
Perhaps they could have identified some poor assumptions they were making about placing all their eggs in the basket of increasing product ‘X’ sales.
Generally, my opinion is that analytics teams should have a stronger voice in the decision-making process.
However, that’s a chat for another day.
I hope you enjoyed reading!.Thanks for tuning in.