Shake it off these 3 bad habits: In Decision-Making in Marketing

Shake it off these 3 bad habits: In Decision-Making in Marketing

As with a lot of other business disciplines, in marketing, decisions are typically informed by some kind of evidence, or so we hope, the evidence base for making marketing decisions can include lots of different things. For example, we might make use of tried and tested frameworks & textbook approaches recalled from our university days or found through a quick Google search or read of a Wikipedia page.

Sometimes, we’ll consult with subject matter experts or find some research to influence our considerations, and we might even conduct or commission some research ourselves, such as running an experiment (A/B Testing), or a customer survey, or analysing relevant data we’ve already collected.

Of course, It is also entirely possible that we go with what instinctively seems right for the decision at hand (trusting gut), or what a “marketing guru” wrote on their blog, or with something that emerges from a vividly told, yet entirely anecdotal story from a meeting participant who spoke up.

There’s much to love about the “whole brain” blending of so-called art and science that sits at the core of what marketing is.

As managers, entrepreneurs, business leaders, executives, and — most of all — human beings, we make decisions based on the integration of a variety of inputs that are available to us at the time. Absolutely nothing totally wrong with any of the examples of inputs listed above, provided that we optimise over the entire set of decision inputs by taking into account the strength — which in business - translates into credibility and relevance — of each piece of evidence.

For instance, there’s nothing inherently wrong with gut instinct, and many would argue it can play a role, but alone it rarely are going to be optimal. The same are often said for the other end of the spectrum: meticulously designed research studies, which might be scientifically amazing but somewhat detached from the real-world context of the decision to be made. Therefore, our job when making important strategic marketing decisions is to challenge ourselves and our colleagues to use multiple evidence bases whenever possible and to critically assess the credibility and relevance of these inputs.

Shake it off these 3 bad habits: In Decision-Making in Marketing

But do you always do things this way? Do you always push your partners, your colleagues, your bosses, to critically challenge the evidence base that’s on the table? Guess not. In my experience, we develop "bad habits" in decision-making processes. Here are a few key ones that, in my opinion, we should try to shake:

  1. An over-reliance on past experiences.

    Humans are creatures of habit and therefore so too are our customers. This is akin to a form of “homeostatic behaviour” or, put plainly in marketing terms, habitual buying. Similarly true is that something is always changing, whether that’s the individuals we are attempting to market to, or the external environment in which we are operating as we have seen recently with the pandemic, or something else — there will be change. This is true even in cases where we are using prediction models based on machine learning and statistics. They can do a fantastic job of suggesting what to do next based on what has happened already. But if the forthcoming context is meaningfully different from the context upon which the historical input data is based, then your prediction machine won’t be so effective.

  2. Using heuristics when working in unfamiliar territory

    We all try to use heuristics — decision shortcuts or rules of thumb — because they can be highly effective and efficient in many facets of our personal and professional lives. But heuristics aren’t usually all that helpful when faced with decisions for which there is no precedent or sufficiently similar prior experience to call upon.

    When we’re in unfamiliar territory, such as during the pandemic with respect to consumers’ purchasing behaviours in certain product categories (e.g., toilet rolls and pasta), or when we start incorporating new types of digital advertising into our media plans, shortcuts and rules of thumb won’t really help us make well-informed decisions.

    Heuristics work well when we have learned how different patterns of inputs tend to be associated with certain outcomes time and time again. We learn from experience (or we build machine learning models to “learn” for us). But when there’s little experience to learn from, such as when we’re operating in unfamiliar territory, or, critically, when pertinent market conditions have changed, our human tendency to rely on the same old decision heuristics we’ve always used is unlikely to serve us well.

  3. Extravagant extrapolation from research, white papers and books

    The bigger concern arises when we try to stretch findings from, say, a carefully conducted research study beyond the context in which it was conducted. Empirical studies are great, but they are always limited in that they rarely can cover every situation, every possibility, and every context. When trying to use this type of evidence, however, often we don’t think much about this perennial limitation and related to this is what i call “extravagant extrapolation”.

    Think of it this way. We’ll find a piece of evidence in a published study and we’ll like what we hear. It fits your world view or gives you what you need to win an argument. And it might be remarkably straightforward and easy to explain (and remember). What can happen next (and certainly has happened in our industry) is that we might have a tendency to adopt it there and then. We might even treat it as if it is a “law of marketing” akin to the laws of physics. (As an aside, as appealing as the thought might be, in the social sciences we don’t have “laws” akin to those in the physical sciences because human social systems are inherently stochastic and complex, and therefore cannot be fully codified that elegantly.)

    The problem with this is that the research might not fit our situation. Too bad if you’re in the business of selling cat food in the UK but the study’s findings were based on advertising and sales data for industrial forklifts in Australia. Unfortunately, if we like what we hear and the source seems credible, we might inadvertently fall into this trap.

There’s nothing inherently wrong with gut instinct and many would argue it can play a role, but alone it rarely will be optimal.

Given these three bad habits, what then should we, as thoughtful professionals, try to do more of?

  1. Be professional sceptics

    It is helpful when evaluating the evidence we need when making decisions. It puts us into a mindset of “thoughtful scrutiny” and reminds us and our colleagues to think carefully about the strength, credibility and relevance of the evidence at hand. It doesn’t mean assuming everything is wrong or cannot be trusted, however.

  2. Think of proposed answers as “hypotheses to be tested”

    In marketing practice, that means being willing to test or experiment a bit before fully jumping in on something. Test and learn often can mean trying out lots of things to see what works. There’s nothing wrong with that. But “hypotheses to be tested” implies there’s a hypothesis (or hypotheses) as a starting point. A real test isn’t throwing a bunch of darts at the board blindfolded. A real test starts with some sense of what is being tested, why, what will be learned, and what courses of action might subsequently be influenced. Sometimes the empirical test is hard to do — because of limitations due to time, cost or data availability. Nevertheless, challenge yourself to think in this way.

  3. Be data informed, not data driven

    George Box, a famous statistician, who said: “Essentially all models are wrong, but some are useful.” If we are data driven then it implies that we let the data (or rather the statistical models run on the data) decide what we should do. Being informed by data (and analytics), therefore, helps us get around this problem. We should put weight on what our analysts’ and data scientists’ models tell us, but we need to put that through a human filter or sense check. Healthy scepticism comes into play again here, and so does, believe it or not, gut instinct. A good empiricist will always think about context when designing a valid hypothesis test or reviewing existing empirical evidence. Ideally, your test will be done in the same context that your decisions will play out in, so as to be as realistic as possible. That makes sense and should be straightforward to accomplish.

Resist simplification and embrace change

As a final suggestion, strongly encourage all marketing professionals to resist the urge to oversimplify things such that we are susceptible to believing that there are certain immutable truths about customers and markets, or so that we can have a set of unchanging “laws” of marketing. What we know from past experience, research, and so on, can offer us valuable, useful guidance. But it cannot tell us what to do in every situation. It can help us reach a solution, and indeed some problems and their solutions will, over time, look remarkably similar, almost constant. But as the world changes, our customers and competitors change. And as they all change, so must we.

If you want to know more about how to build brilliant brand and improve your marketing games Schedule a discussion with our expert.

Previous
Previous

Aligning Sales and Marketing: Types of Relationships

Next
Next

Sales Prospecting: Do you have an unique solution?