Jonas Christensen 2:26
Ikechi Okoronkwo, welcome to Leaders of Analytics. It is so good to have you on the show today.
Ikechi Okoronkwo 2:34
Thank you happy to be here.
Jonas Christensen 2:36
I am very excited about this episode because we are going to be talking about something that's interested me for many, many years. But I've never quite had the opportunity to be exposed to it on the other side of the fence, which is marketing analytics from an agency's point of view. And it's really a fascinating world of how we use analytics to optimise the way that we market, to advertise to people around the world. And you are a true expert in this field. So we look forward to hearing from you. But before I put too many words in your mouth, you have a very interesting and varied background. Could you tell us a bit about yourself, your career background and what you do?
Ikechi Okoronkwo 3:18
Absolutely. Yep, so, after I finished my undergraduate studies, I knew I wanted to work - well, before I finished, I really knew I wanted to work in a data driven field. I've always been drawn to storytelling. But I also realised that being a quant and using numbers to paint a picture is quite effective, because people always like proof points to justify the things that you're doing. So I worked in a couple of different areas. So I worked for the New York Academy of Medicine. So I did a research programme to help improve certain parts of New York City and worked with the Clean Needles Program where we were mapping out different parts of the city to help improve the programme and how they distributed services. So that was a really cool first, kind of, foray into research and using it for something tangible and meaningful. I also worked in the financial industry. I worked for Standard & Poor's. And then I reached a point where I decided I needed to do something different. I want to use the knowledge that I had or the knowledge I was accumulating in a more creative and bigger way. And I was always drawn to the advertising industry. So, I went back and got my MBA, my Master's. And through that I got exposed to different types of methodologies to do the work that I was doing. And you know, there was a really important inflection point in my career, which was a case competition that happened towards the end of my MBA degree. And I was working a different thing - doing different things during my MBA - but, you know, there was a case competition where we were brought into analyse Super Bowl buzz, and use that to make decisions and it was something that I hadn't done at that scale. You know, I was introduced to the GroupM network through that competition. And you know that experiential learning opportunity really helped me fall in love with the advertising industry and how we can use data to power insights and decision making. So for me, my team won, so that helps in terms of inspiration and motivation, but going to work at GroupM, getting exposed to the different agencies and creative agencies and content and innovation teams and all the different things that we can do all together, bringing it together for clients has really been an interesting journey and something that I really value. So, you know, I think it wasn't a straight line. It's not something I knew I always wanted to do. But I think that all those different pieces of creativity, data, innovation - and culture, I think is really important too. You know, advertising really drives culture. And so those are the things that I kind of latched onto as I formed my journey in this field.
Jonas Christensen 5:57
So you mentioned here, GroupM, and you work for Mindshare today - I can reveal -which is a subsidiary of GroupM. So this case competition led you to be employed by GroupM. Is that what happened at that time?
Ikechi Okoronkwo 6:10
Yeah, that's correct. So, they had a partnership with Pace University, which is my alma mater for my MBA programme. And they put on this competition with four teams of four students per team. And the whole point was to evaluate how we approached the problem they put in front of us, the case they put in front of us. And the teams that did really well, so the winning teams and individuals on those teams were brought in for interviews, and a bunch of us secured roles within GroupM. And, you know, I spent the first couple of years at GroupM before I moved over to Mindshare, to start the advanced analytics practice there, or to grow the advanced analytics practice there. And it was a really great opportunity and we've continued doing that, actually. So, in my capacity at Mindshare, ever since we did that competition, while I was at GroupM, and then when I moved to Mindshare, we kept doing the competition. We expanded to different schools. And so we now work with Columbia, Baruch, Fordham. We've continued to work at pace. We've expanded to other schools that are not as close to the city, like Simon business school. And we've hired tons of really talented individuals from these programmes. So it's something I'm very passionate about, and I think is a really important tool for recruitment, as well as community outreach, especially when it comes to data and analytics. Because the nature of the work that we do requires some level of showing your work in a certain setting. And I think it's important for analysts to not just believe that they're sort of back office building models, and then handing it off to somebody else. The most impactful strategists are those who can understand, deeply understand the data and the work that's being done, and then tell stories and inspire and advise and persuade. So you know, I can talk for hours about experiential learning and case competitions, because it's something that I've personally benefited from early in my career. And I've seen how it's been a very valuable tool for others throughout my role when I'm on the other side of the fence, giving people that opportunity.
Jonas Christensen 8:11
Yeah, maybe let's not talk about it for hours. But let's indulge ourselves a little bit, because I also have experienced that type of learning environment: Case Competition, as a university student many years ago, and I really valued the opportunity to turn theory into practice, but also show that I could do things that weren't necessarily what the book said at university. And at the same time for the companies involved, it's such a great opportunity to identify those very creative thinkers that are otherwise hard to really pick out of piles of resumes and so on. What is it that is so valuable in case competitions that you've seen and experienced in your career, that has made you such a proponent of them?
Ikechi Okoronkwo 8:57
Yeah, that's a good question. And I think there's many things but if I can really zero in on what the core value or the interesting thing that makes it stand out from other things, I think it's the ability for people to show different sides of their competency. And what I mean by that is: It's one event, you know, it's a case competition. It's a presentation, but it's multi layered in terms of the information you extract from it. So let me explain what I mean by that. So, you know, let's say you give people 24 hours, you know, hackathon approach, or you give them two weeks to come up with the solution, the quality of the deliverable during the presentation, you can tell a lot about the people, the team dynamics, their thought process from that one meeting. You mentioned, you know, people coming up with creative ideas and practical application. It tells you another story of ''Okay, if I was to work with this person, here's how their output would look''. When you're in an interview, for example, people are telling you what they can do, telling a story about themselves. In a case competition they're showing you and telling you. And I think also a case competition is really important. Because in an interview, for example, you're asking someone a question, and they're responding. You can give them potentially a case on the spot. But a case competition, you're really, very specifically asking them for things that are important, and giving them time to work on it, which represents the actual real world. So when we talk about practical application, it's a much more practical evaluation tool, whether it's to see who's a star performer, whether it's to - if you're using it for an interview standpoint, or even for the person on the other side of it. They can extract more from themselves through that approach, through a case competition. So, you know, the answer is it's many things. But I think that the way I will crystallise it is: It is a multiplier effect. The case competition tells you so many different things, just from that one activity. And I think that we can talk about this as a separate topic, but I think it's also something that's going to revolutionise how we think about talent acquisition, and how we think about the interview process. Because to be honest with you, the people that we found through the case competitions, it was a no brainer. Everything that we were trying to extract from the interview was all right there and then. And when you're also trying to do it at scale, a case competition, you can have, let's say four teams of four students each versus an interview, which you know, you have to do multiple to get to 16 people. With a case competition, you do it all in one shot. So, you know, there's a lot of things, but I'd say it's just multivarious output, I would say is the most valuable thing of a case competition.
Jonas Christensen 11:40
Absolutely, I can only concur with all those points, both as a former participant in case competitions, but also a hirer of many analytics talents. It is very hard to get to that detail without actually some sort of applied example. Now Ikechi, let's get back to you and the topic of today, which is data driven marketing. So, we heard that you started at Mindshare, and today you are the Executive Director, Managing Partner and Head of Business Intelligence and Analytics at Mindshare. So you've obviously risen in the ranks in the organisation. Could you tell us about the company and your role there? And perhaps what kinds of problems you solve for your customers?
Ikechi Okoronkwo 12:23
Yeah, absolutely. So, Mindshare is a global media agency network. As you mentioned, it's part of the GroupM network, as well as the WPP network. So there's many different companies. And as an agency, we've kind of bring all of those things together as a full service offering for clients. So we're talking about strategy, planning, trading and investment, content partnerships, account leadership, but really where we differentiate ourselves is our specialised services. So this are things like dynamic creative optimization, technology consulting, and then where I work, which is data analytics. And, you know, especially for Mindshare, we recently just completed a merger with Neo. And that's really just to kind of, like, enhance our performance chops. And so the reason why I'm bringing that up is, you know, everything that we do, we're trying to add in, like data driven empirical evidence for clients, especially clients who are looking for, you know, really kind of hardcore performance and proof points. And, you know, part of our branding is driving good growth. So it's not just about driving growth for companies, but it's doing in a socially responsible way. And I'm sure we'll talk about that, you know, throughout this conversation. You know, another thing is, we also have a neuro lab. So we use neuroscience as well to look at different signals and metrics that we can evaluate to optimise our clients investment. So when you think about my role in the organisation, my role in the organisation is to make sure that when we're thinking about the people, the process and the products that we use, to take data and extract value from it. You know, I'm really part of making sure that that is a rational, scalable, and effective system. So my stock and trade is I'm an advanced analyst. You know, I'm a model builder, using different programming languages to clean the data, model the data and visualise it. But my role has evolved now where I'm really kind of a business developer, right.So, we're developing a commercial model and an organisational, an operating model to make sure that multiple people can extract value, right, because a lot of organisations they just, kind of, hire smart folks. Expensive, smart data scientists, but sometimes they end up kind of hitting the wall and there's no vision and the work that they do is not always very closely tied to the mission of the organisation. So, sometimes people say ''Oh, we're just gonna use creativity. We don't need data''. But the reality is when that system is properly built, and well informed by all the other parts of the organisation, it really can extract tons of value and you know, I'm proud of my Mindshare that they've really taken that seriously. Both from investing in the tools and technology to be able to do that effectively but also really is a buzzword. People say that because they think they're supposed to. But if it becomes a culture that is embedded in the way we make decisions, then we can do that better for clients, right. So those folks who act or you know, pantomine being data driven, their clients are not really going to see that value. But for those of us who kind of like organise ourselves intentionally to make sure that when we are analysing a client brief. Okay, what historical data are we using to: Number one, Kind of, pull out the right, kind of themes? How are we translating that into setting budgets? How are we translating that into measurements on ongoing basis? How are we using that to, you know, quantify attribution, and then use that to optimise the things that we're currently doing, and then, you know, test and learn, rinse and repeat? You know, all of those things require a very, kind of, stable organisation and intentional setup to do effectively, and also to work with the very complex martech and adtech ecosystem that we're operating in. So, you know, I'm helping to navigate that. I think things I'm focusing on a lot on, you know, just making sure that our data, kind of, layer is properly managed, from a data governance standpoint. And then using advances in AI and machine learning to extract value from the data and to automate that process and make it more efficient and to be an enabler of the system, rather than just some new shiny object. I think it's quite flashy. I think it's really quite exciting. People don't always think of data and analytics as the most exciting part of the organisation because it does requires a lot of detail and process that sometimes is not the most fun. But once you get through all the fundamentals, that's when the fun starts, right. When you have all these things set up, when you have the right data layer, when you have the right tools, then your creativity is limitless. So I think we're at that point now at Mindshare. There's been a lot of hard work, and maybe not the fun part to set things up. But now, I mean, we're rocking and rolling. And I can talk about a couple of examples as well. But I'll kind of stop there for a second.
Jonas Christensen 17:05
Yeah, let's walk through how you get from data to insight to strategic implementation of various campaigns and so on. So I'm interested Ikechi in what kind of data you collect. I am sitting here imagining you getting some from third party platforms, Google, AdWords, etc. Some from your clients, and some you generate in the house. What is that data landscape that you have and the different assets that you combine to get your insights? Could you talk us through that maybe with an example or two?
Ikechi Okoronkwo 17:41
Yeah, sure. So as you can imagine, there's so much data to collect when it comes to marketing, specifically, because we're working in so many platforms. And all those platforms, you have hundreds of metrics and those metrics are being reported every second, every minute, right. So, there's so much data. So, you know, an example is from our buying platforms: We have data from, you know, our spend reconciliation platforms. We have data from different websites that our clients use to communicate to consumers. We have data from measurement partners, right. So partners who are able to tag the media or, you know, provide services to measure attention, to measure reach, to do sales, brand lifts. So you know, I'll just lump measurement partners in one big bucket. We have data from a variety of vendors who help us to understand identity. So that's not even talking about the clients data. Some of the clients have their own first party data, which is more data on the consumers. And that can take many different shapes and forms. But you know, an example I'll use of a very valuable data asset we have is Choreograph. And so, Choreograph is a GroupM organisation that allows us to collect data from various third parties, which gives us, you know, a view - Just using the US, for example, you know, 270 million consumers - and we're able to understand their behaviours, understand their buying patterns. But I have to point out doing this in a very privacy compliant and socially responsible way. There's been a lot of news recently about data vendors that are not following very ethical approaches. You know, we take that very seriously. And due to our scale, you know, we're really also influencing the industry to do that as well. So we don't use certain datasets, because at the end of the day, we don't want to be in the business of making money or making value in a way that hurts the culture or hurts society. No, we can do that in a way that makes advertising work better for people. And so due to our scale, we can make those decisions. And I think it's something that we'll do anyway, either way. But that's an example of a data asset that's really powerful because now what you can do is you can use that to stitch with clients' first party data and create a more nuanced view of the consumer. And then what you can do with that is: things like propensity analysis to understand people's probability to turn, people's probability to buy different products, people's probability to consume different types of media in different places. So you can deliver creative assets in the right moments of receptivity, or at the right point of purchase. So there's a lot of possibilities of how that data can be used. You know, that's a tangible example. But from an agency standpoint, we have so much data around the media activity and spend. And really, those are what I would call the independent factors, the independent variables that you're using because, you know, you're measuring the probability for something to happen, or you're measuring it against sales. And so what we're optimising is how we spend, and where we place it, and then what the impact will be and then we can optimise, right. So, a lot of the independent factors are data that we have as an agency, because we are the investment arm for clients in terms of media.
Jonas Christensen 20:53
So we're talking here, data that's being used to attributes. Using attribution modelling, perhaps, that attribute different advertising spend marketing activity to specific sales or trends in sales? Is that the sort of scenario that you're describing here?
Ikechi Okoronkwo 21:10
Yeah, I would say what the modelling does is it unlocks your ability to make data driven decisions around things like budget optimization, things like performance simulations. So for example, you can say now you're planning to stay with this media plan and let's say you have different comms objectives: drive awareness, drive consideration, and drive conversion, right. Let's just say the conversion event is sales. You can use these models to quantify attribution against each of those different outcomes, and then be able to run optimizations to say ''Okay, with different levels of spend, different media mixes, different tactical approaches, how do we affect each of those outcomes?'' So some clients will come to us to just really drive one outcome. But as a large media agency, we're usually answering full funnel types of questions for clients. So, it's across the brand to demand spectrum. And so the way to think about it is: Okay, they give us money. They're like, ''Okay, where should I invest it?'' We use these models to help them understand that. They ask us ''Okay, where's my next best dollar spent?'' Okay, we can help you answer that. And then it's like ''Alright, now that we've done that, is it efficient?'' We can help you answer if it's efficient. They're asking ''Okay, well, we want to increase our sales by X percent. How do we do that? How should we invest to that?'' Okay, we can help you answer that. Then ''Oh, we have a portfolio of products. And we have a set budget. How do I organise my investments across my portfolio of products, and minimise risk in terms of my business objectives, whether it's profitability, or whether that's a certain product for a certain segment?'' We can help you answer. ''Can we look at this in a granular audience level?'' Yes, we can. We can use the right type of data and the right models to be able to quantify it at that granularity. ''Can we manage our reach and frequency effectively across all these questions that we just answered?'' Yes, absolutely, we can do that. So it gets very complex. But there's very simple tangible questions that our clients have. And you know, I spoke a lot about the infrastructure that we've set up. I don't want to make it seem like we're just jumping to the most complex things where we're looking at identity, the person level analytics. No. Some of these questions from clients are quite straightforward. It's ''What should my budget be for the next year? What should my budget be for this campaign?'' And that's a very simple answer, if you're set up the right way. It can be quite complicated, If you're not. Then you're just making guesses, you are doing napkin math. But if you have the right sort of ecosystem in place, then you can answer those simple questions and help clients build towards the more complex ones, which involve answering multiple learning agenda questions simultaneously. And then also talking about how those answers influence other questions, so to speak.
Jonas Christensen 23:50
It's very interesting. You talk about here, different types of data sets, and there is an element here of the rational consumer behaviour. And then there is the emotional reaction to advertising. What kind of data and techniques do you use to optimise for both of these sorts of dimensions?
Ikechi Okoronkwo 24:10
Yeah, so for the more rational side, I would say that, you know, those techniques are more ubiquitous. So we're talking about things like marketing mix modelling on top of aggregate data. If we want to get a little bit more granular, you can use machine learning with both categorical and continuous variables. So, numbers on a time series fashion, as well as individual variables of consumer such as their demographic information or different actions they took at different points in time, like contextual information. And you can use that to, kind of, form a mathematical simulation of ''Okay with these consumers, there was this much spend put in market. They took these actions. We can quantify what drove that''. So, quantify attribution or quantify relationship with, you know, certain things that happen, certain levers that were pulled. So, that's a much more rational, sort of, system that I think accounts for the bulk of the work that we do. But there's the other side of things, which is understanding why people take actions. Right now, that's the Holy Grail. And it's not only it's very clear, and so in order to understand why people take those actions, you have to measure different things. So you have to measure their emotional state, right. So you have to measure things such as how does this certain creative or certain content or context of content, make them feel? And so this is, you know, social listening. This is looking at, like, sentiment analysis, and really kind of using that altogether to create new metrics, right. New signals that can be used in conjunction with the more rational. When I say rational, I mean just things that are pretty deterministic in terms of this happened and then that happened. Because humans are complex, right. And so you use similar methodologies, right. I would say the machine learning is used much more with some of the other, you know, the more emotional datasets, because the data there is not as clean. It's not as continuous as it is with, like, sales over time or something like that, right. So you're looking at different audiences. You know, for those audiences, you maybe have 5 to 15 different metrics of their emotional state or needs at different points in time. So you need something that's like decision tree analysis to help classify and categorise all of that, to make sense of it. And then pair that with more regression based, kind of, straightforward approaches. And then what you can do there is you can build probabilistic simulations. And so what you're basically saying is, ''Okay, I took this action. I invested this much. I have a certain comms objective and here is how it affects their consideration or affects the, you know, the way they feel about certain things''. And then also have information around some sort of data point to say ''Okay, if you invest this much across this tactic, what's the probability to purchase or probability to take an action?'' and you can stimulate all of this in the same system and draw linkages between those two things. And so it's that balance that we're looking to strike. I think that we have pretty good approaches for that. It will require a lot of explanation to really get into the nitty gritty of it but the way I would summarise it is really, kind of, bouncing across these two things. And making sure that when we're making recommendations, we are not just being pigeonholed to things that we're seeing which we need to try and get to a hidden layer, another layer of why certain things are happening. I think that's just good analytics, good science, right. You're always sort of questioning what you've measured, and trying to replicate the experiment in different ways. And part of good science is new data points, new information, or recording the information in different ways. So that's something that I would say is of particular interest for the industry to solve for, you know, in a very scalable way, you know. I mentioned Choreograph and some of the other data assets that we we have access to. And, for example, we have a neuro lab where we have people come in, and we are setting them up with, you know, emotional valence calculators or things that quantify their emotional valence when they see different things. And that's something that we're very interested in embedding into the way we advise clients to optimise their media plans, and look at different outcomes, not just ''Oh, just do these drive sales?''. Because, as we all know, organisations have more than one mission. And they come to us trying to get understanding of different outcomes and we want to make sure that we're addressing all of those things at the same time.
Jonas Christensen 28:34
Yeah. So talk to us about this neurolab. How does that work in practice? What are the typical use cases for it? And have you seen it really, I suppose, change things up and create results for clients?
Ikechi Okoronkwo 28:47
Yeah, I mean, a typical use case is obviously with creative testing, right. And so usually in the industry, when you build a new creative, you have to bring people into a room, show them the asset, get some questions, and then put it into the marketplace based on that. Now, that's very expensive, time consuming, and not the most scalable approach. But with the neurolab, what you can do is you can bring people in and show them different elements of a creative, whether constructed in one place or separate and kind of measure things humans can't see. Measure the responses to these different things and all you do is you use that to extrapolate, to take that training data and extrapolate to a larger dataset. And so that way you can simulate or predict how certain creative assets would work in certain context. But it's not just about the asset itself. It's about where the asset is, or what type of asset. You know, a display ad versus a video. Versus 15 seconds versus 30 seconds. And at what point in the time of video, does it crescendo at 15 seconds or does it take five seconds? You know, all these very granular things. You know, the way I've seen this data really help is using that to scale and do things means faster, and be able to potentially come up with new things through, like for example, dynamic creative optimization, right. So in the past, you know, something would happen, and then you would go back and figure out how to make an adjustment. But with this data sets, we can very quickly say ''These are the things that you need to do. These are the changes you need to make''. Some of them might be subtle, others might be wholesale. But you know, human beings are much slower at making those decisions. And so I think the marriage between humans and robots and machines, you know, to make those adjustments is quite valuable.
Jonas Christensen 30:32
Yeah, and I'm sitting here imagining the complexities that you go through with clients that come with quite large budgets, but also big hopes for what their campaigns might result in. And they might have made promises or close to it at the other end with their executives, and they don't want to take risks and take challenges and sort of experiment their way to success. So a neurolab like that is really a way to test hypotheses before you put in market, I assume, as well. Are there any other ways that you do hypothesis testing before you really push campaigns?
Ikechi Okoronkwo 31:06
Yeah, I would say one of the most important ways that we test hypothesis is through predictions of data that is available to us. And the question becomes: Okay, how do we use that to validate an idea, right? Because everybody can have an idea. If I try this, I feel, I believe this will happen. So the best way to really kind of test that out is to measure as much as you can historically, and predict what is likely to happen and measure that and look at the actual versus predicted over time. And so that way, when you have a new hypothesis, you can reference those benchmarks and sort of use that to sort of validate your hypothesis. You know, but I think that, obviously, you know, if it's something that's completely net new, that hasn't been done before, that's where you do need things like A/B tests to measure incrementality. You know, the output of A/B test is not something that you can fuel into a scenario plan. But it really does help to make really quick understandings of: Okay, we made this adjustment, and we're looking at these different markets or these different audiences and can parse out incrementally from that. So I'd say a culture of experimentation is very important when it comes to media and marketing, because there's a lot of new things and new platforms and new approaches that are coming out every week. And so we can only rely on historical data. But I would say that the most scalable type of approach is being able to use machine learning, use AI to measure what has been done before. To create proxies for new things. That's something that we do a lot, right. So if you're launching a new vehicle, let's say you're an automotive company, and you have an SUV, and you have a sedan, and you're going to launch a compact vehicle that is potentially in between the two, rather than just say ''Hey, let's not build any analysis or plans and let's just wait and see what happens''. I mean, for companies that are investing billions of dollars, you can't really do that. And so using the historical measurement to create proxies, and then in very small pieces of time, validate that predictive proxy that you put in place. And then over time, you're collecting real data, you're using your proxy data - you know, synthetic data, we'll call it as well - and then you build from there. Because I always tell clients, you know ''If you're looking to analysis or analytics for certainty, then I think you have mismanaged expectations''. Because that's not what it's there for. It's really kind of a tool. It's like a fork, like a spoon. You, kind of, use it and just, kind of, figure it out as we go along. No one really knows anything for sure. But what we can do is use these tools to demystify the unknown, as best as possible. And then, you know, try and get that feedback from what we're doing as quickly as possible, and then implement that into improving our understanding. So if we're looking for something to really just tell us exactly what happened, that's basically impossible, because there's so many variables. Like, we just talked about consumer behaviour. There's emotional variables. There's how that person felt that morning. You know, their baby might have been crying all night and didn't get some good sleep and then they took a certain action, but we would say ''Oh, it's because they're part of this demographic that they made that decision''. That's not necessarily true. Could be, but maybe not necessarily. So it's the same concept when you're looking at modelling and analysis. You really have to just approach it with an experimental mindset, and try different approaches, and really not just have one approach. You know, the reason I bring that up is we have some clients or some partners who would say ''We don't trust anything but A/B test, because that's how you can get true incrementality''. Like, alright, well, great, but then that's an awfully restrictive approach to making decisions. And I would argue, it's contextual in nature, and that incrementality you could say ''Okay, I feel within a certain degree of confidence that this is the reason why it happened because of the really rigorous test that I set up'' but then it's like ''Okay, so can you tell me how I can use that to make another decision?'' and you can only use it for a use case that's that very specific scenario. Versus what I'm talking about with creating synthetic data and proxies, you need other types of analysis. Like, things that give you the response curves, because then those response curves can be adjusted and used for scenario planning in ways that, you know, an A/B test can't.
Jonas Christensen 35:17
Ikechi, I want to pick up on a couple of things you mentioned there. Specifically around experimentation and measurement. Because I think whether you are in a in-house analytics function or, as you are, in an agency, one of the challenges that we all face is this challenge of, as you said, analytics not being an exact science, but sometimes the expectation is that it's that. And at the same time, as analysts, as scientists, we want to necessarily experiment that iterate to get the signal out and then respond to that signal that the data gives us and then get better and better. That's not always the case, when you have the counterparty, the stakeholder, whether it's a client or an in house stakeholder. They want results now. How do you create this acceptance of those two dimensions, namely, that things are uncertain, and that experimentation is actually needed? And measurement? So, those two go hand in hand: control experiment. How do you, when you have a difficult client that doesn't necessarily appreciate that, how do you turn them around?
Ikechi Okoronkwo 36:26
Yeah, that is a really good question and something that I've spent a lot of time thinking about, right. Because, especially in our world that's increasingly digital, speed is a really critical competitive advantage and need. People are like ''I need to make quick decisions'', right. But that's really at odds with the scientific method, because the scientific method really says ''No, let's take some time. Make some observations and make incremental reads'', right, to say, okay, whether something works or doesn't work or use it to influence decision making. So I think as a analysts, you're really trying to reconcile across those two things that seem at conflict, right. And there's tension between those two things. So I think that a really important approach that people need to take as analysts, with navigating that scenario, or, I would say, helping them to manage expectations, is to create a, kind of, like an outcomes framework, right? Say, what exactly are we trying to answer? What decisions are we trying to make, and really try to help clients understand, based on this decision that you need to make: This is the metric that we're looking at. This is the KPI that we're looking at. These are the metrics that affect that KPI that we know about. And based on that, that is the first level of constraint with being able to give you an answer, right. Now, then, from there, what you can do is you can then say ''Now, here are the tools or approaches available to us to sit on top of that data and provide you with an output. And based on that, here's what you can expect''. So I think what tends to happen is people jump into solutions first. So the client's, like, ''I want to make data driven decisions'' and someone's like, ''You know what? You need multi-touch attribution. That's what you need'' or ''You need, you know, you need to create a dashboard''. And a lot of clients jump straight to doing that. And they don't really focus more on like: What's my learning agenda question? Like, what decision am I actually trying to make? What data do I need to make that decision? What questions do I need to answer to make? You know, you need to map out that framework. So it's kind of like a weird answer I give people, especially when I talk at schools. I tell them, the most important part of analytics is not even the modelling itself. It's that upfront consultative work and good analytics lives or dies by that. Without that sort of qualitative exercise to really map out and categorise number one, the levers that we need to pull, the questions associated with those levers, the metrics and then the tools, everything else becomes a bunch of noise. Because there's a fundamental distrust for - not distrust, but it's a murky thing, you know. People are like ''Are you going to build me a model that's it's going to tell me what to do?''. And you have to really kind of educate them to say ''That's not really exactly what we're doing. What we're doing is we're making sense of what's available to us, and using very straightforward scientific methods, which is the way that the entire physical universe has been built. To make sense of it, and then extract information to answer those questions''. And so when you're building analysis that's very uniquely focused on like specifically answering the question, that's when you can really have some value. Now a really important thing to build on top of that is understanding the metrics that matter. So a lot of times, folks will say ''Let me get all the information. I'm going to look at it. We'll throw it all together. And I'm going to make some sense of it''. And I usually tell people ''That's not the right way to do it. That's not the right way to approach it''. Because you want to make quick decisions. For example, the best way to do that is look at lead indicators. And so you do analysis to quantify leading indicators across the levers that you can pull and that's how you make faster decisions, right. Because for example, sometimes that metric, that success metric that you're looking at may not be reported on as much as a consistent basis as you would like. So what you do then is you look at other fast moving metrics, create some causation or correlation to that metric, and then use those to make faster decisions. Now, as an analyst, I would always caution people to say, ''Don't make decisions off of something with a small sample size''. So again, you shouldn't be just looking at things hourly, daily and then just, like, kind of having a knee jerk reaction to something that you saw pop up in your dashboard. You should do analysis again to understand the right cadence for decision making off of the insights that you're getting from that fast moving data, right. So to kind of summarise what I'm saying, you know, the most important thing is developing a framework to categorise what you're trying to do, to elevate and manage expectations and then from there, find leading indicators or do leading indicator analysis to help service faster moving decisions, as you wait for the larger data or analysis to come in, which gives you more confidence. Because it's just the fact that things that take more time, that move a little bit slower, that have multiple layers of validation, those things give you more confidence. And then you know, the last thing I would say is scenario planning and simulation is the most effective way to help clients like that, right? People are saying ''I can't wait for that. You know, what can I do today?'' Well, then that's why we need to build models and analysis that allows us to quantify the attribution or, you know, a mathematical connection between, let's say, media, and sales. And then what you're doing is if you need to make a quick decision, you use that model to simulate what will happen. So based on level of spend, based on the media mix, based on the synergy of variables, this is what I think will happen. So they use that to make those faster decisions. And then you refresh your models, when new data comes in, append new data to your data set, build a new model, get a new read, and then rinse and repeat and continue that cadence. The mistake that people make is not investing in doing that type of analysis foundationally. And then what happens is, when the question comes, now we're scrambling to figure out how to answer the question. But if you have that stable, consistent approach, then when those curveballs come, you're ready for them. It just becomes part of the process. And again, analytics becomes more of a culture than a definitive kind of answer, right. It's something that's more embedded into just like continuous decision making.
Jonas Christensen 42:31
Yeah, so the directional message that the analytics is sending, rather than it being a true ''Be all and end all'' answer is really critical for stakeholders to understand. It is directional, and then we can become more and more certain about that direction, of course, as we get better. But you talk about speed a lot. Speed is often traded off with the level of certainty that you can or cannot give. The other element to it there is, as you also alluded to, there is a big chance if you don't plan upfront enough that you spend a lot of time coming up with brilliant answers to the wrong questions. And that is probably even more of a waste of time than anything else. Now, Ikechi, I'm interested - this was really helpful and I think your explanation there was really pointy actually, and then very helpful for listeners. I want to step away from this a little bit and take a helicopter view of data driven marketing. And I want to know from you, what do you see as the biggest opportunities in data driven marketing? And why?
Ikechi Okoronkwo 43:34
Yeah, I think, for me, the biggest opportunity in data driven marketing, and you know, speaking specifically as someone in a media agency, is, we really need to establish analysis that can connect media actions to outcomes. There's a lot of analysis that can be done against intermediary metrics. So we can look at things such as reach. We can look at things such as some engagement metrics, and all of that's really important. And I spoke about leading indicators. Those things are really valuable. More analysis, that is tying the levers that we pull to the business value of the client that you're working with, I think that that is the most important thing. It's a very important thing to do and it needs to come closer together. Because what happens a lot of times is that analysis is treated almost as a separate need. It's something that happens afterwards: ''Yeah, we'll do some attribution. You know, somebody will come in and a consultant will tell us about the contribution of what we're doing to our KPI and give us our ROI'', right. And this is a fixation on ROI's. I understand the need for that, but I think if most of your measurement is focused on things that are not around driving outcomes, then what you're doing is you're not linking directly to, for example, the things that the business is going to be looking at to evaluate whether you should continue to do what you're doing. And so, from what I've seen, the clients that really excel are clients that understand that ''Yes, understanding ROI at a high level is important. But I need to be able to make sure that I'm linking all the very tactical things that I'm doing to that. And not just doing the analysis at a very high level. I'm really making sure that I can tie almost everything that I'm doing at a very granular level to an outcome''. And that is valuable for marketers because now you can justify your budgets a lot better. You can, for example, go into a conversation and say ''Okay, these are the creative ideas that we have. These are the new partners that we want to work with and here's how is going to drive value for the business. Here's, kind of, the business benefit it's going to drive''. And if the analysis for the way we develop ideas and execute media is done linked to outcomes, that is much more impactful. Because in the past, what will happen is: if you just have that high level ROI analysis, there's no direct link for, you know, let's say, the consumer or the audience of that conversation. They're just kind of like ''Yeah, I know that we did this measurement at this high level, but like, how is this thing that we're talking about today connected to that?''. If your measurement is not set up that way, there's a disintermediated conversation that's happening, because you're trying to tell them ''No, I'm telling you. Media is impactful'' and you may even have like a high level proof point for that. But then it's not tied to the specific thing you're talking about that day, or that specific, sort of, creative idea or that specific tactical shift. And so I think that that's the biggest opportunity. It's to have more outcome driven, performance marketing, and then from there, it just really becomes an exercise in getting the right data, right. Making sure that we record the right data to facilitate doing that, and use the right methods. You know, everything I'm talking about, if you're going to be doing that at a very granular level, with very large datasets, that's why you need machine learning, right. We can't rely on the aggregate approaches of the past. I know cookies are going away but there are other datasets that can be used to do this type of analysis. And more investment, and more focus on that will really, kind of, empower the truly modern driven marketer, especially in the platform age, where the platforms we're using, the data we're getting are just going to explode in volume and complexity. And so outcome driven measurement in that ecosystem is going to help people to make better decisions and win, and be more persuasive, retain their budgets. And it's not even just about retaining your budgets. It's really about driving growth for the business and, like, quantifying that, and being able to speak to it rationally rather than hypothetically.
Jonas Christensen 47:33
Hi, there, dear listener. I just want to quickly let you know that I have recently published a book with six other authors, called ''Demystifying AI for the Enterprise, A Playbook for Digital Transformation''. If you'd like to learn more about the book, then head over to www.leadersofanalytics.com/ai. Now back to the show.
Yeah, and I think that the whole idea of quantification is something that has been a huge beneficiary and also, to some extent, a huge challenge for the marketing profession over the last, say, 15-20 years. Because if you look back further than that - so we're back now, around the turn of the century, people will put up marketing, branding activity, advertising, and so on but it's not directly measurable, not easily, at least. What is the value of a billboard or 32nd TV commercial? But with the advent of things like Google advertising, Facebook, all those platforms, where you can literally measure the value of each click and who clicked on it and what demographics you get and all that. All of a sudden marketing is so quantifiable for those channels, but the old channels still, to some extent, they are stuck in a paradigm of you can't be exactly sure who looked at that billboard on the side of the road. And that challenge of justifying the stuff you can't measure versus the stuff you can really measure, very detail, that must be a constant tension that you're battling with. As in ''why should we invest in a TV advertising?'' Because we can't say that it delivered XY said, whereas if we target these 200 search terms in a search engine, then we know exactly what we paid for by the click and so on. How do you navigate that with clients? And how do you justify the different channels? How do you help them justify it, I suppose?
Ikechi Okoronkwo 49:31
Yeah, I mean, it goes back to what I was saying about outcome driven measurement, because to your point, there are certain platforms or certain types of digital investment channels, where you can deterministically link what happened exactly to, you know, ROI. And so if you're only looking at that to make decisions, that is somewhat siloed. And then to your point, you can't talk about your full funnel, holistic media mix. But if you're doing measurement that can allow you to quantify, you know, from the ''traditional'' channels - I'm using air quotes - the ''traditional'' channels as well as the digital channels, instead of looking at ROI, then what you're looking at is marginal ROI. Because the reality is that ROI on the digital platform, it's not fairly attributed, because we all know that these things work together. And so if you're using measurement that doesn't incorporate all of those levers in the same system, to some degree, you're not getting the full picture. And as I said before, that's not necessarily a horrible thing. I'm not saying that people who are looking at website analytics and you know, ROI from different platforms are doing it wrong. I'm just saying it's just one part of the picture. You know, we talked about modelling not necessarily being absolute certainty. But if you want something that is more certain, more predictable, you need to take a bird's eye view and incorporate everything that you're looking at, and use marginal ROI, not ROI, marginal ROI to make decision about your next best dollar spent. So that allows you to now not get into this tension between digital and traditional. It's all media. It's all marketing, right. A lot of times we hear - sometimes some people talk about performance media. And you know, I would argue that all media is performance. You're just looking at different outcomes, potentially. And in some situations, you're looking at the same outcome, right. You invest in TV for awareness, but you know, ultimately, you do want it to help you drive sales. You expect that, right? Now, the question might be, some things might have a longer tail impact than others. And that's fine. You just need to be able to bounce those things together and there's very straightforward methodologies to do that. We just need to kind of all understand that and develop the foundational trust to do that work upfront, because what it entails is a culture shift. People like to just do things and then measure afterwards. And we really have to educate people to plan for measurement. You have to plan ahead, to be able to answer the questions in the future. You know, there's a saying that my dad use to tell me, which is ''Be kind to your future self'' and I was trying to educate marketers ''Be kind to your future self''. You're going to have these questions three months from now, four months from now, a year from now. And if you don't set up the right data capture, the right investment in measurement and tools to do this analysis across the funnel, when those decisions come, you're going to be in the middle of the ocean on the raft. And then you're gonna start searching, and you made a good point where people are just like, kind of looking everywhere. It's this frantic search for an answer. And when you have a culture of experimentation, it's never frantic. You may have serious questions, time sensitive questions, but you have a rational way with which to answer it. So then it's less stressful. It's less back and forth. When we talk about, you know, the power of AI and machine learning to transform our industry, it's about being able to make these decisions faster, more rationally. It's about being able to kind of draw inferences with all this data across all the different channels and simulate what's going to happen rather than what has happened. And then when that information of what actually happens to comes in, then the work you're doing is about looking at that data. Okay, we thought this was gonna happen and then this happened and we have the variables that we know impact that, so then now we can more accurately diagnose why something is happening. It's really just a scientific method. It's used in all these other fields, like very serious fields like medicine, because the stakes are high, and I think that those same tools can apply. It's a different thing that we're solving for. But it's the same concepts, the same statistical methods, the same scientific methods, the same tools, right, to be able to feel much more certain or feel much better or more accurate with the way we're making decisions. But you have to plan ahead. You have to, or else, you're really not going to be successful.
Jonas Christensen 53:43
Be kind to your future self. That's going to be ringing in my head tonight. And for a little while longer. I think Ikechi, that's a really good phrase. I'm going to steal that. I hope listeners will too. That is good advice in so many dimensions, but especially when it comes to this sort of field of data analytics, and in understanding what is happening. Gaining insights from data. You talked about it, right at the beginning, in terms of having invested in the right data assets and platforms to actually be able to do this and that. That stuff takes time that is ''boring''. But it yields results later on. You were kind to yourselves today, back then.
Ikechi Okoronkwo 54:22
Jonas Christensen 54:24
Ikechi, we're almost at the end. I have two questions left. One is for you to pay it forward. I'm interested in hearing from you. Who would you like to see as the next guest on Leaders of Analytics and why?
Ikechi Okoronkwo 54:38
Yeah, I actually have a really good answer for that and it's somewhat self-serving. But I'll explain to you why. With a lot of the work that we're doing at Mindshare, we are really bridging the gap between the way things were done before, taking the best practices of that and finding new ways to improve that and sort of take ourselves into the future. And you know, one of the people that has been instrumental in a lot of the vision that we've had to really use advanced technology to transform the way we make decisions is somebody who I work with, named Fabio Giraldo. You know, he leads the advanced analytics practice within my group, and we're doing a lot of work with many different specialty groups within the organisation to really kind of solve new problems and solve really high value problems. And, you know, I think he'll be a very interesting person to talk to, to dig deeper into some of these things. And I'll give an example. You might have seen it in the trades recently, but we just recently launched Impact Index, which allows us to use AI to be able to understand which publications are toxic and help our clients, very quickly divert their media dollars and make better decisions, so that they're not perpetuating things that affect our culture, through media, - perpuate the negative aspects of our cultural media. And so, Fabio was really involved in a lot of partnerships with our content teams, our innovation teams, our invention teams, our strategic teams. And so there's a lot of story - there's a big story there about how to navigate all these different points in the organisation to come up with a really interesting tool to transform the way media operates. So that's somebody who I would - you know, I can keep going. We have a carbon calculator that we're using. There's so many things. We partner a lot with the neurolab. So that's somebody who I would want you guys to hear more from because I work with him every day and it's really a pleasure. And you know, his team is doing really cool stuff.
Jonas Christensen 56:32
Great suggestion. Thank you, Ikechi. I will be speaking to Fabio very soon. Lastly, where can people find out more about you and get a hold of your content?
Ikechi Okoronkwo 56:43
Yeah, so you know, I work for Mindshare. So if you google Mindshare or go to www.mindshareworld.com, you can learn more about Mindshare and the stuff that we're doing. You know, me personally, I am not as active on social media. But I mean, you can find me on LinkedIn, just search my name, and there's links to different podcasts I've done, different articles I've contributed to. So you know, that's where you can find a lot of that content. You know, something that I also do is I'm also an adjunct faculty member at Pace University and Fordham University. So there's different events and things that we're doing that people can can find out about. But generally, I mean, on LinkedIn is a good place to network with me. So I invite anyone who's interested to learn more about things you've heard today, or, you know, have ideas or want to collaborate or want to become a client of Mindshare, whatever that may be, you know, please feel free to reach out and I will respond.
Jonas Christensen 57:32
Great. I will put links to all those places in the show notes, listeners, so please go and check it out and do connect with Ikechi. He's a very nice guy, I can promise you. So, please go ahead and do that. Ikechi, thank you so much for being on Leaders on Analytics today. I have learned a lot from listening to your ideas, your concepts and how you go about things at Mindshare, but also your personal approach. And I will definitely be kind to my future self a lot more after listening to this conversation. All the best with your future endeavours. And we look forward to hearing from you again in the future.
Ikechi Okoronkwo 58:09
Thank you for having me. This was fun.