Running data experiments to improve marketing performance, with Jason Widup, VP of Marketing at metadata.io
TheNextCMO’s latest podcast is with Jason Widup the VP of Marketing for Metadata.io. Jason is a sophisticated SaaS marketing operations, technology, and analytics leader with over 20 years of marketing experience. In this podcast we discuss best practices for running data experiments, a new way to look at ABM, and advice for Marketing Ops leaders looking to become a CMO.
Jason Widup - https://www.linkedin.com/in/jasonwidup/
MetaData - http://metadata.io
Interested in being on The Next CMO podcast? - https://info.plannuh.com/the-next-cmo-podcast
For more info about Plannuh, check out our website
Kelsey Krapf: welcome to the official podcast of the next CMO. Hosted by Plannuh. Makers of the first AI driven marketing leadership platform for quickly and easily creating winning marketing plans, maximizing budget impact, improving ROI.
The next CMO is a thought leadership podcast for those that are seeing or want to become one. My name is Kelsey Krapf and I'm the senior marketing manager
Peter Mahoney: and I'm Peter Mahoney, the founder and CEO of planet. And welcome to the next CMO podcast.
Kelsey Krapf: This week we have Jason Widup the VP of marketing at metadata. Jason is an experienced marketing ops technology leader, known for his MarTech knowledge and developing high performing teams. Thanks for coming on, Jason.
Jason Widup: Yeah. Thanks guys. Thanks for having me. This is great. Yeah, I'm looking forward to it.
Peter Mahoney: Yeah, I'm really looking forward to this, Jason, just because I've heard so much about metadata and, and didn't really have a chance to get into it too much and doing a little research before the call.
I got even more excited, in that the, the whole idea of improving the performance of marketing using data is. Pretty central to what everyone in the planet wants to do right now. So you guys are in a pretty hot area, but rather than me ramble on about what metadata is, maybe you can tell us a little bit about the company and, and also it would be helpful to understand a little bit more about your background so you can mix them in however you want to do them.
So tell us about those Jason.
Jason Widup: Sounds great. So yeah, so I'm, Jason went up, VP of firstname.lastname@example.org. And, I'll start with a story about how I actually became the VP of marketing here. And cause it's actually interesting. my background is primarily in marketing operations analytics. I've been doing it for a long time.
I've got a lot of gray hair, and I was, and I've been doing it for so long. I was really ready. I was thinking that I'm ready for, taking on a little bit bigger role and I'm ready for, to helm, a marketing organization and take all of this. Experience and knowledge that I've learned over the 20 plus years that I've been doing it and, take my hand in it.
And also, over that time, marketing has become so much more of a data-driven data optimized function. And so all that experience that I had in ops and analytics just, it was just great experience to have even in a marketing leadership role. And Last summer, summer of 2019, I started hearing about metadata in my network.
So I was working at Workfront, great company, just started hearing about metadata. And in fact, I was at a conference when those things were still happening. And metadata wasn't sponsoring it, but it was literally the only thing people were talking about.
And I exaggerated a little bit, but it felt that way to me. I was like, they're not even here. And we're all talking about metadata and what it does and who's behind it, and so I don't know, a week after that I just reached right. I'd reached out directly to Gil our CEO and just said, I don't know you, I.
I've been doing a little bit of research. I've heard a lot about your platform. I'd love to do. I'd love to see if we could work together. And we met up in San Francisco, really saw eye to eye. we saw these problems in marketing. I'd seen these from an operator's perspective, really difficult time, proving ROI, really difficult time doing just that.
What we all think we should be doing? Testing, optimizing. No, not enough resources, And then also when you have enough resources, the emotional start part gets in there. what wins, what does it, It's almost like trading in the stock market as a person, and not a machine.
And we just saw eye to eye. And what he was building was exactly what would fix that. And it was, there were like three aspects of the platform. I was like, I've tried to build this at these places I've been, and I haven't been able to. So we started in, I started advising and it a little advisor relationship with them and that lasted about six months.
Really loved it. And then right in the beginning of the pandemic is when I decided to make the leap and come full time, which was also another kind of scary time. But yeah, that's, that's why I joined and, metadata, you know what we do, we described it a little bit. What we're.
We call ourselves the demand generation platform. And we'll talk about that maybe a little bit later, but we really help B to B marketers. understand the makeup of customers that they can sell best into, and then we'll run and optimize sets of campaigns across channels using our AI. That really just focuses in on what drives the most revenue.
that's the most basic way to say it. Of course, there's lots of components of the platform that make that work, but we're really trying to make B2B marketing just more effective, more efficient, and also really just hone in on the quality of demand that's being driven from a platform and not the volume of demand.
so yeah, so that's,
Peter Mahoney: that's great nutshell. No, that's, it is a great place to start and I'm sure we'll unpack a little bit of that in, throughout our discussion here. And, one of the things I was going to ask you though, is having spent a lot of time in these operating roles, and especially at.
Big companies like Workfront, what were the big struggles that you had, as a marketing operations leader at a company, the size of Workfront, it doesn't have to be specifically work front. Cause I think everyone has the same issues. So talk about the pain points that you had as a marketing ops leader.
Jason Widup: Yeah. And in my experience, all the problems are really the same, from, I worked at Tableau work for Microsoft, smaller places, Getty images, all as an ops and marketing ops person and all the same problems. the main one, I would say from an operator's perspective is. All the resources you have working towards a goal, but how those resources are being applied and how much I'd say red tape, for lack of a better term, there is that goes on.
And these organizations that really slow things down dramatically. And so the, the amount of, so let's say you have a good idea in a big, a big marketing team. You've got a scary idea. It doesn't fit the mold. You know what you've been doing historically, you pitch it, you've got to pitch it to seven people.
You've got to do this big business case. You've got to come up with all of the details, just to get approval, to try a test out. And by the time you've got all that ready to go and all the approvals, maybe three other companies have already done it. You know what I mean? And they beat you to it. And so now you don't get to do that anymore.
and they take the other piece is they have all these resources that they take and they. Apply it to okay, let's get this campaign done. All right. We've got, let's say seven campaigns through the year we need to get done. And the goals are getting the campaign out the door, not the campaigns performance.
So their goals on the activity.
Peter Mahoney: Yeah.
Jason Widup: And not the outcomes.
Peter Mahoney: And so call it the, did you do it measurement versus the outcome based measurement? And it's a huge problem, especially when you get to a complex organization, as you said, if you've got in a big team, we have a lot of moving parts and getting a campaign from ideation to delivery and result takes a lot.
and obviously, and especially if you're, people who use Workfront tend to have complex things because that's what it's really good at. And, and obviously keeping focus on what the outcome should be C is really important. And not only that, not only keeping focused on the outcome, but understanding is that a good outcome?
So that's a big struggle that I think people have is the difference between saying, Hey, I did, the campaign is step one, right? It was done. Yay. Step two is, yeah. And we got, A hundred leads. Okay. Is that good? Or is that bad? How does it compare to other things? What was the cost per lead?
What's the value of the lead? So what would the ROI be, et cetera. So really understanding those things. And I know that some big companies struggle with that, but also some small companies do. And in fact, a lot of small companies really understand, really struggle with this understanding of. what is valuable, what's valuable and what's good for them.
And just because the outcome they've been getting or the cost per opportunity or cost per lead is the number that they've been working toward. Is that good? They don't understand if it is, or could it be better?
Jason Widup: so that's a big struggle where they stopped there. They stopped there and they're like, I know that goal, which was maybe it's a CPL number below something, or a number of leads or, and even Hey, I met my goal.
Not only did I meet my goal on volume, but look like the profile looks right to, they're the right company size or the right job title in the right departments. But then it's like all of a sudden it's not converting, and that's really where. I think there's some of that disconnect, it's marketing feels like they're doing their own job, but they're looking sometimes at the wrong metrics or however they understood their unit economics from like a demo request or even an impression all the way to a closed one deal they're off, They don't work out. And Yeah, I see that a lot too. I see that a lot as a struggle with these larger companies, but it's just interesting, what was interesting to me coming from these larger companies and, moving to a startup is. It honestly feels like I'm doing like we're producing just as much output.
Absolutely. With two people in a marketing team that we were with 80 people in a marketing team. And it's just that, I know we're not, but it feels like we are. And I would venture to probably say that if two people looked at my website and the activities that I'm doing outside of the website and larger company, and what they're doing, it would be hard to tell Is the size of their teams differently, or are they relatively the same size?
And so when I started to see that, I was like, I'm able to do a lot of the stuff cause there's no approvals. There's none of this red tape, there's an experimentation culture. So I can try something and be okay if it's going to fail. Because I know that I'm not going to get fired for it, and it just gets you to doing a lot more activity and then that activity gets you to the place where you know, what works faster.
Peter Mahoney: Yeah. So one good way to talk about this, Jason, is that the way that you guys organize your products are. in this logical flow the way marketers, think you start with identifying your target audience, what's that all about and using data to do that. Second is running a series of experiments to see which ones are going to be performant and which ones aren't what's a good thing.
And then third is actually. Flooding campaigns and acquiring customers, et cetera. So in that sort of phase of things, that's, should be a familiar set of steps for marketers to think through, the first thing they need to do is identify the target audience. So let's talk about that.
Cause that's a huge struggle. In fact, before we started hitting the record button, we were talking a little bit about ABM strategies, in whether your. Explicitly doing an ABM strategy or you've at least had a implicit, ideal customer profile that you've defined, et cetera. You should have some kind of a view of what that, what that target audience is.
So what are the kinds of things that you should consider as you start to build out and refine your understanding of that target audience?
Jason Widup: Yeah. And it's interesting, cause this is an area I get really excited about because it's so traditionally we're looking at things like, all the standard stuff, right?
Like industry revenue, number of employees, just all this stuff that you can, that's pretty readily available. and you get some more interesting things like, Oh, maybe they're on the inc 5,000 list or, it's against some other, maybe interesting ways of.
Segmenting, of course you've got geography and all that, all that stuff, but those are like the starting point. You know what I mean? Cause that's just to be the starting point, but the things that are actually going to move the needle in your ICP are probably not those things. They're going to be behavior related things.
They're going to be things that are harder to detect or that might be unique in an industry or a company. And When I'm talking to, I talked to a lot of our customers, so use the platform because they still need a marketing strategy. So if they come to us with a bad marketing strategy, we're just going to amplify it.
So it's so I try and get it on the phone with them and talk a little bit about the strategy. And the audience is really where I focus in because it's not true, but it's almost true that if you've got just the right, like the perfect audience, if you really nailed your audience, The message you get in front of them, doesn't matter as much.
I hate to say that, but cause it does, but like almost doesn't, you can almost, if you really do the audit, the audience that really needs your thing and you just give them one pain point message that they're really feeling at that point. That may be all you need to do, but so it's how do you get to these other understanding of these other pain points?
How do you know, for example, what we do, when I'm targeting people, is. People that would be good for our platform are usually people that are spending a decent amount of money in paid social. Cause we really help them optimize in a couple of channels, Facebook and LinkedIn. there's really no publicly available dataset that I can use.
It says how much, there's Kantar and like how much does it people generally spend on advertising, but nothing that like. What are they spending in LinkedIn and Facebook? So I built my own. I realized like you can go to a company page on Facebook and LinkedIn, a couple of clicks. You can actually get to a page.
You can see every single one of the ads they have in that platform. And so I had a company write a script for me to go out to thousands of these company pages and just count the number of ads. And use that as a proxy for how much they might be spending in these channels. And so I'm always advocating, look at your customer data.
So when you're trying to figure out your ICP, if you sold enough, great. You know what I mean? So you've sold enough. So you've got a great volume of customers that you can analyze and really understand your ICP, but you're going to start by understanding just these standard firmographic and technographic stuff, right?
what technologies are you using, et cetera? But then, it really is helpful to try and get in and understand these other things. Were they growing at the time? Were they hiring for people at the time, but they, that they became a best customer viewers. There's just these other pieces of data.
And so then using that obviously to then go out and find other audiences are that are like them other easy ways. Things like your current customers. Who are their competitors, right? So if your current customers are doing well with your product, their competitors probably will too. That's readily available data also has some, the other, the embers of, I always forget this one customers, competitors, and then competitors.
Customers. Yeah. Yes. you can find use a technology like BuiltWith or Aberdeen, and you can understand technographics and some of these data sources even have the first month. That a technology was detected on a website. So you can infer a contract ending and start to, build programs around, I know you use this technology, we use something we often do better than that technology.
You might want to check us out that kind of messaging. really understanding your audience, but then. Knowing how to flex that a little bit and getting some additional data so that you can really, and then of course, in 10 data, which we haven't talked about, which is one of the most important things, as you're really creating these kinds of audiences.
Peter Mahoney: yeah, in fact, I was just having a, an online discussion I'm in a group of CMOs. hu. even though I'm what I call a recovering CMO. and, I was just having a back and forth about intent data and the best sources for that. Cause obviously, understanding demographics or firmographics is great.
But if someone is looking to buy something that you happen to be selling, that's pretty handy. And, and so can you talk about the kinds of techniques that w generically you can do an extra would be handy to understand also how, Oh, you guys are doing it metadata, because I think one of the things, one of the things that people struggle with a little bit around this kind of data work is that.
They understand at a high level, conceptually how this all works, but actually doing it without a bunch of data. Scientists is pretty tricky. So how do one, step one is what are the right things to think about when it comes to intent data? What are the right sources? And then two, how do you make it actually doable?
So you don't have to hire 14 data scientists.
Jason Widup: Yeah. Yeah. And there's different plays or flavors of intent. Now, as there's like companies like Bombora, who you can buy from and subscribe to. And, they grab intent oftentimes through these like journal publication sites, that people might go to and what content they're reading and they.
They read all the, the system reads all the content and kind of comes up with NLP. what are they looking at? And stacks that up. And you can get things like surging companies searching on a topic and those kinds of, then you got intent from, places like G2, right? So review sites. And I actually find that intent is oftentimes lower funnel, And so it's if they're on a G2 website and they're researching alternatives already, And I can capture some of that because I have such low awareness, against some of the people that they would normally, choose as a supplement for us, or, a replacement. there's really no better data and you can get a direct feed of that and the accounts that are either on your page or on the.
Comparison page or on the category page, and then you can flex, your messaging, based on that and just add some job titles to the account list, and then you can start to go, start to go advertise to them. And then there's first party intent, that's the best intent. So they're on their website, and they're reading the pricing page or they're reading some other high value page and you go and retarget them, because you've got a site pixel, cause they've already been on your site.
And so those are really the three kind of intent. Areas that I play in. I would, I primarily use G2 intent for those stuff that I do, because again, we have a little bit low, I won't say a little bit, quite a bit lower awareness than some of our bigger competitors, and getting them lower funnel. And then as long as I can get in front of them, which my own platform helps me do, then I can usually either intercept or at least have a conversation or get them to at least.
Kayla there's an alternative outfit.
Peter Mahoney: Yeah. And what we see that's a great way to break it down. Jason. And what we see is that I think people are decent at the first party intent stuff. I think people are pretty good at, pixeling their appropriate pages and trying to do retargeting and things like that.
And that works reasonably well. But obviously. That is people who already have know your brand and impending your website and all that good stuff. so finding the broader universe and filling, Filling the, I was going to say the top of the funnel, but the other really important point you made was re was about the idea that when you're doing a product comparison and trying to decide which vendor is the right one, clearly you're pretty far down your buying journey.
when you want to look at which alternative is the right one for me. So that's a cool trick to look at. and I want to come back to how to operationalize this, but we should probably try and get you that was part one. Don't worry. It's three. It's not like 47, but that was part one.
So I think Kelsey, cause I know you spend a lot of time on the optimization side for our plan. I think you had a couple of questions about optimization, right?
Kelsey Krapf: Yeah. What are some of the best practices that you use running these experiments and with all this data that you collected?
Jason Widup: Yeah. so I'd say, first of all, it's the approach you use to optimize it?
first of all, coming in with just a test mindset is the first, this is the most important thing. So just being okay, that 10, 20, maybe 30% of your experiments are going to fail miserably. So just coming in with that, when you think about optimizing, especially in B2B, we all want optimize the revenue, right?
that's the Holy grail, but. Unless your sales cycle is like a day, right? Which no real B2B product is. Then it's going to take longer to get to understand what actually drove behavior that led to revenue. It's not just going to happen in one day sales cycle. And so I think what. What's important.
and w earlier is really understanding what you're trying to measure at the different stages of these things. And so early on, you're just going to have clicks. You're going to have costs. You're going to have a couple of leads probably coming in. And so you'd start to optimize to that. You start to look at those metrics from he's telling me, but if you only have one campaign and one thing out there, then you don't really no.
You just have this one thing you're just, you're left to decide if that's okay or not. And if you haven't set those goals, Before you started it and you're just, you're basically gonna probably talk yourself into it, but then if you've got, I'll look, I've got an MQL now I've got an opportunity.
Now I've got actually a closed one deal after one or two sales cycles. And if you set up the connections between systems properly, you can actually see the campaign influence on those deals. And so that's, that's something that we do in our platform too. We just connect to Salesforce.
So we can see What were the campaigns that influenced, opportunity a deal, and then look at things like deal size and all those things. And that's really where it gets hard for humans to do this kind of optimization. So it was like easy for us to do optimization 10 years ago, because the level of data, the amount of data we had was just like really high levels, surface level stuff.
Now we have all these like beautiful throughs and we've got every single permutation of an ad and all these things we want to test. And we actually know through all of these years of doing it, that those things actually matter. So it's worth testing those things. Especially if you're a small team or even a decently sized team, like even though the teams, I have these large 20, 60 person ops teams, we struggled with just doing it.
Amount of testing. And so that's, I think really where the change happens and there's a lot of people are like, I'm optimizing this campaign well, but there's really no way to optimize it unless you have a lot of different things to try. And so it's what are the different, what are the breadth of things that you can try?
And then. Trying to get, trying to make it so a human can actually look at all of those different things. It's just really difficult. So having a platform, having a dashboard, just something that can do most of the heavy lifting on the analysis for you is critical.
Peter Mahoney: Yeah, it's interesting because Jason, one of the things that I see people struggle with a lot is a very similar problem to personalization.
The first step in personalization is segmentation. and then, so you've got to figure out how to segment your audience. And then you've got to figure out the map of messages that are appropriate to those different. Segments and personas who are every breaking break them down. And when you build experiments, it means you actually have to build multiple things.
So it's actually, so you need, first of all, you need to, you need to understand your audience and then you need to find out how do I define a. statistically valid segment of my audience that I can go after and run a test. what is the scale of a test that I should be running at any given time?
And then how do I actually choose the variables that I need to tweak as I run these tests. And it's the thing that blows my mind is the, sometimes the tiny little things that make a huge difference in performance. and, and how would, to test that particular variable about how you punctuated or the size of the font, or whether you put an emoji in the subject line of an email, or whether you use a, bright green or you use pale pink, so how do you help people think about.
That and solve that problem, understanding with all this incredibly diverse matrix of variables that I could think about, how do I start, that's where people, I think struggle a lot is how do I get going when I want to start to run some money experiments.
Jason Widup: Yeah. And the funny thing is when I first started doing a lot of experimentation in this role, I would have my favorites.
I like, I really like this one, let's do this one more. Let's put some money on this one. And then the data would come back and I was wrong. Basically. I was the opposite basically of what my audience was clicking on and attracted to and what was working. And so it really just told me the story, even though I'm selling to myself.
So I'm selling to. B to B marketers, usually with an ops, an angle or lens on things. And they're trying to, get efficient. It's it's me yet. What I think is going to resonate is wrong. what I think will resonate with me is currently wrong. I don't know myself. I don't know.
And so that's the first thing I say. And then, so when people want to first start getting into experimentation, Like the main thing is the amount of content that you have to do. You have to actually get those because most organiz, most organizations are just like, they're already struggling with their content team and resources, just getting one version out the door.
And so then when you're like, all right, great. So now we're going to five more, we just need five, more like that, or just a subtle tweak. And man is just it's, the sky is falling down. And so I think the first hurdle to get over is, and this isn't one of the conversation, but one of the hard conversations we have with some of our clients is.
Do you have enough stuff to test, because if you only have two versions of something, that's probably not enough. And so we really promote, and this is where we try and boil it down to like component parts. It's don't think you have to create a bunch of different ads. Like just whatever, a bunch of images that you would use, And then what are a bunch of headlines that you would use and calls to action. And then we'll just put those together in all different ways, as long as it makes sense. You don't obviously want to put a headline with a. Creative. It just is totally off balance. You know what I mean? You're talking about one customer and here in the logo of a different one, but, within the realm of possibility, have it run all of those and then to be able to have it, understand which of those things is having an impact.
and then see, yeah. How, other campaigns that have that same component are having a similar impact, and it was just hard for us as humans to see that. Granular a beta, And be able to make inferences off of that. When a computer has a little bit better time doing it,
Kelsey Krapf: Jason, one thing I love that you brought up is having the mindset to test things.
I think if you don't have that fundamental of, as a marketer wanting to test and being open to trying new things, I know firsthand just from some of our digital campaigns that were, there's things that we brought on board that we wouldn't have. Out of it or, have the ability to test if we wouldn't have an open mind.
So I love that you brought that up because I think it's super important. And as a marketer, that's how you're going to get the data you need to make those data-driven
Jason Widup: decisions. One is scary for people because. Oftentimes, what happens is they're coming from an area of no testing or low experimentation.
And so what they've done is they've taken all of their budget and resources and they've assigned it to things that they know are going to work. But then by doing that only guaranteeing average performance, because you're basically saying I know I can do this already. Because I've done it. And that's how I'm basing this average on. And that's how I'm going to plan. My budget because I know what I'm going to get from it. But what you don't know is, what if there was a more efficient way to get that? Or what if there were a completely different tactic you haven't tried or a completely different audience, but the mindset people get into often is, I have a finite set of resources and they don't have a certain percent of app.
That's basically set to the side. That's not required to meet their performance goals, but that's there so they can test. And I don't know if that's 10%, 2030, it depends on your budget. And. You like your unit economics, but that's what I find is where people have a hard time is but I don't have a budget set aside for experimentation.
So I need every dollar to basically drive this average. And so until they can separate themselves out from that and be like, okay, this is my 30% I go around the 30%, rates about 30% of my budget is not expected to drive results, I guess I would say. but then if I find something in there that does, it goes into the 70% bucket and I started to spend more money on it.
Peter Mahoney: the other thing to keep in mind is that you talked about the idea of going straight down the middle and getting the expected performance. But if you don't do some kind of optimization, you will likely see decay and performance over time. So especially if you're continuing to market to the same audience.
And Nikki Kelsey, it reminds me of the book promotion we were doing and we it's been our most successful promotion, little plug for the next COO that. The, guide to operational marketing excellence, awesome book, and, in the, we were getting great performance out of our campaigns. And then at one point it started to decay and we changed the ad color.
That was the thing that sort of revitalized it and made it pop it's. it's funny how that can work. I also wanted to follow up on your, it's a question I hear a lot is how much should I think about for, for experiments? and, and there's probably a pretty big range, as you said, based on a lot of factors of your budget and your expectations, but the idea of, of 30% will do a lot of people feel like a big number, So if you're saying 30% is an experimental budget, is it, would your expectation be from what Cause you guys see a lot of clients, you've got to really looking at your client page. You've got people like drift and Pendo. And so we've got a, really, a pretty heavy digital marketers out there.
do they see, are they running that scale? You don't have to tell me specifically those companies, are they running that kind of scale of experiments? and when they do. Running an experiment. What kind of performance boost do you expect to see from running a successful experimental program?
Jason Widup: Yeah, oftentimes what we'll see is we'll see heavy experimentation upfront. So a lot of our clients are coming to us because they want to, I want to figure out what works and then I'm, I don't want to do a bunch more experimentation right away. I want to let that go for awhile. You know what I mean?
And then basically we crack it back open, When that starts to decay, The scale, what we try and do is we, especially when we're first onboarding a customer, we were really planning out span and we talk about like how we're going to pay so that we can get the fast, answers the fastest so that we can make those answers in.
but it's, it's not an exact science it's yet, I would say. and. Some clients given their, what they sell their audience, how long they've been doing it. You know what I mean? They might have different performance than others, but, as a kind of a general rule of thumb, we try and get to that point with our clients where we're like, Hey, let's talk about a certain percent.
And oftentimes it's lower than my 30. It's often around 10% of just that budget that you want to continue to experiment with so that you can. Find the next, winter before you need it. And that's really, the thing is don't wait until you need it, to then go digging for it.
Take 10% of that you've got always. And then always be, trying to feed new things into that mix. we find that, our. Our larger clients of course have a lot more content, right? So like our larger clients have more span, more content. And so it's a lot easier and faster for us to figure out what works strictly folks.
that with, as it are a little smaller, it might take us two or three months, to get to that point where we really understand what works because we can't pump enough budget to the platform to or content, oftentimes they have smaller budgets, but also smaller teams. And that's something we're actually trying to figure out, like, how do we solve that for some of these smaller clients of ours?
And what we're doing is we're. We're actually building playbooks now. So like you mentioned earlier, we see across a hundred plus customers and what works, what doesn't, what creative has worked for them. And we're going to basically start to pull together all of our learnings from that over the last couple of years, and then try and help out new customers and even existing ones, with just these playbooks, we just know this works because we have, proof that it does.
Peter Mahoney: that makes a lot of sense. And it's great that you've started to write that down, because, I think it's, there's a huge thirst for that kind of information out there among marketers, because the landscape of what's possible in what works in marketing is changing all the time. So it's great to be able to.
Give people some guidance around those practices. So one thing we're going to
Jason Widup: produce them too. We're going to let them, we're going to publish them too. We're just going to lay it, let them out. That's great. I'm out the door. So everybody. So yeah,
Peter Mahoney: I think that's a great approach. And w we think the same way, the idea of, creating content and, and helping people.
get better at their job. And, people will remember that and remember your brand and come back to and appreciate that. Yeah. So one of the things that is a lot of people, if they're doing any kind of optimization, typically do it within a channel, so that they're going to do some optimization within if they're using LinkedIn, they're going to use the LinkedIn tools.
If they're Facebook, et cetera, are Google. So what's. What's different about the approach of running these optimizations outside the individual digital engines. and I have 17 follow questions, but let's get to the first one.
Jason Widup: Yeah. So you mentioned the main thing. So a couple of things, As everybody knows, the platforms themselves have their own algorithms that they use to optimize your campaigns.
And so you can set it up a little bit based on what you're trying to do, what goals you're trying to achieve. And. your campaign will go through a learning Mo you know, some learning time, and then they'll start trying to optimize them. So our platform can actually take advantage of that, but obviously within Facebook, it's good within Facebook, but, and within LinkedIn is good within LinkedIn, but where it falls down is across, like what's actually working for you across these two.
and in fact, like for us, for myself, because of the way that we build audiences in Facebook, which is very similar to how we build it in LinkedIn, We can get such targeted audiences on Facebook and then have the lower cost of that basically the marketplace. So I ended up in my stuff. I end up optimizing almost out of LinkedIn for a lot of the stuff that I do.
So a lot of times you won't see my ads on LinkedIn because I've realized I can get to the same people at two-thirds lower costs. Third of the cost, on Facebook. And I can get the same conversions. Now. It's not the exact same audience. my match rates, probably the 30% instead of 50 or 60%, but it's it's better than any other way I can match to Facebook.
and so that's really, one of the bigger ones is it'll really help you broker that budget across those two channel because LinkedIn is really expensive. And so all of us B to B marketers are like, we have to do LinkedIn, because that's where they all are. Most of them are on Facebook. You know what I mean?
And so everybody's on Facebook. and so if you can get them on, and by advertising on Facebook, you're also advertising on Instagram, So now you're pretty across channel. And then we also, Talk to people about budgeting on, have some go to Quora. So our platform doesn't automatically go to Quora, but we have optimizations and the CSM can say, here's some poorer ads that we can do.
Google ad words is another channel. We're going to bring into the platform so that it can actually really see across those. And and it's also in this gets way too technical, but it's also just the way that our AI is set up. And so we're, the Facebook and LinkedIn are relying on groups of ads and things like that to optimize.
We actually inject each one individually as an individual experiment, so we can see different types of data from those individual experiments. And so we have other types of ways that we can optimize based on how we pull the data out, even within a channel, not even across, but, yeah, so we worked a lot, w what we do is we allow our clients to put kind of safeguards, thresholds, in the platform.
So it's like bacon, it's almost like a stock trading platform a little bit, you can think of it like that, where you can almost that stop.
Peter Mahoney: Yeah. no runaway trades.
Jason Widup: Yeah, exactly. Yeah, exactly. So you can set thresholds around. how much do I want to spend on an experiment? If it doesn't generate at least a single lead, like how far am I willing to go?
just see if it's going to work. you can put those limits in and other things, I can't remember all of them. There's lots of different limits, obviously your daily spend and all that stuff, but, Oh, you're arranged for cost per lead, and those kinds of things you can plug in so that, so yeah, it, it knows.
A little bit more about you and how you want to optimize them, just how the system wants to do it.
Peter Mahoney: I imagine that there's also, if you take this approach of optimization outside of the proprietary channels, there's the advantage of data ownership. So you obviously you own the data and there's also the advantage of proprietary.
Data targeting is part of the optimization algorithm. So you can bring in stuff. That's just not part of their thing. And that's
Jason Widup: yeah. That's where our audience part really so we have this huge audience, 1.4 billion profiles. And then LinkedIn has their audience. And so you combine the two together, you get this like super audience.
and so that's in the fame on Facebook, obviously doesn't have, you can do like personal interests query and those kinds of things on Facebook, but, we let people do. B2B queries, industry, job title, et cetera. And then we've figured out how to do our databases as personal to corporate identity graph.
And so since everybody logs into Facebook using their personal email address, we have their business profile and then we have their personal email addresses, all of them, attached to that business profile. And when you create a segment, that includes that person. We upload all of their personal email addresses to Facebook to guarantee like the best match possible.
And so we never sent anybody those email addresses of course, cause those are all PII, but that's really how we, Create audiences, B2B audiences on Facebook that would unlock that for our customers.
Peter Mahoney: Yeah. And how do you to help your customers navigate through the, the data privacy rules and, the ever-changing GDPR landscape, it's complicated.
And, and it's one of the things that's just this existential risk that we all think about as a marketer. You don't certainly want to damage your reputation or run into some giant set of fines that may happen. So how do you think about that?
Jason Widup: Yeah. So we, we're, we have actual data protection attorneys.
Yeah. What I mean that we had help us navigate all of that, lovely, content and, and all the rules. but what we do is we just don't expose it. we don't expose any PII to our customers. And so just through that alone, The only thing they see is they're querying for industries.
They can actually query for people at a company, but they're not going to see the people. they're going to say this company with this job title, and they're never going to see the person's name or email address or anything else. and so that's really how we, and then we have everything in the background.
So if there were to be some kind of a breach, all of the PII we have is already hashed. And so it would be not, it wouldn't be able to be. Re constituted, I guess that's the right word, but yeah, we did it. You have to sign all these, data protection agreements. And so we try and take on as much of that responsibility for our clients as we canceled, they don't have to worry about it, but just legally, we have to sign data, transfer agreements, data protection agreements, just to keep everybody, I guess legally safe, but, but we haven't had any issues, which is nice.
We don't have a lot of European clients yet, But we have in us clients at target people in Europe and yeah, it's no problem.
Peter Mahoney: Great. maybe a couple of quick questions that I know we have to wrap pretty soon, but the, one thing, do you have any, do you have any sort of real tangible, what's the one thing that every database marketer should be doing right now that they may not be doing.
Jason Widup: I would say if I had to choose one thing,
Peter Mahoney: You can pick two, if you want. That's like a bonus.
Jason Widup: I would say
focus on that, who you're targeting, like really get really clear about how you build your target account list, but not only that it's. How you build the target account list, but you have to consider, I'm going to take one minute on this. You have to consider your unit economics. And this is the challenge I see a lot of companies making is they say they define their target account list based on all the things that they know, what would make somebody right for the product.
And they have no caring concern about the size of that target account list. And then they try and create a marketing plan that matches that target account list. And they realize. I can only serve each person in that account, a half of an ad because I didn't do my math to figure out like how many target accounts can I actually advertise to?
And so that's the thing I'd say is I just see that almost every of our clients that I get into and help I try and work them back up from that unit. Just make sure you understand how many people you can properly market to, especially if you need five touches or so, from somebody with no awareness, make sure you have enough resources and that your list isn't so big that you're.
Serving half an ad to everybody.
Peter Mahoney: Totally makes sense. That's great. we'll include some, links and information to learn more about metadata.io. I really appreciate you sharing your expertise, here with us. Jason, this has been really fascinating. Some really good tangible stuff. I think Kelsey has one more question and then we'll wrap up.
Kelsey Krapf: Yep. So Jason, I'm interested to hear from your perspective, as a marketing ops by trade, what advice would you give to CMOs or those aspiring to be one? Cause there is a massive trend you would see of marketing ops becoming a CMOs nowadays.
Jason Widup: Yeah, I would say. Yeah. Here's what I'd say. I'd say I've seen two types of CMOs.
they're not just two, but they're like really data-driven ones and then ones that are like don't ever, I don't even want to see the data. I don't even know what to do with it. I'd say the ones that are buried data-driven, don't go too far down that path because there's still so much marketing that we do that is not attributable to things.
That's not a direct line to things. And so if you go. All the way that side of the pendulum. And you're just like, everything we need to do has to have an ROI. That's not great. that's just too far. And so you've got to be able to have things that are like good for the brand. Good for your potential prospects and things that may not be like word of mouth.
Like, how are we going to treat, but you're going to try and do word of mouth programs and influencer programs and stuff. And so don't get so mired in, but I need to see a one plus one equals two that you basically select yourself out of a lot of other things you can do.
Peter Mahoney: Great. excellent advice.
really appreciate it. And, and I think with that, Kelsey, I think we're ready to wrap right. Yes,
Kelsey Krapf: we are. Thanks so much for your time today, Jason really appreciate it and make sure to follow The Next CMO and Plannuh on Twitter and LinkedIn. And if you have any ideas for topics or guests, you can visit our website or email them to thenextCMO@plannuh.com. Stay tuned for the next community launching soon and have a great day, everyone.
Peter Mahoney: All right, thanks Jason.