In the hyper-competitive world of e-commerce, how do the top performing brands drive continuous improvements in conversions and revenue?
This week on The Inbound Success Podcast, Dexter Agency CEO Joris Bryon talks about the importance of A/B testing, and why small improvements to your website can drive big increases in revenue.
From the process he uses to identify which website pages need to be optimized to how he determines what aspects of the page are underperforming, and the nitty-gritty details of setting up and running tests and user surveys, Joris lays out, step-by-step, a process anyone can use on any type of website to improve conversions and pipeline.
Check out the full episode to get the details. (Transcript has been edited for clarity.)
Kathleen: Welcome back to the Inbound Success podcast. I'm your host, Kathleen Booth. Today, my guest is Joris Byron, who is the founder and CEO of Dexter Agency and the author of Kill Your Conversion Killers. Welcome to the podcast, Joris.
Joris: Thanks, Kathleen. Great to be here, actually.
Kathleen: I am so excited to talk to you, but I have to start with a question. Kill Your Conversion Killers, is that kill or be killed, conversion-rate optimization style?
Joris: Actually, it came from the baseline we had when we started the agency. So it was Dexter Agency. And Dexter, the serial killer who kills ...
Kathleen: Oh, yeah. I didn't even think about that [crosstalk].
Joris: Serial killers. Yeah. And that's where it came from. And conversion killers is a thing, actually. So, we kill conversion killers, and it's a bit of a play on words. And that's where the title came from and the baseline for the company as well. Yeah.
Kathleen: I love it. So speaking of the company, maybe you could just briefly introduce yourself to the listeners and talk a little bit about your background and what Dexter Agency does.
Joris: Sure. So, I've been in marketing for 20 years now, and I started my career in traditional advertising agencies and actually did that for about 10 years. But I got fed up with the typical discussions you have with clients, like, "Make this blue, make this red, put this on the left, put this on the right," that kind of stuff based on nothing. Well, I discovered online marketing, and I started learning about SEO, PPC. I went working for an agency as well. I had to stop in between where the company, it had failed. And then I went into digital marketing. But anyway, in that visual marketing agency, I learned a lot about digital marketing. And at one point, I fell in love with conversion optimization. And that's how it all started. And I ventured out on my own. I learned everything I could about conversion optimization. And first, I tried to implement that in the agency where I was working. And it was a great agency, but back at the day, conversion optimization was still pretty new. And there weren't any clients prepared to start doing that, so I had to venture out on my own and started out as a freelance CRO consultant, and that grew into an agency. And yeah, here we are now doing this for six years already.
Kathleen: And I saw in my notes that you have done over 1,500 A/B tests. So, you have a ton of data that you bring to this conversation, which I love. Conversion optimization obviously is a broad topic. We're going to focus specifically on e-commerce, which is an area that historically we haven't talked about a ton on the podcast. Although, I've started talking about it more lately because selfishly I'm working in the e-commerce area. And so I'm really interested in learning more, and so I'm excited to dig into this with you. Let's just start with a background on conversion optimization because I don't know that everybody fully really understands what it is, why you should be doing it, how it works. Give me just a really quick summary on that.
Joris: Yeah. I'd say conversion optimization for me is trying to make more from what you already have. You already have visitors; try to make more out of the visitors you already have. They already buy from you, so why don't you try to increase the average order value? And you have them as customers, so why don't you try to sell to them again? For me, that's conversion optimization. It's basically working with what you already have. I know there's definitions out there that focus entirely on conversion rate optimization, but I think that's too narrow. It creates wrong expectations. I don't think conversion rate optimization is a great name for the discipline, as such. So, if I have to say something about it, I usually say conversion optimization as you do rather than conversion rate optimization because that creates false expectations.
Kathleen: Yeah. I spent a little over a decade as the owner of an agency, and I used to always talk about this and frame it as if you want to double your revenue and you look at a traditional marketing funnel, there's two ways to think about it. You could say, "Well, I have these conversion rates and this number of visitors and this number of leads. If I want to double the number of visitors who ultimately turn into customers, I can double my traffic, and if the conversion rates all stay the same ..." And I'm going back to conversion rates right now, so I'm deviating a little from what we just said. But I think it's a helpful rubric. You can either try to double your traffic and just stuff twice as many people in the top of your funnel, hoping that it produces the same outcome, or you can work on converting more of the people that are already coming into your funnel. And in a perfect world, you're probably doing a little bit of both, but the reality is that the fastest path to more revenue is the second thing that you focus on. It's much harder and longer-term effort to double your traffic than it is to double the number of leads you're getting that ultimately turned into customers. And I'm sure the same is true of repeat purchasing and things like that.
Kathleen: When you talk about ultimate impact on the business, trying to squeeze more juice out of the orange you already have is always a better approach in the short term, certainly, than trying to grow more oranges.
Joris: No, absolutely. And I think a lot of business owners, they're so focused on traffic that they forget there's other ways to double their revenue. And I get it. In the beginning, the fastest way to grow is adding more traffic, and especially PPC. If you pay for that traffic, it's going to get you quick growth. But at some point, you'll hit a plateau, and it's going to get harder and harder to attract relevant traffic because you can dump the traffic on our site. But if it's not relevant traffic, why bother, and why pay for it? And what I feel is, by then, most business owners are so stuck in a traffic mindset that they look for ways to still make it work. Maybe I try something new, some new campaign or some new channel or fire their agency, work with another agency, whereas they miss out on the opportunity of working with what they already have and try to improve that instead. I think one question that helps is, do you want more traffic, or do we want more revenue? And when you put it like that –
Kathleen: That's a pretty easy question to answer, I would hope.
Joris: Yeah, yeah. Right. And that's a bit of an eye-opener, but most business owners are so stuck in traffic mindset, whereas there's a lot of potential in increasing the order levers. I think for e-commerce, the formula that I always use is revenue equals your traffic times your conversion rate times your average order value times your purchase frequency. There's only four levers that you can grow your e-commerce. There's nothing else. If we're talking about your own life story, you can start selling on marketplaces. That's a different story. But if we're working on your online store, it's still those four levers. Yet, most companies focus only on traffic, and they miss out on the other three levers, whereas if you look at that formula and you can increase these three other levers by 30% each, which is pretty doable, then you double your revenue. And doubling your traffic sometimes it's going to be very, very hard. For me, it's sometimes a mystery why people get so stuck in a traffic mindset when there's auto levers that you can pull.
Kathleen: Well, and I think everything you just said honestly applies to, really, almost any type of business. And in fact, I just had this conversation last week. I have a weekly marketing meeting with my team, and we had been tracking traffic and conversion rates and all that stuff. But we're also tracking marketing source, pipeline and revenue. And it was really interesting meeting because I've been feeling lately like I'm beating my head against the wall trying to increase traffic. And it isn't working as quickly as I would like it to, but my marketing source revenue is really good. And so I finally said, "You know what? I'm not even going to report on traffic anymore," because, clearly, it's not a good leading indicator for what really matters. And I don't want to keep pouring a lot of time and energy into changing a number that isn't going to get us necessarily where we need to go. It's not that I'm not going to ever work on traffic anymore, but I do feel like as a marketer, I could let it really eat away at me when it doesn't need to. So I think your point is really well taken.
Kathleen: But what I want to start with on this topic is, how do you know if your conversion rates aren't good? How do you know if you're functioning in a way that there's real low-hanging fruit from conversion optimization? Because I think a lot of the marketers I've talked to in principle are fans of it. Everybody says, "Sure. We should all optimize our conversions to the greatest extent possible." But I do feel like there's this feeling out there that, well, I already have a pretty well-running system, so why should I invest in that? So how do you look at your existing funnel, your existing business, and identify whether that is the right thing to invest in and when it is the right time to invest?
Joris: Yeah. That's an excellent question. I think, first of all, I've never seen a site that cannot be improved. So, we've always made our clients a lot of money. You can always improve. So, never assume that you're at the top of your game and you cannot improve anymore. The second thing is always look at, for me, it's Google Analytics. So look at the data and try to figure out where you're losing money. So it's really about finding those areas on your site. Maybe there's a huge drop-off in a certain page. Look at bounce rates as well. If you drive a lot of traffic to a certain page with very high bounce rates, start there. So, look at those numbers in Google Analytics, and that'll tell you where you have opportunities. And whenever you can, try to also put a number to it in terms of dollar value.
Joris: Let's say you have on a cart page on an e-commerce site. So, you have a 50% drop-off. So, people reach the cart page and 50% drops off. There's always going to be a drop-off. But what you try to do there is, what if we can get it up to 60% going through to the checkout, so only 40% drop off? What would that represent in terms of annual revenue? Then you can look at, let's say, checkout pages and see what the drop-off is there and what would be a more normal level and try to put a number to it or a dollar value to it on a yearly basis. And then you know where you have to start because if one page represents a problem that could, basically, if it's fixed, maybe you can get a $200,000 a year extra. And the other one is $1 million a year extra, you know where to start and start digging further to understand why that is happening. So, we always start with where it's happening, and then you try to figure out why that is happening and try to solve that problem, basically.
Kathleen: I like how you frame this. If I look at my Google Analytics and I find a page that has a really high bounce rate, in this case, maybe it's my cart page, I've identified where. And then how do I go about identifying why? What's that process look like?
Joris: So, there's different research methods that you can apply here. What I find is one of the most effective ones is user testing. Basically, you give a couple of assignments to regular people who they have to do those assignments. They have to comment out loud what they're doing. And if you're doing that remotely, you'll get a screen recording of it as well. And you see them moving through your site trying to do the assignments that you gave them, and they have to comment out loud. And it's going to give you a lot of insights. It doesn't always have to be that kind of setup. You could just ask someone random, I don't know, in a Starbucks or something and pay them for a coffee and say, "Hey, do you want to take five minutes and go over to my site and say what you think," something like that. Just try to get feedback from people who actually use the site. And that's usually going to be one of the most valuable things.
Joris: Obviously, we look at it ourselves as well. We are conversion experts, so we know what might be issues. But you always have to test it and never assume that you're right because as an expert, we sometimes get it wrong because what works on one site doesn't necessarily work on another site. And I know a lot of people don't grasp that idea. They think, Oh, it should work on every site, but that's not the case. So you can look at click maps and scroll maps. You can record visitor recordings so that you see people moving through the site. So there's a bunch of research methods that you can use to get some qualitative feedback. You can also, which is very good one, is on the checkout. So, basically, on a thank-you page, you can trigger a survey, ask a couple of open-ended questions so that you get valuable, qualitative feedback because that's the thing. It's not always about percentages and the hard data that you find in analytics. You have to understand the why. So the qualitative feedback is going to get you so much further than trying to look at the hard data. And you really have to try and step into the minds of your consumers. That's where tools like user testing and surveys come in and can give you very valuable insights.
Kathleen: I love those little surveys. And I know there's a lot of tools out there from Hotjar to Lucky Orange and platforms like that that you can use to create them. But I've tried using them before, and I don't get many submissions. And so I'm wondering, what is a typical survey response rate? And are there any ways to increase the odds of getting somebody to actually fill it out?
Joris: So, there's two ways to do it. You can set it up before someone checks out, so before they buy. So, it's visitors, not consumers, and that's different. Your response rate is going to be lower than when you do it after the checkout, for instance, or you send them an email with a questionnaire. In that case, you're going to get a lot of responses. If you trigger that Hotjar pop-up to any visitors, it's really going to depend on the quality of your traffic as well, the timing of the pop-up because I often see those pop-ups really being [inaudible] after five seconds. And I don't even know what you do yet, and you already asked me what your experience is on the site.
Kathleen: How long do you think it should be? Or should it be on exit?
Joris: Well, it could be an exit, for instance. If you look at the average time on site in Google Analytics, that's when you know it's probably going to be a somewhat qualified visitor already. They've spent some time on the site. And I would start there and trigger it there, and then you can start playing around. Another thing that helps –
Kathleen: Wait, I just want to make sure I understand what you just said. Do you look at Google Analytics for that specific page and see how long people spend on that page or the overall session length?
Joris: Yeah, session length. So, basically, if you see a session length is on average one minute 27 seconds, then you trigger it at one minute 27 seconds to start with. And you'll see what happens. You'll get much better feedback also because people that bounce usually are not interested anyway. It might be a mismatch, so it's not only about the amount of responses that you get but also quality of feedback that you get. So, you've got to play around with those things. And there's no one formula for that, but it's just a couple of criteria that you can use and play around with it. Another thing that you can do is ask two questions, and the first one is a yes or no question. And then the second one, you expand on that and have an open-ended question because what happens is people want to be consistent. So, if they've clicked on yes or no and then you ask them a follow-up question, you'll get better feedback. More people will fill that out. So that's a good way or another way to test.
Joris: Never make it longer than that if it's one of those pop-ups before someone buys. If it's after the checkout, you can ask more questions. People will be more involved already. So, I don't have a hard answer there. It should be this percentage, but you just have to try and improve what you already have. I think that's the main message here because it can vary wildly. Also, if you have a site with a lot of diehard fans and you have a good, interesting brand and a lot of brand fans, you're going to get much more response. Yeah.
Kathleen: No, that's good feedback. I like the idea of looking at session length. That certainly is a little bit more scientific than just randomly picking a number of seconds. So, I have a bunch of questions on this topic. I've been in the situation before where I know I have a page that has issues, and it's that feeling of like, Oh my God, where do I start? And you can certainly go to user testing, and there are other things like that. But I also know there's some gold to be mined within Google Analytics. And sometimes the issues you have have to do with devices or browsers. So how do you approach sussing out whether the issue is something that's on the page or whether it's more of a device or browser issue?
Joris: Yeah, that's an excellent question. And I think it's a matter of keep asking the right question. So when you see a number that looks off, then you have to think, Oh, OK, is this only on mobile? This is on desktop as well. Is this on a specific browser? So, you start checking all those reports. There's no one way to go about it. You start very high level to see if there's some numbers that seem a little bit off, and then you start digging and look at segments and apply other reports and that kind of stuff. And it's a matter of asking the right questions and being curious and really trying to figure it out. And you can spend a lot of time in Google Analytics before you find something that is off and find a reason for it.
Joris: It could be bot traffic, for instance, as well. If you see very high bounce rates, then you might want to look into, is it a specific browser or even a browser version causing this? Is it only on the homepage? Is all the traffic going to the home page? Do you see 98% bounce rate or 99% bounce rate? It's probably a bot. If that then comes all from one location, it's very likely it is a bot, and you have to exclude that traffic and it's just skewing your data. So, you really have to keep asking questions to find the answer. And I start at a high level and then dig deeper.
Kathleen: So wait, the bot traffic thing is fascinating to me. What do you do about it? How do you fix that?
Joris: You can exclude some of the bot traffic if it's really clear. If let's say, it's all from one particular location, you could –
Kathleen: Meaning one IP address?
Joris: Yeah. You could just exclude that. Or it could be, let's say, even on an old Internet Explorer browser version, sometimes you see that it's sending bot traffic. And if you're like, Oh, the only traffic that's coming in is from that bot, you could exclude that particular browser version as well. You try to find a unique identifier for that bot, and then you exclude it. Sometimes it's just impossible to do, but at least you give it a try and see if you can find one unique identifier that you can filter out.
Kathleen: And then, I've always been taught with any conversion optimization or any A/B test you only change one thing at a time. So, walk me through when you're trying to really optimize a specific page. You pick one thing, and how long do you let that experiment run? Is it a matter of time? Is it a matter of volume of page visits? How do you know when that experiment is up and it's time to move to the next thing? Because I'm assuming these all layer on top of each other. You're testing multiple things sequentially, correct?
Joris: Yeah. So, typically, we test on several pages at the same time. So you could have a test running on all product pages. We don't test on one product page, but on all product pages at the same time, [on] collections pages or category pages, one on the cart, one on checkout pages. So, you could have those running simultaneously. Well, you have to have enough data before you call it. And usually, what we look for is, to make it simple, It's more complex than that, but is at least 300, 400 transactions per variation. So, you need quite a bit of traffic to pull it off. And I like to or I prefer to say the number of transactions rather than in traffic because, ultimately, that's the main conversion that you're tracking. So, you need at least 300, 400 transactions per variation, and you always have to let it run in increments of seven days. The reason for that is you can see a big difference. On a Thursday night conversion rate, could be totally different than on a Sunday morning. And a variation on a Thursday night could work better because it triggers something in your consumer that is relevant at that point, but it could work not so well on a Sunday morning or the other way around.
Kathleen: Right. The type of person who shops on Thursday night might be completely different behaviorally than the type of person who's shopping Sunday morning.
Joris: Absolutely. Yeah. Especially if, for instance, you're going to buy something you need on Saturday, so you need it to be delivered on Friday. Then you're going to decide a lot quicker than on Sunday morning. So, there's different times of the day, different days of the week that have different conversion rates and different behavior from your consumers. So, always run a test in increments of seven days. If you have enough data after five days, sit it out. Wait until you have a full seven days. If you have enough data after eight or nine days, too bad, you have to wait 14 days. We had one client where we joked about it because on Friday nights, it was always like, Oh, this variation seems to be winning, and on Monday morning, it was totally flipped. So the behavior ...
Joris: ... on the weekends was usually completely different there. So, it's really something to do to be very strict about. And if you do this for clients and you look over your shoulder and you look in the A/B testing tool, you have to educate them on that because they're going to be like, "Yeah, but we have enough data. This version wins. Let's implement it." No. Just be calm. Let's give it some time because you don't want to implement something that ultimately ends up being a loser because then it's going to cost you money.
Kathleen: So this is interesting to me. And I started thinking about this as you were talking because you mentioned doing something on a category page or product page, a cart page. Any given customer, and I'm not telling you anything you don't know, goes through a journey. And when they're on their path to purchase, even in an individual session, they're going to visit a lot of different pages on your site. And so I already mentioned once changing only one thing. So, if you're doing an experiment on your cart page, you pick one thing to change and you test it. But how does that reconcile with the fact that a customer has this journey, they're visiting all these different pages on your site, if you have experiments running on your category page, your product page, there is a chance that that person in their single journey to purchase could encounter three different experiments at once on three different pages. It's one thing to say we're testing a change on the cart page, but if there are experiments going on three, how do you know what really led to that purchase? I feel like that's where it starts to get complicated. How many things can you have going at one time on different pages?
Joris: That's a very good question. And it's something that a lot of people struggle with. Just the easy version is as long as you equally divide the traffic for every test, there's no issue because let's say you have two tests running, one on a product page and one on a cart page. Now, 50% of your traffic is going to see the one on the cart page. No, sorry, 25% version A on the cart page. 25% is going to see version A on the product, version B on the cart. And 25% is going to see version B on the product and version A on the cart. And 25% is going to see version B and version B. So it's hard to follow and very hard to explain without a visual, but bottom line is if you divide the traffic equally, there's no issue.
Kathleen: So, it basically allows you to create cohorts, and each cohort has one single experiment running for them.
Joris: Yeah. You could also double-check all of that in analytics, for instance, by making a segment. So, anyone who's seen version A here, if you suspect some influence from one to the other, then you can basically make those segments in analytics and see if there's a different behavior and a different outcome. So, you could double-check that. Apart from that, there's also something like multi-page experiments. So, let's say you want to test something that is on several pages at the same time. A typical one would be the navigation or the footer or a benefits bar with all those USPs you typically see on an e-commerce store. That's something that is across pages. That's not an issue either because it's very consistent across all those pages. So sometimes you want to set up a multi-page experiment as well. If someone does something on the product page and this needs to happen on the cart page, that could be an option as well. But I'm making it a bit complicated. So, it really depends on what you're testing, but if it's two separate tests and you split the traffic evenly, there's no issue.
Kathleen: So, as I listen to you talk about this, I'm fascinated because this is not my primary area of expertise. But I'm also a little intimidated because it's starting to sound like you a) could potentially need to be really, really well-trained in this, and b) it sounds like it could be a complex tech stack to support this, having these multiple experiments running with different cohorts going through your site. Can you talk me through really what does it take to pull this off?
Joris: Yeah. So, I think in terms of tech, if you have Google Optimize, it works perfectly fine as the perfect tool to get started. It's free. We still use it a lot for a lot of our clients. There's other tools out there as well, but don't spend any money if you can do it with Google Optimize. So, that's the tech side of things. For the rest, it's a matter of understanding a couple of best practices as well in terms of how to set up a test, like let them run in seven days, let them run long enough, how you need to analyze those tests. I describe all of that step-by-step in my book without making it needlessly complex because what I find myself is the whole CRO community likes to make things complicated and sound very high level and a lot of statistical stuff in there as well to make them look smart.
Kathleen: Yes. And it makes me feel really dumb.
Joris: I think that's a mistake from the CRO community because we alienate people we do this for because they don't understand it. And it is complex. There's a lot going on when you do a conversion optimization. At the end of the day, you need to know a lot about design, about copywriting, psychology. This takes research, all that kind of stuff. So, it is a hard discipline to grasp, and there's a lot of misconceptions about it. And clients don't always understand it. And then on top of that, as a CRO community tends to make it look much harder than it actually is. That was one of the reasons I wanted to write my book, is to make it as pragmatic as possible without ... . My objective was not to make me look smart but really to help people out. And usually, people in the CRO community want to look smart. I had a discussion about that the other day on LinkedIn, where someone was attacking me because I didn't share all the statistical stuff behind an A/B test. And it was like, I think that's not always needed. It's about inspiration, about opening their eyes. [inaudible ] Obviously, that's our duty as well. But if we use all that statistical stuff around it, we scare people off. And as you said, they feel dumb themselves.
Kathleen: Yeah, and then they check out.
Joris: Yeah, absolutely. And we're marketeers. We say it to our clients. Hey, you have to understand your clients and speak their language, and then we don't do it ourselves. So there's a big gap there, I think, in the CRO community and a big responsibility in the CRO community as well. But most of the CROs, they just want to look smart, and that's a mistake, I think.
Kathleen: Well, I appreciate that, that you are focused on making it accessible, because I will be the first to admit that sometimes I read this stuff, and I think I just must not be smart enough to fully understand it. So, I want to do a few rapid-fire questions for you on this theme of debunking myths and making it more accessible to people. The first one is, and you alluded to this earlier, why should people be running A/B tests? Why can't they just read a best practice somewhere and do it on their site or look at what their competitor's doing and do it on their site?
Joris: We see that all the time. So there's something called best practices. I prefer to call them prototypical principles or common practices because what we see is when you test the best practice on a site, it may fail and cost you money. We've seen that many, many times. So, there's just no way of predicting it. Those are great ways to start. If you're just starting out, use those best practices by all means. But then at some point, you'll have to start questioning is this actually working for me. And if you have the volume to the test, then you definitely should start testing it. So the [phrase] best practices is very misleading.
Joris: The second part of your question, following what anyone else is doing, that's also thinking, first of all, they know what they're doing.
Kathleen: They're right, exactly.
Joris: So, you assume that. That's not true. Or you think like, Oh, I know they're [inaudible], but then you're assuming that they've already beat us at that, which may not be the case. If it's, let's say, a call-to-action, maybe they haven't even tested it yet, so it may not be working. It may be a problem for them. And then you're maybe implementing problems on your site. You don't know their data. You don't know anything, and you just assume that they are doing a good job. And you just implement it. And it's very dangerous to just follow what your competitors are doing. The only situation in which you can do it for me is when you see you have a problem and there's an issue and you're looking for inspiration on how do other people solve it or go about that problem and maybe work around it, and then you start testing different solutions. So, you can go out there and look for some inspiration, but don't just follow it and implement it.
Kathleen: That makes sense. Use it as the basis to inspire your tests as opposed to assuming that the test has been successful.
Kathleen: So, if somebody is listening and they're in e-commerce and they're thinking, OK, I'm willing to give this a shot, what are three things that you think they should start out doing A/B tests on?
Joris: Yeah. First of all, start doing the research. You'll know where your problems are. But where we often see a lot of opportunities is anything around value proposition on the homepage. If you don't have that yet, that's usually a good area to start. Then anything else that underlines your USPs, basically, and that gives people a reason to buy from you, and especially if those reasons are unique. We see a lot of value around that as well. And then what I think is usually a good place to start as well, product pages, anywhere in what I call the decision area. So, anything near the picture and the button, anything you can improve there is usually also a good place to start. But a product page in general usually is a place where you can make a lot of improvements.
Joris: But do the research first, and try to understand where your biggest problems are because I remember one time we were working for a client, and we were testing on the product pages. We just looked in Google Analytics, and we saw they had a huge drop-off in one of the steps in their checkout. So, they were losing millions of euros, a Dutch client, of euros a year in that step in the checkout, but didn't even know it. And they were testing a product page, which was acceptable at every product page at the time. So look at the data first. That's the best place to start.
Kathleen: So, on that note, look at the data first, and you mentioned Google Analytics. We've all got it. It's the one tool I think all marketers use. At least all marketers that listen to this podcast I'm sure use it. What do you think? If we're going to rely on the data there, we better be looking at it correctly. Are there certain mistakes that you find that marketers make when looking through Google Analytics that somebody should be aware of if they're listening?
Joris: Yeah. I think a lot of people focus on the wrong data. So, focus on those data that are actionable and [inaudible] metrics. What I would say as some test if you're looking at the right data, always ask yourself, how can I use this to improve my site or my marketing? How can I use this number to improve it? If it's time-on-site, for instance, you can use that to trigger a Hotjar pop-up, but that's about it. If time-on-site increases, it could be that people have a harder time to find what they're looking for. So, you can not really use that metric. Yet, some people look at that metric. So, ask yourself that question. I think that helps because what I find is that most people don't have a lack of data. There's not a lack of data but a lack of insights. And they get overwhelmed by the amount of data and analytics, and they don't really know where to look. And so we always ask the question, How can I use that? And if you don't have an answer, then don't use those data.
Kathleen: Yeah. Your point is so good about session length and time-on-site. It seems like it would be good to increase it, but it's true. If somebody is frustrated and not finding what they want, they could have a long time-on-site or, what I find often happens, if they're a job seeker and they're reading all of your content. I'm in B2B SaaS, and the DIYers, the ones who are never going to buy from you, but they're like, I'm going to read all the educational content so that I know how to do this myself, those are the people that spend a lot of time on site, whereas somebody who comes in knowing they want to buy often has a very short time-on-site because they're high intent, going right to your contact us page and filling it out.
Kathleen: And it's funny because I used to look at this too when I had my agency. We would try to look at patterns of people who bought from you, how many pieces of content did they consume, thinking, Gosh, is there some rule where if somebody consumes more than 30 pieces of content, then they're a good lead? And I found actually it was almost like the inverse was true, that the customers that were converting consume very little content on site because they came in ready to buy. So, it's really interesting. And I do think you have to question the assumptions that you have as a marketer before you go and try to implement changes. Well, last question on conversion because I am just curious. What is a good conversion rate? And I guess we'll talk about e-commerce because that's where your expertise is.
Joris: Yeah. Absolutely. And it's a question I get a lot. And the only right question is a good conversion rate is one that's better than last month's. Just look at yourself because there's benchmarks out there, and people always want to look at benchmarks. But that doesn't really help you. Benchmarks, usually, you mention it's 2%, 2.2%, but it depends on so many variables.
Kathleen: And is that for a visitor to purchase conversion?
Joris: Yeah, yeah.
Joris: Yeah. But it doesn't mean anything. If your average order value is $20 and you have a 3% conversion rate, you might think based on a benchmark that you're doing all right. But I would say your site probably sucks because it's only $20 average order value. You could probably get it up to six, eight or maybe even higher. And so you could be complacent about it. Whatever. It's 3%. I'm doing good. Whereas if your average order value is $1,000 and you're a 2%, yes, then you're probably doing pretty good. Or you might be at 1% and thinking, Oh, I should be improving a lot. I can easily go to 2%, maybe more. [inaudible] a little bit difficult to get to 2%, but you could probably get it to 1.2%, 1.5%, and it could be a lot of money as well. So there's a whole bunch of factors out there. Also, the quality of the traffic that you drive to your site. So, I would suggest don't pay too much attention to benchmarks. And I get it. People want to have an idea. Am I doing all right or not? But basically, just look at your conversion rate from the past and try to improve that. The only good conversion rate is one that's better than last month's.
Kathleen: That's great advice. All right. We're going to shift gears because I can talk with you about this forever, and I feel like I'm getting a Masterclass here. But I have two questions I always ask my guests. And I'm just curious about your viewpoints on this. The first is, of course, this podcast is focused on inbound marketing, and the definition of inbound has evolved quite a bit. And so I really look at it as anything you're doing to attract the right type of customer to your business. So, when you think of it that way, is there a particular company or individual that you think is really raising the bar or setting a great standard for what it means to be an inbound marketer today?
Joris: Yeah. I think there's a couple of marketers that I follow and I look at. I tend to follow people more than brands. And you could say they're brands, so all their personal brands. I think when it comes to anything B2B on LinkedIn Marketing, I very much follow Matthew Hunt. He's doing a great job in anything small business marketing and startup marketing-related. I really look up to Noah Kagan from AppSumo. Makes great YouTube stuff and great videos, and you just keep binge-watching them. And he has a great style but also great advice. And when it comes to e-commerce, there's two people: I tend to follow Ryan Daniel Moran and Ezra Firestone. I think Ezra Firestone, he does a lot of inbound marketing the right way. And he experiments a lot and invests a lot of money in trying new techniques and then sharing those findings with his audience. I think Ryan Daniel Moran is really good in terms of building e-commerce brands and making it less complex because we tend to overcomplicate things, and he makes it less complex and helps you focus on what really matters.
Kathleen: Oh, interesting. I'll have to check those out. Those are some new ones for me. All right, second question. The marketers I've talked to consistently say that one of their biggest pain points is just keeping up with everything that's changing in the world of digital marketing. How do you personally stay educated? Are there certain sources of information that you really rely on to stay on top of your game?
Joris: Yeah, it's hard to stay on top of the game. Fortunately, in conversion optimization, it's pretty evergreen. New things emerge, new tools, for instance. And obviously, you want to follow all of that. But if you look at a global digital marketing game, it changes every single day. And there's a lot of marketeers that suffer from shiny-object syndrome, I guess. And you see something, you want to try it out, whereas I believe that you've got to be consistent and try to find what works and give it a real shot because don't try it just today and then give up tomorrow. Sometimes you have to make a choice and stick with it for a while. So, I suffer from that myself, shiny-object syndrome. So, I try to stay away from too many things that distract me from the real path that I want to walk on.
Joris: I think in terms of getting new trends, LinkedIn is my source of information. If my network shares, it's probably going to be worth reading. So, that's where I spend most of my time, is on LinkedIn. And that's where I discover. And then when I see there's a topic that really is important to go deeper, I just buy a course or read a book. So, I stay away from too many blog posts because it's so fragmented. You don't know who wrote it. You don't know if they know what they're doing. And when it comes to courses, I think CXL Institute is a really good source of high-quality courses. So that's where I look first as really. ... Yeah, CXL for me is the benchmark when it comes to marketing courses.
Kathleen: Well, I am a huge fan of Peep Laja, so I definitely agree with that. He was an early guest on the podcast too, and he's doing great work and, in fact, when it comes to message testing has a really neat new platform called Winter, W-I-N-T-E-R, that if you're listening and you haven't checked it out, you should, for sure.
Joris: Yeah, absolutely. For B2B, it's a great tool.
Kathleen: Yeah. I found it really useful in my experiments. Actually, it's funny. I recently interviewed Chris Walker from Refine Labs, and he talked about how at a certain point in your career, you really can't rely on outside educational materials to stay on top of your game. At some point, it has to be about doing your own experiments and doing the work and testing. And so I feel like as somebody in the field of conversion optimization, you have a leg up because that's literally what you do for a living. So, that's great. Well, if somebody is listening and they're interested in connecting with you or learning more, what's the best way for them to find you online?
Joris: Yeah. So on LinkedIn, I'm pretty active there. So just to add me on LinkedIn. And then you can also email me, email@example.com. And if you want to learn about this yourself, you can download the free PDF version of my ... you could buy it on Amazon, but you can download a free PDF version on dexter.agency/free-book. And there, you can get a free PDF version of the book. If you want to get started yourself, just check a few things, like what are best practices about A/B testing, what we talked about. So, that's all in there.
Kathleen: That's great. And I will put those links in the show notes. So, if you're listening and you're interested in connecting with Joris or getting a copy of the book, just head there, and you can hit the links and get all of that information. And if you're listening and you enjoyed this episode, you learned something new, please consider heading to Apple Podcasts or the platform of your choice and leaving a review. That's how other folks find us. And, of course, if you know somebody doing amazing inbound marketing work, tweet me at @WorkMommyWork because I would love to make them my next guest. That's it for this week. Thank you for joining me, Joris. This was so interesting.
Joris: Thanks for having me. It was great to be here.
Want to stay updated when the podcast is released?
Drop us your name and email address below and we’ll send you the show notes every Monday!