Want to create a better user experience for your customers?
Start with user research.Before you develop a user experience design you need to understand what your customer really wants.
In this episode, Els Aerts joins us to explain what user research is, why it’s important, and how you can complete user research on a budget.
Everyone explains that making your business different is vital — but NO ONE (not even experts) explains how to actually do it... Until now.
Just click on that big fat red button, answer a couple of questions, and learn to stand the f*ck out in a no-bull, super-practical way:
"You're literally the only marketer I can stomach."
"When are you going to do something in French so I understand it?"
"A terrific celebration of marketers and marketing in all its forms."
Louis: Bonjour, Bonjour, and welcome to another episode of EveryoneHatesMarketers.com. The marketing podcast for marketers, founders, and tech people who are just sick of shady, aggressive marketing. I'm your host, Louis Grenier.In today's episode, we're going to talk about user research, why it matters, and how to do it on a small budget. I'm super happy that my guest today is the co-founder and managing partner of AG Consult. She worked with organizations like Ghent University, the European University Institute, and the Red Cross to create the online strategy.
She also does user research for companies like Daikin, Atlas Cutco, Bridgestone, and Orange to find out their website and online strategy and how they can be improved. She also developed a series of web-writing courses that she's started commercial and non-profit organizations.
And finally, she shared her knowledge in conferences worldwide. In particular, one called C-Excel, C-Excel Live, Convention Excel Live. She was rated the #1 speaker at the conference, and this is why she's on the show today. I always like to look at ratings in conferences, because I just steal their speaker then, because I know they'll be good.
I'm super happy to have you, Els, on board. Before I butcher your last name, I'm going to ask you to pronounce it, for the listeners.
Els: Here goes, Aerts.
Louis: Okay, so Els Aerts. There you go, I said it.
Els: Close enough.
Louis: Not good, but close enough.
Louis: So, what is user research?
Els: What is user research? That's a very broad question. I would define it as finding out what your prospects and what your clients really want. And what their emotions are, what their hopes are, what their fears are, what is their drive to either buy or not buy the product or the service you're selling. What is really in their minds? What is going on in their lives?
Louis: Why is it so important to care about those people?
Els: Because, basically if you don't care about those people, then I don't think you're going to do well as a business. I think it's pretty much common sense, and also just good business sense to care about the people that you want to sell to. If you don't know what really drives them, if you don't know what their problems are, what their expectations are of your product, then I don't think you will be able to sell it to them really well.
Louis: But surely, all that matters is profit, no? In this world.
Els: Oh, Louis. Stop breaking my heart, man! I mean, I know you're cold, but that cold? Really? No, profit is good. Profit is healthy. Profit is what keeps the business going. But, really caring about your users, empathizing with them, one it is good for profit.
And two, let's not be sleazy. Let's not be those guys. You know? So, no. If you have a good product, then getting to know your users; that's a great combination to really do well online.
Louis: So you said, let's not be like those guys. What are you referring to? What are those guys?
Els: Well, people who want to sell or market to you absolutely relentlessly, who will beat down your door with their offers. Who will not stop emailing you, even if you have made it very clear that you do not want their business.
People who will bombard you will all kinds of dirty pop-ups or overlays. And I have nothing against the occasional overlay if it offers me something based on what I'm doing on the website at the time. If it's offering me something that it has done some research into that you know is a value to me, then sure, offer me something in an overlay.
But don't, yeah, be overly aggressive about it. I think the overly aggressive part is very often in the way-- not just when the pop up appears--although that certainly plays a part. I don't want to see a popup right when I enter a website. Subscribe to our newsletter, and get 10% off. I don't know if I want your dirty socks.
Els: Also, the way the messages are worded, that can sometimes really make all the difference between a popup or an overlay that is appreciated by a user, and one that really leaves a bit of a nasty taste in your mouth.
Louis: Right, so just to be clear, I'm playing devil's advocate.
Els: You're not?
Louis: Trust me, all that matters is people at the end of the day. A lot of people we say that, and another question has come up. Actually, I wasn't planning to ask you but, that's popped up into my mind, a lot of people when we explain this point of view, whether in my business I work for HotJar, or in my personal vendetta against shitty marketing--
Louis: --they say, yeah it's all well and good, this vision of taking care of people and listening to them but what if business is in the shit and they are like we need to make money. We need to make money or else we won't survive. It's not about people first then in this instance.
Els: It is. It is because that's the whole point. It's like that saying, you know that saying, for your personal well being, do exercise at least half an hour a day. If you're super stressed, do it for an hour a day, because that will help you. If you are not doing well as a business, very often that is precisely because you are not listening to your potential customers.
It is because you are not paying attention to them. It is because you are trying to impose your will upon God knows who because people will leave you in a hurry. I think that is a very old school and just simply backward way of looking at things. You can't force people to buy a product. You can't force people to like you. To make sure people like you, you have to invest in them as well.
Louis: And what I find to be true most of the time is that there is a strong correlation between how shitty a product or service is and how aggressive the marketing is.
Louis: The more the product is good, the less you need to do shady, aggressive marketing because the product sells itself. Word of mouth is the first layer. Things happen, they talk to each other, and then you just, as a marketer you surf on its wave. You don't just try to create a fake wave. But the more you have a bad product, and the more you have to really force yourself to sell it, and all that. That's an interesting way to look into it.
Louis: We want to talk about user research and how to do that on a budget. But before, I just want to ask you, could you describe the type of activities that user research entails, and then I'm going to ask you about the biggest misconceptions, the biggest reasons why people think, no I don't want to do that.
Els: I think user research, depending on how broadly that you describe it can entail a lot of things. You can do quantitative user research, and that can include surveys with closed questions, or that can include looking at your Google analytics. There's qualitative research, and for me, you can also do qualitative research by a survey if you ask the right questions and if you ask open questions.
Of course, there is moderated user testing, and let's not forget the heat maps. The user session recordings. Those are also, for me, all great tools for user research, like other online user testing methods. I'm a big fan of the optimal workshop suite of tools myself as well, more for information architecture user research. They have tree testing, cart sorting, I love first click testing, a five-second testing.
Some of the things are a mix between research and sometimes validating a hypothesis, especially those last two I would say, the first click testing, the AB testing. All of the other stuff, depending on the kind of project that you're working on and the stage that the project is in, can be very valuable and valid methods of research.
Louis: When you talk about those methods, to really extract what those people are thinking, qualitatively most of the time, what are the biggest objections or misconceptions that people tend to tell you? One of them, maybe you weren't planning to mention that, but one of them is people don't respond to surveys. That's something we hear all the time, I hear all the time too.
Els: Yeah. I would say that people don't respond to bad surveys. We run a number of surveys. Basically, every information architecture project that we start, and a lot of our conversion optimization projects as well, we start with what we call a top task survey. Because we want to find out who's visiting the website, target audience, and what is the purpose of their visit to this website today?
We get response rates of up to 15% on this survey, which is, if you know anything about online surveys, not bad. But this completely depends on the site that you're running them on. We get those on sites where there is a very good relationship between the visitor and the organization, the company who runs the website. The bigger fans that you have, the higher response rates that we see.
We see the worst response rates, and I'm sorry if there are any B2B marketers out there, but we see the worst response rates on B2B sites. For us, for that type of survey, I'm happy if I see 3.5% response rate. But for B2C and we work a lot with cities and communities, non-profits, response rates can absolutely skyrocket and there's a definite correlation between how does your audience feel about you.
Louis: Any other misconceptions, objections that you can hear people mention to you when you mention user research as a way to grow the business?
Els: Yeah, a lot of people think you can just skip that step. They think that they know things about their customers--like they know a lot of socio-demographic stuff. Very often, that stuff doesn't really help you sell all that much because what you need to know about your audience is not so much whether they're 25 or 28, or whether they are male or female or whatever.
No, you need to know: Why is this person looking into the solution that I am offering them? What exactly do I need to tell them about my product, my service to convince them?
What are the fears that I have to take away? Who are my competitors? How can I position myself against them, for example? No, you really need to know a lot more about your audience than just the basic socio-demographic shit.
Louis: Is there anything else that springs to mind before we go into this step-by-step together?
Els: Oh my God, dude. I open one of my talks at Conversion Hotel, in the Netherlands, which is a great conference by the way. It's coming up in November. I open that with a tirade against people who discredit user testing, moderated user testing. This is also something that is, well, the sad thing is, I kind of understand. When you do user testing bad, it's a crap method. But the point is--
Louis: Can you define what it is, because not everyone will know what moderated user testing is.
Els: Ah, okay, there we go. Moderated user testing is where you basically invite potential users of your product, or existing users of your product to attend one on one sessions. You sit them down, behind your laptop, or desktop, tablet, whatever.
You have a moderator who asks them to perform tasks on your website or your app to really try to observe real user behavior. This is different from interviews where you ask peoples' opinions. User testing, moderated user testing, you're after the observation of user behavior, and in a one on one setting.
What I get a lot, as an argument against it, is that it's not scientific enough. Or that you don't get good data with that. The point is, if you do it wrong, you're absolutely right. You don't get good data. You don't get good information out of user testing.
But you don't get good data out of your Google Analytics if the set up is wrong. You don't get good data out of heat maps if you don't know what you're looking for. It's the same as with anything, but I feel there is a much greater false sense of security that people have with quantitative research than with qualitative research.
Louis: Yeah. I'm not going to try to name the psychological principle behind that, because I'm going to fail miserably, but the stats appear to be true when you say someone like 25.3% of people said that it appears to be true in your head.
Louis: If you say three people told me that you're like, three people that's nothing. No, it should be, those are three people that more likely fit the demographic and your persona, they are likely to be representative of the full sample.
I agree with you as well. There's a great episode we did with, I'm going to forget the name now. The CEO of the Buyer Persona Institute, actually, which is an interesting company. The same point was made about demographic data such as, who gives a shit about I'm 28 and I do that and do that.
What matters is why I'm doing it. What matters is what I'm looking for, what type of pain am I suffering from? That's a real persona. Quantitative these days are so important for that.
I always say if you're able to do five user tests with people and truly look at what they do, but also how they feel and their faith and their action, that's going to open so much more inside than any reports on Google Analytics. You're missing the empathy, you're missing the human to human connection. That's what I believe. This is when good things happen, usually.
How to do that? That's kind of the subject of today's episode. How to do user research, how to leverage and get inside even if we don't have a huge budget, right?
Louis: We are talking about, to be clear, in a digital context, in a website context to simplify things. We're not trying to do user tests on real life product or anything. But more of a website scenario. So, when you are working with your client and if they don't have a lot of budget, or anything like this, what are the typical steps you take in order to do user research?
Els: If they don't have a lot of budget, then you could go for, well, I personally really like and I always push for in-person user testing. When you have a person sitting next to you, there is just very often, so much more they are all so willing to tell you. They're automatically more comfortable, it's just a much more natural conversation.
I love talking to you like this Louis, but to be fair, it would be much better if we were sitting at a bar having a beer. This is pretty much the same with user testing. The closer you get to your test participants, the better it is.
I understand that getting people to your office or maybe the client's office that involves a bit of organization, but remote user testing takes away that barrier. As long as you have your webcam on so you can actually see each other, it's very important, I find to always see the test participants face because it says so much.
It says whether they're bored, or not. It says whether they're really having trouble with something, if they're sighing or frowning those are very important things to pick up on. Basically, you can do user testing with just Zoom, or Go To Meeting, or if you don't want to record it, don't need to record it, I'd say try to record it, with Skype. That way, you take away a lot of the effort on your side and on the user side to participate in the test. That's a very cheap way to do it.
If you know where your test participants are, just go out and try to find them, is a bad word, if you know where they are, but you know what I mean. This is a tricky one because it's very important that you always test with the right people. Like what you said earlier, you can really learn a lot if you just do a user test with five people.
I love hearing you say that because it is just completely true. The point is though, you need to get the right people in, and then again, we're not talking about demographics, we're talking about people who are actually interested in your product.
When you do that, and I'd basically invite any business to do that, do that. Go out and talk to five people. Go out and have five people do the tasks that are most important and that you want your users to do on your website. Just observe. This is another very important thing. Don't feel the need to interrupt too soon or to ask too many questions, observe. That's very important as well.
Louis: Don't try to sell anything.
Els: Oh God, no.
Louis: There's a lot of time, where yeah, it's difficult to remove your salesman hat or your marketer hat and to put a journalist observer hat, but it's so important, isn't it? Instead of asking leading questions, like I just asked, it's so important, isn't it?
Open-ended questions is much better, isn't it? I've done it twice now. That's two examples of leading questions where you want someone to tell you something and they're probably going to be forced to say yes if you're putting it the right way. But open ended questions are the opposite. It's like, why did you just do this? Or, that kind of questions, right?
Let me go back to one thing you mentioned. I know for a fact--because a lot of listeners email me about this--but we talked about research, user research and talking to people in the past right? It's not a first time. Most of the time what happens is, I get emails saying it's all well and good but how do I do that?
What is the first step? How do I contact those people? What do I tell them? How will I convince them to spend time with me? What's the value there for them?
Els: I think depending on what your business is, you could try and find them amongst your actual clients. I always think it's very interesting to do user testing with people who have just recently become clients. Also, potential clients. Maybe people who have signed up for your free trial, if you're SaaS.
If you're a regular services or products have a look on a forum where your product is being discussed, and put out a message there that you're looking for people to participate in user research. Give them, again if you're SaaS, maybe give them a couple months free of your product, or give them a coupon for Amazon for $10, or 10 euros.
You'll find that a lot of people are quite happy to participate. Try and make that monetary value not the main reason for them to participate, because then you might just get people who want to make a quick buck and think oh, I'll tell them what I think and there's $50 in it for me. Try to always make sure that you get the right profile of people in who are interested in doing it, not for the money.
Louis: Have you watched a show called Silicon Valley?
Els: Oh, God yeah. I love Silicon Valley.
Louis: So if you're listening to this episode right now, if you've watched Silicon Valley, you'll know what I'm talking about. If you haven't, Silicon Valley is basically the TV show about this startup trying to make it in Silicon Valley. It's all the cliché, but it's so fucking funny because it's so true.
There's many things in there about user tests, user testing, like an in-person user test, the panels or whatever it's called.
Els: Ah! Louis! Please don't make that mistake, man.
Louis: Yeah, so--
Els: Do not say this in the same sentence. No, they have--
Louis: It's not the same thing.
Els: ... focus group. It's a hilarious scene.
Louis: Focus group, yeah. It's like a scene where they go through each individual and ask them whether they like the products and everyone in the room says that they hate it. Then the moderator would name Els, Louis, Liam, John, Sean, he would name every single person in the group and say that they don't agree with the statement.
Anyway, I think it's a good point to make, user test, in-person user test is not the same thing as focus groups. I can see, you can't see that if you're listening to this episode, but I can see Els is losing her shit right now, she's losing it. She really wants to punch the screen. So tell us, what the difference between user tests and focus group, and why you should probably avoid the latter.
Els: Yeah, I think of focus groups as doing market research. I think doing market research, is something completely different from doing user research. Market research is, I don't know a lot about market research, so I'm not going to actually say a lot about it.
I just know that for user research, for finding out whether people can use your product, can use your website, and what their difficulties are, focus groups are just crap. Because one, when did surfing become a group activity? No. It doesn't happen. You don't think, oh, let me check out whether I'm going to try this product, let's invite all my friends over, so we can do this together. No.
One-on-one is very important. Also, when you have that group dynamic going, seriously, put me in a focus group, and I am pretty sure I will manipulate these other people to basically say what I want them to say. This can happen.
You can have a loud mouth in there who will drown out all the other voices. No. Focus groups are also, again, about opinions, and not about facts. User testing is about gathering facts.
Louis: How do you conduct a proper user test?
Els: Haha! Well, a lot of people think it's very easy, and it's not. Because everything basically rests with, as far as I'm concerned, the quality and the experience of the test moderator. I think this is something that Jared Spool, he's written a great article about this, about the multiple personalities that a user test moderator should have.
He says you should have three personalities. One is that you have to be a flight attendant. You have to be somebody who is personable. You have to put your test participant at ease, make them feel comfortable sitting next to you. Make them feel as if they can do whatever they would do at home, that's what we're after, natural user behavior.
Second of all, you have to be a sportscaster. You have to be a reporter because it's not just you doing the user testing, very often you're recording it, or there are people following the test in an observation room. As a moderator, you're sitting quite close to your test participant, so you can actually see everything that's happening.
People in the observation room, they can't. So sometimes you have to point out things that are happening on the screen or that the test participant is saying but that don't make sense without context, so the people in the observation room can actually get with it.
And you can't forget, you're not sitting there having a chit-chat with someone. No. There are research questions that you want answered in this user test session. This is always something to keep in mind.
This is something very important, whenever you set a task for a user, I try to always say, set a task and not ask a question. You set a task for the user to do, is to keep in mind, why am I asking this? What am I trying to find out here on this page? Why am I doing this?
Those are Jared's three personalities, but I've recently added a fourth. I haven't told Jared this, maybe I should. My fourth personality is Switzerland. Like you have said already, you can't ask leading questions in it. You can't ask: So, do you like the search feature?
And especially not with a look of expectation on your face because then your test participants probably thinking like oh, maybe she designed it. Maybe she made it, I better say I like it. No. Open questions and yeah, take a step back. Let your test participant do the talking, that's a very important bit.
Louis: What do you want to find out typically in a user test? That goes for another proper example, you mentioned SaaS, software as a service, a few times. Let's take a proper example of a SaaS website. Traditionally, you have the homepage, the pricing page, a trial sign up page, and then you move on to the product, right? That's the traditional funnel. What do you typically, in detail what do you want to find out? What's the usual questions that you want to answer?
Els: I would say that on the homepage, you would want to make sure that, duh, here come the basics, that what you are selling is clear. This sounds extremely basic, but it's not. I see so many homepages where I think, and what exactly is your product? What exactly does it do for me?
That in itself, homepage material question that you want asked. And to see whether any concerns that your test participant might have about this product, are they answered, yes or no? And are they answered in the right place? Do they have to go look for them on God forbid an awful frequently asked questions page, for example?
No, all your frequently asked questions should be answered in your copy. Basically, you should answer them before your test participant even thinks about asking them. And try to anticipate on that. Those are things I think you should look out for, in your homepage or your product landing page.
Els: Then, of course, clarity of your pricing. Again, very basic, but my God, there are so many confusing pricing models out there. To understand what the difference is between, for example, different flavors of a product that you're offering.
Then, your sign up process. Is that painless? Is that according to what the test participant expects as well? Sometimes, and this might sound strange, but sometimes a signup process is too short. Sometimes a signup process that is too short also goes against what a user expects and doesn't leave a good feeling.
Testing all of those things, all of those steps is very important. Very often, when you go into a user test because usually, you don't do it out of the blue, you already have a number of questions that have come up in other types of research.
Maybe you have seen in your analytics that there's a big drop off at a certain point, why is that? Because that is the question that analytics doesn't answer for you, why? Then you have to zoom in on that page, during the test, and try and find out, okay what's holding you back here?
I think that's also the difference between thin data that you get from Google Analytics--this is a problem for a lot of people--76% of people drop off here. But that's the only thing you know. So many people drop off.
Where in, user testing, and true you might only have five people, but if three people out of those five, drop off that page and they probably will. But then you know, oh, okay it is unclear to them, that they have to actually fill out this form because the label of the field is wrong. Oh, there's an error message, that sucks donkey balls, alrighty. It could be all of those things. It's very often, also during user testing-- you told me I could swear, Louis.
Louis: No way, keep going.
Els: I'm gonna swear now. It could be all of those things, and that's the beauty of user testing. It gives you thick, rich data, it gives you a lot of answers to your why question. That is, yeah, just beautiful.
Louis: Yeah, in my experience, traditional web analytics solutions gives you the what, it's easy. No, sometimes, not that easy to know the what, but user research and qualitive data gives you the why. The why is what matters at the end of the day.
That's user test, that's one type of user research that you can do on a budget. We probably won't have a lot of time to go through many others, but perhaps you can pick your second favorite, or the one that is maybe something that people are going to try tomorrow and get a lot of value out of.
Els: Gosh. It's a bit of a toss up for me between surveys and user session recordings.
Louis: Let's do surveys.
Els: Okay. Surveys, I really like the right small surveys on the right page, at the right time. Which, basically means that you ask a question that is very targeted to the page that your user is on, and that is also correlated with behavior they're exhibiting.
So for example, we did a survey recently on a retail website client's website of ours, and on the product detail page, we asked them what they thought was important to them in choosing a particular product. There's no point in asking them that on the homepage.
They might be after a whole range of different products. The same goes for asking people who are about to drop out of your funnel, okay, if you're trying to leave, what is holding you back from sending a request for offer, or from placing your order today? Getting the wording of those questions exactly right, getting the timing of that survey exactly right, that is where you get the value.
I recently saw a survey on the website of a project we're just starting, and it was just so cute, because they're using HotJar, and they asked, "Isn't there anything we can do to improve this website?" That's not the question you should ask. That is a huge open question that puts the burden of thinking completely on your user.
Also, it means that you think your users will tell you exactly what you need to change. No. Your users will tell you what their problems are. You have to figure out the best way to fix that.
Louis: To go back to what you mentioned, because I know what you are talking about, as you mentioned I work for HotJar. But to be clear, this episode is not sponsored by HotJar at all. Those surveys, those are on page surveys, in HotJar we call them polls, but there are other ways to name them, mini surveys, on page surveys, whatever.
You can imagine, it's a small box, that appear on the website traditionally where the live chat is, that's usually where. Where you can ask an open-ended question or multiple selecting questions and you can even have logic, you answer one question and you're being brought to another one. I just wanted to explain a bit more in detail.
Talking about this, you mentioned a few already that are super interesting. From your experience, what are the core ... if listeners listening to this right now, want to implement that today, on their website, what should be the type of question and the behavior, the triggers, that they should set up? What are the typical ones that you would recommend?
Els: Well, I think trying to figure out why people are dropping off at a certain point in your funnel is a very important one. You can do that either on exit intent, on desktop, or time on page on mobile.
For that, you'd have to look at your analytics and see what is my average time on a page? What I also like to do is just, I like to go to that page myself and time myself, and see just how does that stack up with the average time in Google Analytics?
I find that a lot of marketers could do well to use their own website more, and not just look at it, because I know a lot of marketers like looking at their website and commenting on the looks of the website.
Really, you should go through every funnel, every sign up flow, every order flow, at least once a month yourself. Then you will also know when is the right time to put this question in front of people. What is the right timing for that?
Louis: What question do you ask exactly, when it comes to figuring out why people are dropping off at a certain point, for example?
Els: For example, you could ask, what's the main thing holding you back from, and this could either be requesting an offer or completing your purchase today? Sometimes that is, I'm still checking out other vendors, I'm still in the process of deciding, or what we've had in one case is, "I can't choose. This is super interesting. You have so many nice products." This was a company that sold watches and glasses. "I can't decide."
Els: Yeah, choice paralysis.
Louis: Products of choice, the more you give options in front of someone, the less likely this person is to make a choice.
Els: Yes, exactly. We noticed this problem, because we did that survey, and then we actually fixed it because we added a reassuring message in the cart. We said, you made a great choice. To take away that fear of the user to say oh okay, yeah, well if they think I've made a good choice, I bet I have.
It's what you see ... it's basically pretty much what Booking.com does when they put a green Z when you have correctly entered your first name. They just give you pats on the back the whole time, and it helps to give users a pat on the back. It helps to make your potential clients feel good about themselves. Positive reinforcement can really work.
Louis: So that's one, and super interesting, definitely, identifying drop off and asking people is there anything holding you back from doing the action that we want them to perform today.
Louis: What other scenario, what other type of question do you like to ask?
Els: I think another one that is very interesting is asking people who have ordered, right after they've ordered, what tipped them over into becoming a customer? What's the main reason for shopping with us today, and not one of our competitors?
When you know what drives your fans, then you also know, oh okay these are really the strong points that we have. Very often, what a company thinks are it's USPs are not necessarily the same as what your clients might think are your USPs, and those answers at that point as well might surprise you.
This is a survey that you can either do on page, on the thank you page. This is a great moment, because usually, people have just bought from you. I don't know what you're like when you've just bought something, but a lot of people are on a little bit of a high, you know.
Els: No, you don't feel guilty Louis, you feel happy that you've bought yourself something good. They're happy with you at this point, otherwise, they wouldn't have bought it. And so they're also quite willing to answer this question, right then and there on that page. Obviously, you could also send it separately in an email. But I would go do it right there on that page.
Louis: Agreed, that's a nice one as well. So this is something that traditionally is not being asked that much. I think as human beings, we're really good at spotting problems. We are quite bad at spotting things that are good and trying to double down on them. I like that.
Why are people dropping off, but also on the other hand, what made those successful customers to be successful? What happened in their head? What is the trigger, what key reasons did they choose us over competitors, as you said?
Now, I like the number three. I always like to take three steps, so you're going to have to pick another one. Another example, another typical thing that could lead to a lot of value for our listeners.
Els: Okay. Well, this is not a survey that you can do on ... well, it's not a survey that makes sense on every website. It's a survey that we like to do in our information architecture projects. We call it the top task survey.
It basically asks people, who are you? And what is the main purpose of your visit to this website today? It helps get a good picture of who your website visitors really are.
For a university that we work for, for example, it's important to know what is the ratio between future students, students, staff and also what are they looking for? When you get that information and you can correlate it with your Google Analytics, that can be really interesting.
People always say, why do you need to do that survey? What are you here for? Google Analytics tells you what they are here for. I always say no, Google Analytics tells you what they do. It doesn't tell you what they want to do.
Sometimes what they want to do, you might not even have. Google Analytics will never tell you that. That you're missing content, or that there is something that a lot of people are actually looking for but won't pop up in your top pages in Google Analytics because it is just super hard to find.
Very good methods to always use in conjunction with each other. I think that goes for all the user research techniques that we've already mentioned today. A lot of them are, you can use them on their own, but the point is, if you use them together, and you put all the data together, that is when you get the really valid insights. That is when you really know oh, okay, this is how I should fix it.
Louis: I think there's a lot of reason why this type of poll or these types of questions to understand who are your website visitors is so important, is the privacy side of things, right? We have GDPR that happened a few months ago at a time when the time that this episode is published. We can feel in this world in today's day and age that privacy, user privacy's getting more and more important. You do not want to try to collect data against people's will, and it's always better data if you have the consent to give you this information.
Louis: If you do a mini survey on the homepage, I used to do that when I had my own conversion optimization consulting agency. We used to use a mini survey on the homepage of our clients' and asking the exact question, who are you and what are you looking for today? Or why are you on this website today.
Usually, the answers on this compared to what we got from analytics was completely different. At least we knew what people were doing on a very high level, which page, how long. But it certainly didn't tell us what they were doing on the page itself, or how long, what they did, what they didn't like et cetera, and who they were, what they were trying to achieve. So it gave us a very nice picture of the situation as well.
I think that's a pretty good picture of like here's a research on how to do that, on a budget and in a simple way. So thanks so much Els for going through this exercise with me. I know it's not easy. You were really efficient. I'm curious because I can feel you're a very authentic person. Before I go through the last questions that I always ask my guests, what has been the biggest marketing fuck up in your career so far?
Els: Oh gosh. The biggest marketing fuck up in my career so far? I couldn't possibly just choose on, Louis. I can see a reel of failure passing before my eyes right now. I've been doing this for a long time, you know? I think, there's actually not one huge thing that stands out. What I will admit to, is making lots of mistakes. I think that's only normal. Making mistakes is also how you learn.
Thank God, I make fewer mistakes these days, but that's also because I've been doing it for longer. It's quite as simple as that. I'm not one of those embrace failure and fail forward type of people. I'm also not a big believer in punishing every little fuck up. I've made mistakes just like everybody else. Thank God, I've learned from them and I keep learning from them.
Louis: I know, our listeners aren't going to be satisfied with this answer. You're going to have to pick one mistake.
Els: I'm going to have to pick one mistake?
Els: Oh, gosh. I'm very bad at this. Hmmm. Biggest marketing fuck up? Does making the wrong choice of partner count? I feel like, we as a consulting agency should really be able to pick good partners, to work together with.
I have fucked up there very recently in choosing an advertising partner for ourselves that was completely not aligned with our values and as a result, there was a campaign for our online training that I felt not very comfortable with. I consider that as much a failure of mine, as I do of the partner we choose. Basically, it was our choice to go with them. My God. You made me squirm, Louis!
Louis: Thanks for being transparent with us, I know people who enjoy this type of insights.
Els: Laughs. Thanks for that.
Louis: You're welcome. So, what do you think marketers should learn today that will help them in the next 10 years, 20 years, 50 years?
Els: I know there's a lot of emphasis on data-driven marketing right now. I'm a huge fan of that. Let's be fair, because it's very important. This is why, also user testing and user research is so important. What I think, the very underestimated skill, is the psychological side of things.
And also, copywriting, because I think that is something that AI is not going to take over in a hurry. Analyzing data, I think, machines might get better at that then we are, but really putting feeling into copy, I don't know about that.
Language is something very intrinsically human and I think you need real emotion to write really convincing copy, so I would say psychology and copywriting.
Louis: Yeah, it's such a difficult thing to foresee. Definitely data analysis. Computers already much better than us at making decisions, it seems like they are also able to make decisions from data. But yes, the feeling, the emotions, we'll see. Who knows at this stage.
But thanks for answering that, which brings me to the second question, which is, what are the top three resources you would recommend our listeners today? So that could be anything, podcast episode, podcast, books, conferences, anything.
Els: I think that for online trainings on digital marketing, you can't beat CXL Institute. They have online trainings on a lot of topics ranging from conversion optimization but also analytics, copywriting, they just really find the best people to teach. So I think that's just a great resource.
When it comes to copywriting and getting really deep down and dirty with it, I love Copy Hackers, and Joanna Wiebe as well, so I would check out CopyHackers.com.
If you're interested in user testing, actual moderated user testing and you want to find out more about that, I'm going to refer you to my old mentor, who is basically Jakob Nielson. He's the guy who introduced me to user testing, and I know he's not hip, but the resources on his website nngroup.com, sorry you still can't beat that. There is just absolute gold there.
Louis: Thanks so much, Els, you definitely didn't suck donkey balls today. I was waiting to say that. It's been a pleasure talking to you, and I'm pretty sure people listening are also enjoying you quite a lot. Where can listeners connect with you and learn from you?
Els: Well, I'm on Twitter, my Twitter handle is @els_aerts, or you can always drop me an email at firstname.lastname@example.org. It saves the Aerts bit, email@example.com.
Louis: Alright, once again, thank you so much.
Els: Thank you very much.