[00:00:00] Sani: Welcome to a new episode of NoHacks podcast, where we explore the ins and outs of optimization, whether it's for your online presence or personal habits. Today, I'm thrilled to be joined by someone who is truly a powerhouse in the world of user research and UX, Else Ertz. Else is the co founder of AgConsult and with her no nonsense, data driven approach, she has spent years helping companies make smarter decisions. She's here to talk about the fundamentals of user research, UX, and AI. AI myths and some real world insights. Before we start, I want to invite you to go to nohackspod. com slash follow and just follow this podcast, rate it, review it, whatever you want to do. But yes, please do that. Else it's so great to have you back on the podcast. [00:00:42] Els: It's great to be back. I really enjoyed our first conversation. So I'm looking forward to diving deeper this time. [00:00:50] Sani: I cannot wait. So let's start with the evolution of user research. And you're someone who really has had a very successful career. Can we talk about how user research methodologies have changed and evolved? [00:01:04] Els: Uh, absolutely. I've been doing this for a long time. That also gives me the benefit of knowing what we had at the beginning Thank you. , and appreciating what we have now. I come from a time, oh my God, it sounds like I'm, I'm doing a fairytale like once upon a time. Oh my God. I come from a time, young people. I come from a [00:01:24] Sani: but you know what else it's all about the storytelling. That's what everyone says. [00:01:29] Els: So I come from a time, um, from before Google Analytics even. This is, this is, hard to imagine for some people, but we didn't we didn't have a lot of analytics in the beginning of the web Which is when I started in the web, this was the 1990s. Yes. This was the previous century a lot of what we had was qualitative research. So there was user testing, moderated user testing interviews. Of course, there were surveys, but there weren't all the survey capabilities that we have now. You couldn't really like target surveys surgically, basically, like you can now, like trigger it at a certain, uh, scroll depth or a certain behavior, pages, etcetera. There was no Big scale, unmoderated user testing. None of this existed. Um, and I'm happy with some of the additions. I'm less happy with others. Um, but all in all, I think having all of these extra tools just creates more value in your tool kit. And the expertise remains the same. You always have to know which tool to pick for the right research question. So I'm just grateful that my toolbox has basically expanded. [00:02:56] Sani: So does that mean your, your work is now easier than it was when you got started? [00:03:01] Els: This is lovely. Yes and no. Um, yes, in the sense that thanks to AI, which we're both big fans of AI, uh, thanks to AI. A lot of the extremely repetitive work. Um, that's gotten a lot faster. Somebody in my team just recently said, Oh my God, it's like I've got my own intern full time . Um, so like apologies to all interns listening, you are valued and loved. Um, but yeah, so much. I can take away a lot of that repetitive work and I can be a great sparring partner. Like if you're thinking about thinking of research questions to just to rubber duck with a I if you don't have anybody else to rubber duck within real life. But it doesn't take away the fact that you still have to have the expertise and know, in the end, What are the right questions to ask? What is the right order to ask them in? Is this the right research method? And to really also check whenever it says something as an analysis, because I find this is still a field where it's a bit dicey sometimes. Um, to really check whether the insights it tries to give you, whether they're really real. Because AI does have a tendency to Well, hallucinate a bit [00:04:25] Sani: It really, it really [00:04:27] Els: keep it in check. So you have to know what to use it for and what not to use it for. Has it gotten easier? I would say it's gotten more interesting. [00:04:36] Sani: I would agree with that. I mean, life has gotten more interesting in the last year or two years with, with AI tools and, and. all the possibilities that, that people have. Now, like you said, you can have not one, you can have thousands of interns if you want to in one day. And if that's not crazy, I don't know what is, the way I see AI and, and building with ai, because that's what I've been doing lately. If what you're building, if your process or your tool doesn't work without ai. Don't add AI to it, make it work and then make it better. Otherwise you're just contributing to the mess. You're making things up. You're basically doing, we'll talk about it, but you're doing synthetic user research. And that, that's kind of an analogy for all things bad for me with AI. No, so over the years, what is the one major change in how user research is done? Let's, AI, but one major change you have witnessed and you had to adapt to. [00:05:33] Els: Whew. Um, major change. I would actually say one thing that added a lot of value to the research that, that we do, uh, for our clients, which is. Some of it is exploratory research, but a lot of it is evaluative research. So research in how to optimize, how to improve existing solutions, existing products, existing websites, etc. That is really, uh, the surveys I talked about earlier. Uh, the surveys that you can really target. to a precise audience, a precise point of their customer journey. And if you then ask the right question, the insights you get are just, are just gold. I think that's a really great addition and a great change because it's also a very accessible tool. Like it doesn't have to cost the earth. Um, and you can do it. On pretty much any, digital product. So I think for me, that is probably when we're talking about qualitative user research and surveys can be qualitative user research, if you ask the right questions, I would say that's probably the biggest game changer. Yeah. [00:06:54] Sani: So that, again, that's supposed to make it easier. But if you do it wrong. can also make mistake faster and, and you can, it's easier to make mistakes. [00:07:03] Els: Oh, absolutely. Like one of my research heroes, uh, Erica Hall. Who's the author of just enough research. Like she hates surveys with a passion. And I used to be like, damn Erica. Why, why, why are you always, you know, bitching about these surveys? But then I had a look at all of the surveys out there and it was like, shit. Yeah. There is just so, so much awfulness in the world of surveys. And. But I'm like, let's not throw away surveys just because a lot of people do it wrong. I'm like, I had this discussion with, with somebody else who's like, user testing, user testing is useless testing. If you do it wrong, it is, everything is bad if you do it wrong. Uh, but like for methods, the biggest game changer, I would say surveys, online targeted surveys. [00:08:08] Sani: Hard to disagree with that. And when you said there's so much awfulness in the world, I thought it was going to be a period after that, but you said world of surveys. It could be true as well. [00:08:18] Els: trying to keep it light. [00:08:19] Sani: Trying to keep it light. [00:08:20] Els: but, [00:08:20] Sani: Yes. [00:08:22] Els: Uh, [00:08:22] Sani: I need to read this because when my research team of one was preparing for this podcast, I found. Legend of experimentation and queen of user research next to your name, attached to your name. So for someone who has reached the status of other people saying that, what is the most impactful advice you have received as a mentee over the course of your career? [00:08:46] Els: I would say I haven't had a lot of mentors. I have had a conversation which really shaped me. Um, and that was basically with somebody who gave me a reality check. It was when we were first starting our company with my business partner, Karl and I, and we were extremely cocky. We were very young. We, yeah, we just, we thought we could do everything, you know, uh, even though we could do very little actually. Um, and we were, you know, we were, we were talking to this really experienced exec, um, About all the things we thought we would be absolutely brilliant at which we had never done, And he was like well It sounds like I'm in the company of greatness here Now tell me What are you really good at? What do you have proven expertise in? We're actually good at Those two things. Great. Go and do that, maybe, before you think you can do everything else. Um, so it was very humbling. It was very necessary, uh, for us to receive this lesson. Um, but what it helped me do was focus. Focus on UX and user research. Because I think sometimes, and I have these conversations sometimes with young people, they're like, I want to get into experimentation. What should I do? And I'm like, whoa. Well, it's such, that's such a broad field. And I think in the beginning it can be really interesting to sort of, and I think it's really valuable to know a lot of different Expertise is to sort of like dabble and and get to know like the fundamentals of everything. But you can't be an absolute expert in everything. Like maybe you know some of these people, Sani, but I don't know somebody who's an expert in analytics and a magician copywriter and who's also a good qualitative researcher. I don't, I don't know anybody [00:10:57] Sani: Now, Marina [00:11:00] Els: listeners. Viewers, if this is you, please connect with me. Uh, cause I'd love to get to know you. Um, but no. Um, and so I think when you're starting out your career, it's great to sort of to look around and see what really interests me. And then it can also be interesting to just go like, okay, this is the field that really interests me, like use of research or UX or analytics and statistics, and then go deep. [00:11:27] Sani: Ostrovsky sent in a question that Reads. What is the one mistake? Rookies should avoiding this field. And I think this is it. Don't try to be everything just specialize in something. Right. Let's go back to the story of the origin story of ag consult. You felt like you could do anything. You're smart. You can do anything. Now user research is. One area where a lot of people, CRO as well, where a lot of people feel like, ah, I'm just good at this. Like I can figure it out. I'm smart. I know how to think all that. Even if they don't have any formal training or any experience or anything like that, how do you feel this impacts the field? [00:12:07] Els: Yes, well I have to admit I'm not the biggest fan of what is sometimes called the democratization of user research. Not because I fear for my job. I do not. Um, but nobody talks about the democratization of baking. Um, If you want to [00:12:29] Sani: are you sure? pandemic changed that. Let's, let's update our examples, [00:12:35] Els: you're right. You're right. Baking is a terrible example. Um, [00:12:39] Sani: I get what you mean. [00:12:40] Els: yeah, but seriously, nobody's going to go into a kitchen and like, I'm going to make a cake. Open the fridge, not even watch a YouTube video, not even read an article on like what are the basic ingredients of cake and just going like, what's in the fridge, spinach, eggs, Some soda pop, I'll make a cake work. I'm not saying that you need to spend years and years of training. No. Um, but I do think having a basic background is very important to know, you know, what are some fundamentals when you are setting up a survey? What are some fundamentals in. What are do's and don'ts when you ask questions? Um, because if you do it wrong, this can be, this can be very bad. Like I, I don't think any company would be like, Oh, let's um, Oh yeah. Intern. Great to see you. Cause AI broke down. Could you do our Google analytics set up? Could you do our Adobe analytics set up? Cause you're like easy, right? Yeah. If you want every event, you know, fire twice. Good. Um, it'll get you wrong data. And that is what happens if you do user research and you don't have any background, you don't have any training. Um, and I would say educate yourself. There's a whole interweb out there for you. NN group, Nielsen Norman group, still an amazing source of information. for anything on user research as far as I'm concerned. Um, yeah, just done of research by Erica Hall. Excellent book. Rocket surgery made easy. Steve Krug, a classic. There are some really good resources out there and yes, they've been out there for a while because you know what? Talking to people is still talking to people. , getting down to those fundamentals, that actually hasn't changed very much. [00:14:55] Sani: So basically no winging it when it comes to user research. It's a very dangerous thing to just try to, you know, hope and do it well on the first try. You know what? Someone described what you just talked about. A friend of the podcast, Tracy Laranjo described something like that as no hacks mindset in a conversation we had. And. I think this is it, like, basically that this is why this podcast has that name, because I'm very anti Let's just hack it and try it and hope it works. You need to spend some time learning about the process and about how it needs to be done. Like I spent months, if not years, learning about the equipment for the podcast, how to do, how to set it up. And only then did I try it. This approach, this is what I really appreciate. And yes, like you said, talking to people will never go out of style, but let's talk about synthetic user research. There's a lot of buzz around it. There's more and more and more and more buzz, mostly from companies who are promoting tools that do it. They're saying it's the best thing ever. Why is this so bad? Why do we, what do people miss here in, in, in trying to push running fake surveys on fake users? And what is the risk that we get out of this? [00:16:11] Els: Hmm, i, think synthetic users is taking AI just that step too far. I see zero use for it in evaluative research, but like zero because you can't ask an AI to pretend to be a user of a certain tool that just does not work. I don't believe in it. You will never get the same insights. You will never get the frustration that a real user feels. And we've experimented with this, whenever you ask AI these things, they come up with very general and very broad problems that are the problems of every tool, and that is like exceptionally unhelpful if you're out there looking for things to really improve because they will just spew. You know, generic stuff at you and that is actually not helpful at all. Um, like I don't do market research. It is just, it's not my expertise. Can it be interesting for market research? I don't know. I'm not gonna, I'm not gonna, I'm not gonna say anything about that because I'm not a market researcher. That's not what I do. Evaluative research? Hard no, like really, really hard no. Um, Exploratory? Eh, I'm willing to give it a go, but um, I'd really want to have it next to some solid data from real people to, to be able to compare it to, because we're not going to sell it to the synthetic users, are we? We're going to sell it to the real users. [00:18:01] Sani: do we know that? How do we know what the future holds? [00:18:04] Els: Uh, well, well, if, if, if, if there's a future product made for synthetic users, then absolutely do your research with synthetic users. So, you know, and good luck with that and leave me out of it. Um, but no, so I, I'm just, wary of the validity of the insights. And the thing is, people were like, yeah, but it saves so much time. They're like, yeah. So you're going in the wrong direction faster. What's the use of that? I'm appreciative of the fact that you want things to move along at a certain pace. Uh, I, I totally agree with everyone who says like, Oh, we don't need a six month research project. No, you don't, you don't need to do six months of research. but to only use synthetic users to save time so you can make your sprint planning. Maybe no, [00:19:02] Sani: But that's just checking a box without knowing if, if the result is what you want it to, it to be. And the biggest issue for me is if you're going to have, let's say good synthetic users that represent. Your user base properly. You need so many real users to train that AI data on that. You don't need AI and synthetic users. That's the catch 22 with, with synthetic users for research for me. And that's why I don't think it will ever be useful. Follow up there. What if a client says, I want this research to be done faster. Can you use synthetic users? I know you'll say no, but what is the exact response? [00:19:41] Els: Why, why are we doing this in the first place? What are we trying to achieve? What do we want to find out? And are you serious about wanting to find this out? Or indeed, are we just ticking a box? Because if we're just ticking a box and we're doing research theater, this is not something I want to be involved in. Um, [00:20:03] Sani: Research theater is it's such a cool name for this. [00:20:06] Els: It's, it's, it's exactly what you said there, just ticking the box. I'm like, no. No. Um, I, I had a conversation about this, uh, recently as well, uh, the client who came to me with a research question, um, and I was like, actually, the best way to find this out is A B testing. Yeah, but, um, it would have to be a run time of three weeks and we want to launch it sooner. Can you do user testing? No. Because this is not a user testing research question. It's this is not a qualitative question that you're asking me. So can I do this? I could, but I won't because no. [00:20:50] Sani: integrity, the integrity. You have to, I respect that [00:20:53] Els: No, that's that. That's not that's I [00:20:56] Sani: integrity because there are people who would say yes and take the money. Let's talk about some real ethical risks and the consequences. This, this could have using only synthetic users because it's not just about, Hey, it's going to get you going the wrong direction. There could be actual consequences of this. What are, what are some of the examples there? [00:21:15] Els: If you're going to, if you're, if your synthetic users are going to have any value, then they will have to really be representative of your actual users. And they will also be have to be representative of the diversity of your actual users. Um, I mean, something that also in user research without AI very often gets forgotten. Um, so that's not purely an, an, an AI issue. Um, but so that's, that's one of the things, I'm very uncomfortable with the thought that anybody would think that real humans are that easily replaceable. I still see AI as a, a, a great assistant to help you with repetitive, repetitive tasks, the rubber ducking, um, but not as a real research subject. Hmm. What are things you're thinking of? [00:22:18] Sani: I'm thinking, well, It could lead to you making a decision that's going to leave out an entire group of users who are not going to be able to use your product in a good way or understand it or whatever it is, just because some processor told you to do that. I think there, there are more consequences to synthetic user research, terrible, terrible consequences. It could lead to in the long run, then there are benefits because what is the real, like real benefit from using, from doing this synthetic user research versus just doing it properly. Even if it takes. Not six months, but let's say a month. You can do, if you can do it in a month You should do it in a month just because it's there for like 99 euros or dollars a month some tool that will give you a million users that are, I don't know what they are, they're just processing power, I guess, not, not, not representative of real people doesn't mean you should, and we talked about, uh, before recording about standup comedy a little bit, Louis CK had a bit where he said, just because it's there doesn't mean you need to use it. And, and he talked about why people are always online plugged in and complaining about the internet going off on the planes just because it's there. It doesn't mean you need to be using it all the time. And this is where I am with synthetic user research. It's not ready. It might be at some point, but it will really need to be more advanced than what we have right now. That's my take. That's my. I'm not going to say a hill to die on because this is not my industry, but I wanted to ask you this, what is your non AI hill to die on alone, if that's what it takes when it comes to user research, what's the hot take [00:24:03] Els: I don't have any hot takes. Uh, I don't believe in hot takes. Um, uh, I, I, I think most hot takes are lukewarm at the very best. Um, there are some things that I have. I have pet peeves though. [00:24:18] Sani: That's good. That's good enough. That's good enough. [00:24:20] Els: I've got a lot of pet peeves. Ha! People who know me well know that they only have to mention the word focus group around me to sort of, to, to get me going. Like focus group is market research. A focus group is not UX research. I mean, when's the last time you gathered around a group of people and sat in front of a laptop and did anything together? Um, no. So if you're, again, if you're evaluating a product or an interface, A focus group, rarely a good idea. [00:24:55] Sani: not the way. [00:24:56] Els: no. I'm also not the biggest fan of personas. Um, yeah, I know. [00:25:04] Sani: How do they tie in with, with, with the synthetic users though? [00:25:08] Els: Well, that's, [00:25:09] Sani: a similar thing? [00:25:10] Els: yeah, and the thing is that what I probably dislike about personas is that this idea that a person Can be identified by, you know, your gender, your age, where you live, whether or not you have a pet, um, you know, is, uh, for an early adopter or what, all of that stuff, your behavior on a website, on an app, your expectations of a product. are not determined by any of those characteristics. They're determined by, what do you want from this product? What do you want from this service? What do you know about this already? Um, what is your level of expertise in this field? Uh, for example. So, this is why I'm, I'm not, I, I, I don't like these, Personas, like the people, I'm, I'm more, I'm more of a top task user researcher. I'm more of a jobs to be done, uh, user researcher. Um, and while I have some sympathy that personas, if they're done right, can be helpful to talk about these tasks, um, in more a broader setting that doesn't just include user researchers. Um, but then they have to be really based on research. And then basically the tasks or the jobs to be done should be brought to the fore a lot more than, you know, Jenny, the early adopter, [00:26:58] Sani: I'm not going to go on a rant, but I could about personas because I, I think 99 percent of cases, they're completely made up and useless. And why do I care if a 35 year old, uh, cat owner from Oregon is going to be into my product or not? Like, how does that help me do anything when it's jobs to be done? Like, for example, This person or this, this persona needs a tool to help them with their accounting because they don't have time for it. Like now, you know what you're talking about. Personas with like someone with a lumberjack shirt who is into digital marketing, which is something that I saw in a document recently, the Please stop everyone. Just let's stop. Let's leave it to people who write Netflix shows and movies. let them make things up and we should focus on a real world That's I don't like personas. That's the end of my rant for that [00:27:54] Els: and we, we, we keep having things in common. Sani, [00:27:57] Sani: Oh, yes. Yes. This is why, this is why we're doing a second episode, but let's move on to rapid fire. I asked, uh, I asked, uh, my connections on LinkedIn, uh, to leave some questions for, I didn't say who it was. I just said, uh, well, I said, uh, legend of [00:28:12] Els: what you said, don't repeat what you [00:28:14] Sani: okay. I said it already. I said it already. And there are some very interesting questions. So let's. Try to answer them in, in 10 to 20 seconds each. Trina Moitra asked, how do you keep bias out of user research? [00:28:31] Els: The main thing is be aware that this is impossible and be aware that you will be biased. That is the first step to keeping bias out of it. It sounds really, um, yeah, just admit it. Um, and then do your best. [00:28:48] Sani: And then do your best. Okay. Valentin Radu, a shout out to Valentin. Wonderful person, a guest of NoHacks as well. How many career experiments should you do in a decade? [00:28:59] Els: I think the amount that makes you happy, because I think when it comes to your personal life, optimizing for happiness is the most important thing. So as many as you want to, or as little as you want [00:29:12] Sani: I love that. Kelly Wortham asked, why do you prefer moderate studies over unmoderated studies? [00:29:17] Els: Great question. Yeah. Because. In my view, when you do user testing, it is about user behavior and you don't always, hmm. know exactly why a person is doing something when they do something. I mean, I don't know about you, but sometimes you can watch user. I can't do this in 20 seconds. Um, so this is not, I'm sorry. I've, I've, I've, this is not a rapid fire question anymore, Kelly. Duh. Um, so I have called in the past unmoderated user testing as the non missing link between. Um, user session recordings and moderated user testing because if you, either you can see it by just the behavior on the page, like the mouse going left to right, left to right, and then they go like, Oh, they're doubting between these two things or rapid scrolling downstairs. Oh, they're looking for it at the bottom of the page instead of where we put it. But why would you need the unmoderated sessions? Um, and I'll tell you who basically. It taught me to see the value of unmoderated user testing, and it was a copywriter. Not just any copywriter, THE copywriter, Joanna Wiebe. Unmoderated is great if you want people to vocalize what they're thinking. Unmoderated is actually It's not so much about the behavior. It's about what people are saying. It's more of a, it's more of a way to get a lot of interviews done in one go, quote unquote, and to get people's real words than it is for behavior observation. Um, so yeah. [00:31:06] Sani: 20 seconds are up. Uh, final question, uh, from Brendan McNulty, uh, which methods do you use when you have limited time and or resources [00:31:16] Els: Man, I have to always come back to surveys. They're cheap. They're fast to set up, and if you know what you're doing, they're gold. [00:31:26] Sani: that was under 20 seconds. I'll give you that. Let's talk about research in real life. Let's leave work aside. It's Friday anyway. I always record on Fridays. I assume that you can not just shut this thing off in your brain. That makes you go wonder what's happening and being skeptical and, and being a researcher. So how do you find yourself applying all these principles or any of the research methods or research thinking in your everyday life? Can you share some examples of that? [00:31:59] Els: um, the thing that I, that I keep coming back to, uh, more often than, than user research is actually is UX and usability, um, Whereas, um, I think if you're, if you have a background in usability, you notice bad UX, uh, and bad user experiences everywhere. [00:32:19] Sani: More than more than the good one, right? [00:32:21] Els: uh, sadly there are a lot of bad user experiences out there. Basically, 8 out of 10 elevators, um, 9 out of 10 showers, which You know, knob, do I need to turn? Do I need to turn it? Do I need to flip it? Do I need to, some knobs you need to turn while you flip them. I mean, there's no instructions. So all of these things and you're like, Oh my God, they couldn't, this Airbnb host should have put up a little how to sign for to use the shower, et cetera, dah, dah, dah, dah, dah. Or they could have, You know, optimize this, this line for lunch at this conference. Um, not when you go to German conferences, not when you go to, not when you go to, not when you go to Andre Morris's conference in Frankfurt, not when you go to Growth Marketing Summit, they've optimized the lunches, uh, the lines there, um, but so yeah, um, you sort of, you look for optimization opportunities. Everywhere. And I think if you're a researcher, you just had, I, you just have an inquisitive mindset and, and, and you wonder why? Why? I sometimes feel like when my husband's talking about his job, he's an architect, and he sometimes talks to me about the projects that he's doing. And I just go like, Why? Why did you do this? Why did you do that? So I must be like a five year old sometimes. I'm [00:33:53] Sani: have a mindset of a five year old. Uh, so you said you wonder why they did it this way. Do you sometimes wonder why didn't they even bother to think about how they should do this versus just slapping it? And then is it laziness? Like, what is it? [00:34:11] Els: not sure whether it's always laziness. I think it's sometimes not realizing, [00:34:18] Sani: Hmm. [00:34:19] Els: um, not realizing that there's actual value in doing that research. Just if you've never done it, then you don't know how valuable it can be. Um, I think it's, it's, it's, I, I don't want to attribute it to, to laziness. Um, I, I, I, I don't know if ignorance is better, [00:34:45] Sani: No, I don't know if it's better, but it's, it's, it's likely as possible [00:34:48] Els: um, I, I, yeah, I want to think it's, it's because they don't know any better. Um, like, cause if you don't know any better, yeah, then you don't. Does it get tiring to have that always wondering mind that says, why isn't this better? Why, why did they do it this way? Does it, is it exciting and pleasant to have that mindset or does it actually get you tired of all the thinking? [00:35:15] Sani: Yeah. [00:35:16] Els: Hmm. It might be tiring for other people. [00:35:19] Sani: Okay. [00:35:20] Els: I can completely see that. It's not tiring for me because it's just the way I'm wired. Um, it's, it's, it, and it really can be unpleasant sometimes for people because if you show me something, um, I will, And, and, and, the same for a lot of people in our team, um, when we talk about something, even, or we present something, the first response is, is usually not, oh, that is a great idea. The first response is, but wait a minute, uh, have you thought about this, that, and the other? So it's always looking for optimization possibilities, which means you have to look for the holes. You see the holes in, in, in a theory, or you see the points of improvement to be made. And, uh, I've actually had to learn to not just point out the holes, but to be, to also acknowledge, like, all of the good stuff. Uh, because, obviously, this is very important, and it's also something, of course, and I understand, it's very important to hear that you're also doing well. Um, but it's not tiring for me, uh, but I'm sure I can be very tiring for other people. [00:36:36] Sani: Let's call it negative feedback. Even though it's not negative. Finding holes in an idea or concept. It's a good thing. It's the best kind of feedback you can get. But let's say negative versus positive. I never cared for positive feedback in my life. When I'm doing some in work specifically, I don't care. Like if it's good, I don't want you to tell me it's good. I don't need praise or being tapped on the back. Tell me what's wrong with it. So it can be better. Like this is literally the only, every time I asked for feedback and you know, we talked about a secret project that will launch very soon. I want to know what's bad about it so I can eliminate the bad parts. I don't know. It's easy to insult people these days. by giving them feedback that seems harsh or negative when you're just trying to make their work better. Does that make sense at all? [00:37:27] Els: I do think that a lot of people for a lot of people getting praise is very important, uh, as a validation of their work. And I totally understand that. It's great if you don't need it means you're very secure in what you do. [00:37:42] Sani: No, it means it means I grew up in Eastern Europe. [00:37:45] Els: Oh, maybe it's cultural. Yes. Um, but I, I think you should also always be open for negative feedback. Like, um, I love praise. I love getting praise. I'm not going to lie. I love [00:38:02] Sani: I'm not, I'm not buying that. I'm not buying that. [00:38:04] Els: love it when somebody says, ah, you did a great job there for us. You know, you're like, wow, this was this, this, these insights were a lot better than, than, than we had expected. I love that. Um, I also love when, you know, I've, I've, I've done a talk or we've done a project and like, it was good. It's great. We learned a lot. This part would have expected a little bit more this and that, or maybe next time we can do fuh, fuh, fuh. And I'm like, excellent. Great. Indeed. Because it's only with, quote unquote, negative feedback, critiquing that you can grow and that you can do better next time. If, and the thing is everybody and everything There was always room for improvement. Always. Uh, we can, we can, you can always do better. It doesn't mean it wasn't good to start with, but wouldn't you want to do better? I'd [00:39:03] Sani: We, we need more of that. We need more of that mindset in the world in general, that that's just how, how I feel about it. People should not. Not expect to get bad feedback or negative feedback. People should want to be given the roadmap to making, to doing something better. I don't know. I, I, I just don't think maybe, maybe it's a cultural thing. I don't know. Maybe it's Eastern European talking right now because, and of course I'm half joking. Eastern Europe is not all, uh, suppress your emotions and, and like that, that kind of stuff. No, but that's a stereotype. So I have to go with it. But I think, I think there's more, uh, benefits to getting, let's call it negative feedback or, you Improvement roadmap feedback. Let's maybe that's a nicer way to put it then just, Hey, this is great. You crushed it. Like, what do I get out of that? And another standup comedian, said that two words have ruined American culture more than anything else, and that's good job. And parents telling their kids good job, whatever they do. I was a lifeguard in the U S years, not in the nineties, but almost nineties. And the, the number of times I thought it was a joke before I got there, the number of, no, it was George Carlin who said that he was George Carlin, the great George Carlin. The number of times I heard parents say, good job when, you know, maybe it wasn't a good job, like swimming classes, all that stuff. You're not really helping anyone if you're just praising them when Don't be harsh. No, absolutely. Don't be harsh, but just don't false praise people. I think that's a terrible thing to do. You know, I have one more question, obviously. And this is the fun one. You, when I sent you the intake form, uh, you said you were a bird phobic bird lover. I'll just tell you my perspective. I, I don't trust birds. Like, this is the only animal I don't trust because they can harm you in many, I watched the birds or birds movie when I was like six or seven, it was on TV and I watched it. And I still don't trust flying birds because they can harm you in ways that no, no other animal can. But explain that sentence please. [00:41:15] Els: You are not making this better for me. So they can harm you in ways that no animal can. Oh my God. Um, yeah, no. I have had a bird phobia since forever. Um, people ask me, but how did you get it? You don't want to know so many terrible bird stories. I'm not going to go into that. I just, just know, [00:41:37] Sani: But you still love birds. [00:41:39] Els: from a distance through binoculars, preferably, [00:41:43] Sani: You admire birds. [00:41:44] Els: I admire birds. I really enjoy, um, Like the activity of birdwatching again. Yes, I know I've, uh, this is how old can you be? Uh, old enough to enjoy birdwatching. [00:41:59] Sani: no. It's had a comeback. Like it definitely had a con. Yes. Yes, for sure. [00:42:03] Els: Like baking birdwatching. So I really enjoy. Birdwatching. So I'm, I'm, I'm, I'm lucky enough to have a garden. And when I hear, uh, like we have a woodpecker or we have actually a family of woodpeckers and I hear like the, the tap, tap, tap. And I'm like, I get my binoculars out and I'm like scanning the canopy for where is this woodpecker? I know where he is now. He's got his favorite spot there. And so I love birdwatching and I hear a call and I'm like, Oh, I know this. Ah, this is a buzzard. And I'm scanning again. I love it, but birds can't come close to me. Because then I, I freak out completely, uh, but really like completely. Uh, so you don't want to, yeah, yeah, yeah. You don't want to be near me when that [00:42:47] Sani: you know what? They're basically dinosaurs. It's okay to be afraid. [00:42:52] Els: I totally agree. Uh, I, uh, I'm super happy that you said, you know, um, but, but I might disagree with you that flying birds are the creepiest ones. Chickens. Are the creepiest birds there are. I can feel like my heart rate going up and I can feel my tummy doing things that I don't want it [00:43:11] Sani: Time to stop it. [00:43:12] Els: because yeah, it's, it's really, it's, it's, it's, yeah, it's, it's too much for me. Thanks. [00:43:17] Sani: time to end that and time for me to, thank the audience for listening to this episode. Like I said earlier, you can go to no hacks, pod. com slash follow, subscribe to the podcast on Apple, Spotify, YouTube, Instagram, Twitter, even Twitter, I don't know why, but even Twitter and, Els, you know what I'll do, I'll praise you because you were a great guest. You were an amazing guest and I'm so thankful. [00:43:40] Els: Do not say good job at me, man. Do not. [00:43:43] Sani: Oh, it was better than good. It was way better than good. I'm not going to say that. And I really want to thank you because this was the second episode. I think even better than the first one. I enjoyed it even more. And, uh, I can't wait to meet you in person at conversion hotel. But one final question for you. If you had to say to yourself six months from now, one word or one phrase, what would that be? [00:44:06] Els: Don't forget to feed the cats.