Paul Blunden: Hi, I’m Paul Blunden, founder of UX24/7, and today I’m going to be carrying out another interview in my series on rolling research or rapid research. I’m also looking at continuous research and iterative research and really trying to find out why and how rolling research differs from those and why you need it.
Today I’m going to be interviewing Amanda Gelb, and she’s written and spoken extensively on rolling research. So I’m really looking forward to learning more from her. So let’s go and meet Amanda.
Hi, Amanda, And thanks so much for meeting with me today for the benefit of our viewers. Can I ask you to introduce yourself.
Amanda Gelb: Absolutely. Hello, everyone. I am Amanda. I am framing myself as a professional question asker these days I’ve been a design researcher, a strategist, a product researcher, a straight researcher usability researcher. For the most part, I, you know, ask questions to help people gather meaningful data to make impactful decisions. And I’ve done that for over a decade. Now, mostly in Tech, at Google and Lyft and a range of other companies.
Paul Blunden: Cool. And how did you get into research? What was your kind of journey into the profession.
Amanda Gelb: Yes, it’s so funny. I feel like nowadays. You can study it in a you know, at school. But when I was coming up there wasn’t really a language, even for user experience or any of this kind of thing. So I studied business undergrad. I got a bachelor of commerce. I thought I was, gonna do corporate social responsibility and change the world from inside businesses. And I studied business ethics and entrepreneurship and all of these different things. And then I was like, you know, I don’t. It’s funny, cause now I do play with data and numbers all day.
But I didn’t wanna be, you know, kind of just focusing on the business side. I really was like, oh, I love the ideas of what was coming through and so a friend of mine sent me this woman’s writings, and she had this creative consultancy. That was this boutique experience design firm, and I kind of cold called her and was like, Hi, I’m graduating and I’ve done these things. And I ended up working for her a few years later, and that was kind of my end to the industry. And then she was really instrumental in encouraging me to go to graduate school, where I learned about user experience and coding and wiring, and 3D printing and all kinds of different things. And it was there that I learned about user research as a term, and got an internship that summer, and was kind of off to the races.
Paul Blunden: Fantastic that it’s a funny thing the number of people who come into research with some sort of altruistic desire somewhere back in their past, you know, and like CSR and all the rest of it. And I think it’s one of those jobs where you are. You’re professional. You, you say, question also professional helper. You’re trying to help people. So it’s a it’s a great career, I think, for that.
Amanda Gelb: Yes, and I can’t. I feel we can’t use the word empathy. It’s just. It’s been too overdone. But in the earlier days, when I was describing what I did, it was it was that which is really cool. You know. I’m I have mostly worked for product teams that are in service of people that are going to be using whatever we’re building. But you know, we can’t build for ourselves. And so being able to learn and create knowledge is really cool. I I’m grateful to love what I do.
Paul Blunden: Yeah. And you’ve joined the industry at a great time here. And technology is getting bananas. And obviously, this interview series is primarily about sort of rolling research, rapid research.
Paul Blunden: and you mentioned in your introduction. You worked at Lyft, and obviously I was really interested to learn about what you did. They? You set up the rapid research program.
Paul Blunden: I just really interested, you know, to kick that off. How did you arrive at the research operating model. You, you eventually implemented.
Amanda Gelb: Absolutely well first, perhaps I’ll share how I arrived at Lyft, which is really.
Paul Blunden: Yeah, that’d be great. Yeah.
Amanda Gelb: There was, you know, a medium sized team, mostly based in San Francisco, and I was hired as the first researcher for the New York office, and I showed up. And there were 80 other, you know, cross functional folks there, and you know what it’s like as a researcher. First people don’t really understand your value. And you’re like, I’m here. Hello! Is anyone out there? And then people get a taste of what you can actually do for them and the product. And you know who you’re trying to learn about. And then it’s like this total stampede for all of these research requests.
And so, you know, I had done some lunch and learn to kind of share the value of research, and I had some phenomenal design partners off the Bat, who already knew you know how to utilize my services, but for the most part I was a little bit overwhelmed to be completely honest with you, and, you know, kind of servicing this whole office and all of these different teams and all of their research needs. So the program kind of stemmed from that as a way to scale myself.
And then, yeah, we can go into more details. But I I also was really fortunate with this group of designers and product partners to be able to iterate with them over time. And so, you know, I sense this was going to be the case, Paul. You know I didn’t walk in, and was like super surprised that been in house before. There’s always more demand than supply for research and researchers in house just in terms of how we’re staff proportionately to other disciplines.
And so, you know, I done a little bit of research. I spoke to a friend at Etsy. They had a program going there, and I’d spoken to someone at Linkedin, and they had their bento box thing going again. This was maybe eightish years ago or so, but I kind of come in with a little bit of knowledge about how these types of programs had worked elsewhere, and the nuances between them, and had that in the back of my head, or or up my sleeve, if you will, as something that I could lean on if needed.
Paul Blunden: Got you and I. I love the idea of you’re trying to work out how to scale yourself.
Because so many researchers are not only the researcher, but they are the capability that the organization has. And so and I’m not surprised to hear you say that demand, you know, explode. So often, it’s the case. When organizations start doing research. they want to do more. And there’s never enough time. So okay, so you started off. How do you scale yourself? You had some ideas. How did you go about it? How did you get buy in? What did what did you do.
Amanda Gelb: Yeah, you know, I kind of pull back and thought about how to set up a system. And I set up some sort of strong dog type system, which is, you know, how we tend to do research anyway, which is some sort of intake form, right? Someone comes to you with an idea or something that’s launch, or, you know, some producty thing in some shape of, you know, fidelity.
So it’s like, Okay, I need a way to like, get a sense across the office of like, what do we actually need? Research on? And then some sort of filtering system, right? Like what makes it through that barrier, and we could talk about that in a minute, and then I would initially it was kicking off with every team in person, and I would kind of say, like, Tell me, like, what what do you think is going on here? What are your hypotheses? What are your big questions? What do you want to happen if this thing is successful, or you know we’ve launched something out in the world. You haven’t done any research on it. Prior. What what can we glean that would be helpful and get a little bit more color there and then, as it started, I would write up the test plan right? So like the goals, the objectives, the method, the approach. How many people we were gonna speak to all of that kind of thing. And then I would throw that back to the team to give feedback on and to edit.
And then I would also write up a discussion guide right like, what are the questions that you’re asking the order that you’re asking them? A all of that kind of moderation piece. And then I ended up creating a moderation training. So in my program as part of scaling myself, I was kind of camp counselor, pulling all the strings from behind the scenes and setting it up well. But it was my partners, my cross functional partners, engineers, product managers, designers, data scientists, you name it, who are conducting the actual interviews. And so in order to kind of trust them in front of real customers, I would have them go through this training that I created.
That was 2 hand had kind of all of the best practices, and I’d have them practice. You know, there’s this interactive piece, and then I would have them do a dry run right? So practice it also was someone that wasn’t familiar, and then they would show up on the day, and it would be kind of like a round robin, where you know, I would just rotate. It used to be, and this is Pre covid. So it was in person in our offices. This very exciting, you know, rapid research day in the office, and we kind of clear this space, and everyone had their own little room, you know.
I made sure that every team had a partner who was taking notes, and you know all of the pieces were kind of tied it up, and everyone was ready to go, and then I would bring in this group that I had recruited, who were general population, you know, type users, because Lyft is a 2 sided marketplace. Sometimes it was riders, sometimes it was driver focused. We picked one of those and they would just, you know, have you know, the teams would go off for 20 min, and I had a literal bell that I would ring, and that would mark that everyone would rotate, you know, to the next group I built in bathroom breaks and check ins with the teams, but those are the components and the flow of what it looks like.
Paul Blunden: Wow. So yeah, when you describe a recent, you know, it’s a research day, everybody’s participating. Everybody’s doing it. It must have been a fantastic way for these stakeholders to get in, get cut in front of customers.
Amanda Gelb: It was phenomenal, and when I think about what was most important to me, it was building a culture of learning right like these folks are not professional moderators. I I you know, I had this. You know, they weren’t gonna ask the best questions or follow up questions. They weren’t gonna extract the most meaningful data. But they were this term. I’m seeing circulating now in the industry. There were people who do research right, and everyone has these folks that do research internally, anyway, whether it’s marketing or sales, or a number of different folks who aren’t necessarily designated researchers, but are doing that kind of research. And for me it was, how do I get people who don’t necessarily, you know, either do research or sit face to face with someone that’s using or going to use a piece of our product.
How do I generate that connection with them? How do I put them in the, you know, in the driver’s seat of what I love, which is just hearing from people in their own words. And it’s funny, Paul, because you know, I think folks can come at me and have with like, Oh, researchers should be doing this research. I, you know, kind of shifted my thinking there. But for me again my goal was building this culture of learning, not necessarily executing on perfect research. And you know I had projects that were cut out for that reason and projects that I took on, etc.
But it was, how do I help my partners learn as best they can? Because I, as a researcher, could create? I mean the most like beautiful insights deck, and it could be sparkly and shiny and have the most amazing. You know, emotional pull quotes. And this phenomenal video that I’ve edited and all of these strategic. And you know, whatever it is, I could create this whole narrative but it will never be as impactful.
And I have found this to be true as someone sitting face to face and hearing directly from the customer. And I’ve had engineers after this, you know rapid research day, who went off and changed code immediately, like overnight. They’re like, Oh, my gosh! This thing is broken. I need to fix it now. Whereas, as we all know, as researchers and designers, sometimes you, you know what’s broken, and you gather this data in support of that, and then it still needs to be put on the roadmap, or approved, or whatever it is, and so was kind of my secret sauce also, beyond that, that learning piece of like how to get people really bought into learning from folks and making changes to our product. As a result of that.
Paul Blunden: Yeah, it’s like, when you get somebody in the Pre Covid days, when you used to have labs and viewing rooms when you’d get someone into the viewing room for the first time, and they suddenly be enlightened by seeing their customer failing to use their interface. And what you’ve sort of designed. There is getting them even closer which is really impressive. And then I want to, you sort of alluded to a quality there, and with democratization we might talk about that later on. Actually.
But to start with, I wanted to understand a bit about maybe the cadence cause when people talk about rapid research and rolling research. There’s a lot about what’s the cadence? How frequently is it happening? So how did that work for you?
Amanda Gelb: That’s a great question. So a few things. The first is that we called it rapid research, and that’s what we named it internally. But as more researchers got bought in, and we did this across the entire company and multiple lines of business, we were like, Oh, this isn’t actually rapid for us as researchers. It actually takes a lot of time to pull together on the back end. So that you know that we’re getting the insights, perhaps more rapidly. But it wasn’t. It was a, you know, a medium sized lift for us. But yes, I started running it every 6 weeks.
And I have seen companies do it monthly. Where, like this is the day that the customers come in. You know it’s the end of the month, or the first Friday or that kind of thing. We ended up shifting to quarterly. And we were really we. We kind of created this process of who we were reading out and who we were letting into this program. It was like, you know, kind of a drum roll to get to submit your application. And then we did some really thoughtful, you know, conversations just as a group of researchers as more folks implemented this program about the types of projects that this would be good for or not, and who had a researcher staff to them or didn’t right.
There’s a number of different components that we thought about in terms of who got in. You know we we ended up doing it quarterly, and had a lot of success there, because it gave us the amount of time to get people to think about it with their own roadmapping. And ultimately it got to the point with a few of our lines of business, where because we knew we were offering this quarterly when teams came to us for these larger foundational research, asks that were kind of out of scope for other priorities that we were playing with.
We’d say, Okay, we can’t do this whole 3 month engagement. But we do have rapid research coming up in, you know 5 weeks time, or whatever it is, and I think we can help you answer. You know a subset of these questions or this particular set of questions. We can add you to the lineup already. So it’s kind of that, like bartering behind the scenes system also of just because we knew we had it in the schedule when we were building out at lift, we were building in 6 month increments. So on the half cycle, because we’re building it for H. One or H. 2. We also knew what projects, perhaps, to throw our guaranteed spot as well.
Paul Blunden: Gotcha. Okay? So when you were talking about people making applications to be part of the program, you’re referring to the stakeholders, then is that right?
Amanda Gelb: Yes.
Paul Blunden: Gotcha. Okay? And so you described it as or it is described as rapid. Where does the rapid come from? Is it because it’s happening so quickly for the stakeholders.
Amanda Gelb: Yes, I think the oftentimes research is perceived as slow, and that is both true and untrue, and maybe a topic for a future chat but perception wise. It’s that it could take a while, and it can to find the right people to bring them in. It could take up to 2 to 3 weeks to do the recruit. If you’re doing foundational research in particular, to really get clear on what you’re trying to learn and what the approaches and all of those kinds of things, and this was seen as something that was happening more frequently at the company, and that gave you as a cross functional partner access.
And for us we really determined. The best types of projects for rapid research were ones that needed an incremental change. Right? There was a very clear kind of low, lying question. That, and the answer to that question, whatever the research you know pointed to could be acted on also in a pretty quick way. So that rapid piece there was also, you never know fully what you’re gonna find in research. That’s why we do research there. You know, there are times we’re like, Whoop, gotta rethink that one or this, you know, design system isn’t quite working.
But for the most part we tried to focus around questions that could be pretty easily answered by a, you know group of 5 to 8 customers, and doing that in a way in which executionally they could. Also there’d be some room for change, and for me as a practitioner. That’s something that I’ve hammered home more and more, which is that as I’ve gained more trust and credibility where I’ve worked, I won’t do research. And this is the case for our rapid research program unless there are people staffed to act on it. We are not. We do not have the luxury of doing research so that it can sit on a shelves and be kind of just validation. Okay, that’s good to know we are doing research when we have design or engineering or product, who are then gonna rewrite a product spec are gonna make some sort of design tweak, or that engineer who’s going to go off and fix that bug, whatever it is, who are kind of staffed to what happens next.
And that was, you know, when we think about our intake form. Those were one of the questions we asked, you know, what are you going to do with this research? How are you going to act on it? Who do you have on your team, who’s ready to action on it if needed. And that’s that’s a rapid piece as well.
Paul Blunden: Got you that, and that must create a bit of a sense of kind of scarcity almost that makes people, you know, if they wanna access the program? They gotta be ready. Is that what you saw, or how did that work out.
Amanda Gelb: Yeah, I think that was mixed. And more, you know, kind of more the interpersonal piece that we do as researchers, right? The conversations in the hallways. They’re on slack, or that kind of piece where people like. Oh, I you know I had an engineer, you know. Also Pre Co, we can I show up to my desk and be like, okay, I built this thing and I think it’s really cool. And I hadn’t showed it to anyone. And normally I would say no to those types of things.
You know, we again can only pick 5 or 8 projects to be in any given rapid research cycle, but in this case it was really thoughtful, and getting that bit of feedback for him, and being able to have those full quotes would enable him to go to leadership and say, look at how this can be something that we build and that turned into a brand new feature. So there are these kind of edge cases where you know someone’s kind of just showing up and like, Hey, I’ve got this thing, or like, you know, I was in this hackathon. Can we play this out a little bit more.
Paul Blunden: And you, you’ve mentioned or alluded to evaluative research. You mentioned foundational research. So in your opinion, the kind of research these programs are best suited for. What would that be?
Amanda Gelb: Evaluation.
Paul Blunden: Right Okay.
Amanda Gelb: Yeah, across the board. We’ve played with foundational, you know. I when someone has a great question, I you know it just professionally like, really wanna help answer that. And so there have been times where there have been foundational questions. And of course, you know, this is meant to be 20 min interviews with 8 people wrap, you know, one day. Kind of thing we’re not going to get at that foundational depth and nuance that we like to. But I’ve helped folks, you know, where there weren’t going to have resources otherwise tease that apart into a set of questions that we could get really clear direction on to be able to move forward with the next step of thinking. So mostly evaluation. And that’s where it shines the best. What’s the tactical thing? What’s like, you know usability or things at that level and that. But occasionally I’ve let some more foundational projects trickle in that. We’re thoughtful in that way.
Paul Blunden: Gotcha. It’s quite similar in the rolling programs we run where sometimes the we evaluate it, sometimes the part starts not ready, or whatever, and cause you’ve got people coming. You don’t wanna lose the session. It’s perishable. So you know, if we can ask some questions we do. And it was fine. That’s quite a difficult one, particularly with stakeholders not familiar with research, is trying to help them understand the reliability. And what we can and can’t do. Did you find that problem? Because I know you’re a big, passionate team sport person? And how did you do with that side of the stakeholder stuff?
Amanda Gelb: Yeah, it’s funny that you say that I feel like 50% of your job, especially if you’re in house as a researcher is evangelization and education of some sort. And so, yeah, this intake process again, I would just kind of get as many people to fill out the form as possible, and that was a forced conversation, or you know, asynchronous digital follow up that I would have with them, and sometimes it would be like, oh, this is a great question. That should be a survey, you know not the right program for you. Let me connect you with.
You know our quantitative researcher. We can get that going in that bucket of work sometimes it was like, Oh, this is a great question. It is foundational, you know. I hadn’t considered that. Thank you for bringing it to my attention. I’m going to be doing road mapping for research priorities in the next 2 months. Is that too late up a timeline, or can we work it in there? You know a lot of it is kind of just teasing apart and, I think particularly cause I worked in tech for a while. Everything seems urgent. But part of the work to your point about the stakeholders is like, is it urgent, like what will actually break or go awry, or wheels will fall off if this thing is not answered or not. And that was the urgency piece was one of the components that we would consider in determining what projects make it. In addition to, you know, if you have folks that are staffed and ready to act on the research.
Paul Blunden: Okay Well, I said, we’d return to the question of democratization. And you mentioned people who do research. And clearly the Lyft program was bringing in people who weren’t researchers to run the research. I was really interested in your view. I wanted to ask you the time about, you know. Quality. How did that work for you? But I think beyond that as well. The research is the last 2 or 3 years has been pretty volatile. I wondered how your opinion has your opinion changed? Do you? Do you still think that’s a valid thing? Tell me what you think.
Amanda Gelb: Yeah, it’s a spicy question. Because democratization is this really polarizing word. And I’m sure you’ve seen on Linkedin and elsewhere, people attacking democratization people praising democratization. It’s funny I have so many product folks who either have their own consultancies or are in house, who talk about continuous discovery which, when I peer behind the curtain truly sounds like the exact same thing that they’re trying to set up. And I’m like, Oh, you’re doing research. I just saw a post from a a co-worker who has her own products, you know, strategy group. And she was like, here’s why we need to answer the why? Of the question. And I’m like, Oh, that is qualitative research. You were talking as a product person about doing qualitative research. And I don’t think we can prevent those.
And that that is it’s unrealistic for research to just grab and have ownership of the entire discipline and be the people that do research. I do think we, as researchers, bring a rigor and a knowledge of methodologies and moderation is really hard. Really, quality. Moderation takes truly, I think, 5 to 8 years to just get to be a medium good moderator. And that’s something I’m super passionate about.
And the questions that you ask going back to our introduction directly impact the data that you get and that you action off of. And so you know, you taking someone for granted or thinking you understand what they’re saying. But you don’t not taking that moment to go back and clarify something, you know, whatever it is. Those are kind of the marks of really seasoned researchers that do phenomenal, rapid research, or any kind of research that I don’t know that our cross functional partners honestly can do just because they are doing 12 other things as part of their jobs.
But that being said, they’re still doing it, and I don’t think we can prevent them from doing it. And so my approach is how do I help you? Do it better, and create the parameters right like? What are the boundaries here and for me at lift. And you know, when I consult with other companies that are considering this kind of approach, it’s maybe it’s training their research team right to be able to do that, and the cross functional partners support it another way. Maybe it’s doing like really a series of a few moderation trainings to up the skill level for those cross functional folks.
Maybe the researchers bring, and the education lies in the projects that make it in or that don’t make it into this type of project. I think there are a number of ways that we as expert researchers and facilitators and folks that understand the entire scope right? You were talking about stakeholder piece can have those conversations and be valuable internally, and the whether you call it democratization or continuous learning. Or you know, whatever you want to cut, and it’s been rebranded 12 different times. I think it’s it’s the skill set, though, we bring to it as researchers and making sure that we’re still involved to some degree whether that’s levelling up our cross functional folks or taking a little bit more ownership of particular pieces of the puzzle.
Paul Blunden: So it’s a turkeys for Christmas kind of question, I think, but obviously the the message I’m getting is, you believe in the research specialism still. Do you think that organizations value that skill?
Amanda Gelb: You know, we’re in an interesting moment in time right now, and I think we have to acknowledge that there are tens of layoffs. Think I think the research reckoning that’s been written about and various spin off articles is a little bit overblown.
One thing I’ve been thinking about is what we do well, as researchers, how we think, how we interact with folks, what we bring there, and how to position that better internally, we do need to be close to business. We do need to be close to our customers. There’s, you know, I think we could do a better job at some of those elements. But yeah, I I think these organizations that are just mass letting go of all of these researchers and designers and folks still need to put out a product. And that product isn’t gonna be as good. And they’re gonna feel that later, they’re really gonna feel that burn. You can’t.
You know, you know this, Paul, from the work that your organization does. You know, there are these paper cuts that you have whenever you deal with a bad user experience, and if you have enough of those, or those are severe enough, it’s enough for people to not be able to sign up or not, you know. Kind of go to one of your competitors. And so I’m kind of excited in that that kind of dramatic popcorn way, I hope everyone you know is doing well in the world, and for the folks that have kind of written off the discipline, I’m excited to see that come back and bite them. If I’m being completely honest. It is. It’s gonna hurt you. And it’s gonna hurt your bottom line like we as researchers do help Roi even and helping prevent folks from not building something right like, maybe sometimes rapid research comes back and we’re like, Oh, we shouldn’t invest in that.
It’s in an invisible to the bottom line. But we’ve saved the company, you know, hundreds of thousands of dollars of engineering time or staff time not building that thing. So I think part of it is just that hidden value that doesn’t necessarily make it you know, onto the balance sheet, but can be really heavily felt as well.
Paul Blunden: Yeah, I think we’ve got a better chance of getting into the balance sheet than we used to have as researchers, because used to be a cost. Nobody knew what happened, and nobody was really mentioned measuring impact. But I hear more and more with clients. We want to measure impact want to measure impact. And I think that’s a can only be a good thing. And I have to say, my hope is that this is an economic issue. Because what you opened up with about you get into an organization. They want to do more research that happens everywhere.
And I feel like, you know, we’ve just gone through this explosion. We had to make some layoffs, perhaps because the economy is terrible, but the desire to carry on doing it, I think exists. It’s just can’t afford it, perhaps. And anyway, that’s what I hope. I might be completely wrong. So I mean, we spend quite a bit of time. I must move on because I am very conscious of taking too much of your time.
But we talked a bit about democratization. I think that was probably the last big shift to the lives of researchers. And an AI may well be the next. How do you think AI will impact. Maybe the role of researchers, or our day to day lives.
Amanda Gelb: Absolutely. It’s already impacting my life as a researcher. I use it to do quick, competitive audits. I use it to get up to speed on kind of like the day in the life of, you know, an audience or a customer that I’m that is new to me, or that I’m trying to learn about. I use it to help clarify my headlines. Right? So if I’m trying to create really clear insights, I’ll kind of use it as a writing partner. It’s a teammate right now. I know all of these products are have rolled out and are rolling out to do the actual interviews. That’s where I get a little worry. Because again, we, as researchers, can pick up on tone of voice, on body language, they’re all of these other elements that are going on in the background.
And right now, at least, AI doesn’t understand that full picture when you’re talking to someone, they’re just taking the literal responses. And just to give an illustration, I was once doing contextual research. So I was in someone’s home and I was asking her it was parent and kid digital media usage. So I said to a parent like, Hey, how often do you let your children watch TV? Is that a limit that you set, etc, and the parent was like, Oh, I’m very strict, only on the weekends. etc. Then the Kid came home from school in the middle of our interview, naturally put their backpack on a hook, went into the den, turned on the television.
This was not. There was no permission asked. This is not a conversation. This is seeming like a very natural thing that was happening. AI can’t pick up on that a I can’t pick on that thing that’s happening in the background, and I think part of our value and our bring as a research discipline is what people say isn’t necessarily what they do, and as much as possible. We try and marry attitudes and behaviors to get the full picture, so forget sarcasm. I think AI will get good at that kind of stuff. But if they’re actually conducting the interview, I do think you miss a ton of data and potentially go in the wrong direction, because you’re not picking up on those other contextual clues.
And then, yeah, at least right now. And this will improve like it’s not reliable. Right? Like you ask? I asked. I uploaded a transcript the other day, and it started telling me things I was like. No, this is not right. I had to go back and reread the Transcript and do it all myself. And so I think that accuracy gap will close. But the other piece, like you, are still responsible, as the researcher is the product managers, designer, whoever you are, on the product team it. It’s your name that is attached to whatever the insights or the findings are the recommendations, and I think there’s only a degree to which we can trust these machines, and not necessarily our own expertise in our guts as well.
Paul Blunden: Yeah, I couldn’t agree with you more. The term you used about writing partner, I think, is a very good one. And I was interviewing some product leaders and asking them about AI and got a similar thing that I think someone said to me, it’s more an assistive intelligence rather than you know more than anything else.
And actually, we’ve done research ourselves where we ran a project. We then run another project, using AI to do the analysis and everything else, and quite frankly made some of the things up about participants that weren’t even there. And it was you just can’t rely on it for that kind of thing. It’s. And I think when you’re moderating, how can you rely on it there as well? It’s just it’s a good assistant, but not ready quite yet. I’m with you. I suspect it will be one day so we shall have to see.
Good stuff. Well, look! And, Amanda, it’s been fascinating talking to you. Have you got any other final thoughts about stuff that’s cropped up while we’ve been talking you thinking God, I hope he asked me that, or I want to get a chance to share that.
Amanda Gelb: Yeah, thank you. I just, I mean, this is coming across also, just in the conversation. I’m a big proponent of rapid research. I think we can leave the democratization and the terminology aside. I think, having a process where you are having continuous input from the people you were trying to build for can only be helpful to companies. And so, regardless of what’s happening economically or elsewhere, I think it’s a huge investment that feels important to make.
Paul Blunden: I couldn’t agree with you more, and certainly the evidence of what you did at Lyft supports. That so you should be very proud of your work there and I just been a pleasure speaking to you. Thank you so much for giving me up quite a lot of your time more than I had asked for. And I really appreciate it. And yeah, best of best of luck with the next big thing.
Amanda Gelb: Yeah, thanks for having me on.
Paul Blunden: I hope you enjoyed listening to Amanda as much as I enjoyed interviewing
Her. The case study on lift was brilliant, particularly how that then led into what democratization. But how’s that changing? And then and then AI, yeah, really interesting thoughts.
You can get in contact with me. Via Linkedin. I’m Paul Blunden, founder of UX24/7 and you can find me on our website. That’s ux247.com. And the email address there is, hello@ux247.com.
And of course you can subscribe to this channel, and there’ll be other interviews in this series coming along. And you can access some of the interviews I’ve done with some of our researchers, researchers around the world, and also people in product leadership roles.
Thank you very much for watching.