In this episode of The Optimal Path, Hannah Anokye joins host Ash Oliver to discuss her work on the Intuit Futures Team and the role of qualitative research in exploring emerging technologies, understanding users' needs, and identifying opportunities for new product developments.
In this episode of The Optimal Path, Hannah Anokye joins host Ash Oliver to discuss her work on the Intuit Futures Team and the role of qualitative research in exploring emerging technologies, understanding users' needs, and identifying opportunities for new product developments.
Hannah shares best practices for conducting moderated user interviews and how to generate insights that spark ideas and drive growth. Discover how integrating user-centric approaches in qualitative research nurtures innovation and redefines mental models, leading to more resonant and effective product solutions
About Hannah:
Hannah is a product innovation practitioner with a unique background. Through a career with titles in product marketing, user research, and now product management, her through line has been user-centered experiences that drive meaningful impact.
Most recently as a Senior User Experience Researcher on Intuit's Futures team, Hannah explored how Generative AI technology could be responsibly and meaningfully integrated across the Intuit product ecosystem.
She recently transitioned into a product management role, focused on building the next generation of AI-driven products for Intuit’s future customers.
Connect with Hannah:
You can connect with Hannah on LinkedIn.
Resources:
Follow Maze on Social Media:
To get notified when new episodes come out, subscribe at maze.co/podcast.
See you next time!
Ash Oliver:
The value of user insights is well-known to product innovators, with generative, qualitative research holding the key to turning conversations into breakthrough products that are both relevant and revolutionary.
Hannah Anokye:
Qualitative research is incredibly important because when you're working with a scenario that's a what if, asking someone, "How would you do X, Y, Z?" or, "How would you use this tool?" is maybe not the most impactful type of question you're going to ask. Instead, we want to really understand the person, so what does their life look like? What problems do they encounter on a regular basis? How do they see the world? And knowing those things can help us understand, if we introduce this technology into their life, how will they more than likely react to it or use it or integrate it, knowing these things about them in terms of their context and the problems that they face.
Ash Oliver:
Today on The Optimal Path, we're getting an inside look at the Futures team at Intuit and how qualitative research is a cornerstone to innovation.
I'm Ash Oliver and this is The Optimal Path, a podcast about user research and product decision-making, brought to you by Maze. Our guest is Hannah Anokye. Hannah is a product innovation practitioner at Intuit. Through a career with titles in product marketing, user research, and now product management, her through line has been user-centered experiences that drive meaningful impact.
I'm really excited to have you on the podcast. Thanks so much for being here.
Hannah Anokye:
Yeah, thank you. I'm super pumped to be here as well.
Ash Oliver:
So you're a product innovation practitioner and have a unique background, given you've been a product marketer, a UX researcher, and have recently transitioned into product management. I'm looking forward to getting your perspective on research across these disciplines, but want to start with our core topic of conducting qualitative, generative research within the Futures team at Intuit and how it connects to innovation. So I thought we could begin with some setup around the Futures team. Maybe you can describe what this is, how the team's structured, and what the main goals of the team are.
Hannah Anokye:
Yeah, definitely. So I work at Intuit on the Futures team, which is housed within our technology organization. And we are essentially tasked with exploring emerging technologies and building out proof of concepts and MVPs that can eventually be imported into our roadmaps across our different products. When I joined the team about a year and a half ago, we were exploring a number of different technologies, from blockchain and crypto to AI to even the metaverse, so really exciting stuff. And most recently we've really dove into AI experiences because, as we all know, that's where the world is shifting. So really looking at where we can push the envelope in terms of offering AI-driven experiences to our customers in a way that makes sense for our business, and of course is done responsibly.
Ash Oliver:
It's such a cool team to be a part of.
Hannah Anokye:
Yeah.
Ash Oliver:
Is there a combination of roles that make up this internal tiger team or is this a combination of different pods that come together? How does that work?
Hannah Anokye:
Yeah, that's a great question. It's a rock star team. Everyone on the team is really, really talented, really open to trying different things outside of their wheelhouse, which I absolutely love. In terms of the makeup, majority of our team is engineers, because of course they're the ones building out these proof of concepts and MVPs. And then we also have a number of product managers who lead the charge on the roadmaps. And then we have a really strong design team that comprises of product designers and product researchers. We also have program managers. We have a fully stacked team, which I think sometimes is rare for these types of innovation legs of organizations, but it's really empowered us to explore things from really early stages to, okay, how can we actually build it and how can it actually be integrated into our product team roadmaps?
Ash Oliver:
Yeah, I love that. Since we're going to be spending the majority of our time together talking about qualitative research, how does qualitative research drive innovation within the futures innovation team at Intuit?
Hannah Anokye:
Yeah, so when I was a researcher on the team, definitely skewed more on the qualitative side of things. I will say we value mixed methods research a lot because it's this great yin and yang where depending on what stage of the process you're in, qualitative research is incredibly important to really help you understand the root cause of things, the core problems, how your target users see the world. And then the quantitative research can really help you validate some of your initial thoughts and understanding about the user and how they might utilize the technology or experience we're building. And so it's wonderful when we're able to marry those two things together. But I will say, for the futures team, because we're often working with technology or concepts that people have never heard of or don't encounter in their regular day-to-day life, I think qualitative research is incredibly important. Because when you're working with a scenario that's a what if, asking someone, "How would you do X, Y, Z?" or, "How would you use this tool?" is maybe not the most impactful type of question you're going to ask.
Instead, we want to really understand the person, so what does their life look like? What problems do they encounter on a regular basis? How do they see the world? And knowing those things can help us understand, if we introduce this technology into their life, how will they more than likely react to it or use it or integrate it? Knowing these things about them in terms of their context and the problems that they face. That's where having a strong researcher on your team matters, and also just having the strong triad. So we work with a traditional triad model where it's a designer, a product manager, then a tech lead. And the designer could be a product designer or a researcher.
And together, we really work to outline, okay, what are the key questions we want to go after given the stage that we're in? Do we have a lot of information about this problem space or this use case? Do we have a little information? And depending on those questions, we can figure out are we at the early stage where we really need to know more about the user, the problems they face, or are we at the stage where we have a lot of the information and now we need to narrow once we have more data? And then we keep iterating through that process until we have enough information to build something and put it in front of users.
Ash Oliver:
I'd love to contextualize this if we can, to maybe share an example of a time when interviews or qualitative research helped your team reframe a problem or user needs.
Hannah Anokye:
A project that I worked on when I first joined the futures team was in partnership with MailChimp. A lot of small businesses leverage it for marketing campaigns to send emails to their customers and leads. And so this was in really, really early stages of generative AI, before the big boom and before ChatGPT launched. We were exploring how can we leverage this technology within the email experience, so we embarked on this study with MailChimp. And we were interested in integrating generative AI into the email-building experience. We were most specifically looking at long-form email. Could we use it to generate an entire email campaign start to finish with very little input from the user? And so we were interested in a couple of different things. We wanted to see if people saw the value or how the technology worked. Would that be enough for them to be like, "Okay, I'm okay with just approving it. I don't need to edit it too much," or would they want a lot of control over it?
So at that point, it wasn't really about changing mental models because we didn't have a solid understanding of how people react to this technology because it was so new. It was more so about, okay, how can we validate or invalidate our initial assumptions? What we ended up learning was that users did want a lot of control. They needed a little bit more control over the input, even if that wasn't directly tied to the output. So it was a semblance of control in some sense, because it's such a new technology. And because, for these customers of ours, they're small business owners and they care a lot about their relationship with their customers. And so that was a really interesting project because we were so early in the adoption of this technology. I think if we ran it a year later, it would be quite different just because of how big ChatGPT got. But that was a really cool experience to dive into this new technology before it became mainstream.
Ash Oliver:
That's really interesting, and I'd love to get into some of the aspects around how you might effectively glean some of these insights. How do you ensure that the insights from this research are substantiated with stakeholders?
Hannah Anokye:
Yeah, that's a great question. I think as a researcher, this is a key thing you need to be thinking about. You can be the best researcher ever. You can have the most interesting insights, but if you don't know how to work with stakeholders throughout the process, you're limiting the ability of your insights to make impact. So there's a couple of different ways I suggest people do this. One is bringing your team along from the beginning. So as a researcher, where you can collaborate with your product manager or your engineering lead, your product designer is when you outline what are the research questions we want to go after, what are the assumptions we currently have, and then what is the riskiest assumption we need to probe into deeper? So getting people on board early makes it so that when you get to the phase of outlining what's most insightful about this research, you can always tie it back to those questions and objectives, and because people were bought in at that point, they'll hopefully be bought in to those insights that you come to at the end.
The other thing is making sure you understand what an insight actually is. So there's a big difference between an observation, a finding, and an insight. Observations and findings are really important because those are supporting details or other things that can help you understand the user or the problem space. But an insight needs to be directly tied to your objective, and it also needs to have design energy. And so what that means is if you look at your insight and you read it, does that spark ideas? Does that spark opportunities? And if the answer is no, then maybe you want to rephrase it or maybe look at some of your other data to come up with a stronger insight, because insights should lead easily to the "So what? Why does this matter?"
And if you think about the stakeholders that researchers work with, product managers, engineering managers, et cetera, they're really busy. Product managers are always in meetings. They have to make decisions based on a lot of different data. And so you want to make it as easy as possible for your product manager to look at this insight and be like, "Oh, this is just giving me ideas." This is something I think is really, really key for researchers to have impact on their team, so really just integrating and thinking about your partners from the beginning to the end of the study.
Ash Oliver:
I love the way that you've described this factor of design energy and if it sparks ideas. I think that's a great way of approaching this. What are some of the key things that pop out for you when you know that you have design energy?
Hannah Anokye:
I definitely did not coin the term design energy. I learned it in grad school in a class with Martha Cotton, who's an awesome design researcher. The strongest insights I have seen identify contradictions. So what they often look like is it'll start with, "While users behave in X, Y, Z way, in fact, they believe A, B, C." Beginning of the insight explains maybe a behavior that was surprising or a behavior that maybe we expected, and then the second part of that phrase describes the contradiction. And so a contradiction introduces tension. And if you observe tension, that's somewhere where you can visualize how to break that tension. And so looking at an insight like that, it's like, oh, that's fascinating. It sparks ideas because there's that tension, that friction, or it completely goes against what maybe an initial assumption was when you started that project and you outlined what are your leap-of-faith assumptions or what are your riskiest assumptions? The insight might completely bulldoze that initial assumption. And then from there, you can think through, okay, what are these opportunities and some ideas that are sparked from that?
Ash Oliver:
I think it's an excellent way of being able to identify when you have the makings of an insight. We're going to get a little bit into the different methods under the branch of qualitative research, so thinking about when you're in that interview session, for example, in what ways do you look for body language or contextual references that might contribute to a deeper understanding or insight?
Hannah Anokye:
This is one of my favorite things about interviewing users and something that researchers can use, designers can use, product managers. Everyone who is tasked with really understanding the user should try to incorporate in your practice. When you conduct research, you typically have a script or a discussion guide. But I think the strongest researchers go beyond the scripts to really connect with the person that you're talking to and observe, identify, sometimes even mirror how they're communicating with you. So sometimes people might take long pauses. Just live in that pause. I think an initial reaction and reflex is, I'll fill it up with words. I'll try to break this awkward moment, when it may not even be awkward. It may just be someone is thinking or they need a little bit more space. And rather than you as the interviewer filling the space, let the person you're interviewing fill that space, because it's your job to really listen and let them lead the conversation regardless of whatever your discussion guide is.
The other one is leveraging their own language. So if you're talking to someone who is referring to something that you commonly refer to as X, but they're referring it as Y, or maybe they call it the wrong name, you typically want to mirror what they're talking about and use their own references and their own language to keep them comfortable. And also, think through what does that tell you about them, the fact that they call it something else? You can maybe even ask about it. And these are things that go beyond a traditional discussion guide and set apart more advanced researchers or interviewers. And the other thing is paying attention to contradictions, because we as humans, we contradict ourselves a lot.
And oftentimes, when you're first meeting someone, getting to know someone, you want to showcase your best self or showcase the version of yourself that you want to be or should be, or sometimes you think that the person interviewing you wants you to be. So as an interviewer, you have to be creative about getting past that. So really trying to make them comfortable, connecting with them, asking them certain questions multiple times but in different ways can help you get past that initial guard that they might have up and really get to the heart of what they're actually experiencing.
Ash Oliver:
Those are great suggestions and deep listening techniques. I'm thinking about, especially given your background where research has been intertwined across these different disciplines, how would you suggest non-specialized researchers, so perhaps your product managers or your designers, balance the need for rigor and best practices within research while also paying attention towards this intuition in an interview setting?
Hannah Anokye:
All product people, whether you're a designer, product manager, researcher, should try to learn how to interview people well. And so that means that whether you have an hour with a person, 30 minutes, 10 minutes, you know how to maximize that time. So I think it's a skill that people should actively try to learn and get beyond just reading a discussion guide. That will take most of us really far in where we're trying to go. The other thing is just being self-aware. Just know where you are in that journey. If you're not the strongest interviewer, if it's a really complex problem, I do not suggest you try to embark on that on your own. Consult your researcher and see if they have bandwidth to help or can make things a little bit easier for you.
Ash Oliver:
Any resources that you would point out in terms of being able to develop interviewing techniques and skill set, or is this where you could be a silent observer in certain scenarios where researchers lead the interview and your PM counterparts or your designer counterparts may be able to witness that? What would you recommend?
Hannah Anokye:
Yeah, definitely research counterparts should for sure get involved in interviews where they can, so sit in on them, take notes. Being a note-taker can help a lot because let's say you're the secondary note-taker and you're tasked with just taking notes of the behavior, just observing the intangibles or the things that are unspoken, that can help you learn to look for that when you are the interviewer. There's a book that I recently was just rereading. It's Interviewing Users by Steve Portigal, and he's awesome. He's really well-known in this space. And in this book, he's really great about saying it's not about qualitative, quantitative, it's not about ethnography, it's not about title even. It's just here's how you interview users in an impactful way regardless of what role you are in. And then my last tip really is just you have to just do it. If you're a PM or a designer, you're probably going to be in front of users anyway, and so practice is the best way to get better at any skill.
Ash Oliver:
I'd love to address the common misconception that qualitative research may yield insignificant results due to the sample size or objection around how much time is required in order to run this type of research. I'd love to just get your take on how you might be able to overcome those objections.
Hannah Anokye:
Yeah, definitely. It's something I'm keenly aware of. Thankfully it's not something I had to face in my role on futures, but in hearing stories, how even different types of data that seem to be valued at many companies tend to be quantitative, tend to be product analytics. So the common misconception is that because with qualitative research, you're interviewing sometimes six people, sometimes 10, sometimes 12 or 20, it's still a lot less than the number that you get from a quantitative study. And so it feels sometimes less impactful, or why would I listen to a handful of people when I could get data from a thousand? I think that's a strong misconception because you can have bad data both ways. You can have bad data with a poorly recruited qualitative study, and you can also have bad data with a poorly recruited or poorly analyzed quantitative study or product analytics that are sliced the wrong way. So that risk is there for both types of research and data.
But for qualitative studies, I strongly recommend really investing in a solid recruit, so your data is only as good as your recruit. So that means really thinking through what is the screener going to look like? Who do we want to get answers from or information from? How diverse are the individuals that we're talking to? Are they all within the same income bracket? Are they all within the same ethnic background? If so, how can we go beyond that? Because if you find themes across a diverse set of individuals, that's a really strong signal that there's something there.
Ash Oliver:
Yeah, that's a really important highlight. I wonder if you could speak about different methods like co-design and how having more of a comprehensive approach might contribute to product innovation.
Hannah Anokye:
Working on an innovation team is really awesome because we get to utilize all of these different methods to explore problem spaces, explore reactions to technologies, the concepts that we're building, et cetera. So some of my favorite methods besides strictly interviewing users, the top one for me is definitely ethnography. And so what that is is really just observing people in their natural environment and just taking note of how they interact with others, what activities are they doing, what language do they use, et cetera, and having that inform your understanding of the end user. Just sitting with people observing them can be incredibly powerful because, again, there's a lot that you just can't get in an interview that you get from watching real behavior. You bridge that say versus do gap in a stronger way when you're doing ethnography.
The other one is co-design. I really love doing co-design because it helps us go beyond our own environment, our own little bubble, and our own biases. If you're co-designing with users that are completely different than yourself or the people that you typically surround yourself with, you're often going to come up with incredible ideas that you would've never come up with yourself. It's a really powerful method to use on an innovation team because we're building from zero to one, we're building from the ground up, and it's really important to make sure the folks that contribute ideas are diverse and don't really look like the majority of our team, because that's who we're building for. On that same type of method is concept testing. So typically, we'll co-design with our target users, and then we'll also test concepts with them. So we'll build out some of the ideas that we came up with together in a low-fidelity sketch or Figma file and put that in front of them and have them express what stands out to them and why. And then from there, it's a collaborative process to build new products and experiences.
Ash Oliver:
I'm curious, in the co-design process, if there were any surprises or key findings that maybe reshaped your approach?
Hannah Anokye:
I can think back to, again, early days of generative AI and before ChatGPT became mainstream, our team was experimenting with it, but we also knew that we wanted an internal instance of it. And so there was the technology behind it, but then there was also the front end. What would the front end look like? Were we going to replicate the playground of what OpenAI had built, or would we build our own unique one based on what our internal users needed? I ran a study where I interviewed people, but I also showed them a couple of the different features of OpenAI and had them ideate, brainstorm, based on this feature, how would you use it and what kind of things would you want to see in our internal experience? And what we learned from just the co-design session was people, first of all, didn't even know that GPT3 at the time could accomplish all the things that it could do.
And we initially had a hypothesis that developers, engineers, they all were really familiar with the interface that OpenAI had at the time. It was not as user-friendly as ChatGPT. It was like a playground, and they were very technical terms. And so we thought, "Engineers, oh, you guys already know how to use this. It's going to be great. We'll just design it to be the very same as OpenAI's playground." But as we co-designed, we realized people were very excited by what OpenAI could do, but they also just had no idea how to use it. They didn't understand that it could do things like translate and summarize and things like that. So most of us were operating from a very similar place of not understanding how this technology works and not knowing how to utilize the UI, and so the co-design session revealed the limitations of people's imaginations at the time. It was interesting because it revealed that our initial hypothesis was wrong, and it revealed this new emergence of the mental models of some of the groups that we were looking at.
Ash Oliver:
Yeah, it's really cool also to contrast that to the earlier example that you provided with MailChimp and the comparison to the trust that emerged through that research versus in this context where it underscores the importance of education and the exposure as you described. This has been great. Really exciting to learn more about how you and your team operate and how you're being able to incorporate research through how you're innovating. I'd love to transition into the last part of our segment. This is a series of questions that we ask every guest just to get to know you a little bit more personally. And my first question for you, Hannah, is what's one thing that you've done in your career that's helped you succeed that you think few other people do?
Hannah Anokye:
Asking bold questions. Bringing up angles that other people in the room may not have thought about is really important, and sometimes asking a question can help you assert an opinion without having to actually assert that opinion. You're just asking the question. So being curious and asking bold questions has been incredibly important for me.
Ash Oliver:
Love that. I know you already recommended a book, but what is the industry-related book that you've given or recommended the most and why?
Hannah Anokye:
I recommend Designing Your Life all the time. At this point, I think they should sponsor me or I should get an affiliate link. The authors, Bill Burnett and Dave Evans. This book came out in 2017 and it truly changed my life and how I think about my career. And I think it's great for people who are early in your career, but also people who are feeling like they're at a crossroads or maybe not so engaged in their work or certain aspects of their life. It helps you apply the principles of design to different aspects of your life, so I think it's a wonderful book. There's lots of great exercises in there, and I highly, highly recommend it.
Ash Oliver:
Yeah, I have it as well, and a big plus one to that. My last question for you is what is an unusual habit or an absurd thing that you love?
Hannah Anokye:
I didn't know it was that unusual until recently, but I'm really big on power naps. I think sleep hygiene is super underrated. I had a sleep specialist come to my high school. Well, two of them actually. One of them told us that 10-minute naps are all you need, and then another one said 26-minute naps are great. And so depending on how much time I have, I'll do 10 minutes or 26 minutes, and it's been amazing for me over the years.
Ash Oliver:
That's super great. Hannah, this has been great. It's super interesting to get your perspective. Thank you so much.
Hannah Anokye:
Yeah, you're very welcome. This has been such a pleasure. I love talking about this stuff.
Ash Oliver:
Thanks for listening to The Optimal Path, brought to you by Maze, the user research platform that makes insights available at the speed of product development. If you like what you heard today, you can find resources and companion links in the show notes. If you'd like to stay connected, you can subscribe to the podcast newsletter by visiting maze.co/podcast, and send us a note with any thoughts or feedback to podcast@maze.design. And until next time.