How I’m Doing More User Research (with a little help from AI)
Three ways to make user research faster, lighter, and more fun — without losing the human touch.
Every product manager and designer I know wishes they had more time for user research and testing. We all know how valuable it is, but it’s also the most common thing to slip when people get busy.
But what if AI could help us get some of those insights faster — without losing the human nuance?
I’ve been experimenting a bit recently with how I could use AI more in this space, both for work and for my side projects.
The goal here isn’t to replace real user research, but to explore how to use AI to make it easier to ask better questions, faster.
In this weeks post I want to talk about my three favourite methods at the moment. These are helping me save a bunch of time, and helping to get more user insights into everything that I’m working on.
AI + Surveys: My shortcut to valuable customer insights
I love surveys. They’re one of my favourite ways to get information from users when you need some data and insights quickly or at scale, and don’t have a lot of time.
But even though they are quick and easy to do, sometimes I still struggle to find the time to get it done.
A few weeks ago I wanted to get some customer feedback on a new feature we’d released.
I could see people were using it and seemed happy — but I wanted to dig deeper to learn what they really thought, and how we could improve.
Normally I draft some questions, share it out with the team to get feedback and insights, see if I’ve missed anything, and then setup the survey.
But I wanted some real user feedback for a presentation I was giving later that day. And I needed it fast.
I decided to see if ChatGPT could help me spin something up quicker. I gave it a quick brain dump of what I wanted. Here’s a slightly shorter version of what I asked:
I want to create a quick survey to send to customers who have used our new <feature>. The goal is to understand if they are happy with it, and any suggestions they have to improve it. I also want to understand how big of an issue <potential feature we’re working on> is for them.
The last bit is because there was a feature that we felt was a strong pain point, but we’d kept it out of the MVP due to time constraints. I wanted to understand how much the people who were using it needed this feature to help understand if this is something we needed to build now, or if it could wait.
AI did a great job converting the request into survey questions, including suggested formats for each.
I then asked it to help draft a short email that I could use to send out the survey invite.
While these things probably wouldn’t have taken me too long to do, getting the AI to help, made this a 10min job instead of spending an hour overthinking and polishing it to something that felt ‘good enough’ to send.
It wasn’t perfect, but it didn’t need to be. And that’s kind of the point — AI helped me get to something valuable faster.
The survey went great and within 2 hours I already had a bunch of awesome user feedback that helped me understand so much more about how users felt about the new feature – good news they actually loved it.
And that feature that we thought was a major pain point – turns out 80% of our users don’t care about it at all. We’re still missing some of the market by not solving for the other 20%, but now we have real feedback to help us decide when and how to solve for it.
Using AI to help validate an idea (without spending hours reading Reddit)
For weeks I’ve been wanting to do some more user and market research for a side project I’m working on. But each week it just kept getting pushed to the following week cos I couldn’t find the time I thought I needed to sit down and do it.
So one day, between sets at the gym, I decided to get my phone out and dumped a quick brief of what I wanted into ChatGPT.
Hey can you help me research what features users want in a journaling app? I was thinking about using reddit as a source. I want to understand more about what people want, current pain points etc to better understand how to make my app better
I use AI all the time, so I’m not sure why I kept putting this off. I think I thought I wanted to read first hand exactly what users were saying, worried that the AI would over simplify it or miss something important.
But I was actually really happy with the result. The LLM took my very basic prompt and expanded on it. It pulled info from reddit, hacker news, product hunt, and other websites and compiled me a full report detailing exactly what I was looking for (with links back to the sources), and then went a few steps further and also suggested some reddit channels I could post in for more feedback and drafted what to post.
I’m sure this could be polished up and expanded more, but I think the lesson here is you don’t need the perfect prompt.
Research doesn’t always need to be a project — sometimes it just needs ten minutes and a few good questions.
Building (and interviewing) AI versions of my users
Inspired by a recent post from Elena Calvillo, I’ve been wanting to test out building some AI user personas.
I decided to try this out for my current personal project – building a reflective journalling app. And it was actually really fun, and insightful. It helped me understand more about my potential users, such as other apps these users might use, why they stopped using them, what they want, and even what they’d be willing to pay.
The first persona that AI created felt like looking into a mirror, which is not necessarily a bad thing since it knows I’m building this for myself. It also made it easier to test if the AI was giving what felt like realistic responses.
But then as I started to interact with the others, I started to see more reflections of myself, which made me wonder if ChatGPT had customised them all to be relatable to me, or if everyone is exactly like me (unlikely).
I decided to get a second opinion and brought Claude into the mix. I use Claude a lot as well, but it knows less about me, and I felt like it would give less biased and less personalised responses, which is actually what I needed here.
The personas it created were similar, an anxious professional, a growth seeker, a creative, a gratitude practitioner, and a memory keeper, but they felt a little less like me (which was a good thing).
I actually really liked that it included one that isn’t my target audience (the memory keeper who wants a visual archive of their life). I know its a large part of the audience, but it’s also not who I am building for, and so being able to have that persona in the mix here might be interesting as well to understand how those users are different to my target ones.
Here’s the simplified step-by-step process that I used. I didn’t time it, but it felt like the whole process took less than 30 minutes:
Create your personas. I started broad and then this helped me understand a bit more about who I am really building for and which personas I care about the most.
Create a project or custom GPT where the personas will live. I like to have just one where I can swap between personas using commands. I also set it up to have an option for /all for those times when I want to see how all the personas might respond to a new feature or change.
Perform virtual interviews with each AI persona. To streamline this process, I got an LLM to draft the first set of questions, but once I did that I felt my brain open up with new ideas and questions to ask. I then ran through the questions with each persona, and it felt surprisingly similar to doing real customer interviews. But a lot quicker. I could ask additional questions and interact with the persona, and then also start to see what was similar and different between different user types.
One of the interesting things about this process wasn’t just seeing what the personas said and learning some new things I hadn’t considered, but it started to give me more ideas about what kind of questions I want to test with real users.
AI personas aren’t a replacement for real user interviews, I’m sure there are still things that real users will say or pick up that an AI wouldn’t, but it feels like combining them both could lead to better customer experiences and better designed products.
AI isn’t replacing user research, but it can help us do more of it
Using AI for user research shouldn’t replace talking with real users, but it can make it quicker and easier.
It helps me move from wanting to do more research to actually doing it.
If you’ve been putting off your next round of research, try bringing AI in as a partner. You might be surprised how much momentum — and clarity — you can build in just 10 minutes.
My top three things you can try that will probably take less time than you think:
Ask AI to help you draft a user survey
Ask AI to help you with a research project you’ve been putting off
Create some AI personas and chat with them
If you try any of them out, I’d love to hear how you go! Drop a comment below.




Super insightful stuff! I've dabbled in AI-powered research at Canva, but that was primarily to help with the post-research data synth. It was overall a time-saver, but I did have to double-check its output because when dealing with large amounts of data it sometimes drew conclusions where it logically shouldn't have.
Regarding AI-generated personas, do you think there's a risk of some bias creeping in? I find when using AI in this way I need to be really on top of identifying potential bias before it can influence the output.
Love this idea. Have you tried to bring in your personas into all of the different walks of AI yet? Seems like embedding them into a Space and always prompting against that in memory is a great use.