How to Use AI in User Research – Three Easy Wins

We present three ways of using AI in user research that can help you speed up those menial tasks and focus on strategic work.   

AI won’t take your job. Someone using AI will.


In the early days of the generative AI hype, user researchers might have been worried that their jobs were on the line. After all, that job involves processing data, understanding patterns, connecting those patterns to high-level concepts, and creating actionable insights – things that artificial intelligence excels at. 

However, as AI-powered tools entered the mainstream and we’ve been in peaceful coexistence for a while, we see that Prof G was right. Designers are still designing, copywriters are still copywriting, developers are still developing, and user researchers are still researching.

In fact, research from the leading UX organization Nielsen Norman confirms that many of us are using AI in user research regularly – and probably wondering if we really did all that by hand before.

In this article, I’ll show you three ways we are leveraging AI in user research without overhauling our whole process.

AI in user research – the new tool in your toolbox

AI is not a new user researcher but rather a new Maze or R. Like any other method, technique, or tool, when deployed properly, it helps you do your work more quickly. When deployed poorly, it helps you be out of work more quickly.

You are still in the driver’s seat. And the gist of user research remains the same: find valuable insights, present them to the product team, and engage in the ideation of improvements and new features. 

User research has always been a game of balancing time and value. Just think about the dozens of menial and repetitive tasks included, the painstaking grind needed to get to the good stuff, all the things you’d happily delegate to your junior colleagues. 

Unfortunately, you don’t have infinite junior colleagues – but you have AI (tip of the hat to Benedict Evans, who calls this the era of infinite interns).

Assuming you’ve at least experimented with AI in user research, I’ll skip the basics and listing all the tools that leverage AI. These are available in our Product Strategy Handbook, anyway.

We’ll focus on the enhancements in the user research process made possible by conversational AI tools, tried and tested on actual projects.

#1 Crash course on any domain

When I checked my schedule for last year, I realized I worked on projects in banking, telecom, hospitality, e-mobility, gaming, and social media. All of those clients expected me to be knowledgeable about their domains. But this is agency life, right? It’s part of the job. 

Before AI tools, you’d do online research and interview a few subject matter experts (SMEs) and client stakeholders. Now, you have an SME at your fingertips. Especially if you’re using a tool that can search the internet – you’re working with up-to-date info.

AI doesn’t fully replace talking with SMEs but speeds up the process immensely. You’ll understand the domain more quickly, and your questions will be more “insider” when you interview them.

As an added bonus, asking these types of questions does wonders for your image in front of clients.

The questions below can probably be useful in a number of scenarios, but agency user researchers are likely to get the most mileage out of them.

This is what we like to ask AI when diving into a new industry:

  • What are the three main players in the industry? Describe each in three main bullet points expanded on with three sub-bullet points. (Ask for your geo market when applicable.)
  • Who are newcomers to the industry in the last few years?
  • What’s the most common business model in the industry?
  • What are some common business KPIs in the industry?
  • What are some non-business factors influencing the industry? (Go through the PESTLE framework if unsure)
  • Who are the common users/customers in the industry? Describe their demographics and psychographics.
  • What are users complaining about the most in the industry?
  • What are the most common features of digital products in the industry?
  • What trends will be shaping the industry in the next five years?
  • What are some industry lingo phrases that I should know? List at least five and provide an explanation for them. (This works wonders to make you look like an insider.)

Keep in mind that these are conversation starters. Put your curiosity hat on and ask follow-up questions the same way you would with an SME. The nice thing about talking to an AI chatbot is that you don’t need to keep notes. You’ll always have the conversation log. 

Additional pro tip: ask AI to summarize your conversation about that industry.

#2 Fast-track research script building

Research scripts are pesky little things to build. Even when you know how you want to conduct the research, you have a template, and you’ve done something similar before, it’ll take you a lot of time to structure the script in a useful way.

Lucky for you, your AI junior colleague can get your script to 80% done in minutes. It will still need some additional tweaking, but those are value-adding tweaks that are both impactful and interesting.

When I say “script,” I mean anything you might use to elicit and capture a response or behavior from participants: interview and survey questions, usability testing task instructions and helper text, follow-up questions, instructions for moderators, response gathering tables, metrics to be tracked… 

Here are the steps to get the most out of AI when compiling research scripts:


Provide context on your product or service, and be as specific as possible.

You can even share wireframes or sketches (respect the boundaries of the NDA, if one exists).


Provide the goal of your research.

NN/g’s qual-quan and behavior-attitude frameworks are a good way to go.


Provide the method you plan to use.

Sometimes it makes sense to ask the AI which method it would suggest. You might be pleasantly surprised.


Ask for the whole package.

Cover the “before, during, and after” materials you’ll need. For instance, recruitment emails, usability tasks, and testing metrics.


Ask AI to overdeliver and provide alternatives.

If you plan to have four usability tasks, ask for ten and two alternatives for each.


Instruct AI to structure the inputs.

You want to get neatly organized titles, subtitles, and bullet points.


Tinker with the outputs.

Tell the AI tool which parts are ok, which ones you want removed, and which ones it should rewrite or expand on.

Once you’re happy with the script, copy/paste it to your text editor of choice. Some final edits, and you’re good to go.

#3 Summarize qual raw data

Qual data is a gold mine, but not many user researchers enjoy the mining part. Luckily, parsing text data and finding patterns is where AI shines.

Your own interview and testing notes are still irreplaceable. They’re your playground for hypothesizing, marking high-level trends, and ensuring you follow up on interesting participant remarks.

However, the process of going through interview transcripts and tagging topics in participants’ answers is very much replaceable – and delegatable to your AI assistant. As a bonus, you don’t need a multilanguage team if your qual data comes from multiple countries. Most AI chatbots are very good at translating. 

Here’s how we use AI to get actionable summaries of qual participant data:


Tag your data with “Participant 1”, “2”, “3”, and so on. Remove any personal information that might be in the transcripts and other notes.


Provide context on your product, your research goal, questions or tasks, and participants’ backgrounds.


Sometimes, you’ll need to break down your qual data per question or task. You need to make it easier for the AI to digest the summary and avoid missing some major trends buried under the heaps of raw data.


Ask AI to find 3-5 key themes in the data, provide three key bullet points on those themes, and reference participants talking about that theme.


Play around with outputs, ask additional questions, get AI to rephrase some themes, provide you with three additional themes, ignore some themes, or dive deeper into some others.


Once you have a better understanding of the most common themes, ask AI to structure the findings into three categories based on the rose-thorn-bud framework: already good, pain point, and potential improvement.

You can use AI in workshop tools such as Miro or FigJam for a similar purpose. However, that doesn’t mean you should.

Since you can’t provide any context or play around with outputs, AI tools in Miro or FigJam are better suited for summarizing workshop notes, not proper qual data.

Such qual analysis capabilities will probably soon reside in UXR tools. Dovetail has sentiment analysis, and I wouldn’t be surprised if they added something similar to the process above to their transcript analysis.

I suggest you leave summarizing to AI only if you’ve been present in the sessions with participants. If the AI summary is off, your gut feeling will immediately catch it. Trust your instincts.

Use AI in user research with a grain of salt

The magic of AI can make you believe you can pull off even the most complex research projects, and this is something to be wary of. AI is a Dunning-Kruger effect catalyst. With its help, beginners might feel they are on par with experts in mastering the subject. 

Always use AI in user research with a grain of salt. Catching some issues or illogical outputs requires mid- or senior-level expertise in user research.

AI tools are great at generating text outputs that look and sound convincing, so if you’re only starting with user research, tread lightly.

And regardless of your level of expertise, trust but verify what you get from AI. Check the key information either through independent desktop research, testing the hypotheses with participants, or talking to SMEs.

As I said, you’re in the driver’s seat – don’t delegate creating the research goal. Your strategic and product-specific perspective is unique, and AI is just a tool you’ll use to achieve that goal more easily.

Human+AI = a great user researcher

AI can enhance any UX researcher’s work, but the trick is to deploy it wisely. Use it on tasks where its capabilities shine and human motivation tends to drop. 

The three techniques described above will power up your user research game. You’ll be able to focus on finding actionable insights and providing value for your product team while delegating less demanding tasks to AI.

Since these techniques allow us to focus more strongly on high-level strategic and research work, we’d be happy to discuss where you want to take your product. You can always reach out to us here.