Analyzing User Feedback
User feedback is everywhere: support tickets, sales calls, Reddit, social media, and surveys. The challenge is synthesizing it into actionable insights without losing critical nuance. Treat every escalation as a gift, use AI to process large volumes, and balance quantitative data with direct qualitative observation.
The Guide
5 key steps synthesized from 56 experts.
Centralize and aggregate all feedback sources
Valuable product insights are often trapped in silos like Salesforce, support tickets, and sales call transcripts. Use tools like Dovetail or AI-driven analysis to aggregate and synthesize feedback from all sources into a queryable database that prevents redundant research.
Featured guest perspectives
"The first product we launched is called Reforge Insights, which acts like your AI product researcher, aggregates all the feedback from all the sources, uses AI to analyze it, helps you explore it, but also will start to identify what are the gaps, the things that you don't have in your feedback today."— Brian Balfour
"It's also about aggregating all the interviews or customer research that's been done so people can query it and start to see what do we already know so we're not going out there and duplicating a bunch of research."— Melissa Perri + Denise Tilles
Cluster feedback into behavioral pathways, not demographics
Don't just look for themes; cluster feedback into 'pathways' that reveal the different reasons users hire or fire your product. Look for conflicts where one user group wants speed while another wants thoroughness. Use LLMs to bucket user intent from unstructured conversational data.
Featured guest perspectives
"And then what we do is instead of trying to look for themes across all of them, we actually do something, instead of segmenting them, we cluster them, we find the pathways because what you start to realize is it's not one reason why people do it, it's sets of reasons."— Bob Moesta
Treat every support ticket and escalation as a product failure
Customer support interactions are signals of product friction. Treat every escalation as a gift that enables deep root cause analysis. Track 'customer confusion' tickets as a primary metric and share every negative review with the relevant tech lead, PM, and designer monthly.
Featured guest perspectives
"The first principle there was saying, 'Well, every support ticket is a failure of our product.' We literally have that as a quote just posted on all those channels. It's a failure."— Geoff Charles
"There's no greater gift to me as a product executive than receiving an escalation from a customer... Escalations are a gift... it's people who are just particularly skilled at pistol whipping other people in the company to get to real root causes."— Matt MacInnis
Monitor unfiltered community platforms like Reddit
Reddit and similar platforms provide higher-signal feedback on product flaws and 'real' user sentiment than polished social platforms or direct surveys. Use Reddit's upvoting mechanics to prioritize which user pain points to address first.
Featured guest perspectives
"A few of us are constantly on Reddit. There's praise up there and there's a lot of complaints... Reddit is a little more negative but real actually. So I've started increasingly paying attention to how people are talking about using Codex on Reddit actually... you get good signal on what matters and what other people think."— Alexander Embiricos
Create a 'bad day' chart to track friction
If you're unsure what to measure, emit a log line every time a user hits a 404, delay, or decline. Create a stacked bar chart of 'bad day' reasons to identify the most frequent pain points. This creates a prioritized backlog of quality improvements.
Featured guest perspectives
"If you are unsure what to measure... just make a user having a bad day chart, emit a log line and count it as a bar chart... It's like, oh, I'm working on making this chart go up and it feels good to just say the name out loud."— Jeff Weinstein
Common Mistakes
- Over-relying on dashboards at the expense of qualitative understanding
- Filtering feedback through intermediaries and losing critical nuance
- Taking user requests at face value instead of understanding the underlying need
- Dismissing research insights as 'obvious' due to hindsight bias
Signs You're Doing It Well
- You have a centralized, queryable database of customer feedback
- Product decisions are informed by both quantitative data and qualitative context
- You regularly uncover 'knowledge gaps'things customers aren't telling you
- Negative feedback trends decrease over time as you address root causes
All Guest Perspectives
Deep dive into what all 56 guests shared about analyzing user feedback.
Alexander Embiricos
"A few of us are constantly on Reddit. There's praise up there and there's a lot of complaints... Reddit is a little more negative but real actually. So I've started increasingly paying attention to how people are talking about using Codex on Reddit actually... you get good signal on what matters and what other people think."
- Monitor niche subreddits for unfiltered product complaints and bug reports
- Use Reddit's upvoting mechanics to prioritize which user pain points to address first
Aishwarya Naresh Reganti + Kiriti Badam
"You cannot practically sit and evaluate all the traces. You need some indication to understand what are the things that I should look at. And this is where production monitoring helps. And once you get this kind of traces, you need to examine what are the failure patterns that you're seeing in these different types of interactions."
- Use production monitoring to flag problematic traces for deeper review.
- Identify recurring failure patterns in user interactions to inform the creation of new evaluation datasets.
Anneka Gupta
"One way that we're using AI today is summarizing our user research calls. ... it'll find you the call, it'll find you the context, it'll find you the transcript and summarize exactly what we've learned from that call."
- Use tools like Dovetail to automate the transcription and summarization of customer discovery calls.
Barbra Gago
"We did case studies throughout the process. We got feedback on logos. We got feedback on brand and designs, and there were plenty of people who were like, 'Oh, I like RealtimeBoard. It's straightforward.' They loved their product, but ultimately, we kept many of them informed and we really involved them in the process to the degree of not slowing us down but still getting feedback."
- Run case studies and feedback sessions on new logos and designs with existing users
- Keep early advocates informed of major changes to maintain their sense of ownership
Bob Moesta
"And then what we do is instead of trying to look for themes across all of them, we actually do something, instead of segmenting them, we cluster them, we find the pathways because what you start to realize is it's not one reason why people do it, it's sets of reasons."
- Identify the 'hire and fire' criteria for different clusters of users.
- Look for conflicts where one user group wants speed while another wants thoroughness.
Brian Balfour
"The first product we launched is called Reforge Insights, which acts like your AI product researcher, aggregates all the feedback from all the sources, uses AI to analyze it, helps you explore it, but also will start to identify what are the gaps, the things that you don't have in your feedback today and auto-generate the research to go gather all those new insights."
- Aggregate feedback from all sources into a centralized AI-analyzable repository.
- Use AI to identify what customers are NOT saying to guide future research efforts.
Brian Tolkin
"All these ideas and feedback that comes from everywhere, make sure it's written down for a number of reasons. One, you can then go reference it, but two, part of the job is making sure the people who present those ideas are heard and respected and know that it's at least somewhere where it was considered."
- Capture all feedback in a centralized backlog (Google Sheet, Jira, etc.) even if it won't be acted on immediately
- Acknowledge the source of the feedback to maintain cross-functional respect
Bret Taylor
"Literally taking what a customer says or what a user says in a focus group or a usability study is rarely correct... it's very important to get right."
- Practice intellectual honesty to distinguish between surface-level complaints (e.g., price) and root causes (e.g., lack of value)
Chandra Janakiraman
"The ask is really to create a meta analysis of all of the analysis... scan the historical archives at the company and sort of synthesize and condense that into a very sort of digestible macro themes."
- Create a meta-analysis of historical behavioral data and feature launch performance
- Synthesize 'soft' signals from customer service, social channels, and UXR into macro themes
- Have the strategy working group observe a user directly to build empathy
Christine Itwaru
"The synthesis of both this qualitative and quantitative data as a theme that I saw arising across our customer base... bringing all of these different inputs that would traditionally be handled by product manager... to the surface when they're going through their product development lifecycle planning."
- Aggregate qualitative feedback from sales and success with quantitative usage data
- Surface synthesized feedback themes specifically during the product development lifecycle planning phase
"We brought those folks in a room together... our head of professional services team at one point was like, 'Whoa, this could truly impact what we do from an onboarding perspective and now we have this data.'... The PM got risk data, high-priority deals, feedback from our feedback product. What are we hearing from prospects versus paying customers."
- Bring revenue, success, and product teams together to review feedback collectively
- Segment feedback by customer type, such as prospects versus paying customers, to identify high-priority deals at risk
Claire Butler
"We implemented Intercom back in the early days... Dylan would jump in, an engineer would jump in. And he'd open up a chat with people and they'd actually debug the product with us live. They'd be like, "I have this bug," and this engineer would be like, "Let me QA it right now.""
- Have the entire team, including engineers and founders, participate in early customer support
- Debug issues live with users to demonstrate extreme care and responsiveness
"We decided a couple years ago, we had this idea where we were like, oh, what if we package all of those quality updates up into one thing and launch them together and we could even show the tweet or the forum request that spurred us to do this. And that was where our idea of little big updates came."
- Run 'quality weeks' where engineers fix small, high-annoyance bugs
- Publicly credit the specific user requests or tweets that inspired the updates
Dan Shipper
"Same thing for if you've got tons of customer interviews or tons of customer data you want to go through, it's incredibly powerful for going and figuring stuff out from big data sets like that."
- Use an agentic CLI tool to read through every individual customer interview file to find specific behavioral patterns
Donna Lichaw
"Take a data-driven approach to the stories that you tell yourself. For example, the story, 'I'm too nice,' it could be true, it could not be true. How did we get down to the bottom of that? In this case, what we did is I went out there and talked to his team... And I found out how people actually experienced him and his leadership."
- Conduct informal 360-degree research to compare your internal self-story with how your team actually experiences your leadership.
Eeke de Milliano
"We use Slack very heavily to talk to our customers... we have hundreds of Slack channels with customers. Every time we're testing a new product... We will work with them in Slack, to get there off-the-cuff feedback, just back and forth."
- Create dedicated Slack channels for key customers or beta testers
- Use 'off-the-cuff' back-and-forth messaging to iterate on new features
Elizabeth Stone
"We talk about it as a superpower internally in combining those skill sets. So I think the Consumer Insights team at Netflix has had a lot of credibility in a certain area of expertise and we took it to the next level by combining it with other functional expertise... combining attitudinal research, qualitative and quantitative with behavioral research on more of the data science, data engineering analytics side."
- Integrate user research (UXR) and data science teams to solve complex product problems
- Validate behavioral data trends with qualitative user interviews to understand the 'why' behind the 'what'
Ethan Smith
"What are all the questions people are asking you on your sales calls, customer support on Reddit? Mine all those questions that exist somewhere else. Probably those same questions are being asked in chat."
- Review sales call transcripts for specific product comparison questions.
- Analyze customer support tickets to identify 'how-to' queries that aren't currently covered in your public documentation.
Geoff Charles
"The first principle there was saying, 'Well, every support ticket is a failure of our product.' We literally have that as a quote just posted on all those channels. It's a failure."
- Share every negative review with the relevant tech lead, PM, and designer monthly.
- Track 'customer confusion' tickets as a primary metric for user experience quality.
- Halt new feature shipping if operational burden or confusion metrics exceed a threshold.
Gibson Biddle
"I'm a feedback freak, so you'll notice at the end of every talk I do, every essay... there's a link to give me feedback. And it's always the same question: On a scale of zero to 10... how likely would you be to recommend this to a friend?"
- Include a feedback link in every product or communication
- Ask for a numerical rating (0-10) plus 'what you liked' and 'what could be better'
- Review feedback daily to identify patterns for improvement
Hamel Husain & Shreya Shankar
"The first step in conquering data like this is just to write notes... You sample your data and just take a look, and it's surprising how much you learn when you do this."
- Sample production traces and manually annotate them
- Perform 'open coding' by writing down the first thing seen that is wrong
"Keep looking at traces until you feel like you're not learning anything new... theoretical saturation."
- Aim for 'theoretical saturation' where no new error categories emerge
- Review at least 50-100 traces to build a representative taxonomy of errors
Guillermo Rauch
"Create a lot of opportunities for people to give you feedback inside the product... a feedback button with a very slick inline form, with four emojis that would allow you to decide how you were feeling about the feature... That would go straight into Slack. And we were building day in and day out, just streaming users' thoughts right into our consciousness."
- Embed a simple feedback button with emoji-based sentiment selection.
- Integrate in-app feedback directly into a shared Slack channel for the entire team to see.
Gustav Söderström
"If you look at what people do on Spotify's homepage... it is almost 90% what we call recall... When we tested the design... we switched it from 90/10 to 10/90. 10% recall, 90% discovery. And while people want discovery, they probably don't want 90% discovery, instead of 90% recall."
- Distinguish between 'recall' intent and 'discovery' intent in usage data
- Compare new user cohorts (who have no old habits) against long-term users to identify if feedback is about habit-breaking or bad design
Inbal S
"Spend time with customers and learn from your customers, because a lot of the innovative ideas are coming basically from conversation with customers because they're sharing with you their frustration, they're sharing with you what they would like to have, they're sharing with you what will be an amazing invention for them."
- Create mechanisms to gather feedback from both large and small community foundations
- Listen for 'amazing inventions' customers describe when discussing their pain points
Howie Liu
"doing a lot of LLM calls against long transcripts of let's say, sales calls to extract different types of insights like here's the product apps, identify or here's summaries, et cetera."
- Run LLMs against sales call transcripts to identify product gaps and positioning insights
- Use 'map-reduce' LLM patterns to aggregate insights from massive datasets
Jeff Weinstein
"If you are unsure what to measure... just make a user having a bad day chart, emit a log line and count it as a bar chart. ... It's like, oh, I'm working on making this chart go up and it feels good to just say the name out loud."
- Emit a log line every time a user hits a 404, a delay, or a decline.
- Create a stacked bar chart of 'bad day' reasons to identify the most frequent pain points.
Jen Abel
"It is the founder's job to interpret that because a lot of feedback you get is this is the old way, they're responding this way because it's the old way of working... 80% noise, 20% had I not asked this question, I wouldn't have gotten that gold in terms of where they are today."
- Apply the 80/20 rule to feedback: 80% is noise based on legacy habits, 20% is the 'gold' that guides the future product.
Judd Antin
"Everything is Obvious If You Already Know the Answer. And it's about hindsight bias... we end up selectively remembering things and then constructing narratives around them in a way which makes us feel like we already knew that, when we in fact did not."
- Use research to explain the 'how and why' behind data, which AB tests cannot provide.
- Be wary of creating simple, convenient stories about past events that ignore conflicting evidence.
Julie Zhuo
"Conversational analytics is totally different... it's actually harder for us to tease apart what is the user intent... we probably have to use an LLM or a machine learning model to bucket user intent."
- Use LLMs to bucket and synthesize user intent from conversational data.
Kristen Berman
"We try to understand what people actually do versus what they say they will do. And by doing that, it becomes wildly clear that we should all be skeptical that budgeting would actually work to change the behavior that you're trying to get to."
- Perform a behavioral diagnosis to see if a requested feature actually reduces the friction required for the target behavior.
Laura Schaffer
"I just started sharing a voice of the customer report. I started sharing my insights, started writing down and just sharing them. It became with digest and eventually people were like, 'Hey, can you share it with me? Can you share with me? Can you get on your list?'"
- Create a regular 'voice of the customer' digest to share with the company
- Host quarterly sessions to present customer insights to the product organization
- Compile feedback from multiple departments to provide a holistic view of customer pain points
Maggie Crowley
"People who are really excited about being data-driven, to me that is oftentimes a red flag for their product thinking... they're over-emphasizing quantitative data at the expense of qualitative data and they're not using good judgment."
- Balance quantitative dashboards with direct user interviews to understand the 'why'
- Talk to 10 users to get better insights than a dashboard can provide
- Use judgment to identify 'obviously better' improvements that don't require exhaustive data justification
Marty Cagan
"When we're doing user research, we're finding all the reasons they don't like it. In fact, that's an Elon Musk quote is when you do user a research, you should be focused on finding all the reasons they won't use your product."
- Focus user research on finding reasons why a customer would NOT use the product
- Use evaluative research to uncover flaws in the solution
"The second thing is they have to be an expert in the data. How is your product used? How is that change over time? What's the sales analytics? What's the user analytics? So you have to know how your product is actually used"
- Study user analytics to understand how the product is used over time
- Analyze sales data to understand the business impact of product usage
"Another is you're supposed to be the expert on the data. How is our product being used? How is that usage changing over time? How is it being purchased? So that's big."
- Become the primary expert on product usage data for your team.
- Track how usage patterns change over time to inform product decisions.
Matt MacInnis
"There's no greater gift to me as a product executive than receiving an escalation from a customer... Escalations are a gift... it's people who are just particularly skilled at pistol whipping other people in the company to get to real root causes."
- Treat every escalation as a gift and a chance to perform a deep root cause analysis.
- Trace errors back through the software, then the system, then the process that created the system.
Melissa Perri + Denise Tilles
"It's also about aggregating all the interviews or customer research that's been done so people can query it and start to see what do we already know so we're not going out there and duplicating a bunch of research."
- Use tools like Dovetail to aggregate qualitative research
- Create a 'findings database' that allows team members to query past interview results
"Another piece of this too that we talk about is getting qualitative insights from sales and support... what we're trying to do is get that out of these individual systems and into somewhere where a lot of people can take those qualitative insights and start to learn from them."
- Extract customer feedback from Salesforce and support tickets into a central system
- Establish a feedback loop with sales to track churn reasons and feature requests systematically
Melanie Perkins
"We get more than a million requests from our community every year and we've got a whole incredible team that then tallies them, breaks them down, and then delivers them to all of our product teams and then those actually get closed."
- Tally and categorize community requests to identify the most 'hotly requested' features
- Create a 'closing the loop' process where product teams are assigned specific community requests to resolve
Nick Turley
"I go through those in detail because it's not like I knew about those use cases either. They're very, very emergent and I just go through the comments and process because there's so much to learn. ... I find it very, very useful. It's just fun to watch people talk to each other about the various use cases that they have."
- Read comment sections on viral social posts to find novel user applications
- Use conversation classifiers to identify high-level trends in user data at scale
Nicole Forsgren
"Data from people and data from systems are really important compliments because we can get certain insights from people that we'll never get from systems. Let's look at lead time from changes, for example, from commit to deploy. The speed might be fine, but people might tell you it's taking absolute heroics. It's some ridiculous Rube Goldberg machine. The system will never tell you that."
- Survey developers at least once a year to uncover insights that automated systems cannot capture.
- Look for disagreements between system data and user surveys; the surveys are often the more accurate reflection of reality.
Nikhyl Singhal
"You have to be great at pulling feedback, listening to it. You have to triangulate it from people that don't see you all the time, that do see you all the time, your peers. But you have to create an environment of safety where people feel like there is no worry about retaliation or concern"
- Repeat feedback back to the provider better than they articulated it to show internalization
- Publicly share and credit constructive feedback you've received to encourage a feedback culture
- Look at the 'discard pile' of feedback—anomalies you previously dismissed—to find hidden development areas
Nikita Bier
"Put live chat customer support in your app 24 hours a day... it's the best vehicle for getting feedback and doing user research because users will literally tell you the problem they're having."
- Embed live chat in the app during the early launch phase
- Have a dedicated process for piping interesting support feedback directly into product development channels
Nilan Peiris
"And when you read the comments, and now obviously we've got all kinds of fancy models sitting on top of these things, customers kept telling us the same things, 'Make it faster, make it cheaper, make it easier to use.' ... And customers were pretty clear, the ones that were evangelical, is the word we use, are the ones that had a much... Had this cheaper experience, the ones that were talking about it had a fast experience."
- Read the specific comments from 'evangelical' users (9-10 scorers) to understand what drives their excitement
- Aggregate feedback into high-level pillars like price, speed, and ease of use
- Share raw customer comments with the entire company to build collective conviction on what matters
Oji Udezue
"Customer listening is different. It's not really discovery. It is the scarfing up of customer signals that are happening constantly anyway. So people are talking on social, people are talking on app stores and G2 crowd. If you have a instrumented churn survey, people are talking. If you have NPS, people are not only giving you the scores where they're putting in the verbatims."
- Triage signals from Zendesk, Salesforce (closed-won/lost notes), and app store reviews to see frequency distributions.
- Organize a system where PMs and designers have customer signals automatically surfaced to them to reduce the friction of 'listening'.
Patrick Campbell
"Only one out of 10 companies actually do customer research or development on a quarterly basis. This should be a continuous thing, it should be a monthly, weekly type thing."
- Commit to 10 non-sales customer conversations per month
- Send short, targeted surveys (30-45 seconds) rather than long, complex ones
- Use research to understand customer perception and worldviews rather than just taking feature requests
Ramesh Johari
"There's a great concept in the literature on rating systems called the sound of silence, which is this idea that there's a lot of information in ratings that are not left... this was much more predictive of downstream performance of a seller. So there's a lot of information in that lack of a response."
- Incorporate 'non-responses' into your quality metrics to get a more accurate picture of participant performance.
Robby Stein
"When we asked people to understand people like, 'Why aren't you posting to your story? What's preventing you from doing it?'... It was like this commonality was audience problem. Someone had an issue with people watching them."
- Look for commonalities in why people *don't* use a feature to identify core product gaps.
- Use qualitative research to uncover 'audience problems' or social friction in sharing apps.
Sachin Monga
"We call it build with writers, build with readers... do it in a way where we bring writers along... We actually have now we've set up something called the Product Lab... an invite only little group of a hundred or so writers that we know are interested in being on the bleeding edge."
- Run small pilots with a subset of users to test high-risk or fundamental changes.
- Use direct feedback from these groups to iterate on the feature's design before it hits the broader dashboard.
Sean Ellis
"One of my favorite questions is... 'What is the primary benefit that you get?' And then I use that initially as an open-ended question to kind of crowdsource different benefits people are getting. But then I run another survey where I turn it into a multiple choice question, force them to pick one of four distinctive benefit statements. And then the question that follows on that next survey is, 'Why is that benefit important to you?'"
- Crowdsource benefits with open-ended questions first.
- Use multiple-choice follow-ups to force prioritization of benefits.
- Ask 'Why is this important to you?' to understand the user's context and pain points.
"Why don't we just ask them why they signed up and didn't download the software? ... we just asked, 'Hey, notice you haven't had a chance to use the product yet. It looked like it was coming from customer support. What happened?' And the answer we got back... was, 'Oh, this seemed too good to be true. I didn't believe this was free.'"
- Send personal-looking emails from 'customer support' to users who drop off at specific funnel steps.
- Ask open-ended questions about why they didn't complete a specific action (e.g., downloading or installing).
Shaun Clowes
"He used to call this concept a Feedback River, and he basically said that really smart product managers are constantly swimming at a Feedback River. They set out to surround themselves by Feedback River and I really deeply believe in that."
- Set up automated streams of user interview data, NPS, and competitor info to 'wash over' you daily.
"We use LLMs to take in those asks to summarize what they're about, to find other asks that are like that one, really in a compelling way, a real way, like a semantic way, not other words, exactly the same, are these the same concept? So that we can look across all of the inbound demand on us and say, 'Well, the most popular idea is this one and is getting more popular.'"
- Use LLMs to perform semantic clustering on inbound customer requests to identify true trends in demand.
Shweta Shriva
"The human brain is trained to, or the human drivers are sort of in subconsciously, they slow down when they go downhill in those slopes. The autonomous vehicle doesn't necessarily have to do that if it's safe... But we learned that this is a more natural driving experience and this is what our riders would also expect in terms of the experience. So that's something that we then modify the behavior on."
- Identify gaps where technically 'correct' behavior feels 'unnatural' to users.
- Modify system behavior to align with the subconscious habits and expectations of users.
Tomer Cohen
"Our research agent basically is trained on the personas of our members... It's using not just world knowledge, it's using all the research we've done in the past, all the support tickets coming in. So it's pretty good at understanding that persona at LinkedIn."
- Train research agents on internal personas and historical user research
- Feed support tickets into AI models to provide a 'voice of the customer' during the design phase
Tim Holley
"we were worried about certain sellers not being able to meet the demand that they were seeing. And so we did the old-fashioned thing, of not personally, but we called them and we said, 'How are you guys doing? What can we do to help?'"
- Call power users directly during crisis or high-growth periods to understand their immediate needs.
- Ask open-ended questions like 'What can we do to help?' to uncover hidden pain points.
Uri Levine
"In order to improve, we need to speak with those that fail, those that were unsuccessful, those that did not register, or they did register and did not use, or they did use and did not come back, because they know something that we really need to know."
- Interview users who churned or failed to complete registration to find the 'why' behind the failure.
Vijay
"What this created was this culture where all engineers and designers could consume that raw feed of direct points of customer with no gatekeeper, no process to access it, no pre-aggregation"
- Pipe customer 'gaps' from sales and success teams directly into a Slack feed
- Encourage engineers to reach out directly to customers to ask 'the five whys' behind a specific feedback point
- Enrich feedback feeds with account data (ARR, CSM name) to provide context for the team
Yuhki Yamashata
"We created this new channel, private channel called Concerning Tweets, and it just, we're this small group of us that Dylan can drop us in. And these are tweets that aren't going viral, by any means. They're just things that you see is with one like, sometimes zero likes, but he feels there's an essence of truth to them and we make sure that we look at what's going on there and see if there isn't something much bigger that we should be focusing on."
- Create a dedicated channel for 'concerning' low-volume feedback
- Balance vocal minorities on social media with broader data from support tickets and sales
- Treat individual complaints as potential signals for systemic blind spots
Dylan Field
"I definitely look everywhere trying to constantly ingest information about Figma, and it's not just Twitter/X, whatever that's called now, but anywhere on the internet, support channels, et cetera. And I'm always trying to understand. I also ask a lot of questions and I try to get to root problems and understand where people are coming from and what are they actually trying to solve. Sometimes people are saying, 'Hey, I need X', but they really want Y or Z."
- Monitor non-traditional channels like social media and support forums to ingest broad feedback
- Ask follow-up questions to uncover the 'Y or Z' problem behind a user's request for 'X'
Josh Miller
"Membership... we need to have a deep, genuine, ongoing relationship with them. And on day one, that may be air quote, 'customer success.' ... On day 37, that might be, 'I have a bug.' ... And on day 58, that may be, 'Hey, we're shipping the mobile app soon. What do you want from mobile?' Other companies view that as different orgs and disciplines. We view that as, there are a bunch of people at the other end of this that we are serving"
- Own the user relationship 'full stack' from first touch to long-term retention
- Use the same team for bug reports and feature discovery to maintain context
Megan Cook
"First of all, I mentioned we had that survey and so we had really rich feedback. So it's not just a rating, what we get, we get people talking about why they gave that rating and that can really help us zero in on what are the key aspects that's bringing this down."
- Focus on the 'why' behind CSAT or NPS ratings to find actionable usability gaps
- Share visceral customer feedback (videos or quotes) with the team to build emotional investment in quality improvements
- Connect usability improvements to business metrics like acquisition and expansion
Tamar Yehoshua
"One of the mistakes that I see a lot of product managers make is they over index on people who are going to be unhappy with the products they're launching... design it for the bigger number of people who are going to be using it tomorrow. If you have to redo the UI and the Who Moved My Cheese, people will be unhappy, but all the new people are going to be like, 'This is so much easier.'"
- Be transparent and authentic when explaining why a change was made, rather than using 'marketing speak'.
- Give users a choice or a transition period when sunsetting features to make them feel heard.
Install This Skill
Add this skill to Claude Code, Cursor, or any AI coding assistant that supports Agent Skills.
Download the skill
Download SKILL.mdAdd to your project
Create a folder in your project root and add the skill file:
.claude/skills/analyzing-user-feedback/SKILL.md Start using it
Claude will automatically detect and use the skill when relevant. You can also invoke it directly:
Help me with analyzing user feedback Related Skills
Other Product Management skills you might find useful.
Defining Product Vision
Product vision in media should center on the core content (journalism) enhanced by a user experience...
View Skill → →Problem Definition
Directly experiencing the product as a user or provider reveals fundamental flaws in problem definit...
View Skill → →Prioritizing Roadmap
Growth roadmaps must be sequenced based on the underlying growth model and available resources rathe...
View Skill → →Setting OKRs & Goals
Avoid 'toddler soccer' (everyone chasing the same metric) by detangling goals into specific input me...
View Skill → →