How do you balance user feedback and data analytics in product decisions?
As a product manager, you need to make informed and strategic decisions that align with your product vision, goals, and user needs. But how do you find the right balance between user feedback and data analytics, two sources of valuable insights that can sometimes contradict or conflict with each other? In this article, we will explore some of the challenges and best practices of using user feedback and data analytics in product decisions, and how to leverage them to create better products.
User feedback is the direct input from your customers or potential customers about their experiences, preferences, and expectations of your product. User feedback can help you understand the problems, needs, and motivations of your users, and validate your assumptions and hypotheses. User feedback can also help you generate ideas, prioritize features, and improve usability and satisfaction. However, user feedback also has some limitations that you need to be aware of. For example, user feedback can be biased, subjective, or influenced by external factors. User feedback can also be incomplete, inconsistent, or outdated, and may not reflect the actual behavior or outcomes of your users.
-
You need both user feedback and data but I like to remember this quote from Jeff Bezos when the two inputs disagree: "The thing I have noticed is that when the anecdotes and the data disagree, the anecdotes are usually right. There is something wrong with the way that you are measuring it. You do need the data, but then you need to check that data with your intuition and your instincts." If you work in a big company, make an effort to go talk to 5 customers first instead of only looking at your metric dashboards everyday.
-
Diving deep on user feedback is both a critical ongoing pulse check of how users see your product over time, as well as seeing the "whole field" so you can analyze for opportunity spaces. When paired with data, company vision, you can start to see opportunities on that "field" and start to build out a strategic roadmap that you can fine tune. A strong UX research team and facilitator is most critical to help reduce risks like bias and leading the user, which can throw off the insights.
-
To balance user feedback and data analytics for product decisions: Collect quantitative and qualitative data Segment users and analyze feedback by group Prioritize feedback based on frequency, impact, and alignment with goals Use A/B testing to validate hypotheses and optimize UX Communicate decisions and rationale to users Foster a data-driven culture and empower teams
-
difficult to interpret at times. Additionally, users might not always provide constructive feedback or articulate their thoughts effectively, making it challenging to extract actionable insights. Furthermore, the sample of users providing feedback might not represent the entire user base, leading to potential misinterpretation of broader user sentiments. Understanding these limitations is crucial when analyzing user feedback to ensure a more balanced and informed decision-making process.
-
User feedback and data analytics are a powerful combo for product decisions. 1. Feedback (qualitative) tells you "why" and "how" users behave: Interviews, surveys, and tests help you understand user needs and frustrations. 2. Data analytics (quantitative) reveals "what" and "how much": Clickthroughs, engagement, and conversions show user actions. To balance them: 1. Gather data from various sources (surveys, tickets, social media). 2. Use the right method (interviews for why, A/B testing for clicks). 3. Analyze and categorize feedback (group similar comments). 4. Look for connections between feedback and data (e.g.: many complaints + slow loading time = high priority). 5. Translate insights into actions (testable ideas to validate).
Data analytics is the process of collecting, measuring, and analyzing quantitative and qualitative data about your product and its users. Data analytics can help you track and evaluate the performance, impact, and value of your product, and identify patterns, trends, and opportunities for improvement. Data analytics can also help you test and compare different solutions, and optimize your product for efficiency and effectiveness. However, data analytics also has some limitations that you need to be aware of. For example, data analytics can be complex, costly, or time-consuming to collect, process, and interpret. Data analytics can also be misleading, inaccurate, or irrelevant, and may not capture the underlying causes or reasons of your users' behavior or outcomes.
-
The biggest challenge with data analytics is that it doesn't necessarily show you the full picture or an objective picture. Just like with ChatGPT, the output of data analytics is: 1) Dependent on how the question was asked or how the dashboard was implemented. According to Chamath Palihapitiya, the AI role of the future will be "prompt engineer". The same is true with Data Analytics - how was the dashboard wired? How was data aggregated / presented? Garbage in, garbage out. 2) The story the interpreter of the data is looking to tell. A former nuclear scientist once told me: "A scientist can tell any story they want off of the same data". Just like any data, including user research, analytics data is subject to interpretation.
-
PMs have to be concerned about when not to use data or rely on it. While data helps uncover stories for underlying issues but if over-emphasized too much, it leads to decision fatigue! When that happens, PMs believe that data must back all decisions. Sometimes, the alacrity with which the decision is made is more important than the right decision. The skill to figure out when not to rely on data comes with instinct, backed by experience and risk appetite.
-
Here are a few additional considerations: 1. Data Quality and Accuracy: The accuracy and quality of the data collected can significantly impact the reliability of the analysis. Inaccurate or incomplete data may lead to incorrect conclusions and decisions. 2. Privacy and Ethical Concerns: With increased data collection comes the responsibility to ensure user privacy and adhere to ethical standards. Misuse or mishandling of data can lead to legal and ethical issues. 3. Resource Intensive: As you mentioned, data analytics can be resource-intensive in terms of time, costs, and expertise required. It involves employing skilled personnel, utilizing advanced tools, and sometimes dealing with vast amounts of data.
-
Data Analytics is only as good as the person wielding it. It is the sword in the hard of a PM warrior. If they know how to swing it; if they know their opposition and tactics, they can win, i.e. produce meaningful insights. Using it can be powerful, but knowing when and how is the most important. Restriction can be liberating. Otherwise we just have data that tells us eating chocolate correlates to living longer..
-
In addition to the mentioned constraints, a pivotal factor in optimizing the use of data analytics for informed product decisions is the establishment of a resilient data infrastructure. The effectiveness and precision of data analytics in guiding product decisions are markedly influenced by the strength of your data infrastructure, to guarantee the accuracy, dependability, and accessibility of data.
When it comes to striking a balance between user feedback and data analytics in product decisions, there is no one-size-fits-all formula. However, there are some general principles and tips that can help you make the most of both sources of insights. To begin with, it is essential to define your product goals and metrics, as this will enable you to focus on the most relevant and meaningful feedback and data, while avoiding being overwhelmed or distracted by irrelevant or conflicting information. Additionally, it is important to use both qualitative and quantitative methods, as user feedback and data analytics are not mutually exclusive, but complementary. Qualitative methods, such as interviews, surveys, or user testing, can help you explore the why and how of your users' behavior and outcomes, while quantitative methods, such as analytics, experiments, or benchmarks, can help you measure the what and how much of your users' behavior and outcomes. Furthermore, it is necessary to triangulate and synthesize your insights, and translate them into actionable and testable insights that can inform your product decisions. Finally, you need to iterate and validate your product decisions, and measure their impact and value. By using user feedback and data analytics to monitor and evaluate your product changes, you can learn from your results and solicit and incorporate feedback and data from your stakeholders.
-
Starting with qualitative research helps you create an intuitive understanding of the problem, giving you both depth and empathy on the space. Bringing a quantitative approach as a second layer helps you ensure you are working on meaningful enough problems and create focus. This dual approach works better and faster than either approach independently (boil the ocean with quantitative only, or miss the forest for the trees with qualitative only)
-
Qualitative methods play a pivotal role in product development and analysis. They not only help in identifying and understanding the root causes behind data anomalies but also act as a rich source of innovative, user-driven ideas. By engaging directly with users, these methods provide invaluable insights, bridging gaps that raw numbers might overlook. Thus, when integrated with quantitative data, they offer a comprehensive view, ensuring that product decisions are both informed and user-centric.
-
Depends on the stage of the product. While building early stage 0 to 1 products; when you can afford to fall in love with the problem, validation via user research is efficient, especially since there isn’t enough user data available. And when working on large-scale and matured consumer products with millions of users, the quantitative approach is relatively faster. For such products, qualitative user research provides anecdotal pieces of evidence to avoid the situation of paralysis by data analysis.
-
The saying at Amazon is that when anecdotes and data disagree, you trust the anecdotes (user feedback). That's because usually there's an issue with how you're measuring that prevents you getting to the ground truth of the user experience. The other factor is that it's inherently difficult for anecdotes to scale representatively. At a certain scale validation via testing becomes the only way to reliably get insights. You can measure your input metrics regardless, but those won't tell you the "why" in the way that a well-design test will.
-
The key is about finding that delicate balance between what users are saying and what the numbers are showing. Understanding user needs and behaviors is paramount. That's why it is crucial to dive deep into user feedback, whether it's through surveys, interviews, or user testing. At the same time, metrics like user engagement, retention rates, and conversion funnels give us the hard numbers we need to validate our decisions. Shreyas Doshi, a former product lead at Twitter and Google, once highlighted the importance of intuition in this mix. Intuition acts as a guiding compass when the data isn't crystal clear. It's that sixth sense that helps us make sense of the 'why' behind user actions.
Rate this article
More relevant reading
-
ResearchHow can you identify patterns and trends in user feedback?
-
Product EngineeringYou’re a product engineer looking for the best analytics tools. What features should you prioritize?
-
Product DevelopmentWhat are the common challenges when tracking optimization metrics and KPIs in product development?
-
Systems DesignWhat data analysis techniques optimize system customization and personalization?