For all you tl;dr-ers out there, here are the recommendations:
1. Make it a priority to capture the data you foresee you want in the future and continually add as soon as you identify what is missing.
2. Start using the data actively - even if the numbers might be small at first - and habitually review it and build hypotheses. This is how you build products that better meet user needs.
3. Talk to users to determine how to interpret the data. Without this, you will start assuming and guessing what the data means. By understanding the users' views, motivations, and the context of their interactions with our product, we can make sense of the underlying patterns and numbers that the data presents.
A typical project lacks data to make product decisions
Modern product teams have long had the mantra, ‘data should steer decisions’ when, in reality, that is rarely the case. Most often, you have several issues that hinder this:
- Legacy systems with no instrumentation and a reluctance to put development into a product or platform on the way “out.”
- Management is unwilling to pay the “additional” development cost associated with data collection.
- Or simply the attitude that we do not need it this early/for the MVP.
When a team has no data about their product, the team and management typically acknowledge this, but then it is not a top priority to get the data; instead, they tend to prioritize building features that they believe will move the needle. So, with the lack of data, we must guess, and these guesses slowly turn into facts, whether or not they are true.
Having the data at your fingertips
After you track the significant data points, the next step is to ensure you have easy access to the data. Everybody on the team needs direct access to the data, meaning anyone can go in and check the facts about the product without any manual steps.
This might seem trivial, but having the data right at hand when you have a question or thought makes a huge difference. It makes you continuously use the data you have rather than only periodically check the main trends.
Linking product data to business objectives
Our team worked with OKRs (Objectives and Key Results) as a goal-setting framework.
We were responsible for reaching business objectives and agreeing on key results, indicating we were moving toward the objective.
The data we needed for the key results was apparent, but the work was to create hypotheses for what could impact the key results. The question became: to validate or invalidate these hypotheses, what data points do we need?
This process is pretty straightforward, but it is critical to recognize that you are still working with data intended to be key results that should drive toward the object at hand. It might seem trivial, but we often think of the key results as the ultimate goal, and you need to track your actual objective and other ways to reach that outside the given key result.
An illustrative example of this could be a company that sets an OKR to improve customer satisfaction.
Objective: Increase overall customer satisfaction.
Key Result: Achieve a customer satisfaction score (CSAT) of 90%.
Initially, the company might address common complaints to improve this score, viewing it as a direct success measure.
However, it's crucial to understand that CSAT is a tool, not the final goal. The real aim is enhancing overall customer satisfaction, which requires delving into why customers are satisfied or dissatisfied. For example, user interviews might reveal a preference for detailed support articles over quick ticket responses. Acting on such insights, like enhancing the knowledge base, may not immediately lift the CSAT score but contributes to long-term customer satisfaction.
By not fixating solely on CSAT, the company can explore various strategies like product refinement, customer education, and community engagement. This comprehensive approach not only targets a metric but also fosters genuine improvements in customer satisfaction.
How we used the data on an ongoing basis
In my recent assignment, we used our data in various ways to direct development. Here is a set of examples of concrete ways we used our data to focus our efforts.
1. Monitor the adoption of new features
Example: How many have booked a time like this over the last 30 days?
2. Remove under-used features
Example: This page is only visited so often, and this link is never clicked. After a discussion with users, we could understand why specific features were not used and we could remove them.
3. Direct our attention to what to build
Example: Why is this section opened 5 times as frequently as the others? Data spurred discussions with our users and customers.
4. Promote specific behaviors with gamification
Example: Displaying metrics that drive behaviors we want, in our case, this was simple things like; the number of projects started and messages sent.
5. Knowing what users to educate
Example: We worked with several regions that operated somewhat in their own ways so we could use our data to plan educational activities with specific regions or even individuals.
As you can see there are many ways to leverage data as a product team. Just remember to circle back to your objectives to keep focus on what you are trying to achieve. It is easy to lose track when you see all the possibilities with your new data.
Pairing with qualitative data
Even with a ton of data available, it was only usable with an ongoing discussion with the users to figure out what was behind the data. Many times, we could see an interesting pattern in the data, but then we relied on user interviews to understand how to interpret the data.
An example of this from our product was a sudden increase in usage of specific features from one week to the next. After talking to our users, we found that the feature had not been understood correctly, and after one person learned how to use it, it spread quickly to others who took it to use immediately.
Typically, this became a ping pong. We found an interesting pattern that we could not quite understand: user interviews that led to new insights, which led us to implement a new set of data points to monitor new things, which led to further questions for the users, and so on.
We continuously used insights from our data paired with insights from talking to users. Typically, we talked to 5-10 users each week, some reoccurring and some new, in different settings, many in video chat, visiting them in their office, or tagging along to their worksites.
When interviewing users there is sometimes a discrepancy between what the user claims and what the data can tell you. For example, in an interview, a user said, “I have done this around 80 times now”, and when we checked the data, the real number was just above half of that, 42. This does not mean that the user exaggerated, rather our interpretation from the interview was that the user had formed a strong habit of using the product and felt they had used it a lot.
Learning HOW your product is used, WHERE, and in WHAT SITUATIONS is crucial and needs to be addressed. In our product, we had both users that exclusively used desktop and other exclusively used mobile devices. A minority group used both types of devices. But we also had specific features that
How we did it - the tech
Now, for the techies out there, we used Azures Application Insights to collect data using default and custom events. It allowed us to create the dashboards we needed quickly. We ended up with a set of central dashboards with all the data we monitored daily but also continually updated with new and more information as needed.
In addition to these central Application Insight dashboards, we set up additional dashboards for specific needs, such as short-term initiatives or projects.
To get a feel for how people used the product in real-time, we added specific data channels in Microsoft Teams to see the stream of events as they came in. We used this to monitor how new features were working and getting adoption. This way of getting live notifications when customers began to use the new stuff was very valuable and super fun and motivating to continue the relentless work of continuous discovery and delivery.
In addition to the SQL-like queries in Application Insights, we used R in Posit Cloud for more advanced data crunching. Using R we also made visualizations that helped us understand the users and how to enhance the product.
Our experience working with this product has taught us that while a lack of data often obstructs the road to data-driven product development, the pursuit is worth every effort. We were able to move beyond guessing, making data guide our product development, and ensuring every feature and decision was informed and intentional.
Remember, the key to effectively leveraging data is threefold:
- prioritize capturing data
- integrate the use of data insights into your daily workflow
- pair the data with the rich context of user feedback
Doing so transforms data from a static resource into a dynamic tool that propels your product from concept to reality, from mere functionality to market fit. Let's not just collect data; let's use it to carve out our competitive edge.
As we've seen, data is not just a tool for measurement, but a compass for innovation. How will you harness its power to transform your product's journey?