In the realm of analytics, measuring success can be a complex endeavor. It’s crucial to differentiate between evaluating the success of your entire analytics platform or practice and measuring the success of individual dashboards, initiatives or products. Each requires a unique approach and set of metrics. Let’s dive into these distinctions and explore how to measure success effectively, incorporating a real-world story to highlight the importance of user feedback.
Measuring Individual Data Products
When it comes to individual analytic products, the key is to align your measures with the success metrics that matter most to the business users. If your analytics work is closely tied to the goals and KPIs of your users, your metrics will be more meaningful and impactful.
1. Align with Business Objective
It’s not always practical to perfectly align analytics metrics with business outcomes, but the closer you can get, the better. Understand what success looks like for your business users and tailor your analytics goals accordingly. If a sales dashboard aims to boost revenue, measure the dashboard’s success by its impact on sales figures.
2. Start with Usage Metrics, but Beware of Vanity Metrics
Usage metrics, such as the number of users or the frequency of use, are an easy starting point. However, these can be misleading. High usage doesn’t always equate to value. Sometimes, it merely indicates that users are logging in without deriving significant insights or benefits. These are often referred to as “vanity metrics” because they look good on paper but don’t necessarily reflect true success.
3. Predefine Metrics
Before developing your analytics product, define your success metrics. This foresight helps ensure that you remain objective about the results and can learn from any shortcomings. By establishing clear metrics upfront, you set a benchmark against which to measure your progress.
4. Consider Phased Rollouts for Large Audiences
If you have a sizable user base, rolling out your product in phases can provide valuable insights. Measure the impact on users and compare it to non-users to gauge the true effect of your analytics product. This method, while more complex, yields the most accurate assessment of your product’s impact.
5. Time Saved
One measurement that can often be overlooked is just time saved. Often a data project can be an automation project at its heart, keeping people from manually having to sort and compile data that they need for critical business processes. This can also be taken even further by looking at how we can incorporate data applications and metric layers in our offerings, as we discussed in the Maturing Analytics series.
Measuring an Entire Analytics Platform or Practice
Evaluating the success of an entire analytics platform or practice presents a greater challenge. Here, the focus should be on broader adoption and satisfaction metrics that indicate the overall health and effectiveness of your analytics efforts. It can also be helpful to group your user base into user personas that can be individual measured and interviewed.
1. User Adoption
Track how widely and consistently your analytics platform is being used across the organization. High adoption rates suggest that your platform is valuable and integral to daily operations.
2. Net Promoter Score (NPS)
NPS is a metric that measures user satisfaction and loyalty. By asking users how likely they are to recommend your analytics platform to others, you gain insight into their overall satisfaction and the perceived value of your platform.
3. User Retention
Retention rates indicate how many users continue to use your analytics platform over time. High retention rates are a strong signal that your platform is meeting user needs and providing ongoing value.
The Power of User Interviews
While all these are powerful insights that you should be employing, keep in mind that quantitative metrics never give you the full story. To truly measure the success of analytics, it’s essential to go beyond quantitative metrics and delve into qualitative insights. You need the context of your users to have a true understanding of what these metrics are telling you. Let’s take a visit to a health product company to understand why.
A Visit to a Health Product Company
During a visit to this health product company, I was tasked with helping them define their data and analytics strategy for the next few years. On the surface, they seemed to have everything in order. They were tracking high-level metrics of their offerings, and generally, people were happy with the analytics solutions provided.
One topic that came up frequently was their OLAP Cube. The IT department was particularly proud of it, and they repeatedly assured me that their users “loved” it. However, I wanted to dig deeper and understand the real impact of this tool, so I started conducting user interviews.
Uncovering the Reality
As I spoke with various users, a different picture began to emerge. IT believed that their users were leveraging the Cube extensively and effectively, but the reality was quite different. It turned out that only one user group was truly using the Cube, and even then, they were pulling all the data out of it to create a separate data warehouse. This secondary data warehouse allowed them to have snapshots of the data over time and to enhance it with their own sources.
Surprisingly, every other group that highly rated the Cube was using this secondary, shadow data warehouse that IT didn’t even know existed. The measures of success, which assumed that users were directly using the Cube, was extremely misleading. The business users were, in fact, happy with their own process and IT was blind to how far off they were to delivering the actual needs of the business.
Shifting the Strategy
The discovery was eye-opening. IT was not delivering a complete solution, and instead, the business had created an ungoverned, potentially risky shadow data warehouse. This was not exactly the model of success the company aimed for. Armed with these insights, we were able to shift the strategy to more holistically meet the business users’ needs, ensuring that the analytics solutions provided were truly effective and aligned with how users were working.
Conclusion
Measuring the success of analytics requires a comprehensive approach. Align your metrics with business goals, avoid vanity metrics and use phased rollouts for more accurate impact assessment. For platforms, focus on user adoption, satisfaction and retention. Remember the value of qualitative insights — user interviews can reveal crucial details that metrics alone might miss. Above all, empathize with your users. Everything we’ve discussed is an imperfect tool for understanding and empathizing with the people you are here to help. Be thoughtful in your approach, be self-reflective on how you can grow, and always be willing to listen and pivot. By combining all these methods with care for your users, you can set yourself on the path for continued analytics success.