Maturing Analytics: Analytics as a Product

Data

Maturing Analytics: Analytics as a Product

Welcome back to our journey in maturing analytics. In Part 3, we discussed why user adoption stumbles and explored different delivery paths for analytics. Now, in Part 4, we will talk about product management.

I know — you’re in analytics and data. What does that have to do with products? We should be treating our outputs as products. First, because that’s how users interact with them. Second, by adopting product management principles, we can significantly enhance our analytics outcomes. So, let’s start by talking about the difference between an analytics project vs. an analytics product.

Projects vs. Products

Before diving into specific Product Management techniques, it’s crucial to understand the fundamental difference between a project and a product. This distinction can transform how you approach your analytics initiatives, shifting the focus from completing tasks to delivering ongoing value. Let’s explore how these two approaches differ and why transitioning from a project-centric to a product-centric mindset can lead to more successful and impactful outcomes in your analytics efforts.

Projects:

  • Focused on Discrete Tasks: Projects have specific tasks and objectives that need to be accomplished.
  • Aim to Deliver Within Time and Budget: The success of a project is often measured by its ability to meet deadlines and stay within budget.
  • Guided by Project Plan: Projects follow a predefined plan with set milestones and deliverables.
  • Ends at Completion of Tasks: Once the tasks are completed, the project is considered finished.
  • Linear and Deterministic: Projects typically follow a linear path with clearly defined steps and predictable outcomes.

Products:

  • Focused on User Value: Products are designed with the end-user in mind, aiming to solve their problems and meet their needs.
  • Aim to Increase User Value: The goal is to continually enhance the value provided to users, not just complete a set of tasks.
  • Guided by Value Metrics: Product success is measured by metrics that reflect user satisfaction, engagement, and overall value.
  • Ongoing: Unlike projects, products have no definitive end date. They are continuously improved and updated.
  • Evolves and Adapts with Discovery: Products evolve based on user feedback, market changes, and ongoing discoveries. They are adaptive and iterative.

In the end, this is about mindset. Is your north star the end of the project and successfully checking off deliverables? Or are you aiming to impact business outcomes for your users? While a project might be aligned to your users at the start, that alignment drifts over time in this ever-changing and fast-paced world. Remember, your job to be done isn’t delivering a dashboard. Product management helps us keep users as our north star and deliver on the real job to be done.

The Product Lifecycle

As on-going efforts, Products follow a lifecycle:

  1. Build
  2. Measure
  3. Learn

The cycle repeats, allowing us to iterate and adapt as we guide the product to delivering on customer needs. A key to this is to keep cycles as small and as fast as possible, this keeps learning high and adaptability strong. Small and fast cycles mean breaking down your assumptions into smaller chunks that are easier to test. Want to see if users will respond to a dashboard subscription? Don’t build a whole process, send out a manual email first and gauge user satisfaction and measure open rates.

Learn

When tackling a cycle, one of the best pieces of advice I’ve received is to start in reverse. We probably have all kinds of exciting ideas and concepts that we want to put in front of our users. It’s important to resist the urge to dive right in to the idea that you’re excited about. Instead, we need to dig deep because buried in those are ideas are assumptions. Untested assumptions. Or, in other words, we have a hypothesis about what will work and a hypothesis is something we can test. That’s where we start, that’s what we need to learn.

Measure

Once we know what we need to learn, we can figure out what measurements we need to take to learn. This might be as simple as user adoption metrics, but it could be something more specific, such as specific behaviors we are trying to modify or encourage. Start small. Many startups begin by seeing if there is even user interest in a concept by seeing if users will give them their email address. Ever signed up for a waitlist for a project entering beta? Often, they are using this as an early benchmark. Internally, this will look different and what is appropriate will evolve with the maturity of the product. In early stages, you’re often looking for proxy metrics that you hope will lead to impacting key metrics like revenue. This may be as simple as measuring if users will visit the tool. Over time, however, it’s important to align more and more to the company outcomes you want to accomplish.

One key thing to look out for is the use of vanity metrics in your measurements. A vanity metric is a metric is one that makes us feel good, but we have little control over and may or may not have a big impact on our success. For external products, that often looks like blog impressions or YouTube views. These often say more about algorithms and the whims of culture then they do with product success. These things do not translate into product sales (at least not always) and often aren’t repeatable. Internally, this may look like user excitement or other measures of interest. Sometimes solutions sound really good to a user, but they are not really willing to change their behavior for the solution.

Build

Okay, so we’re back at the beginning. We determined what we need to learn, we understand what measurements will help us learn it and now we can figure out what we need to build to take those measurements. A key concept here is keeping building to a minimum. It’s all too easy to race off to build your dream product or feature, but the goal is to learn and we want to do that as fast as possible.

One of my favorite stories about creative build steps comes from the book “Sprint” by Jake Knapp. Slack was looking at solving an onboarding problem. They found that users could log into the system fine. While some became long-time users, too many found themselves overwhelmed by the experience and never returned. They used a Design Sprint (see Part 5 of this series for more on that) to hypothesize two new onboarding methods. One used a chatbot to talk users through onboarding and introduce new features. The other was a wizard that would pop-over the user interface and guide them by directly pointing to features.

Instead of building these large projects, the team got creative. For the chatbot, they armed human volunteers with scripts to mimic what a bot would say. This meant that with minimal code, they could test the results, how often clients got lost, and if it increased user stickiness. For the wizard, they did something as equally simple. Instead of coding all the dialog boxes to guide the users, they took screenshots of the tool and created a simple presentation. In the end, the wizard appeared to give the most promising results, and that’s the direction they went for further development.

Conclusion

By shifting from a project-focused to a product-focused mindset, you unlock the potential for continuous improvement and sustained user value in your analytics initiatives. Embracing the product lifecycle — build, measure, learn — allows you to adapt quickly and keep your solutions aligned with user needs. This approach not only enhances user engagement but also drives meaningful outcomes that go beyond simply completing tasks.

In the next part, we will delve into practical techniques and tools to further embrace product management for better analytics outcomes. So, stay tuned and get ready to take your analytics practice to the next level with strategies that truly put users at the center.

KeepWatch by InterWorks

Whether you need support for one platform or many, our technical experts have you covered.

More About the Author

Ben Bausili

Global Director of Product
Lateral Thinking in Games and Data Or, What Nintendo’s Game Boy has to do with Analytics  Back before the success of the Nintendo Switch or the motion-controller craze of ...
Being the Exception “If you want an exceptional outcome, you must be willing to be the exception.” wrote Sam Presti, the General Manager of the Oklahoma ...

See more from this author →

InterWorks uses cookies to allow us to better understand how the site is used. By continuing to use this site, you consent to this policy. Review Policy OK

×

Interworks GmbH
Ratinger Straße 9
40213 Düsseldorf
Germany
Geschäftsführer: Mel Stephenson

Kontaktaufnahme: markus@interworks.eu
Telefon: +49 (0)211 5408 5301

Amtsgericht Düsseldorf HRB 79752
UstldNr: DE 313 353 072

×

Love our blog? You should see our emails. Sign up for our newsletter!