In the realm of product management, two seminal works stand as contrasting pillars of wisdom: “Build” by Tony Fadell and “The Lean Startup” by Eric Ries. These books represent the balance of product management: the art of visionary design and long-term craftsmanship, and the science of data-driven iteration and rapid experimentation.
For Data Leaders, the path to better analytics requires we begin managing analytics as products. That management requires that we understand and blend these two approaches, leading to better decisions and stronger platforms. So, let’s dive into these books and figure out how to strike a balance between art and science in product management.
The Science: The Lean Startup and the Power of Iteration
“The Lean Startup” by Eric Ries skews heavily toward the scientific, data-driven aside of the spectrum. At its heart is the idea that we can continuously learn through rapid and iterative experimentation. The key principles include:
1. Build-Measure-Learn
In analytics, this cycle helps us continuously improve our tools. We start by building a basic version of a dashboard or tool — the MVP (see below). We then measure its effectiveness by looking at usage data or collecting user feedback. Finally, we learn what worked and what didn’t, allowing us to improve the product in the next iteration.
Benefit: By quickly moving through this cycle, analytics teams can develop tools that adapt to user needs, helping decision-makers get better insights faster.
2. Minimum Viable Product (MVP)
An MVP is the most basic version of a product that still solves a problem. Often, we achieve the definition of an MVP by thinking about the Build-Measure-Learn cycle in reverse. What do we need to learn? If our idea is a hypothesis, that means there are untested assumptions that we need to validate. This validated learning is the output we want to achieve with our MVP. So, the MVP is not packed with features — just enough to gather the needed user feedback and see if the core idea works.
Benefit: Instead of spending months building a full analytics solution, we can release a small version quickly and get real-world feedback. This allows us to focus our time and resources on what matters most to users.
3. Validated Learning
Each time we release an updated version of an analytics tool, gather data on how users interact with it. This helps us test assumptions — whether a new filter makes the dashboard more user-friendly or if a new dataset adds value. This takes a certain bravery, as the data often challenges our preconceived notions about what is working and what is not.
Benefit: This process helps data leaders improve products based on real feedback, ensuring that development is guided by actual user needs and behaviors.
4. Pivot or Persevere
After each cycle of Build-Measure-Learn, this is the point where we ask ourselves: Should we stay on this path (persevere) or change direction (pivot)? This decision is based on the data and user feedback we’ve gathered and requires honestly with ourselves on if we’re achieving the goals we set at the start of the cycle.
Benefit: This helps data and analytics teams avoid wasting time on solutions that aren’t working. If our analytics tool or product isn’t delivering results, we can quickly adjust and try something new. Pivoting isn’t failure, it’s finding a different solution space to explore to better deliver value to our users.
For data leaders managing analytics tools, the science of this approach lies in gathering feedback as early as possible. Focus on small, iterative builds like a dashboard or other analytic deliverable, gather user insights, and make rapid adjustments based on their behavior. This method minimizes waste by ensuring we’re always learning from real data, rather than building features based on assumptions. If you need a starting point for what measurements to use, check out our Quick Start Guide.
The Art: Build and the Craftsmanship of Vision
Tony Fadell’s “Build” takes a different approach, focusing on the art of creating something truly remarkable. Coming from the world of physical devices, like the iPod, iPhone and Nest Thermostat; Fadell argues that great products are not just about data and iteration, but about intuition, design, and a deep understanding of customer needs — often before they even realize them. His principles include:
1. Enduring Design
Products should strive to not only be functional, but intuitive, visually appealing and long-lasting. For data and analytics, this means designing tools that are easy to navigate, with clear visualizations and interfaces that users can rely on for years to come. It’s not about constantly reinventing, but about getting the design right from the start.
Benefit: The skillful craft of our analytics has a direct impact on customer experience. Users can become frustrated by complex interfaces or confusing layouts, making any measurements we take on the solution suspect. It may be a failure of presentation vs the validity of the solution itself. Skillful design provides a seamless experience that people will continue to use and trust, leading to higher user engagement and long-term satisfaction. We see this with Curator by InterWorks where thoughtful design of the navigation experience around dashboards improves user adoption and NPS metrics. For many of our customers, the dashboards aren’t the problem, it’s the path to them that’s at the core of the problem.
2. Sweating the Details
Paying attention to every aspect of the user experience, no matter how small, is the sign of a well-designed tool and respect for our users. In the context of analytics, this could mean carefully choosing how data is visualized, optimizing the speed of loading dashboards, or ensuring that filtering options are intuitive and efficient. That’s empathy and care for our users when every interaction is smooth and purposeful.
Benefit: By focusing on these small details, data leaders can create products that feel polished and professional. Users will notice when things just work as expected, which builds trust in the analytics tools and encourages deeper usage over time. Often, this leads to a snowball effect building more trust and excitement in the work of our analytic teams.
3. Visionary Leadership
We must lead with a long-term vision in mind. This is not something that can be completely data driven. For data leaders, this means thinking beyond the immediate needs of today’s users and considering how the analytics product will grow with the organization. Visionary leadership drives decisions about which technologies to invest in, how to future-proof the platform and anticipating the needs of end users in the years to come.
Benefit: If we build things with a long-term vision, anchored in deep empathy for our users, they won’t become obsolete as new business challenges emerge. Our products can be flexible, scalable and address key pain points for our users in a way that can continue to grow and evolve with the business.
For analytics leaders, the art in Fadell’s philosophy encourages us to think beyond the immediate. A robust analytics platform isn’t just about delivering fast iterations — it’s about ensuring the design is intuitive, the system is scalable and the product can evolve with the organization’s needs. This craftsmanship ensures our tools are not just functional but a joy to use.
The Tension: Balancing Art and Science
In many ways, the approaches in “The Lean Startup” and “Build” feel like opposite ends of a spectrum. “The Lean Startup” emphasizes the scientific method — hypothesize, test, iterate — while “Build” advocates for a more artistic approach, stressing intuition and long-term vision. This tension reflects the very real challenges data leaders face when developing analytics products:
- Speed vs. Quality: In the scientific model, speed is crucial. We release early and often to gather data. But in the artistic model, we may take longer to ensure the product feels refined and polished. Data leaders need to navigate between the need for rapid feedback and the demands for scalability and user satisfaction.
- Iteration vs. Vision: While “The Lean Startup” pushes for quick cycles of improvement based on user feedback, “Build” urges leaders to think about the bigger picture. Balancing between users’ needs today vs tomorrow takes thoughtful consideration of our context. Sometimes the right answer is to continue to iterate, and sometimes it’s time to set out to build the future.
At the core of this is both the limits of data and process in product development, and the unique circumstances each of us work within. Your personal business, and even project context, is going to weigh each of these things differently. Consider some of the following factors around a needed solution:
- High Stakes, High Visibility or Long-term Use: In situations where the impact is going to be high or over a long period of time, quality and craft are going to matter more than if this is a one-off, ad hoc report.
- Short Deadlines or Many Unknowns: When we’re under time pressure, it becomes important to prioritize speed and iteration.
- Evolving vs Stable Environment: If things are rapidly changing, we should be structuring our solutions to rapidly change and iterate along with it. However, if our environment is stable, we have more time and space to craft a long-term solution.
- Resource Availability: The size and skills of our teams is also going to greatly impact the ways we invest our time and efforts.
The Art and Science of Product Development
For data leaders, managing analytics products is about mastering the blend of vision and design (the art) with iteration and data-driven decision-making (the science). In some cases, the scientific method is essential — quick, measurable iterations help ensure tools meet immediate needs. In other cases, the artistic touch is required, investing time in platforms that are scalable, intuitive and built to last.
Essentially, it’s important to understand the limits and complimentary nature of both approaches. Vision is often subjective and needs to be grounded in reality through testing and validation. Data, especially qualitative data, provides valuable insights but can also be an incomplete representation of reality. To truly understand user needs and behavior, data leaders must critically assess what the data is revealing — and what it might be missing. Vision fills the gaps where data falls short, guiding long-term decision-making in ways raw metrics can’t.
Conclusion: Harmonizing Art and Science in Analytics Products
The art and science of product management may seem like opposing forces, but the key is in knowing when to lean on each. Use the scientific rigor of “The Lean Startup” to guide your iterative processes, validate decisions with data and gather feedback from users. Let “Build” remind you that sometimes, great products require long-term vision and careful craftsmanship to create tools that provide lasting value. All of this takes a lot of self-reflection to parse what you know, what you intuit and what you just wish were true. With honest reflection, you can wield both the art and science in purposeful balance to build analytics platforms that not only work today but continue to evolve and serve users well into the future.
If you’d like help in navigating the balance of science and art, send us a note. We’d love to hear about your challenge and iterate on your next great solution together.