I’m seeing so many people in the data space say that companies need to get their data right before they can do AI (we may even be guilty at times). It is simply wrong.
Look, I’m not saying data hygiene isn’t important, but telling people they must eat their vegetables before they can leverage exciting new technology isn’t just incorrect, it’s counterproductive. This “data first” mentality is creating unnecessary friction at exactly the moment when organizations should be moving fast to capture AI’s competitive advantages.
The Perfect Data Fallacy
The reality is that you can start AI projects now. They just need to have the right scope and they need to plan for proper data prep. This sort of iterative process is how you get to clean, useful data in the first place — by using organizational excitement to take on projects that matter and solve individual slices of the data problem.
Can you think of any organization ever achieved “perfect” data? I’ve been doing this for almost two decades and can’t name one. Not even us, by the way! The companies that are winning with AI today aren’t the ones who spent years perfecting their data infrastructure first. They’re the ones who identified high-value use cases, scoped them appropriately and built data solutions incrementally as part of delivering business value. AI projects are failing because of overly ambitious, ill-defined scope that treat it like magic, not because the data wasn’t in a perfect state.
Learning from the Tableau Revolution
This iterative process is exactly what we saw happen with Tableau in the early years. People were excited to have a tool that made it easy to see and understand their data (and those shiny new interactive dashboards certainly didn’t hurt when it came to impressing the C-suite). The data was often not ready, so project after project we went in to enable people to use Tableau, but we discovered and solved the data problems at hand as we went.
We didn’t launch huge, boondoggle data warehouse projects. Instead, we delivered iterative slices that built up into a shared data platform everyone could benefit from, with a wake of useful dashboard projects along the way. Each project made the data a little cleaner, a little more accessible, a little more valuable.
One of the key lessons I’ve learned is that you should leave a wake of finished projects. Small, important, iterative projects that build up into a more meaningful whole. You’ll not just deliver value quickly, but you’ll discover new value along the way that your monolithic plan would never have contained.
The AI Opportunity Is Now
The same dynamic applies to AI, but with even higher stakes. While your competitors are waiting for their data to be “ready,” you could be:
- Steamlining communication by summarizing details by product line or business area
- Building customer service chatbots to do early research to enable your support team to focus on client relationships
- Enabling citizen developed with coding agents like Claude Code to build small apps to solve your business problems
Each AI project becomes a catalyst for better data practices. When stakeholders see the value of AI-driven insights, they suddenly become much more interested in data quality initiatives.
Starting Smart, Not Perfect
The key is starting smart. Choose AI projects that:
- Have clear business value even with imperfect data
- Can be scoped to work with your current data capabilities
- Include data improvement as part of the project deliverables
- Generate enough excitement to fuel continued investment
Don’t let the perfect be the enemy of the good. Your data will never be perfect, but your AI projects can still be transformative. The organizations that understand this are the ones that will lead their industries into the AI-driven future.
Your data isn’t ready for AI, but you should be. Be ambitious, take on projects that matter and use AI as the forcing function that finally gets your data strategy right.