Welcome back to our journey in maturing analytics. We’ve gone from identifying the actual job to be done to many of the methods and tools you’ll use to deliver on it. In this final installment, we’re going to tackle the elephant in the room: artificial intelligence and what it means to this whole process.
If you want to know how you might leverage LLMs and Generative AI, check out my quick primer on building a chatbot for an overview, or the video below:
This article is going to be much more focused on the implications of AI, not the implementation of it.
So Where Does AI Fit in Analytics?
Let’s take a common prompt structure that would be used when building an AI application. A prompt is simply the instructions we are giving the AI that usually includes a user’s question or chat response:
- [Our Instructions]: “You are an analyst answering user’s questions”
- [Relevant Context]: “Answer the question using these resources…”
- [User Query]: “What are my sales in the northwest?”
- [Reinforcement]: “Be concise, answer honestly using markdown”
Much of what we’re going to be delivering will be in the “Relevant Context” portion. This is where we inset things as documents when building RAG (Retrieval Augmented Generation) applications. For instance, if you were building a Tableau bot that helped users, you might put a page from Tableau’s documentation into that Relevant Context to enable the AI to answer the user’s questions. A typical RAG architecture is going to look something like this:
In analytics, our content is dashboards, reports and metrics. Metrics will be the easiest to use and interpret for the current state of AI, which is one of the reasons why metric layers are being a growing trend. There are, of course, many reasons why metric layers help meet user’s needs (see part 2 of this series) by creating a simpler experience and allowing for unique delivering options like alerts (see part 3 of this series). So in our analytics context, Metrics are not solving problems now, but are key elements that will make future AI applications work. If we go back to our architecture diagram, this is where Metrics would live in our AI application:
Of course, one of the promises of AI is that it’ll do more than just be a better search experience, that it will actually begin to do some of our work. This is often called an AI agent and in their simplest forms they look very much like a RAG application, but with some additional logic and functions that allow it to call out to the world. If we update our architecture, it might look like this:
Coming back to our analytics context, what would we put there? Well, again, we have the answer back in Part 2 of this series where we talked about incorporating data input and data activation into our toolkits. This integrated data analytics solves real problems now, allowing users to take actions, but in the near future, those same methods are things we can use with AI to allow them to be more autonomous:
So, by building the right things today, we’re actually enabling the capabilities of tomorrow’s AI. Really, getting ready for AI is about getting the boring stuff right. It’s about having good documentation, good processes and good tools. Many of your vendors are incorporating AI from a technical standpoint for you. You’re not falling behind by not having a chatbot now.
What vendors can’t do is give AI in their tools the information and functions it needs to do a good job in your environment and work with your other systems. If you’re falling behind, it’s because you aren’t working on these core issues. As I heard someone comment recently, RAG is great, but most companies don’t have a lot of valuable information to look up. Do you?
In Conclusion
As we wrap up this series on maturing analytics, it’s clear that the path to integrating AI into your analytics practice is paved with the foundational work we’ve discussed. Here’s a recap of the essential steps to prepare for AI and ensure your analytics are impactful:
- Start Managing Data and Analytics as Products: Treat your analytics outputs as products to enhance their value and user adoption.
- Define the Job to Be Done: Understand and articulate the specific needs your analytics are addressing.
- Deliver More Than Dashboards: Pick the right complexity for the use case. Deliver on the job to be done instead of sticking with the same output for all use cases.
- Embrace Diverse Delivery Mechanisms and Automation: Utilize various tools and technologies to automate and enhance your analytics processes.
- Keep User Value as Your North Star: Focus on delivering continuous value to your users.
- Build -> Measure -> Learn -> Repeat: Adopt a cyclical approach to development and improvement.
- Use Analytics and Get User Feedback: Continuously gather and act on user feedback to refine your analytics solutions.
- Prepare for AI by Meeting Users’ Needs Now: Ensure your current solutions are robust and user-centric to facilitate future AI integration.
By following these principles, you’ll not only improve your current analytics practice but also lay the groundwork for leveraging AI effectively. Remember, the future of analytics is not just about advanced technology but about delivering real value and making a significant impact on your organization. Keep pushing the boundaries, stay user-focused and embrace the exciting possibilities that AI brings to the world of analytics.