Thanks to InterWorks for hosting the webinar about embedded analytics. By way of introduction, I lead the embedded analytics business for ThoughtSpot in EMEA. I joined approximately twelve months ago, but I've worked in analytics and data of one form or another for the last twenty years, and the last eight to nine years at Tableau and Salesforce, where I also led their embedded analytics business. Unfortunately, I'm the guy that has got the slides at the front. Like, Matt Matt will have all the interesting demo content, and Ricky will have all the the the future agentic stuff. So I I get to do the slides, I'm afraid. But, I I just wanna start by explaining one of the reasons that I moved over to ThoughtSpot, and it was largely because it feels like we've moved into a third generation of analytics. I mean, first, we had, Cognos and business objects. I'm old enough to remember all of those things, and distributed cubes to try and help business teams get the analytics that they need. And then Tableau and Clear, Compeller BI, move the world forward in terms of visual analytics and and kind of self-service, which was the second generation. And now with natural language, it feels like we have a new generation of analytical capability driven by AI, where we can actually really start to put the business users in the position of asking their own questions regardless of data skill level and and something that previous generations have never really managed to do. Tableau, in particular, did a great job of democratizing, analytical capability, But but largely for analysts, I think it never quite cracked the knot of I'm a domain user. I'm a business expert. But I don't know what the underlying data is or what it looks like. And as a consequence, I'm not very comfortable in in navigating around the dashboard. And so that that's really moved on in terms of capability, and and that's why ThoughtSpot's such a good fit for the embedded analytics because, typically, that's that's the audience. So I'm gonna spend a bit of time talking today about how the market is changing and evolving, and a little bit on why ThoughtSpot is such a good fit, for externally facing analytics. And so what's going on in the market today? I think what has happened, probably something that everyone's experienced over the last two or three years, is that AI has become mainstream. I mean, I know it's in everything, and some are a little bit dubious. I think my washing machine has AI. I don't think that constitutes a proper AI use case. But but it has become mainstream, and and people's perception of what can be done has changed. And so if you look at the normal adoption rates across organizations, you can see that, technology evolution travels really, really quickly, and adoption, trails behind that quite considerably. But every now and again, something happens in the world which marketeers would call a cataclysmic event. I'd probably call it something like a change event. But but what happens is it changes, people's perception of what can be done, and then there is a clamor to be able to get that type of technology because it has an obvious business impact. So I I need to be able to see that, in all of the different types of applications and use cases that I have because it delivers value to my business. And so consumers start to look to their providers and solutions and and find substandard answers by comparison to what they see in normal world. And NPS scores drop and churn increases, and the demand for these AI led BI solutions, massively spikes. And and that's what we see in the thought spot business today. And so by twenty twenty six, eighty percent of business consumers will prefer intelligent assistance and embedded analytics over dashboards for data driven insights. So it's not even just putting embedded analytics into applications anymore. It it's how do I put something in there that is actually my own personal analyst. Like, how can I get something that's conversational that I can interact with? And by twenty twenty six, more than two thirds of line of business personnel will have immediate access to cross functional analytics embedded in activities and processes. So this is all about putting, intelligence and capability into the workflow of people's day to day work practices so that they can get insights at the point that it actually makes an impact on their day to day decisioning. Because, ultimately, that's the gold standard. Right? If we can give people access to the right capabilities at the right time in the right place so they don't have to deviate from what they're doing, then that's making a significant business impact. And so if that's what we're trying to achieve here, then there's obviously a a great place to put this into everyone's analytical experience. But it's not just the technology that's changing. The entire marketplace, the ecosystem is changing. And if you go back, I don't know, probably ten years or so, most analytical use cases were pretty introspective. Like, most businesses were using analytics, but they were embedded purely for evolving the efficiency of the, only internal processes of that business. So how do I make more money? How do I save cost? But, actually, if we move on to where we are today, you've got this enormous emerging data ecosystem that you've got data supplier businesses on one side so they're creating their own data product strategies, and then you've got data consuming businesses that are buying those product purchases and subscriptions. And you've got kind of this middleware piece, which is self-service portals. And I know Matt's gonna talk about Curator a little bit, and analytics platforms like ThoughtSpot and data clean rooms and data marketplaces. But the advent of AI means that, actually, a lot of those consuming businesses or what would have historically have been consuming businesses are now using AI to take all of their data assets and create their own market facing data products for themselves. And so you've got this huge kind of cyclical process of consuming businesses becoming data suppliers, and the data supplier business starting to consume products from the other side, all of which manifest itself through kind of things like self-service portals and analytics platforms. So it's a really interesting market space in terms of not only the technology evolving, but that driving changes in the marketplace as well. And what that means is that there's this tremendous market opportunity. And if you look at how most businesses make their revenue, it's typically made up of a set of high yielding kind of law, twenty percent of the customers make up eighty percent of the revenue, and then you've got this long tail. And so the key accounts typically get all the good stuff. Right? They they get all of the white glove treatment. They get the account management. They get the full suite of products because they're high yielding customers. And the long tail largely gets left alone because, actually, the cost of delivery of that type of service is too high. But but, actually, if you think about that type of thing, if we can build a long tail capability as an AI driven analytics SaaS service, then we can provision that much bigger cohort of customers with really, really, high level insightful analytics in a way that, typically, that market doesn't get serviced. And so most people ignore that market segment, whereas, arguably, that is the data starved area of the marketplace. And so the opportunity for provisioning that type of capability in an efficient way is really, really high. And so the exam question that we come to, and this is what all people are trying to get to grips with as they bring data and analytics to market, is everybody's got data, like, ubiquitous data, lots of databases, lots of data rows, all sorts of, historical interesting data points, and everybody's got data knowledge and domain knowledge and databases and ETL and analytics platforms. But how do I transform that into last mile business value? Because it it's not a ThoughtSpot thing. It's not an Interworks thing, and it's not a customer thing. It's how do we make your customers more successful with data? Because if we can drive them to the right set of insights, where in their day to day work, they can make better decisions on behalf of their business because they have access to the right insight, then they will buy more of your stuff. They won't churn. As soon as you're providing that kind of high level capability where people can genuinely ask questions, get the right answers, how do I apply sales? How do I reduce cost? How do I increase efficiency? When people are able to ask and answer these questions as a consequence of consuming things from your business, then, really, that is the the gold standard of embedded analytics. That means we've achieved what we're meant to achieve. And as soon as people are getting genuine real value from it, then that that really is the ultimate aim. And so if the market's changed and the technology's changed and there's this tremendous market opportunity, why isn't everybody doing it? Right? And I think it's a really good question. But, really, like, the products are delivering intelligent or rather the products aren't delivering intelligent AI native experience that the users expect, and this is where the reality hits. So it starts with the ever extending backlog. Right? So static embedded experiences create a flood of report requests, dashboard tweaks, new drill paths, and it never stops. Products and dev teams are swamped with add up requests, and it's unscalable. And if we move over, that leads to delayed time to market. Building those experiences delays core roadmap delivery. You miss launch windows. You lose momentum. Then there's cost. Engineering teams are stretched thin, split between building analytics and trying to innovate the core product. So burnout increases, velocity drops, budgets balloon. And so all of this stores growth without standout AI powered features. It's hard to attract new users because that's what the market is demanding, and every small friction point drives up the cost of customer acquisition in an already crowded difficult market. And even when you acquire new users or new customers, retention then suffers because static dashboards just don't deliver ongoing value, user disengage, and churn creeps in. That perhaps most critically, like, I'll arguably, the most important point out of all of these is potentially you missed the AI wave. Right? So competitors are already launching new analytics experiences, smart apps, intelligent agents, and embedded copilots. So if you're not on that train already, like, this is moving so incredibly quickly that if you're starting now, what does the future look like in six months? Everything is changing so incredibly fast that people need to build that type of competency into their products and services right now. And so if we look at what a, a thought spot embedded platform does from a a business value perspective, It delivers it in kind of three ways. So if we take the red boxes specifically, this is where a lot of people start. Right? So they come to ThoughtSpot because they've got a heavy overhead in terms of delivery by answering data questions from all of their customers is incredibly, high overhead. And so what we're gonna do is we're gonna build ThoughtSpot into your product or service or portal, and access to data will become, really easy and increase client satisfaction. You'll be able to demonstrate new business capabilities and customer satisfaction, and NPS will scores will increase simply because now people have access to data and insight. The cost of delivery is reduced because it's automated. There's no prebuilt drill paths. You can drill anywhere, and so that technical debt is is minimized. But, also, your account management, CSM, dev engineer type workforce can also start to focus or refocus their attention back onto the high value tasks because they've been just doing data changes for the last two, three years. And, actually, you want those people focused either on doing in the core application or or high value tasks with, your customer base. And so that in its on its own would deliver probably significant enough ROI to invest in a platform like ThoughtSpot. All you've actually done whilst you've created that is you've delivered a platform for the monetization of data on the top of it. And so the ability to then take an insightful data point that somebody's consuming and wrap around other capabilities over the top of it to a point where people want to buy those types of capabilities means that by exposing all of those people to data insight, you've created a platform where you can upsell and cross sell. Whether it's direct monetization or indirect monetization, you've created a a platform where people can adopt and buy into new capabilities. The third bit of real value is you've also created a platform for agentic and autonomous offerings. Right? So if you look at how ThoughtSpot is evolving, actually, what you're doing is building in all of the capabilities, all of the underlying data governance control, authentication is already being resolved as part of this implementation. And so you can use that same framework and protocol to deliver the evolving agentic capabilities as they come along. And so this is how, ThoughtSpot is progressing. If you go back to twenty twenty three, ThoughtSpot's always been a search driven application from a BI tooling perspective. It's always been driven by search. That was the original ethos, and that continues to be true. We moved into Sage in twenty twenty four. We have our first agent in Spotter in twenty twenty five. And then twenty twenty six, I'm sure, will bring multiple, agentic and autonomous agents, to deliver lots of value. And what I think that this slide really shows is the rate of change. So if you're planning an embedded analytics product today and you haven't considered the impacts of a or autonomous, then you really need to. Because in eighteen months' time, I suspect this the the world looks very, very different. And so this rate of change is really driving why people are adopting even at a a fairly simplistic level like Liveboards, dashboards. They're choosing to embed analytics now at this point because it enables them to hook into the development roadmap of a thought spot where, actually, we're mapping to all of these future capabilities of a Adjemtik and autonomous. So you'll be pleased to know that that's me done. That's the boring slide bit out of the way. We can now move on to the exciting demo bit that I know you all want to get on to. But, hopefully, that was a useful insight. But, hopefully, it gives you a bit of an insight into the types of things that we're seeing in the market space and the driving factors that are really causing people to adopt ThoughtSpot. And I'll hand over to Matt, who's got all of those really clever capabilities to bring that to life. Thanks, Mark. That was, that was really interesting. So for those of you who missed the the first couple of minutes to start, I'll just do a a quick introduction to myself. I'm Matt Whiteley, and please excuse the terrible photo there. I am an analytics lead at InterWorks. So I've been here around five years. For those of you who aren't too familiar with with InterWorks' company, we are a a data and we are a data consultancy, essentially. So we specialize in data services all across the data spectrum from data architecture, data engineering, all the way up to to data analytics, visualization, all all that front end stuff. And we are a Foursquare partner, so we we carry out services within within Foursquare itself. I'm just gonna do a quick I'll I'll do a quick few slides around kind of where we see analytics currently in the market. Why why embed analytics? Kind of what advantages do we see of clients who do do that? What what are some of the real world challenges we need to consider, especially in this kind of new, AI world that we live in? And then kind of why why do we think Fortsbot is that perfect tool for embedding? What makes it stand out amongst other tools in that BI market? I managed to do a quick demo of Intuit's Creator. Creator is our is our CMS platform. This is where this is our kind of our portal, what we sell as product, allowing you to embed not only analytics, but other stuff in into that, application as well. So to start, I'll just do a quick overview of kind of where we see analytics used kind of as a product, when it comes to embedding. And this can be broadly split into into two categories. Now these aren't mutually exclusive. They they can overlap. But, essentially, the most common use case is using, embedded analytics as a data monetization product. So the analytics itself is that thing that you're selling to customers and clients as a as a standalone product. So these can be added value services to differentiate your core differentiate your core offerings. What we often see is kind of a tier one, tier two, tier three case where, you give tier one free, and then you you charge more for access to tier two, tier three analytics. And then there's also that that organizational intranet side as well. It doesn't have to be monetized analytics. We often see people embedding analytics in their own internal intranets, especially key for companies who wanna keep all information together in one place and have one consistent brand, across both their analytics and other information like white papers and documentation and so on. Something we often see in kind of ad tech and and creative industries. And the second side of that is analytics within the workflow itself. So this is when we see embedded analytics as a, as a side product, essentially, to the main service a company is offering. So this is usually seen in in CRM tools. If you think of a the main focus for CRM tool is the customer relationship management, but you often embed analytics alongside that, for things like activation of customers, number of visitors, and so on to give extra information, alongside the main product. Often seen in things like ERP tools as well for management of operational processes. That's the main part of software, but there will be an analytics part which allows you to measure what's going on inside that tool itself. And, again, another real world application that we see quite often is call center software. Analysts will have to go along aside the long alongside the main product to show things like call rates, number of calls handled, and stuff to measure measure agent performance. And like I said, they're not much exclusive. You can have both offerings, and we do see that, in the wild. But that's two broadly, the two categories that they they fit into. So why do we like embed analytics into it? Why do we kind of promote it? And there's there's a few reasons, both aesthetic and functional. But one of the main reasons is we don't need to you don't need to switch your focus between applications. When we talk about embed analytics, we talk about getting a lot of information into one kind of front end portal, one login, multiple tabs of all the stuff you need to see. It means you don't have to log in to multiple BI tools. You don't have to log in to to internal sites with documentation and bitepipes and so on. You can put all that focus in one application. This allows users to explore and interpret that data within that familiar context of a host application. Don't users don't need to learn six different UIs, don't need to learn where the buttons are. They don't need to log in to six different systems. They can log in to one system with all the familiar branding and the UX and UI that they're used to and have all that information there. Just as Mark said earlier, it can lead to a lot faster decision making. Users can analyze that relevant data and gain insights much quicker without the need to then kind of transfer, transfer data outside that system and use other external tools. And the main benefit, and this isn't always the case, is that it looks great. The main goal of embedded analytics is to make it look like it doesn't come from a third party tool. You want it to fit seamlessly into your brand, into your website. And with with kind of traditional BI tools, when you log in, it looks like the server. You don't have much control over what you do design wise. When you embed analytics, you have complete control on the customization of that website, and you can look like the analytics fit seamlessly in the background. And embedded analytics works is because data has real monetary value. We all it's been a long time since kind of that phrase came up that data is the new oil, but in reality, it is it is a very valuable valuable resource. And, again, we kind of split this into into three categories. So kind of a traditional side is the visualization side. This is where you get your, kind of this is kind of BI two point o. This is where you get your, your nice looking live boards, your ability to drill down, your ability to kinda get those insights fast, have all your key metrics at the top. And when that data is enriched with, you know, ELT on the back end, ETL, the, external data added to your data source, it only kind of enriches that value. And then on on the other side of it, it's it's kinda turning that descriptive analytics into that prescriptive analytics where you can augment with AI, data science, machine learning. And that's where kind of ThoughtSpot sits in these two categories, which is, kind of why we like, ThoughtSpot as an embedded tool, which I'll come on to shortly. So in terms of real world considerations, if you are considering if you've not currently embedded analytics, you are considering or considering to improve your current offering. Again, broadly, kind of four categories that we think about before we would embed any analytics. We'd first of all, we start the back end. Do you have secure storage of your data, which is even kind of more important now in this in this degenerative AI world to make sure that we you have, the data ready. Is it ready for analysis? Can it be enriched or improved, with with added value? And is it important enough? Are are is it gonna be easy to use? Are users gonna get their insights quickly? And then we look at how do we control access, to the data. How do we manage users? How do we federate access? Is there a conditional model? We wanna control who has access to what data, not just visualizations, but things like search as well. We wanna make sure that the correct people are searching the correct data. Then does my current BI platform support embedding? And then what should we embed? Should we embed everything that sits in our BI platform, or are we selecting kind of certain things to get value from? And then kind of on that front end, almost what we call, like, the storefront end, can we create that seamless brand in UX? What actionable insights can we provide? And what other data can we bring in to kind of supplement the analytics that we're we're providing? And just to kind of a very crude example of how this can look if you were rolling out your own market research hub, for example, you would pick a cloud data warehouse on the back end like Snowflake, which could be Google BigQuery or or Databricks. Doesn't really matter. That'll have your secure storage. You'd have a Okta authentication layer, which provide that by fine grain control for the users. I'd leave a tier of them with free basic and pro, and then get that SSO ingrained. And then you'd have your full spot. This would be your analytics layer. We'd have your your live boards, your search interface, and then the new, AI capabilities as well, all all wrapped up in that. And then that front end will be your your content management system, a a portal where all this stuff will be be visualized and managed. So why do we, at IntuBooks, like Spot for embed analytics? So with a lot of traditional BI tools, their main goal when they built that tool or when it was first developed, it wasn't for embedded analytics. So it was meant to be viewed within a server, within, a browser, and it wasn't built first for allowing people to embed that. Faultspot was built for embedded analytics in mind, which means it's pretty seamless, to use. It means you get a lot of customized customization. You don't just have to embed embed full dashboards, full live boards of the search interface. You can you can choose what elements of your whole workflow you wanna embed. And it just makes that experience a lot easier for for an analyst. And because it's kind of low code soft software development kits to get you started, It's not just software engineers that are able to do the embedding. You can learn pretty quickly as an analyst, and there's a developer playground you can use, which gives you a lot of flexibility and lets you get started quickly without having to read, you know, hundreds of pages of documentation just to get going. And this main part, just mentioned above, is not limited to just embedding charts and dashboards. It is built for that kind of new agentic AI era. You do not you can embed full spot capabilities, the full AI capabilities, and live boards. It's really kind of built ready for this, the new wave analytics that we're seeing right now. And on the design side, it's fully customizable, seamless brand integration. One of the main benefits of embedded analytics, as I mentioned above, is making it look like it doesn't come from a third party tool, being able to control the colors, logos, all that text to make it look like it fits seamlessly in a website. ThoughtSpot does that very natively, allowing you to embed stuff that fits with your your company branding. And so the the less sexy stuff is that enterprise grade security. There's column level security, row level security, built in single sign on authentication, all that all that stuff that makes it kind of much less stress free for your IT team to allow that stuff to be embedded. Again, more and more important as we go into this AI world, and letting kind of agents have more control over over our data. So I'll just jump out here quickly, and yeah. I'll just give you a quick example of what kind of analytics can look like. And this is our curated front end. We have a, kind of a login. You can log in as a corporate, or or a franchisee. Then we have all these different areas where you can access information up here. You can have all your analytics in your self-service area. You can have your documentation. You can even embed stuff from other legacy BI tools if you have that as well so users are able to still have that kind of, one experience inside one website and not just limited to to one tool. And within, if I just log in quickly as corporate, If I go to my LiveBoard demo, This is an example of a a demo lightboard that we can add for this this fictional company called called Interburger. This is kind of what you call your your legacy embedding your your lightboards, but you can still have those AI capabilities embedded within within this liveboard as well. And, again, it doesn't have to be your charts and dashboard. You can also bring in the, the legacy search data function. And, also, this is fully possible to have that that spotter capability embedded as well. And like I said, it doesn't have to just be analytics. You can bring in your other things like demo videos, live or best practices, data validation, all different types of documentation could come in that same, that same front end system, allowing users to any kind of log in once and get the analytics and that supporting documentation, that they need as well. So, yeah, very good example, but feel free to to reach out to us. We'll we we do kind of one hour, two hour, walk throughs of our credit system on a regular basis. So if you wanna jump into it in more detail, please let us know. I'm just gonna hand over now to, Ricky, who's gonna talk about really exciting stuff on on at the FortSpot MCP server. Lovely. Thank you very much, Matthew, and hi, everybody. Great pleasure to be here. My name is Ricky. I look after the sales engineering function here at ThoughtSpot. I've been in analytics for about fifteen years on various different roles all the way from, you know, sort of gathering data, ETL, data modeling, and data design. And I think what I'm about to show you next is probably one of the most exciting things I've seen in my career. It's a scary place to be, but it's it's it's here and it's with us now. So I'd love to take you through some of this. So let me just share my screen and get going. Okay. Lovely. So just to set the scene from what I'm about to show you and set the context, really, what we're looking to see what what Gartner is saying is about twenty percent of decisions are gonna be automated with the use of AI. Right? And what we mean by automated is how can the AI make some decisions for you and then service you the decisions that it's made or maybe potentially service you with the decisions you need to make. Right? And the the the two things that I want you to sort of take away when I'm doing the demo is there are two things that we in ThoughtSpot are building for. One is how do we empower business users to be involved in these workflows where they want to be? Because you don't want to automate absolutely everything, but still automate some things that are maybe the low hanging fruit, but still present you with the information so the human can be, use their critical thinking in order to do the next leap. But, also, none of this is possible if data sits in silos. So, Matthew mentioned MCP. MCP is a protocol developed by OpenAI and oh, sorry. Developed by Anthropic and backed by OpenAI, which will allow multi agents to come and talk to each other. And if multi agents are talking to each other, we can start breaking down the silos with the data. So that's one thing that I want you guys to sort of bear in mind as I go through this. But so this is all futures, but ThoughtSpot is there today. You know? It's offering an agentic capability for you to embed into your external portals today. Everyone gets an analyst so they can ask and answer their questions. Very pointed questions, they can answer and answer them. It's it's embedded inside applications so data is connected like Matthew show you. Your your live boards, your dashboards, your search, your documents, everything are connected. And, also, these apps are intelligent, so it can push you automated insights on the fly. But what autonomous means is how do we start breaking down the data? So data is all, boundaryless. How can we combine structured data, non structured data, Internet, and documents together so an AI agent can make decisions for you? How do we go into an interface that, is much more familiar with GPT and Claude and Cursor where UI is the only UI. AI is the only UI you will need. Right? And then, basically, how can these agents make autonomous decisions for you? So that's what I'm gonna go into today and show you what the future what the next wave of ThoughtSpot embedded into your applications and creator could look like. Okay. Demo time. So, to set the scene, imagine I work for a telecommunications company. We've seen churn right within our data. I wanna start to address some of that churn. So off I go into what we call the agent builder interface, where instead of asking a natural language question or getting a visualization or or dashboard that I need to look at in order to go make some decisions, I'm gonna start creating an agent. So I'm gonna create my agent. The first thing the agent is gonna ask is what do you want me to do? So the way that I like to communicate what these agents are like is imagine it's like a colleague. Right? A junior colleague that you've got with you that will go and crunch and churn some numbers and present you information. The first thing that colleague is gonna ask of you is, you know, what do you want me to do? So I'm gonna give it a bit of a description here. So I'm gonna identify certain customers at risk of churn next month. And now when they are churning, what do I want to do? I want to maybe send those account owners a summary, and I wanna get a detailed summary on Slack. Okay. Brilliant. So let's click go. And so it will go away and create a job description. It's gonna go and look at all the data sources that you've got available to you. It's gonna identify certain metrics that it might wanna have a look at. It's gonna give itself a description and give itself a name. So churn is what it's decided to call itself, but it's got a clear definition of exactly what it's gonna do. It's gonna help customers in the sales team identify prospects that might be churning and is gonna send me a summary of this information as well. So I can say, right, that looks good, or I can make some other suggestions. But for the case of the demo, I'm just gonna say that looks good. And then it's gonna give me success metrics. So you we can't just, you know, tell it to go and do something without setting goals. So success metrics is a way that we're gonna send this autonomous agent what goals looks like. So I'm gonna use this metric, and it's gonna take okay. Decrease churn by ten percent. What does that mean? It's gonna have a little think. Okay. So now what it's gonna say is we're gonna decrease churn by ten percent. We're gonna break this down into two things. One is new logo retention, and the other thing is about dollar retention as well. So I'm gonna work to these two goals. And, hopefully, by working on these two goals, we'll reach our overall objective of reducing churn. So, again, I'll say that looks good. And then it's gonna recommend me the data sources. So this is where the combination of ThoughtSpot plus your embedded application plus potentially some of your customer's data via MCP, we bring all of these things together. Right? So you, Matthew showed us earlier that, you know, Snowflake could be your source of truth for all of your data. Right? You can, approve certain external sources like the World Wide Web and certain places in the World Wide Web from a platform point of view for world knowledge. You can allow your customer so for example, here, what I might wanna do also is upload a document. So I'm allowed to upload maybe a strategy document that I've got on my side, presentation, PDF, spreadsheets, whatever it might be. Click go. And also pick another dataset like GTM. Press go. And it's now we're breaking the silos. That's his agent now has access to all the things your analytical colleague would have had access to. But they would have had to go into loads of different systems. But with MCP, what we're doing is being able to bring that together. And what we're enabling you as a as a user of ThoughtSpot or a platform admin of ThoughtSpot is control what other governed sources of truth. Right? So you can ensure that you get good governed outputs because you've got good trusted data. K. I'm gonna say that looks good. It's gonna create the actions for me. So what tools do I have available from the MCP marketplace can I use? It's gonna recommend me some suggestions. So it's got two tools that it can use for this job. It's got the Slack tool and the Salesforce tool. So it's gonna suggest me all the things that I need to do, nine AM on Slack. And then on Salesforce, it's gonna go create loads of accounts for me. Okay. So I'm gonna say that looks good. It's gonna go and validate everything. So brilliant. It's validated all these things. So this is what I'm gonna do. I can test the agent, and testing the agent, is critical because maybe some of the data sources have expired or you've got more data that you can access. So it's just gonna test that all of this is good. Confirming the credentials. Okay. Brilliant. So I'm just gonna go live with this agent now, and it's ready to go. So in the interest so I'll just actually let's show the agent library. Because this the idea of of of you as an, an organization that's offering embedded analytics, you will create these little agents for people to use, right, to be able to have a conversation with, to be able to chat with, but you're governing the boundaries of what data is available in those agents. Right? And so now when I get a report, so I'm just gonna flip back to my, slides to simulate next morning. But when we get this information the next morning, what's it gonna give me? Right? And it's gonna give me an interactive place where I can continue my conversation with churning. So Churney has told me here are the accounts that are gonna churn potentially next quarter, from a new revenue point of view, from a dollar retention point of view, the average number of users potentially in those accounts. It's gonna give me all those things that I need. But it's also sent me this information on Slack as well. And, critically, what I can do, with churning is carry on having that net new conversation because this agent, churning, knows that it has access to the World Wide Web. It has access to all the data that I've given it. If it hasn't answered that question initially, I can still use my human and my critical thinking to bounce ideas off of Churney, right, and come up with a strategy of reducing churning in in the next quarter. Right? So this is where we feel this is where we are building towards. We've got the foundation and the frameworks already there with MCP to order in order to do this. But this is what an output conversing with an autonomous agent could look like. And it's awesome. So I'm just gonna stop sharing and pass over to the next presenter. Well, that is actually our last presenter. Saved you to last, Ricky. But we do have some questions that we would be happy to answer. So the first one is gonna be for the ThoughtSpot team, which is how about data security? Does ThoughtSpot store any interim data, or is it directly passed to the public LLMs? So we have, like I mentioned, we have two offerings right now. One is our agentic offering, which is the best in class structured data search. Now that best in class structured data search, we have implemented a a RAG architecture around it. So we use standard, base LLMs, provide it with the context to go and answer the question, and then the LLM will come back and say these are this is a sequel you need in order to answer that question, and it goes from that. So in that context and that flow, we use public LLMs like OpenAI, Gemini, Snowflake Cortex, but we don't send any of we don't use it to train your data. Right? We just use it, give it context in order to get the best sequel back. Now when we're talk when we're talking about MCP, then absolutely yes. Your front end LLM powered application or your AI powered application, right, which you host in your domain and your application, the call from that LLM will go via MCP to ThoughtSpot, where I might say, you know, show me the you know, what what accounts are likely to churn next quarter. And that list of accounts will flow via back through MCP back to the AI powered application. So in the second flow of autonomous, ThoughtSpot as that best of breed structured data search and will, increase more and more capabilities in that. So you can do non structured voice search, Slack search, all those sorts of things. But right now, see it as your best of restructure data search. And, yes, the data will flow back to your AI powered application. Thank you, Ricky. Just to follow-up from that one, can you use your own on premise LL sorry. Do you know what that's really tough, isn't it? LLMs. Yes. So ThoughtSpot itself is a is always gonna be cloud hosted. Right? It's always gonna be a SaaS offering. So that part, you host in the cloud. The data itself, so Matthew mentioned earlier, it's like Databricks, Snowflake, Google BigQuery. We predominantly work with large cloud vendors. So ThoughtSpot in the cloud, co deployed with your in the public cloud like AWS or Google, co deployed as closely to your data as possible. Now the agent application, if that's the question, can that be locally hosted and locally powered? Absolutely. And we'll set up the right network protocols to ensure that the the data being retrieved and fetched is over a dedicated pipe using VPNs or bridges or network security, whatever it might be. But, yes, in theory, your application is wherever it needs to be. Fab. Another question. Could this also take Databricks, Datastore, and ServiceNow for support tickets? Absolutely. Yeah. And that and so ServiceNow themselves are building, an MCP endpoint for ServiceNow data. We've got, Databricks would be us, a sports bot, interacting with that. Right? So you've got the two sources coming together. You've got the Spotter agentic offering, which is the best of breed structured data search into Databricks combined with, ServiceNow's MCP agent. And so now you can build an agent inside Curator and ThoughtSpot, which will say, look. You know? Go and tell me what tickets I should be prioritizing tomorrow. Right? And it'll come back with a list of suggestions on on what tickets and why. And we have a philosophical question. So thinking caps on, gentlemen. But what is the role of human data analysts in the world of AI analytics agents? I I mean, I'm happy to take this one as well, but I don't know if anyone else on the on the call, this is less technical, wants to chip in. Yeah. I don't mind chipping in with that one from from my perspective. Well, hopefully, we'll still be around because, obviously, I have a vested interest in in that question. But I think for those of us who are data analysts out there, you're probably similar to me in that in that we spend a lot of our time cleaning data, making small changes to reports, to dashboards, exporting stuff to Powerpoint in Excel. We're still doing things that we did ten years ago. I'm hoping that this new era allows us to freeze up a lot a lot more time for us to not do those sort of things, actually drive more business value. So if, anyway, I think it's gonna become more important, but it's gonna become more important in less less of a technical sense, but more in a sense of driving business value. So you're still gonna need to be close to internal teams, sales teams, service teams to understand exactly what they want, and you're still gonna be we're still gonna be a charge in building that and given the context needed for all these applications to work well. But I think it's just gonna, over time, change more, to become more focused on driving business value rather than maybe building a dashboard, which I believe starts a place, but it's gonna be more focused around that, you know, actually delivering outcomes, which is the fun stuff in my opinion anyway. Spot on. I wouldn't have put it any differently myself, Matthew. Yeah. Is there any more? I think there's one more. There is, and apologies. The confusion on my end here. So the question is, can you continue the conversation in Slack? And this is referring to Churney sending a notification to Slack and if you can chat with the agents in Slack Teams or if you have to go back to the embedded analytics. So the idea, like Matthew mentioned earlier, is all about reducing the friction for the users. Right? So instead of the users coming to the data, we want the data to come to where the users are. So if that is in, the creator portal where they carry on conversing to churny there, then so be it. If it's that the implementation with ThoughtsBot and creator is pushing this into their corporate teams or Slack, right, then let's do it there. If it's, developing a mobile application, a lot of people are now starting to create mobile first interfaces on top of this. If you want a notification to go to mobile and continue the conversation there, let's do it there. You know? So Slack is our first entry point that we're we're gonna be doing, but the road map for us is more and more integration points into where they are. Right? And that could also be I mean, ServiceNow was mentioned earlier. That could also be in ServiceNow. You know? Continue the conversation in ServiceNow. Continue the conversation where the users are. Amazing. And another question coming in. Are there data size restrictions if Databricks is exposed as a data store, like give a gigabyte you know, I'm gonna put my teeth back in. Gigabytes or petabytes? So there is an exposure in the terms of, because we are live query Databricks. So that Databricks data is is needs to be made available. All the, the the width and the depth of the data needed in order to answer the question. But we're not exposing everything to ThoughtSpot. Right? So the way that so where most BI tools or traditional legacy BI tools, you need to aggregate the data and then shift that data out of Databricks into its data store. We don't do that. We basically, what we'll do is on this high level question about churn, we will break that down into fifteen, sixteen, twenty different little questions that we need to go and answer of the data. And so it will go and ask the data, which might bring me back an aggregated result, and then I'll take that. Then I'll ask another question, aggregate a result, and take that. And so what you'll find is the data that actually comes back is very small. It's compressed over HTTPS and TLS one point two pipes, so it's all encrypted. So the actual data that leaves Databricks back to the web browser or back to the calling LLM is incredibly small. Wonderful. It looks like we are at the end of the questions, in which case, I think we've given everybody a lot of pause for thought. We're gonna get a copy of this recording sent out to everybody. So people who have registered will also receive a copy. And, obviously, please feel free to share the recording with colleagues whom you think would benefit from seeing it. All that remains is to thank our wonderful panel for being with us today. And we really, really appreciate your time, and I hope you, all get the opportunity to go outside and enjoy this fabulous, English weather, which is not something we can say very often. But thank you everybody for your time today. It's much appreciated. Thank you. Thanks a lot. Thank you very much. Take care.