Let's get going because I think we have a really interesting conversation today that we're excited to have. I'm James Wright. I'll be your host today and your moderator for our panel discussion. My role here at InnerWorks is to drive strategy. And a consultancy that means strategy in terms of understanding where is the market going and the direction of the analytics businesses and partnerships, the data businesses partnerships out there. It means translating that into what impacts is that gonna have for you, our clients, whether those impacts be technological or personal or some combination. And then in terms of driving our strategy here at InnerWorks, what partnerships do we choose to lean into? What technologies do we need to learn? What skills do we need to be hiring or training? And we thought it would be fun. I thought it would be fun to run a series of conversations around where are we in the state of analytics in twenty twenty three? It certainly feels like a time of change, of excitement and of opportunity. And so this is going to be the first in a series of panel discussions I'll be hosting throughout the summer here in North America for our Australian and Asian folks joining, obviously, in the winter. Today's conversation is going be focused on analytics and BI, or simply put the consumption layer for data, or maybe the feedback layer back into the business. And this is really nicely timed with us having just attended Alteryx Inspire, Tableau's conference, ThoughtSpot Beyond, and Microsoft's build sessions, and so we'll be able to bring back some insights from those sessions. For argument's sake or discussion's sake, I'm going to draw the boundary for today at the semantic layer because our next conversation is going to be focused on the cloud data platforms, ELT, ETL, etcetera. And that'll be nicely timed on the heels of attending Snowflakes Summit in Vegas and Databricks Conference in San Francisco. And then depending on the feedback we get from the folks listening, I hope this is fun for you too, we'll have at least one and maybe a few different sessions tying it all together into a big picture. Our goals here are to digest incoming information, help figure out what's hype and what's gonna create real value, start looking at the horizons of that value, when are we going to see these impacts and when do we need to start specifying different actions to adapt or anticipate these changes? As a quick disclosure or disclaimer as well as an advertisement, Interworks maintains active partnerships with all the companies we're gonna discuss today, at least the four conferences, as well as quite a large collection of other technology partners out there. Because our goal is really to stitch together a best of breed approach to analytics. And I think that that gives us a chance to customize the right solution for our client base around the world. And the advertising part of that is, we'd love to talk to you about how these general principles and insights we have in the conversation today can and should impact your business going forward. So don't be shy. In terms of today's session right now, just a few call outs before I move on to introducing our panel. We will have the Q and A function open inside Zoom. So please use that for Q and A and we're gonna reserve the last quarter of the call or so for a conversation. I'll make sure that I try and manage questions effectively in there. Also, I think we're gonna send out a few polls which are some multiple choice for each of you out here. Take a few seconds to click in there if you can. It'll be a great chance to help us adapt sort of what we're talking about based on kind of what's important to you and maybe even adjust that in the Q and As. Okay, so without further ado, I wanna introduce our panel today. Again, I'm James Wright, I'm the Chief Strategy Officer here at InnerWorks. With me, a diverse collection of folks here from the US. Keith, maybe you can kick us off and tell us about yourself and what you do here and we'll get on the line. Sure, yeah. I'm Keith Dykstra and I'm based in Washington DC. I've been with Interworks for about five years now. Started out just as a consultant on the ground and most recently have taken on a new role helping lead our analytics practice. So I work with all of our analytics consultants to figure out what kind of work we should be doing to bring value to clients. Great. Ben Young, maybe maybe from the DC area as well, suppose. Introduce yourself. I'd love to say. Yeah. Thanks, James. A little bit further north, just up outside Baltimore, but it's just a quick ride down ninety five to get to DC. But my name is Ben Young. I'm the regional director for the east half of the US, which means I run a go to market efforts, including partnerships and major account relationships. I also spend a lot of time talking with our clients about tool selection and strategy. When we're looking at use cases and personas, what tool fits the bill? What should they be looking at? What makes sense from a licensing perspective? What makes sense from an innovation perspective? So excited to bring that perspective today and chat with my friends here. Last but not least, Ben Bousley. Yeah, I'm Ben. I lead our product group here in InnerWorks. And so, you know, what I spend my day to day thinking about is, you know, how we can integrate the experience around analytics and data and make it better. So that's results in products like Curator, as well as our support products that we deploy at customers worldwide. Great. Well, thanks everyone for joining today. And thanks obviously to the nearly a hundred folks on the participant side. I'm gonna turn off the screen. We don't have a lot to share other than just our conversation. And maybe just to kick off the conversation, we were just at a variety of conferences over the last month or so. I'd like to talk through four of them. ThoughtSpot, Tableau, Microsoft's conference build, then also Alteryx Inspire. And across all of these, maybe as we narrate the panel here, can we talk about the next step in the platform, what we heard from these companies in terms of what direction they want to go, and if you have any insights on what does that mean for what we do now from an implementation perspective or an adaptation perspective, let's call that out. So maybe starting with maybe the newest, certainly the newest competitor here and disruptor, ThoughtSpot. Ben Young, I know you spent some time maybe even keynoting part of that conference, if I remember right. Tell me about your takeaways. Yeah, for sure. And I was able to speak in the keynote there. It's been a few minutes there, which is great. Dropped some dad jokes and references to my kids just to make sure we were having a good time there. But ThoughtSpot was really interesting in the sense that, like Jane is saying, it's kind of our newest tool that's out there, and I'd argue one of the ones that's best positioned is to take advantage of AI and large language models that are making the news recently. ThoughtSpot has built its product off this idea that it's able to go through and do natural language processing. They like to refer to themselves as Google for data. And so you'll go through and have your datasets ready. You'll type into a search bar, hey, what happened? Like, what were my sales yesterday? Or how does this compare to last Christmas season if it's retail? Right? The the promise has always been that they could get there. As we've rolled it out and as we've worked with it being kind of a side by side tool with some of our more headline partnerships like Tableau, the promise hasn't really been able to deliver just in full transparency on where things are at. Because if you're looking for that pure like natural language interpretation of what is your question, and we'll get that back. It kind of fell over, to be honest, and we'd spent a lot of time saying, well, it's not like a Google search. It's an Amazon search, and so you do your keyword search, and that's fine. But when LLMs broke onto this scene with ChatGPT, all of a sudden, ThoughtSpot was kind of in pole position to take the lead there, and their conference reflected that. So big headline announcement from them was ThoughtSpot Sage. That is their AI plugin that takes advantage of GPT. It's on GPT three point five. Four is coming down the pipeline soon, And its goal is taking your comments and your questions and translating it to their already very robust keyword and semantic layer inside of it. So in terms of like fastest out of the gate or tools that are able to really apply these from day one, ThoughtSpot really put in a very impressive showing. So not only are you able to ask questions of your data, but you're able to then follow those up and dig further and drill in at any point, which has also been ThoughtSpot's really main key differentiator. You don't have to deal with a static dashboard. You're able to then drill in and ask five more follow on questions. And they're expanding that so it doesn't have to only go through the lens of like a live board or a search bar. They announced a couple other features like ThoughtSpot Monitor, which is now available on mobile, where you're able to kind of get prompted on, hey, here are things that are gonna that you should be looking into. Here's a KPI that's changed, and here's why. Go check this out. So rather than relying on they use the kind of pull versus push analogy a lot. So whereas a BI tool had to pull you in when you're interested in something, they're saying it can actually push things out to you to help you separate the signal from the noise. So those two, I think, were their headline announcements that they were very excited about. Are dashboards dead? That's funny. No, but it's a real question, right? I mean, I feel like that's part of the zeitgeist right now in analytics is it feels like most of these companies are moving to, or most of what we're looking in terms of consumer base is moving from exclusively the thing I click on in my email in the morning and I open and I see the answers and I leave to being a much more diverse experience I have in terms of how I'm going to consume data. I think we're going to hear metrics, we're going to hear search, we're going to hear semantics, we're going to hear even Ben talking about some technical integrations. It feels to me personally that I don't know that I think dashboards are dead so much as they're no longer a solo act is maybe the way I put it. But I'm curious, you know, the panel's perspective on that larger narrative. I mean, I think you can say that dashboards aren't dead if you call them live boards. I think, ThoughtSpot has essentially a dashboard, they just call it something different. But I think your point James of it's a component of the analytics experience is sort of what we're seeing. Moving away from that singular, I go to a dashboard, I get my insights to a much more multifaceted way of interacting with data, pushing notifications to you. Yeah, we'll talk about metric layers and Tableau definitely released some new features that they're working on at the conference that are related to that effort to not just confine that user experience to a single dashboard as we've traditionally known it. I think ThoughtSpot was probably a little early to the game saying dashboards are dead. I think there's probably always gonna be a use case for seeing some metrics side by side. Like, you know, it's really convenient to have KPIs that are highlighted there. And maybe that is more determined by an algorithm which ones I see, but it's still helpful to say, Hey, what's the status of my business? Here it is. I think one thing that will change radically, though, is data exploration tools. A lot of what I've built in my career over fifteen years in Tableau has been tools for people to explore a data set. Here you go. Here's the general questions you have. It's not a daily dashboard, but it's when you have questions because they didn't have these capabilities. And I think things like ThoughtSpot, the search functionality are a better way to do that. But chat functionality also seems like a really good way to go about that of let me ask a question, let me see it, let me reframe it. And I think that's something that ThoughtSpot hasn't talked about, right, is chat or kind of this responsive thing. It's a one time search. It gives you a destination and then you can use their tool. So it's kind of LLM sprinkled on top of their user interface and it helps improve it a lot, but not the full vision, I think, of what people think we're going to head towards. And I think when we're talking Sorry, James. Please go ahead. I'd say when we're talking about features that are there today with a Sage announcement they talked about, that is absolutely the case. And I think ThoughtSpot also recognized that there was Gap too. So they also announced a couple more features that, for the record, like, are coming down the pipeline. No hard deadlines, classic, like, product manager. We're gonna show a great demo and wave our hand past when it's actually coming. But when we look at like ThoughtSpot Monitor, which is that I'm going to send out integrations that are there or nudge you places. There was a question in the chat that sounds very similar to Tableau Pulse. That's where I would line up those two. But then to Ben Basile's point, there's also the broader ThoughtSpot search experience that is exciting for people where you're not only looking for, let's automatically surface some insights, but you can still go through and explore down to the nth degree. It's always been ThoughtSpot strength is that it's on top of a data model down to the most granular metrics. So you can be querying billions of rows and get down to a single transaction ID. Then also coming back to the chat point as well, they did announce, and there's nothing available with this yet, but they announced Spot, who is their AI assistant that's available in Slack. And the goal with that coming down the pipeline, again, future capabilities, is to be able to kinda like you're bringing up a GIPHY in Slack. You put a slash, you type spot, or you type at spot, and then you ask it a question from your dataset, and it'll return it in your Slack workflow. So it kind of plays into how ThoughtSpot is very much leaning into the modern data stack. We wanna be able to integrate with everything via API connections as opposed to being our walled garden where things only work with our specific platform. I'll tell you, and I I think we have probably need to move on because we have a lot of topics. So but I'll say that I think the thing I loved about ThoughtSpot over the last few months is the nowness of their implementation of ChatGPT and sort of that whole cycle, because it really felt like what I wanted ThoughtSpot to be the whole time. And I saw in the chat, Leanne, one of the folks attending this, is going into production with Sage, which is ThoughtSpot's GPT implementation in July. That's exciting because I certainly think that ThoughtSpot's really caught lightning a bottle in terms of ability to implement on a short time horizon, I know is gonna be a question I have later on when we talk about some of the other things we've heard in other implementations. But okay, interesting stuff in ThoughtSpot for sure, particularly when we look at it being an and conversation with other technologies. One of the technologies ThoughtSpot's often gonna be sitting in the same organization with is Tableau. I know we just attended Tableau conference. I was in Vegas for, I don't know, tenth year of Tableau conferences. Keith, I know you were there and you've presented and been a big part of that community in the past. Tell us kind of what your responses are for what you felt and experienced and heard at Tableau Conference. Sure. Yeah, I mean, think if we have to recap TC twenty three Pulse is the first thing that often comes to mind. If you haven't seen it, think of Pulse as you log into Tableau server, you can click a couple of metrics you're interested in, and it will generate for you a dashboard. You can subscribe to get notifications about changes in that metric. You can ask some follow-up questions. So this whole new consumption layer that is sort of divorced from the traditional build a dashboard experience. There was a question in the Q and A as Ben was describing ThoughtSpot and someone said that sounds really similar to what Tableau Pulse is. And as I was sitting in Tableau conference, I was chatting with people at IntuitRx and we're saying, are we at a ThoughtSpot conference or a Tableau conference? There was just a very much a blurring of the line between those two technologies. And so I think it's Tableau's effort to sort of catch up and also take advantage of a lot of what's happening in the AI space, Try to close that gap and have those auto generated insights as well. But I think the fine print on it is that it won't even be in beta until this fall and looks like it's only gonna be available for cloud customers. And so I think there were a lot of people at the conference saying, this is great, but it might be years before it makes it to my actual implementation. But it was definitely impressive. There's great things about it. I think it's gonna make customized, personalized metric tracking a lot more scalable than building individual dashboards. But Tableau still has a long way to go, I think, to close that gap and make sure that it's broadly available to the common user. It seemed like one of the reasons why they're waiting till fall is because they're building a new metrics layer. Mean, understanding is that it looks like a replacement for current metrics. And part of the requirement there was the VizQL service they also announced, VizQL data service, which is really like addressing some deep technical debt in the tablet platform of separating these query later from the visualization layer. Now they can go after that data information and do something like Pulse. So if if the vast majority of of folks here, I suspect, really are Tableau customers, I'd be curious to see the poll results there. But if if I'm a Tableau customer today, what what do I do based on what I heard this year? What's changing for me? You know, I think one of the things that was really interesting when I was sitting at the conference, was talking to a customer and she was describing just the situation on the ground of her organization wasn't even aligned on what metrics matter and how do we define those metrics. And so I told her, none of this is gonna be useful for you if you don't even know what kind of metrics matter. You have to have clear definitions, you have to have data sources that are clean and ready to feed into these systems. And so I think that's the place where most people are at, or if you're at that place, that's where you should start is figure out what matters, define it clearly, get some data sources that are ready to be fed into these algorithms. Because until you have that defined, it's really sort of useless to you. That feels like one of the points of consultation we've had with lots of clients in the conversations around ThoughtSpot. It feels like it's gonna be happening more and more with the rest of these platforms as well, which is finding a way for an analytics tool to be both a collection of point solutions to point problems, as well as something that bolts onto a general purpose known data layer that has known semantics and known relationships. And I feel like Tableau very much started out being the first of those things, And everything Tableau did in the early days was trying to be a Swiss Army knife for any data source you had, for any use case. And now we're seeing BI evolve to be being something which is much more about shallow usage on more known data. And so I think Tableau is really moving to embrace both of those. Whereas ThoughtSpot, I mean, sort of starts out in a more modern perspective. Maybe I'm wrong though. Ben, didn't mean to cut you off going into that. No, you're totally fine. No, and I think that's right. So that more modern perspective is taking advantage of where data lives and resides these days. So if you look at Tableau early days, they're getting back and we got to connect to access databases. We have to connect to Excel files. We're very much dealing with file based data sources and then matured into on premise databases and then adapted to cloud data warehouses. Where ThoughtSpot, it's actually been around for eleven years at this point, but it kind of languished for a while until they recently said, nope. You know what? We're going cloud first and cloud only. And so they'll, like, qualify you with what cloud data warehouse do you have? You don't? Okay. Fine. Like, we have solutions, but they're not great. We'll wait till you move that direction. So I think that cloud first mentality came out across both conferences. With ThoughtSpot, all of their new features are on Tableau Cloud. Their on premise product is an afterthought at this point. They're working to move customers onto there. With Tableau, the balance is a little bit trickier where all of their new features are Tableau Cloud, and they're very excited about where those will launch. They still have a big book of customers who are on premise. And so you'll see not necessarily a first mover advantage, but a second mover advantage where people have started on the modern data stack running with cloud first, being those groups who will get the newer features faster. It does feel to me though that Tableau Cloud is a better story today than it was and it's making a lot of progress. And it seems like it's first in the feature queue in terms of what are we gonna develop. It feels like it's really growing to the point where for a lot of even the largest customers or larger Tableau customers it feels like as good or better an answer for a lot of those deployments, which I don't think I would have said. I certainly said the opposite of five years ago or three years ago. Yeah. And that was definitely a key message in the conference is Tableau said explicitly, we wanna move as many people to the cloud as fast as possible. And so I think they've they're gonna have a very intentional focus on what's holding people back from making that transition and how do we close that gap? How do we make this feel like a platform that all of our customers feel comfortable switching over to? Because the the it just makes sense for them to have people concentrated on that platform. I know looking at our business around the world, that's a big piece of what it feels like we're doing right now is is answering that question of when should we and how should we get to the cloud. Okay. Well, let's move on then. Microsoft, The Build experience. Ben, maybe what what are your thoughts there? What what did we, what did you hear, from Microsoft? Yeah, so I think Microsoft is interesting. I mean, was not a Power BI conference. It was Microsoft, so it was much more broad ranging and was a little bit like drinking from the fire hose because I think Microsoft is just all in and I think very easily could be considered the leader in the space as far as how they're pushing their vision. Copilot was a big thing for them. So we've seen Copilot as their AI technology that was introduced with GitHub at first, then it showed up in Bing, and now it's in the Office products. And of course, it came to Power BI as well. And so I think one of their visions is that everyone needs a silent partner, right? In software development, we call it pair programming, right? Hey, have someone who can help me. I'm a better coder. If I have someone questioning me, giving suggestions, stuff like that. And I think they're setting that technology everywhere. And so in Power BI, that can mean writing code, like it's gonna help you get your query, your MDX, your calculations. And we saw that in some of the other tools too with Tableau or in ThoughtSpot with it suggesting synonyms for the data layer. But I think Power BI just showed more. So where I think Tableau showed Pulse its own kind of set aside experience. And then they showed something that will come out next year in preview with generating a single chart. Power BI was showing off generating entire dashboards, and that preview is happening now. So they just seem further down the road in their execution. And of course, a lot of this devils in the details, right? How does this actually work? You know, how clean does your data have to be? How good the descriptions to be for it to work? But, you know, the things they're showing of generating an entire dashboard based on a set of descriptions, being able to iterate on that dashboard design within a natural flow of dialogue, changing out pieces of it, adding narration to it, changing the style. This thing I think maybe blew my mind the most is pointing at another Power BI dashboard and saying, Hey, could you use the same style? That sort of thing. I mean, we spend a lot of time trying to get Tableau dashboards to look the same across things. So having that sort of shortcut is pretty amazing. Certainly it feels like have had a growing and a large subset of our customer base that's active in Power BI. I think even on the poll result I just saw maybe half of the folks here on the call have at least Power BI and maybe Tableau for example. I think we originally saw it as a sort of a good enough sort of answer, right, at a low price point. It does feel like there's a change, they're rounding the corner on being a fully featured maybe platform that's focused a little bit less maybe on the beauty of the experience, or the visualization experience, let's say. But it feels maybe the pieces are slotting together rather than, you know, again, five, eight years ago we would have said they have power pivot, have power query, they have power BI and these things are completely independent. Does that feel feel fair? Ben in terms of, you know, where we think power BI is fitting in for most of our customers today? Yeah, hundred With Curator, overlap with Power BI quite a lot. Yeah, yeah. And we embed Power BI now and we do see that in a lot of customers as far as crossover. But I think the interesting thing, I mean, mentioned Power Query, Power Pivot, Power View, they were separate tools, separate experience. And Microsoft has kind of relentlessly said, no, we're building this BI platform and we're going to do the integration. And they've kind of just slowly marched down and done that. And we saw them do the same playbook with Microsoft Teams and taking over Slack and now has higher market share. And that kind of relentless approach is pretty big. So I think actually the sneakiest announcement in Microsoft Build didn't have to relate with Power BI, but it was Microsoft Fabric, which is really them saying, hey, we're going to be the integration for all of this. And we're gonna bring the data lake, data warehouse, ETL, and Power BI with visualizations all together. And then make it so that it's one governance, one access layer, and you can do things like action out there. So when you start combining that, especially with their vision of AI, I think you have a really compelling view of the world. And traditionally, I think the experience has always been where Microsoft lacked. I always felt like they were eighty percent there and there was always these things that were just hard. Well, if a lot of that is being done through natural language, you know, maybe that rounds off those corners and makes it good enough. You know, it becomes very compelling. I'd say that's kind of where the two major players line up. Right? So on the one hand, you've got Microsoft with Fabric where everything is aligned, everything is integrated, you have the single login point, and then they're all there together. And it can do all the different things at varying capabilities. And then on the other side of the coin, you have the modern data stack. And it's saying, look, I'm gonna take the full best of breed approach to this. I'm gonna go in and say, look for data modeling and my metrics layer rather than having a BI tool own that and getting it locked in there. Let's use something like DBT to make sure that I have that metrics layer ready and built out. So like ThoughtSpot announced that they can now import a DBT model that's in there. And they're saying, look, take DBT and then plug that, run that on top of like a snowflake or something, and then have ThoughtSpot be a reporting tool, and then like Inworks curator can wrap it all together. And you have these multiple tools that all can talk to each other via API connection. You don't have to buy the same tool for all them to work, and then that's one approach. Higher on the integration cost is not all part of the same thing, but you are trading the higher integration cost for arguably better performance inside of all of them. Microsoft's approach to the platform is, nope, let's just do fabric. It's all there, and it's all there together. And the real big question mark here is that AI plugin of if we can go in and we can start asking questions across all of these, Does it make that Api connection easier? If so, then maybe modern data stack is a lot easier than it used to be. It doesn't require as much of an engineering burden, or maybe it's you know it's baked in the fabric end to end, and so that's where we need to go. To be honest, I don't have a solid answer on that when I'm looking at which direction is there, but it's a space we're watching very carefully in terms of how those play out. I don't think there's necessarily a right answer for each group, but it's the kind of trade offs that you're looking to make as you evaluate the platform versus the full stack solution. Certainly, I think it's safe to say that Microsoft in terms of being a full stack solution certainly seems to have like some head start in terms of having a believable answer for cold storage in the cloud, warm and organized storage, then now we're organizing into fabric all the way up to consumption. It's a really compelling narrative in a way that maybe feels like Salesforce as a counterpoint to that isn't nearly as mature when we look at will Salesforce be the home for all of my data? Certainly for CRM data, think we can all say it's winning, it's amazing. But mostly, I think most of our clients on Tableau are looking at a Snowflake or Databricks or some sort of warehouse or lake house strategy behind it. Acknowledging on time, and I think we could have that conversation all day long, Maybe I'll just move us along a little bit to talk about Alteryx, which I know Ben, maybe you can tell me about the impact you think you heard. Alteryx seems like it's gone through a huge amount of soul searching maybe or change in terms of its integration of a new cloud product. I think Alteryx server from years ago, I don't think we had a lot of belief in was the future of what cloud analytics looks like. I think Alteryx still is today one of the best answers, if not the best answer for local file based analytics. What did you see in terms of where they're going in the future? Yeah, I think that was a very apt summary. If you look at the history of Alteryx over the five years, five years ago, it was, oh shoot, we need to go to the cloud, something's happening. There were some failed aborted attempts to get started, and then they went out and said, you know what, let's try and get there via acquisition. They bought Trifacta, and then they integrated that Designer Cloud, and now they're saying, oh wow, actually we have something that's really compelling So the main narrative out of Alteryx Inspire was that they are building out an end to end analytics solution. They have some really nice PowerPoint graphics they're going through and saying, this is where everything plugs in, from your data preparation and modeling side to your reporting and visualization, to your machine learning, to your automated insights. It has the potential to do everything that's there. Now have they delivered on that? They're working on it. I think it's the best answer. To summarize that, they launched a new AI tool that's called Aiden, so AI DIN, and it's going through and it's helping out with things like one of the sneaks was auto documenting your workflows. Another one was leveraging their auto insights tool, which automatically figures out why things are happening inside of your data to then get that auto summarized into an email or to a presentation that then gets sent out. So there I wouldn't necessarily call them a dark horse in here because they're not trying to replace BI tools. But for those customers that have data quality issues, so Mark Usher, I saw your question there. Like what is the answer here? It's kind of split. You either can get it through your data warehouse and make sure that that's being built out and has that robust layer, and you have good pipelines and feedback loops there. Or you can kind of take Alteryx approach and say, actually, I'd rather just a self-service prep tool dig in with all of these, And Alteryx continues to be best in breed there. The most exciting announcements that people loved were all around Alteryx desktop. So Alteryx Designer Desktop in seeing kind of that historical tool that everyone loves with things like control panels and dark mode was a was a crowd favorite that got a lot of applause. But then there was also great momentum around Alteryx Designer Cloud, which is coming in and saying, hey, we'll be that next step. We'll be showing off all of our future capabilities, we're not going to forget everything that we have on premise. Designer cloud has the it's like eighty percent of the most commonly used tools are already live and in there. It actually also leverages the trifecta engine underneath to be able to schedule out content. And so whereas before on like Alteryx Designer Desktop, you had to worry about like, do I get the desktop scheduler? Do I upgrade to server? Designer Cloud has that built in natively. So it's very interesting to see kind of the dual paths where they're going forward to try and enable people to have both. And that was a lot of the breakout sessions was, okay, do I choose one or the other? Alteryx was saying, you know what? We're not gonna try to force you to do anything. We'll support both going forward and make it so that you can have kind of the best of both worlds. Everything was talking about hybrid deployment rather than do one versus the other. I think that's an interesting jump off then, Ben. And I appreciate that insight from Alteryx conference. I think it's interesting to jump off to maybe what I'd like to spend the next few minutes talking about which is the philosophy we see in the market right now around how will we integrate AI? Will it be something like the augmented analyst where it's going to suggest things that's basically gonna make me faster or maybe more insightful? Or will it be on the other end of the spectrum a tool that does everything for me? Maybe where BeyondCore was years ago and sort of just spits out the answer and says this is what it is. Keith, just for your perspective, looking at your history with Tableau Desktop and the analytical perspective, what did you see there that was interesting? Maybe Ben Balsey, what's your perspective on that conversation as well? Sure. Yeah. And I think especially if you're looking at Tableau versus Power BI Copilot capabilities, it feels like Tableau is sort of adding it on and it's not as deeply integrated into the traditional analytics experience. There's definitely, I think they felt like there was this collective angst at Tableau conference among the traditional Tableau desktop dashboard developer crowd of what is this going to do to my job of creating dashboards? And I'm not terribly concerned. I feel like the way that they're integrating it is gonna make my job easier. It's gonna allow me to create some things much faster and probably more reliably than I could do individually. And then hopefully just free me up to do some of the more complicated, more nuanced stuff. Again, going back to a client that I'm working with now, they're still figuring out, well, how do we define a metric like efficiency? And where's their data that we actually wanna exclude from these comparisons? What is a fair comparison? There's still a place I think for the traditional analysts to operate in that space. And then just using AI to kind of augment their capabilities and help them make things faster or make better decisions. But from a Tableau perspective, it definitely feels like something that's sort of being layered on on top and not nearly as deeply integrated as well. I think we're seeing come out of Power BI. Yeah, I mean, I think Microsoft is interesting in the sense that I think they have the dev resources to go all in in all of the places. So they can do the layering on top of the experience, but they can also like go down the path of like deeply thinking about how to change things. You know, some of the announcements that don't seem related to this that I thought were interesting were like they announced interoperability with their plugins with ChatGPT. So plugins, if you don't know, gives a tool to the AI so you can go and reach out to get Zillow information or something like that. But if you think about like their play of making things very interchangeable, like I could see a world where you're in PowerPoint, ask for visualizations that are generated by Power BI and embedded automatically. That doesn't seem like a very far fetched thing for Microsoft to be able to execute. And so I do think it changes where we're doing things, how we're doing it, the accessibility across the board. But I do think the safe thing is, if you think about what it takes to onboard a new person at your company, it's hard, right? You can't just hand them a book and be like, read this. And that's kind of what you need to get AI to be appropriate. And so access to this data makes a lot of difference. And I think all of a sudden, all of the technical debt we ignore, good documentation, understanding our business, what do these metrics mean, do we have a shared definition across the all that stuff matters a whole ton in the AI space. And so I think if there's one kind of takeaway that customers can start doing vast stuff, start addressing technical debt and identifying those areas because that's the sort of things we're going to need to make available if you really want this kind of unified AI that can help you do your job. In terms of customers can start, I wanna play on that phrase for a minute, Ben, and this is an open question to the panel. I feel like in this narrative we have of what we hear from these companies, ThoughtSpot feels unique to me in the sense that someone on this call is going to production with their GPT implementation this summer. Looking at the poll results, it feels like most of the folks here aren't there yet. And most of the companies it feels like aren't there yet. Am I wrong in thinking that a lot of the improvements we're going to be seeing in terms of the AI implementation feel like they may be six, twelve, twenty four months out in terms of our clients feeling value from them in most of these cases outside of ThoughtSpot's age, or is that maybe a bit pessimistic? And if a client wanted to inflect that, what would they do, I think, is probably the important question. Yeah, I'll jump in first on that one. And I mean, I'll caveat this, but this is something I could easily be wrong about, and in three months I'll look very silly for having made this. But completely agree with that time frame. I think it's more on the twelve to eighteen months, to be honest. I think we're in peak hype cycle right now, which is great because people are looking at the capabilities that are there. And ThoughtSpot Sage, we actually have access to its public, sorry, private preview right now. And it's good, but it's a first step. So we've got kind of a narrative going on in industry where you have all the possibilities that come out of conferences that people are really excited about, and then rubber meets the road who can bridge that gap first. And I think it's still a few months away, at least for getting there. And a big portion of that is kind of what Keith was talking about in getting your data ready for things and making sure that you understand what metrics are there, because it's it's very much a garbage in garbage out scenario. Now, the large language models are getting very good at sorting through a lot of that garbage to understand intent to where it can get you what is good and coming back. But that being said, nothing that we have seen with any of these major tools have been something where it's like, I can ask literally any question and have it come back on literally any answer on wide unstructured unorganized data. So if we're looking for what we can do now for getting ready for all of these AI integration capabilities, it's gonna be focused on let's get large volumes of data in there to be able to be trained on, but also start putting some organization, not necessarily like perfect data models, but definition around these are the things that we care about so that these tools can find things easier, and like I said earlier, separate the signal from the noise. Mean, thing I'll say is, as far as you can start using right now is just using it to start your thinking about a problem or to handle calculation that you're having trouble with. I mean, you know, all the demos have things like, oh, hey, it'll write regex for you. But, you know, I mean, we had something internally where someone wanted to create a nice curve on a chart in Tableau and started iterating in ChatGPT and feeding it the error messages it was getting. And, you know, they got good results. And it was something that I don't think they would have achieved necessarily without a lot more work on their part. So I think finding ways to do that is important because it gives you kind of an intuitive sense of where some of the limitations are going to be, how you can approach this. There's kind of an art form a little bit too. How do you ask the question and what context do you give? And I think developing those things are really useful. And I think as consultants, know the power of like knowing Google really well, right? Like sometimes knowing how to find the answer is an important part of the expertise. For me, I think that one of the questions we had in the Q and A was around data quality. And I think it's a really essential question. And if I was gonna sum it up, if I have data of indeterminate quality, will the AI, will any of these improvements help me get answers of improved quality out the other side? We have garbage in garbage out or any version of those sort of narratives. But I think the importance of that question is also that with pattern recognition by a machine being an essential component of any implementation of this AI, what we need to make sure is that we're giving it good patterns, right? And so data readiness feels like a huge part of the conversation that's out there for every client today And certainly it's going to be, I think the key to the door for tomorrow. Can I have my data ready accessible to the cloud or in a cloud lake house? And can I have some notion of the model or the structure or what's important in that dataset to be able to give to either my team or a tool like ThoughtSpot or an LLM or something that's gonna make interpretations based on it? It feels like if you're using Power BI, if you're using Tableau, if you're using ThoughtSpot, we'll talk about click in a second as well. But if you're using any of these modern BI tools, the next stage in their evolution is gonna be predicated upon can you define an effective effective semantic layer across your datasets, at least your known datasets. What do you think panel, agree, disagree? And if I wanna get started today, how do I start thinking about that semantic layer if I'm all the way back at this is a collection of point solutions? So first off, completely agree in that everything kind of as you poke at it under the hood, all of these were showing off, okay, great, here's these things that we can surface from metrics or from your data models that exist or some sort of semantic layer that combines those metrics and brings them in. And the AI is just sorting the wheat from the chaff and saying, here's what you should be caring about across all of these things that you're interested in and have defined. Now there's interesting things that are out there. Scott, your question in the Q and A of like, there seem to be a model for let's look at transactional data and let's look at your web pages, right? I can expand that out to where transactional data like Pulse or maybe even Auto Insights on the Alteryx side where it automatically finds key things. That's kind of looking for what's the biggest contributing factor. You can think of the underlying regressions that are looking at certain coefficients and which is the largest. Let's pull that out and service that for you versus OpenAI and Microsoft searching web pages. That's where the Copilot demo that Ben was talking about was so interesting was because it took natural language and then it built out a dashboard with some key contributing factors that were a part of it. Thoughtsbot Sage also has the release where you'll go in and it shows off their KPIs and it says here's what has happened, here's an interesting change. You click on that and it identifies here's the five major things that are happening. So I think they're converging where either it's an LLM talking to an underlying mathematical model to identify the outliers, or it's something where it's an LLM who can do both. I think it's gonna be under the hood for end users. So let's call it three years from now. Again, hang my hat on something that I can look silly on later. We're gonna have a point where these AI models have been trained or able to interoperate in a way where LLM interprets and then underlying solutions find the results and then translate that back through an LLM to where you get to all the way back to James' original question of what is happening that makes these things interesting. It's going to be built off of that underlying data layer. So I think it's absolutely crucial. And if we want to say, what can I do today to get ready for AI? It's going to be investing in that lake house or that data lake or whatever your preferred approach is for making sure that these amazing accelerators will be able to do that for you. It almost doesn't feel like it's important whether you land that as a DBT metric, as you land a Tableau metric, if you land it into Calibre or Alation any of these platforms, it feels like the important part is stage the data in a place that's cloud accessible and determine the structure of what's important in there. Any of these systems will be able to derive value from there. Looking at the chat and the questions, I think there's an interesting one here. And Ben Valsa, I know you've spent a lot of time recently talking to our security operations team around data accessibility in the world of GPT and AI. I'll say that one of the interesting things for me that came out of the Tableau conference was actually a chat with, I can't remember his name off the top of my head, the gentleman who started BeyondCorp and then sold it to Salesforce and now started another company called Able, which I thought was really interesting because I thought his approach to how do we leverage AI instead of a two stage process was really valuable because it kept local learnings, kept local security, and then was able to leverage generalized LLMs for sort of the interpretation component of it. But Ben, maybe you want just comment on maybe somewhere in the umbrella of how do we think about security and also that multistage component there. Yeah. I'll also say Able was a very interesting demo, and I think they have a lot of interesting tech there. Most of the players here are choosing to do similar things, right? So like if you look at the Tableau space, they're made Salesforce GPT, which is then the basis for Einstein GPT and Tableau GPT, which is what goes into Tableau Pulse. And what they're doing is basically creating a foundation where they have a special relationship with OpenAI. They can kind of take those models, run them in an environment that the data is not going to go out to the world. And then they can provide some security there. There's also a lot people are doing to make sure that the models can change. So whether you want to use one of these open source models, if you want to use GBT, or if there's something that comes down like Bard or something from Google, that they could kind of swap those in and out and could be using the best of and it's kind of the same infrastructure. Able was, I think, the only one I saw that really did something a little bit different. And partially that could be because they don't have, you know, they're not a giant company that can kind of create the special relationship with OpenAI. But they were creating a place where you could run these different models. But when it came to the data, they were doing very interesting and complex masking to send the data into the model. So that's all, I think ways that we can protect, especially things like PHI and things. But I think everyone should know, you should be careful if you're using an API that is data that's going into some other system. So, you know, it's up to you whether you trust that third party. You know, by default using APIs and stuff means it's not going to be used to retrain the thing. So it's going to be deleted. They're only kind of keeping it for a short period of time. You you can refer to their policies for that. So, you know, your level of trust kind of depends on that. I personally would not put things that are highly sensitive like PHI, you know, personal data of individuals into that space. But those are things that you should be evaluating with your teams locally. The other thing I'll say as far as like ThoughtSpot and a lot of these, they're actually not exposing the data to many of the metrics. So we were talking earlier about like website versus data, AIs. They're all the same LLMs for the most part that are being used here. There's a lot of process that goes behind the scenes and there's a lot of back and forth with the model that the user doesn't see. So in like the ThoughtSpot space, they're actually giving a fake table to GPT and saying like, create a SQL statement from here. And then from the SQL, they generate the ThoughtSpot language for doing the query, right? And so that stuff never gets actually fed to the LLM. And so there's a lot of things that are happening where it's like, hey, LLM, do something useful for me. Create a SQL statement or write some code for me, and then we'll go execute it securely in our environment. So that's the other way that people are getting around these things. It certainly seems like the important thing here is looking at the specifics of the implementation and asking, has this been built from the ground up very well to be able to take advantage of this and stay secure? Or is this maybe the world in food of like truffle salt where we felt for a little while we could just throw it on top of everything and it would make it taste magic and better. I don't think that's the way GBT and AI is gonna work for analytics, particularly around sensitive data. I will say that the next episode in this series, we'll be talking specifically around the lead up to the semantic layer. How do we look at lake house architectures in Databricks and Snowflake or in Microsoft as well and ask ourselves, how do we leverage that technology to better ready ourselves for the next generation of BI? It certainly seems like one of the hot buttons we've hit on in the questions and in the comments here is around data quality. And I think a lot of that is more and more going to start being apparent at the database level rather than in a sort of a Tableau world of five or ten years ago where you could mask that at the analyst level. I think we're gonna see more of a reliance of that being done earlier on. We are just about at the end of the time. And I can see that it seems like for the most part we've gotten through the questions in the Q and A. There was a question in chat around Qlik. I can't say that I have spent a huge amount of time this year personally keeping track of where they are. They feel very much in the middle of the pack in terms of evolution of the platform. I think we originally saw QlikView being a very interesting product, maybe in the more micro strategy sense of requiring a huge amount of pre investment in terms of that semantic layer ironically. But having less of the flexibility of the BI two point zero products like Tableau. Sense certainly looks and feels more like a Tableau or Power BI in the behavior. I don't know if anyone here on the call has anything specific to say about Qlik or about Pyramid Analytics, which I also hear in the chat here. Don't have anything to tell you that I think either of those companies is doing anything that would not fit within the general trends that we discussed today. Although I'll open up mic to anyone on our team who wants to comment on either Qlik or Pyramid. So I think first off, there's a victim of sampling bias here where a lot of people don't come to us for Qlik questions on that. So I do wanna caveat all my comments here with this. But for our clients, at least that I work with in the past year who are using it, it's been interesting where they use it majority on this data preparation point that we're talking about. So post the acquisition of a community on click, click actually became a piece of their ETL for their nontechnical users. They don't have a big data team. And so they were saying, look, we have click in place. Let's just make sure that we're using that to roll things out and make sure that your end users can do that. So their answer to this, how do we define the metrics is bake it into our BI tool and platform. And you can see similar things happening with ThoughtSpot trying to make their developer later a little bit easier with Tableau Prep with Power BI and their Power Query. And so it's trying to figure out where it fits inside of that. But in general, if we survey our customers against sampling bias as the caveat here, it's click has not come up as much as Power BI and Tableau. So I'm not sure we have a fantastic opinion about it just due to lack of data. Yeah, I think it's totally fair. Look, everybody, I'll just say thanks. Please, on your way out of the webinar, there'll be a quick survey. Give us your thoughts. Does this format work? Is this useful? Is this interesting? What would you change? I'd love to hear it. Personally, think I love this part of the conversation. It's my favorite part of the conversation to have with our clients in terms of what does this direction look like, how do we change our teams, how do you change your teams, how do you change your technology to adapt. So we'd love to hear that. In terms of the quick question I just saw come up, do we have training sessions? Yes, Gary. Reach out to us or we'll reach out to you if know how to. We do a variety of high level strategy education like this as well as everything down to what buttons do I push in these technologies and we do that around the world. So with that, we'll say, look, thank you to our panelists. Thank you Ben and Ben and Keith. Thank you for everyone for joining us. We hope you enjoyed this and we hope we get a chance to, have another conversation with you.