Jumpstart Your AI Journey with Snowflake

Transcript
Fantastic. Welcome, everyone, and thanks, Rob, for that lovely introduction. Today, it's all about jump starting your AI journey with Snowflake and into works. I'm Manish Kukreja, partner enablement lead for ANZ. And I joined Snowflake about six months ago. But as Rob suggested, I have large experience, working in various business domains from telecommunications, health care, and, management consulting. For over the past four and a half years, I've been applying Snowflake extensively in the fields ranging from marketing, telecommunications, finance, fraud, and customer service. So I'm very, very eager to share some of the practical insights of how Snowflake can accelerate your AI initiatives. I want to start by providing the safe harbor and disclaimer statement as I might be making some forward looking comments in this presentation. So let's quickly walk through what our road map is for today so you know exactly what to expect from this conversation. First up, I'll set some ground to say why this matters now. The landscape of AI is playing crucial role, and data is playing even more crucial. Next, we'll dive into unstructured data insights. We'll explore how to unlock the immense value hidden in our unstructured data directly within the platform. And from there, we logically move to data powered AI agents. We look at how we can leverage these insights that we can use to build powerful, reliable, and sustainable AI agents. And that's the exciting part. I'll show you a demonstration of these concepts in action that facilitates your AI journey practically. And finally, as Rob just suggested, we'll open the floor for q and a. We'll dedicate some time at the end of the presentation for your specific questions. So it's a fab fabulously packed session today with actionable insights. Let's dive right into it. As you can see here, the world is intensely focused on building AI centric future. The driving force is clear that we need to achieve business outcomes. Within this landscape, generative AI, over the past three or four years, has emerged to be a particularly powerful tool. It has this potential to offer capabilities that help businesses augment, enhance, and fundamentally rethink how every function in their organization works. But what is crucial to understand is this is not the end goal. The end goal is not implementing GenAI. The true destination is how we are transforming our processes ultimately so that we can redefine how do we do business. Essentially, the key lever for this change is using our data more effectively and AI as a serving powerful catalyst. While leveraging AI, especially generative AI, is a powerful vehicle, The potential is undeniable. However, when I go and speak with partners and customers, the organizations are heavily investing in future. The critical challenge appears, which is what I like to call the value and scale problem. So if you look at that line chart on the left, the adoption rate of overall AI has been steadily growing, especially in the machine learning space. However, in the generative AI adoption, that has been skyrocketing. It's almost a vertical line. And various research suggest that around forty four percent of these organizations are already seeing significant value. And at an enterprise level, it's adding more revenue to the business. So the value is proven that AI needs to be heavily invested, adopted, and scaled. But this is where the dichotomy occurs. Scaling AI applications, over two thirds of the organizations are struggling. And this is the gap that I'm seeing that is the value is there, but scale is not. The investment is there, but the focus and the rigor on data is probably missing. And so this is where precisely the partnership with Interworks and Snowflake comes in, where we enable the data driven transformation we just discussed, with AI in the mix. And it's fundamentally, the power lies in connecting the silos. And these silos could be from various different things, from your data across business units, from your data being in a different data type, whether it's structured, unstructured, or whether it's different architectural patterns, data lake, warehouse, warehouse, or there are different kinds of workloads, whether it's AI, analytics, or data engineering, or be it that environment the enterprise is using multiple different clouds or hybrid strategies. So these different silos are what is causing headaches for scaling the AI applications. And while I think a lot of this requires consistent governance, reliable performance, and exceptional ease of use, Snowflake provides all of this with the unique and unified architecture, which means it's one fully managed platform. The key advantage is that the product, our team, is rapidly innovating, and these enhancements are being rolled out frequently. When I joined Snowflake six months ago, the best part was I started getting emails from my product teams that they are releasing new features every day or so. To just put things into perspective, in the last year alone, two fifty features or more were released, which is around five new capabilities every week, and all of which does not require complex tuning. No extra actions are needed from the customers. So everything we deliver is grounded in three core tenets. It has to be easy, connected, and trusted. And these have remained relevant over the many years. Easy because providing managed platform that adapts to your business needs, whether it was BI to machine learning, on prem to cloud, or machine learning to AI and generative AI, or cloud to multi cloud. Connected because we want to enable you to join, share, and leverage data, AI models, applications across the ecosystem friction free. It should just be as easy as downloading an application from App Store and installing on your phone. And trusted, because even though there is complexity that we take care of that spans across clouds, across various data estates, the data interoperability for you and the universal governance, that should scale and should be available. And so that is that foundation that you should focus on unlocking value and not managing infrastructure. In our last quarterly results, we shared over eleven thousand customers now trust Snowflake for their foundational critical data and AI needs. Now that scale speaks volumes. Pun intended. And importantly, for today's topic, more than four thousand customers are actively leveraging AI to enhance their business decision making. They're integrating intelligence directly with their governed data on Snowflake. And in my role, I've been fortunate enough to witness many of these successful implementation across our region in Australia and New Zealand that GenAI is scaling within the customer base. Based on these observations, this is what I want to do next. I want to share with you the key and repeated learnings, the core factors that consistently contribute to making these customers the highly successful with their data initiatives. So we want to increase that number, which was thirty three percent of the customers are able to successfully scale to even more, and we want to reduce that number from sixty four percent. And it boils down to simply understanding that your AI strategy is your data strategy. These two are inseparable, but very powerful. This visual illustrates Snowflake is architected to be the essential center of data, And now it's becoming center for AI as well. If we look at the right at the core of the diagram, you'll see Snowflake is positioned centrally to handle all types of data, whether it's structured, semi structured, or unstructured, and that too natively. Within this core, we integrate powerful capabilities like LLMs, Snowpark container service, robust data governance, all of which are operating seamlessly across major clouds. You're not stitching two hundred different services. It's all one big product that is unified for you. And notice how the data comes into the platform from various sources, but also from third party application, that ecosystem of marketplace, where not only third parties are sharing their data, but also the AI intelligence and applications. And this brings me to the point that that governed data that is powering everything is now being sent and fueling the application layer AI. It is enriching those tools where users go and interact and engage, whether they are CRMs or ERPs or instant messaging platforms, all of which have their own AI built into it, whether it's Agent Force from Salesforce, Service Assist or Now Assist from ServiceNow, or Copilot from Teams. Each of these application AI can be enriched, and that powers that high value use case like talk to my data, gain insights from my document, an automated BI. This unique architecture where diverse data, closed format and open format, structured or unstructured, comes in, gets processed, becomes AI ready, and the governance around role based access is done centrally, which then powers the intelligence out to these applications. Is what truly positions Snowflake as the center of data and AI. Now how does this platform makes this central role more effective? Well, it comes down to three principles. Easy, efficient, and trusted. We'll deliver this through Snowflake's unified platform from both development and governance. Let's first talk about efficiency. Instead of cumbersome process of moving massive datasets to the models, Snowflake has brought industry leading top tier models and the only data platform which has OpenAI, Anthropic, Meta, DeepSeq all in one place. So the models have come securely to the data. And this has reduced the time it takes for applications to go into production. I'm not talking months. I'm talking days and weeks. Cortex search has provided the best in class retrieval accuracy better than OpenAI, more than eleven percent. Other research suggest Cortex analyst, the text to SQL, accuracy is ninety two percent out of the box. This is industry leading, and this is the most difficult thing to crack. Generating SQL accurately on top of your data is the hard problem that Snowflake solves out of the box. Now when it solves this, the second pillar, which is make it easy, We don't want to limit people who don't have skills of Python or SQL. So we offer fully managed services that are infrastructure that simply work from CPU to GPU to containers. And all of these tools are available via the code platform, like notebooks, or intuitive no code platforms, like document AI. And so Snowflake is truly accessible by anybody in the business. And the third point that I want to make, which is the trust, I can't stress enough why it is important. As data moves inside Snowflake, it is trusted by over eleven thousand customers as I mentioned. And what happens when the data is moved inside Snowflake? The governance, the granular role based access is directly applied on the data and then translates it into AI models as well without extra complexity. And what we've done recently, we've introduced LLM evaluations framework within AI observability. Users get grounded relevant responses in production Gen AI apps. Now while this platform supports full spectrum of AI and machine learning, like feature store, model registry, experimentation, monitoring, alerting, Our focus today would be on generative AI, especially delivered through Cortex AI. So when we look across the successful implementations, these are the reasons that really crystallize. There are two primary value drivers. First, on the left, the ability to unlock unstructured data insights. Huge amount of data that is, buried in documents, emails, images, text from surveys, all transcriptions. Snowflake makes it drastically easy to streamline the extraction of insights and that too at scale. You're not making single LLM calls for each translation or sentiment analysis or classification. You're doing it at scale. And so customers like Nissan and Zoom are leveraging this capability on millions of records and seeing the benefits as well. The second key area on the right is data powered AI agent. This is about moving beyond the generic AI assistant to being more agentic that deliver accurate, trustworthy, contextual answers. Like I mentioned in the previous slide, AI observability plays a key role in creating these evaluation. And so our customers like Siemens Energy and Bayer are leading the way here. They're using these conversational assistants to truly democratize how data is used within the organization. Both of these pillars depend on these foundational layers of powerful AI capabilities and govern data. These models are managed from within the platform, which means the data never leaves, ensuring the security parameter of your organization, the governance layer that data stewards are creating, the guardrails needed for the AI to reply within the scope, and the observability needed for it to be ensured that AI generates right, relevant, accurate responses. So in the next part of the talk, we'll explore these three, two pillars, unstructured data insights and data powered AI agents. And I'll also share couple of demos, down the line. So let's double click on unlocking unstructured data insights. For Snowflake, this isn't just any theoretical capability. What we want the customers to think how they will transform their business. We want them to focus on the value they can get from the platform, not doing things on the platform, not working on infrastructure and complexity. And so this slide highlights those four key use cases or patterns that I see. First, as I mentioned, I worked in call center, analytics space, and that is an obvious one. Because you are surrounded by thousands of call transcripts and chats, and you have to make sense what did this customer say to me previously and make notes of them in the tools. So here, Snowflake can transform that raw text, create this customer need and intent, unlocking the opportunities to have either safe conversation or sell conversation. And this opens up new revenue streams. Going from call center analytics to social sentiment, which plays a key role in brand messaging, businesses who monitor social media platforms often just get word clouds. But the language, the text, the acronyms, they evolve and change. And so these online sources need to quantify brand perception and even understand the emotional resonance. People share their experience online in real time, and so the marketing strategies need to be nimble enough to take advantage of changing perception and appreciation by customers. Third one, again, I mentioned that earlier, which is you get a lot of surveys as soon as you finish a call, as soon as you purchase something. And that text free text field where people type their feedback goes unused, underutilized. And that is where a lot of product innovation is. How do I improve my customer experience, my product experience so that my product teams and operations team can stay on top of their priorities? And the things that they are working truly drive the NPS or the customer satisfaction scores. Fourth one is working with metadata. Now this coupled with the generative AI's image generating, video generating capability, opens up new opportunities to create personalized content at scale. These examples demonstrate vast amount of varied structured, unstructured, semi structured data that is a challenge and is also a powerful asset for transformation. Drilling down a bit further, a particularly powerful application we frequently see is getting the understanding a deeper understanding of your product and customer. I think the key benefit here is being data led. The business scales, the processes improve, and what you get is faster time to insight. And these outcomes are exactly what customers like Nissan, who had millions of records, which used to take weeks and months to process to understand customer experience by the marketing team, when done at a batch scale, reduced the time to generate those insights from days and months to hours. Or there is Irish Life who gain clearer understanding of evolving customer needs. So, ultimately, when you have unstructured data, you want to get noise. No. No. You want to get signal from the noise so that your business can grow. That is the example that I was sharing earlier. So how does Snowflake makes this so straightforward for customers like Nissan? The key part of this, again, lies access to LLMs. Now I wanna share with you the maturity of each customer is not same. So right from the left, we see not everybody has AI engineers in their workload, in their workforce, and not everybody needs to prompt design or prompt engineer. So what we've done is we've created prebuilt functions for those frequently automating tasks, like classification of text into various categories, semantic analysis, positive, negative, translation, summarization, even parsing data via OCR with layout analysis. All of these are available as functions that can be called in a SQL statement or in a Python function. To the second layer, where if you are a little bit more mature and these prebuilt functions do not satisfy the need, you can customize them. In fact, you can fine tune them, use models available from the partners who probably are domain specific, have ability of understanding legal jargon and other, complicated medical terminology. And this simplifies how you build intelligence. Again, not worrying too much about infrastructure. Do I need containers? Do I need, GPUs in the platform? How do I run Terraform scripts, for example? You're not spinning any of those things here. But you might find something that is available far to the right, which is you probably have built something in the past or you find something in the open source community. Well, with Snowpark container services, you can bring any model that can run beside Snowflake data. And, again, all of this, depending upon where you are in your journey, you can either use prebuilt functions to get time you know, shorter time to value, optimize, fine tune, prompt engineer, state of the art LLMs, or use open source models from partners or from a game face directly into Snowflake's secured boundary. So this multifaceted approach helps with ease of use and customization. Let's pivot to the next key pillar where I'm seeing a lot of our customers, using AI on Snowflake. I think the fundamental goal here is to drive the promise of data democratization. And when I say data democratization, I don't mean stats. Don't mean chart. I mean, giving the users the ability to use the data in their decision making. So if I'm a marketer, I want to make sure I get relevant marketing data and insights out of the plethora of documents, surveys, click streams that I already have inside Snowflake. So these chatbots have to be grounded, understand the world of structured data and unstructured data, figure out a plan, go find the right information where I have access to, and retrieve at the state of the art retrieval performance that I mentioned earlier. So customers like Bayer and Thomson Reuters are genuinely using some of these to power their users, empower them with validated data. So let's explore some of these examples again in a bit more detail. These are the four common patterns that I'm seeing, we our customers are applying today. In product research, in claims processing, in customer support, and in supplier management. All of this requires complex understanding of data that is qualitative, transactional, user interaction, and it comes from various different sources. Take, for example, claims processing. Usually, users submit the claim as part of a online form, provide historical claims history, provide images of incidents that have occurred. So we are talking about form data, historical, unstructured text data. We're talking, multimodal image data. All of this needs to be processed in a fast amount of time so that we can give an assessment whether the claim would be passed or would require a human to look, at. These examples are what I like to call data powered, AI powered data agents. So this concept of AI agents is very interesting. And in the context of data agents, it's even more powerful. Because when you're asking questions to these chatbots and behind the scenes when they're retrieving data, you want to ensure that the data is retrieved as accurate, trustworthy, and you have right access to that data. So the answers are grounded, and the architecture allows the scaling. So if you look at this diagram, this architecture, it starts on the left box, the request box, where a user typically poses in a natural language. For example, the user might ask, who signed this contract? Or how many customers purchased, this particular product last year? And the middle layer, the central layer, which is what we call data agents layer in Snowflake, allows you to orchestrate, figure out what is the task that the user is asking me to do, decides which tools to use, whether I need to query the data from tables, row, retrieve the data from documents, from search index, or call any other service to retrieve, the data in real time. Once these tools come back with the answer, then the data agent reflects to see whether it has received the right and relevant information. If not, go do this whole process again. So the the planning, the iteration, the execution, and on top of it, monitoring and iterating based on the user feedback, using the thumbs down and thumbs up features or sampling, at a later point. So this data agent's, concept then needs to be exposed in this outer layer. And before I go there on the right side, let me quickly go a little deeper into the data agents layer. Let me talk about the semantic layer and the retrieval box. Here, data agents don't operate in isolation. So they directly go and interact with your data. In fact, they understand the business language, and the way the metrics, the aggregations work within your environment. And then they use the specialized tools like Cortex Analyst, which converts the text into SQL using the semantic layer or uses Cortex search, which uses, again, a hybrid approach of, semantic search and re ranking capability. So text based and semantic search and re ranking capability is, again, provided out of the box by Cortex search. And then we come to that important layer, which I am just referring again and again, which is if I have the fine grained access control to that data, the agent adheres to that and enforces those when retrieving the data. So if I have access to it, and if I ask the agent to go retrieve on my behalf, then it is ahead. And finally, then the output is generated and sent via APIs to various tools like Slack and Teams and React. And so we will see a demo where I have built a React based application, which talks to my data that is living in tables and in text files. Just to stress on the performance indicators, the research that I just shared with you, best in class accuracy on text to SQL, Cortex analyst has reached ninety two percent when compared with other providers, just out of the box. And if you tune and if you improve your semantic model, it goes even further. And the search capability, again, is world class. If you look at other retrieval engines, they use ranking and re ranking capabilities. They're not even close to where Snowflake's cortex and cortex searches, with the ARCTIC models. So having access to both of these leading retrieval engines allow you to trust your data, but also don't sacrifice on the efficiency. So the shorter it takes to reach and retrieve the right information, the faster it finishes and the less costly and less total cost of ownership there is. This is a snapshot of all of these things that I've just shared come together. The aim here is clear. I want to power the data led scalable business processes with faster time to insight. So whether it's getting data from various sources, creating scalable pipelines to create capabilities and streaming capabilities like batch, micro batch, or real time, or I want to use Document AI for entity extraction. All of this, I can declare using, declarative pipelines and create the same pipeline that I used to bring data from, ETL processes to to create, medallion architecture. I use the same concept of streams and tasks to build unstructured pipelines, unstructured data, ready pipelines. And I talked about efficient, influencing. And finally and we're moving away from dashboards to insights and delivery into systems of engagement. So this end to end pipeline operates under the robust security governance guardrails trust compliance throughout the process. This concludes, our look at why customers are leveraging Snowflake for AI. And now I'm excited and a little bit nervous to show you the this in action. This is based on a quick start that is available for anyone to try if they have a Snowflake account. If you don't, you can spin up a trial account, without credit card. Just scan this, and you'll get the link to the quick start. And let's jump right in. So here I have an application. Let me start a new check. That is running inside Snowflake on the Snowpark container service. This application is a React based application that I've created to talk to the data that is living inside Snowflake in two variants. So one, it has transcripts or the messages from the meetings that the human salespeople, sales representatives who have gone into the field typed against each customer and also the sales metrics. So which of my customers have been won? I don't know what's happening here. Never see. Oh, there you go. So so the sales data, which essentially means, which of my deals, were won, lost, or are in pending state, what product category, who is the customer, how big is the deal. So a typical sales data that comes from tools like Salesforce or your CRMs, where you capture your opportunity and which representative is working on that opportunity. And when the customer conversation happens, the notes gets updated. So both of these are available for this user. Now here, I'm playing a persona of a sales manager. And so let me ask you know, start by asking, what can you help me with? And in this, the data agent recognizes that I don't need to call any tool, which is text to SQL or go do semantic search. It just needs to answer basically through LLM because the intent of the user is not to get any response. But as I'm thinking, there has been a new motion announced, which is sell more security products. And so I'm interested. Do I have a strong enough security presence in my customer base as a sales manager for me to set some targets. I want to know and do some data analysis. But I don't wanna write complicated queries or go through unlimited dashboards just to filter down the right information. So here, with just asking the natural language question, I got the answer that data agent understood. It is a question for my sales metric table where even though I said ask for security, it figured out that the product line is premium security. So it generated that SQL, first understood my question, rephrased it, created the SQL, ran the SQL, returned the data back to me. And I can see some of them are still being chased, but many are closed. And quite a handful of them are one, meaning we do have presence in security products. So now I want to go even deeper. As a sales manager, I want to see what my sales teams are doing. Are they talking about security products? Because this is the motion we are going ahead. And as the question appears, it goes deeper and checks for customer interactions. It looks at, hey. These are the deals that we have won. I might ask even more. How about transcripts? Is there any mention of security products? This is multi turn conversation, meaning it understands and keeps the history as I'm conversing. And so now it can understand that this question requires to go to the unstructured table. It go it needs to go to the table where conversations are stored. And when I get the response back, I get citations as well. And these are very interesting responses. If we were to see what is one, we can just click on one, and we can see the entire transcript that was retrieved based on this question. So some of these customers are really having these conversations with my sales teams. What I'm interested in is qualifying one opportunity that came to mind. I want to see what are the purchases by this specific company, CloudTech, because this is something that is in our prospect. Go win them for enterprise security. So I can ask a specific question. Again, I did not say this is my customer. All of this information is provided in the semantic layer itself. And I'll show you an example of what I mean by semantic layer. But this particular thing, when I asked what did they purchase previously, it went ahead, ran the query which had status as true and customer as cloud tech solutions. And you can see the use of I like, use of percentages, all of which comes from cortex analyst. So let's go and see how this is created in cortex analyst. To build a semantic model in cortex analyst, you create a semantic model where you specify which is the table. You specify its dimensions, all these different columns that you want, and then you specify facts and time dimensions. The interesting thing that happens is in the verified queries. Here, you can give examples, synonyms based on the knowledge, how to do the join across various different tables. And you can see this exact question with the exact, query. So if I just change the if I want to test whether it really works, for other customers, I can ask the question right here in the playground, which could be which product did, let's say, other customer? We go back to our actual demo. We find another customer, Pharma Corp. Let's say, what was the product that Pharma Corp purchased? And here, I might just add a lowercase everything just to see if it still goes ahead and fetches the right information. And so here, what I'm doing, I'm testing the product before putting it into production. Once I'm satisfied, I can just save it, and voila, that updated model is available to the chatbot just like that. So here you can see what are the product lines purchased by customer named Pharma Corp, and you can see they've used Pharma Corp. And that's that's what I meant. You can they have not purchased anything from us, by the way. So so these examples allow the cortex analyst to figure out how you do querying in your own environment. Let's continue. So I've qualified that CloudCheck is a prospect that is an existing customer, but they don't have security. So let's see. What is actually our win rate of security? Do we actually have a a big enough market? And do we actually have a good enough product? If most of our customers are using security, they could be, using because it's a good product. And you can see we have around, two third of the deals, where we win compared to where we lose. And so, again, this is a very specific, attribution, metric that is created. I'm just gonna end with one last, question. I'll I'll just ask who is working on this, deal? So who is the sales representative? So from here, we can go deeper and continue the chain of thought and figure out sales performances. Like, what are the previous wins by this representative, Mike Chen, or what is the combined deal that he has worked on in the past, and so on and so forth. So the user does not leave this interface. They can continue their thought process and get the relevant output. So how is this built? Behind the scenes, the data came in from our CRM systems and landed into unstructured raw, which was later processed to create a semantic search cortex index. The existing tables, which had raw information from, again, the CRM, which related to deals, were transformed using tools like Coalesc or DBT or streams or tasks and created a sales metric table where a cortex analyst was used. Then a user who is using, a React application running on Snowpark containers talking to cortex agent, which in turn uses a lot of these capabilities that are mentioned. So your data never leaves the secured boundary colored in blue, and the users are accessing only the required data. Due to the shortage of time, I couldn't show you the access control mechanisms and the AI observability part, but those are also the key components, part of this architecture. So let me quickly jump towards a new private preview feature that Snowflake is working on called Snowflake Intelligence. This allows users to, again, query data from various data sources, generate insights, and the new concept of generative UI comes into play where this UI of this chart did not existed. This tabular data this tabular visual did not existed. Based on the output, Snowflake creates these. Not just this in a static form, but in a more editable form where users can choose to drop down and update, and those will be taken the actions will be taken into the relevant source systems. That's very exciting, place to watch out for. So I'll pause here. A lot to take in. We still have a few minutes, five to seven minutes, I guess. Over to you, Rob. Awesome. Thanks, Manish. That was wonderful. Lots of great stuff in there. We do have three questions. This one is coming from Ken. When will the DeepSeek model become private preview in Australia? I believe that's what he means by p r p r. This is a very requested feature. The guidance that I have is, at the moment, due to the shortage of the GPUs in the region, we are having this capability unable to roll out to the region. But if your account supports cross crowd calling, which you can just enable by talking to your account admin, you can utilize DeepSeek today. So the data, again, stays within the Snowflake's governed boundary. It's just that it will be processed for the time being in a different region. Awesome. Thank you. Here is a question from Adrian. From your experience, from your previous experience broadly, where have you seen customers generally start with AI? Adrian and his team have started with document AI extraction and push into their ERP, but what else have you seen? The chart that I started the conversation around the ML adoption, that has spanned over ten years or so. And so in those ten years, some of these organizations have built what we call a feature store. And so if you are one of those organizations who have feature store on your structured data, using the models like classification, anomaly detection are very easy to build inside Snowflake today. So that is one. But if you are on this unstructured journey where you started using document AI and if you have skills to build, these fine tuned models, this is a perfect way to start as well. But what I'm seeing more and more customers are starting to use is the documents that are available in SharePoint. So this feature that, was recently GA'd is available to all customers in all regions to not ingest data from SharePoint, but zero copy, reference the data, and create, indexes inside Snowflake on Cortex search. In real time, use it in chat analytics, and chatbots and agents. So I preserve the data in SharePoint, update the permissions, those gets reflected inside Snowflake. I can use it today. And and the best part is you're not paying anything extra because all of these features are bundled and available for all customers. You don't have to subscribe to something. You don't have to stitch something new. This is available today. So batch AI and SharePoint document are the one that I see, and most of the customers in our region can just start from today. Awesome. Thank you. And the last question, I think this is a little bit of a simpler one. How do we get this is from Josh. How do we get to the AI chatbot pictured from the Snowflake menu main menu? So I think you were showing a demo, and maybe he spotted that AI chatbot. In my demo, this particular one is hosted. So for me to get the link, I need to run a couple of commands. Because this is hosted on my compute pool, I set up a compute cluster, and then I run, that application as the Docker container. And then I get a a URL link, which I can paste inside, my browser and navigate to this. But if you are asking for Snowflake Intelligence, that we have not released yet. Awesome. Alrighty. So we are just about at time. I wanna say thank you to Manish for coming and delivering such an insightful presentation. Thank for all of you for attending. We'll be reaching out, InterWorks over the next day or two to give you guys a little bit of a follow-up. We'll have this as a recording. So if you wanna share it with your organization, certainly that will be available. And we'll have a special offer for you as well related to AI, so look forward to that. Thank you, everybody, and we'll see you guys next time.

In this session, Manish Kukreja introduces the importance of leveraging Snowflake to jumpstart AI initiatives within organizations. He emphasizes that while AI is essential for achieving business outcomes, effective data management is crucial for scaling AI applications. Snowflake’s unified architecture addresses data silos, enabling organizations to unlock insights and democratize data access. The presentation showcases how AI agents enhance data processing and retrieval, allowing users to interact with data using natural language queries. The session concludes with a demo of a React application that simplifies data analysis for sales managers.

InterWorks uses cookies to allow us to better understand how the site is used. By continuing to use this site, you consent to this policy. Review Policy OK

×

Interworks GmbH
Ratinger Straße 9
40213 Düsseldorf
Germany
Geschäftsführer: Mel Stephenson

Kontaktaufnahme: markus@interworks.eu
Telefon: +49 (0)211 5408 5301

Amtsgericht Düsseldorf HRB 79752
UstldNr: DE 313 353 072

×

Love our blog? You should see our emails. Sign up for our newsletter!