Alrighty. So let's get going. Welcome. This is our third webinar that InterWorks has co hosted and co presented with Informatica. Our webinar series that we are, delving into is how we use AI to build better data governance. And so in the first webinar, we had Daniel Hein speak and then Rhys Alex join us. And today, we have Asek, Manutin joining us, and he's gonna be speaking about building an intelligent data enterprise with AI driven governance. There's gonna be a lot of talk about master data management as an example of some of the things he'll be covering. Asek is the MDM leader or master data management leader for, ANZ, based out of, New Zealand. He's been with Informatica for a long, long time. Twenty five years experience in the IT industry overall with extensive knowledge on data, AI, governance, building governance communities, implementing best practices, and he has gotten to work all across Asia, Asia Pacific, the Middle East, and Africa. So thank you, Ash, for joining us. And without further ado, the screen is yours. Thank you, guys. Rob, thank you for that introduction. Hopefully, everyone can see my screen and hear me okay. There was some, mention of, echo, so, hopefully, you guys can hear me alright. So, yeah, look, guys, the topic today is about building an intelligent data enterprise. That's that's the goal. Right? That's the nirvana. Now as Rob said, I spent about fifteen odd years focusing on data and AI in generally, but focused a lot on master data. So you will see my natural bias come to play as I talk about it. But to be honest, having been in that industry for such a long time, I've never seen the need for a better quality data foundation built on governance, MDM more specifically. It's more than, ever before. It's critical right now. Right? So, hopefully, in today, wanna cover some of those topics and what I mean by intelligent data enterprise, where you want AI and some of the modern capability that we are seeing around us kinda naturally start to deliver value to us. Right? And that's gonna be the objective. So, look, when I take a step back and look at AI with all pretty much every day, you, you know, look at the, you know, media, look at the YouTube or, you know, x or whatever platform that you're interested in. There's a lot of changes that are coming in. But when I take a step back, the key thing I take away from it is the possibilities. The possibilities AI is telling us it can deliver. And the reason I'm harping on that word possibility, particularly in for this audience on this call, those of you who are, you know, representing, you know, small, medium, large enterprises today, Making that a reality in the enterprise context, I think we're somewhat off for it. Right? And you will see again some of the analysts talking about there's a tremendous amount of effort going in there, but the results or the value are not quite there yet. And I paint that primarily to one of the core ingredient of that value, which is the data. Now we gotta keep reminding ourselves. So when you look at, you know, likes of Gemini or Chat GPD or Croc or whichever flavor that you prefer, they have been trained on a vast amount of publicly available data. Now that's critical because these are data that does not include data that you have in your mobile device or your laptop or your data center or your cloud operation. Right? These are protected private organization owned or individual owned data, that are critical to our business function. Right? Now that's kind of the element that is missing, and we are, as enterprises, us included as an organization, and whatever whichever organization you represent, your focus is to bring value out of that data. And that is what is a critical ingredient in driving that value. But I wanna take a step further because, again, if you look at some of the results we are seeing in early piloting, there's a trim a lots of organization piloting with different types of data or rather AI. There's something else missing, which is, right, the dependency of reliability on that data, which then I bring to the next level. And I will call it master data, but we can replace that with governance more broadly. But this is about building the trust in the process as well, not just about delivering value, but making it adaptable, making it reliable, repeatable in the organization context. Right? And that is more and more, as I said to before, when we were when I started this journey around master data or data governance more broadly, it was about getting the reporting right. It was about getting some of the decision making whether you're using predictive AI or other sort of advanced analytics. It was about getting it right. But when we are talking about building this intelligent enterprise, data enterprise, where we are expecting, I don't know, agentic bots running around making decision on our behalf. More than ever before, it's now critical to make sure what those AI components are working on is reliable, you know, and we can trust it. Right? And I think this is where we are seeing a lot more conversation happening on data management more broadly, getting what I would call the data AI ready for us to start delivering that value. So let me start with an example. Before that, one of the thing that, I was looking at in the recent, news reporting, I'm a frequent flyer of Qantas as an example, and there was a data breach. Right? And some of the personal information, I was relieved to know, not my credit card and some of the other details, but some of the personal information was lost. Obviously, this is alarming. A lot of, you know, people are looking at it and trying to understand what happened. It's even more scary when it happens to your financial service organization, for example, or your superannuation account and the likes of it. But as your customers are trusting you with more data, ultimately, other than obviously making sure you're protecting that data, which is almost given, but also they're expecting to get better personalization or better experience. Right? But when you try and make it operationalized within your organization context, it isn't often as easy as it, you know, sounds. Right? And and and I wanna illustrate that with an example. Now this is an example of, you know, if you're making a travel arrangement and you missed a flight and you're calling your airline or agent travel agency to rebook your connecting flight. Right? Now even if you go back five, six, seven, eight years ago, we had AI kinda powered voice, I wouldn't call it chatbot, but voice services that would be taking the call and say, hey. You know, try and interpret what your challenge is and try and redirect that call to maybe a, you know, human operator to be able to serve it. But sitting here today as you're looking at sort of, all of these agentic AI opportunities, you're kinda thinking, okay. I wanna use the agentic AI to build on that process to get this particular situation, which I'm using as an example, to activate a few agents. Maybe it's a notification agent who will start to notify certain parts of your business, a booking agent who will make the, you know, booking arrangement, and a luggage or baggage agent then redirect the baggages to the right place. And then, you know, we're all set for it. Right? That's kind of the when you whiteboard it within your organization, and you can replace that with any other scenario that is contextual, more relevant to you guys. But I'm just using this as an example. As you whiteboard in and go, alright. This is what I wanna use. But what does that mean in reality? Right? We have the technical component there. People are talking about, you know, MCP, then the context protocol or agent to agent communication. So the technology pieces are there. But as you start to kinda uncover what you then have to do, you get this hotchpotch. Right? You then realize, okay. It's not as simple because I need to now coordinate data probably sitting in multitude of systems in different formats. Maybe they're not even matching with each other. Maybe some are on cloud, some are on prem, some are somebody's Excel file. And believe me, I've seen that too. Critical customer data or production data in somebody's Excel file as part of the process or Google Sheet likes that. It is much more complex when you start to bring that into reality. Right? So that that kind of picture, I would go on to say pretty much ninety nine percent, any medium to large enterprise will likely to see. So is there a better way, right, in that coordination? Now this is where the concept of governance and bringing in the customer or in this case, customer master profile into picture. If I were able to streamline all of these multitude of silos that constitutes my organization into something that is more manageable, something that is governed centrally, something that now I have to only communicate once, and then the agents have a better way to interact and understand where to go and look for the information. And this also then resolved the problem of identification or, in this case, customer data or a product information or a service information or a location information because I have a centralized place for these agent to be activated. So there's a layer often missing at the moment for us to get to that full on automation where we can simplify, deliver much better service to be able to recognize the customer better, to be able to personalize that content for them to be satisfied even more. To bring in all of that, we need to work on that layer, which I would call sort of the governance layer and bring that sort of centralized capability. And I will talk about some of the other benefit down the line, what it yields to, additional benefit. But in this case, not only you are able to then serve that sort of, okay. I will notify the customer. I'll book a new flight. I'll check the baggages. But now that I have a better understanding of my customer profile, I might be able to because they had a bad experience of missing the flight or delayed flight and missing the connecting flight, I wanna offer them access to the lounge if I can establish that, you know, that person has that, tier level, that will make it eligible or even offer a better in flight service or an upgrade or whatnot. You can start to now elevate those and make some of those value add capability because of that streamlined profile that you have established. Right? And I'll go back to as I talk about governance in general, often to this context or concept of mastering, that it's about taking the critical piece of information about your business and having that enforcement of both processes, understanding technology, people bring together, and making sure it's available to the right person or at the right time or for that matter of, you know, right, AI agent at the right time. Because, otherwise, if I go back to the earlier example, it is gonna be dealing with multiple multitude of silos and try and stitch it together. Right? That's what we wanna avoid. So, hopefully, this give you an sort of a idea of what we are trying to solve here when you talk about intelligent data enterprise, be able to streamline the core data capability to able to facilitate some of these advanced capability we're looking to bring to our into our business. So as I said earlier, I bring it down to sort of AI ready data is what is required in here, not just any data. There is some school of thought that you can if you can aggregate all of your data and put it into a data lake or a data warehouse somewhere, it will somehow figure it out. The truth is and the reality is it isn't often that simple because there are process element to it. There's compliance element to it. There is able to, you know, be able to bring them and interconnect them in the first place in the right manner. There's a whole bunch of other things that can be coming to mind. And for us, how we approach it is to establish these three different r's of AI. So relevancy. So in the earlier example, I was showing you the customer you're servicing, making sure you're working with the right customer record, whether it's in the billing system, whether it's in the baggage system, whether it's in the frequent flyer system, or anywhere else. And the organization, if you think about in your organization context, you will see critical information like customer data or product data or supplier data are in different places. Again, as I said, some of them are even in Excel file sitting in somebody desktop or in a Google Sheet somewhere. Right? How do you make that relevance in the enterprise context going beyond those individual silos? Bringing likes of responsible AI, it is, I think, getting more and more, important as more and more regulations come into play. As you wanna let AI unleash onto your enterprise data, you do have to make sure you're complying with law. For example, if you're capturing data for a certain purpose, you need to ensure only for that purpose the AI is using the data as well. Right? So you need to have some of those governance policies already in place before you bring that AI on top. Because otherwise, you have to go in the effort of enforcing in each and every element of AI that you're trying to bring in and in order to make your enterprise more intelligent. And then be able to deliver it in a robust way as a enter enterprise scale. So let's face it. We are trying to do all of this to deliver better value for our customers, our stakeholders, to be able to grow our business, to build more product, to deliver more services. So you wanna scale that, and you wanna be able to do that sort of as a part and parcel of it without having to do sort of a three year transformation program every two years kinda way. Right? You wanna be making it part and parcel or of your everyday operation. And that's how we look at when we talk about enterprise data governance. It is about encapsulating your current business processes, obviously, fixing, you know, challenges that are there, where applicable, but really wrap around those. Because when you don't do that right, what happens is your adoption fails. So you go through this transformation. You try and bring all of this newer capability, But either the trust isn't there of your stakeholders, your employees, or your customer starts to distrust when there is an event happen, you know, there's a data leak, or there's a, you know, issue around bias or whatnot, people start to trust it less than, you know, you fail in adoption. So you wanna make sure you have the AI ready data that is supporting all these three elements. Now when we we did a, survey of CDOs earlier this year, and as I was alluding to earlier, there is I don't I don't think there isn't any organization that is not dabbling with AI at the moment. Most organization are trying in various forms, and many are in the pilot. Some of them are in production in some extent. But, vast majority of them are concerned that these things are taking place, these AI pilots are taking place without addressing the prior challenges and some of the challenges that Ali did to today earlier, which is, you know, data is spread over silos. You have data quality issues. Obviously, cybersecurity is a big issue when you're making some of these data accessible. Privacy compliance, you just because AI is looking at it does not mean that you can use any data collected for other purposes for some other reason. Right? So you have to apply, align to some of those governance principles as well. Responsible use of AI bias has been one of the most common one. And even in general sense in terms of as you're bringing in AI, you wanna make sure the outcome is better experience for the stakeholder, for the customer, for your employees, and, you know, the management for everybody. And the reliability of results. Right? AI is, often touted, you know, they have answer for everything, but is that answer reliable? Is that answer repeatable? Right? And and have that visibility into it. Do you, you know, do you know what training data was used to be able to create that outcome or the and is it, aligned to your, you know, organization's values and principles that you're trying to bring to bear? And overall quality of the data, right, which is a major problem. Even in today's day and age with all of the technology advancement we've seen in the last couple of decades, it's still, data quality is one of the biggest, challenge that is stalling some of these, progressions. So there is still a lot of concern, and, a lot of what we I wanna talk about or or touch on today is gonna focus on addressing some of these. So at Informatica, and, together with our friends at InterWorks, this is what we focus on. We wanna try and bring the trust and credibility in the process. We wanna be able to say, hey. If you are introducing this new operational process that is built around with the intelligence of AI, you have the visibility of how those decisions are made. You understand the training is based on a certain sort of data which has gone through a validation check, etcetera. You're reducing that bias. You're lowering the risk of exposure, whether it's privacy concerns, whether it's about, data access rights. You it's baking part and parcel of it. When you're doing those right in the process, you're gonna help improve the reputation of your organization. Not only you'll be seen as easy to build to business with because you're making some of this capability available to your end users, but you're also doing it in the right way without compromising certain aspect of it. And I touched on reliability earlier. So what we the way we look at achieving that is through, our IDMC platform, the intelligent data management cloud. I think some of my colleagues, if you have attended some of the previous webinar, would have touched on it as well. But But I wanna take a broader look at it. Obviously, I will focus on the governance and MDM aspect of it, which is about matching and merging and bringing together data, creating that central profile with the example I was showing you. But it also comes with other, capabilities like the data cataloging, understanding your data. If you struggle to understand your data, I can assure you your AI will too to some extent. Right? If you have named your data components in different way in different places, AI will have difficulty consolidating that unless you figure out a way to bring it together. You obviously have to you know, that hot swatch I was showing you, you need to be able to bring it bring it all together and then make it accessible through data integration or API or app integration of different sorts. And then start to bring in a GenTek AI capability, something that we will have available later in the year as well, on top of the data platform, not as a side project somewhere, but on top of the data platform. So all of those checks and balances that you're putting in place through this platform is honored through whether it's a human interacting, human agent interacting with it, or an AI agent interacting with it. That's the level of capability you wanna bring to bear. And then overseeing all of that is generally clear. I will touch on and explain. Clear is our AI engine who brings out capabilities like co Copilot to help you do things better, faster, quicker, recommending new ways of looking at data or addressing data issues, agentic AI, as I touched on, or even GPT type, like, you know, chat GPT, be able to converse with your data in natural language. Right? But the core of it, and this is the real differentiator, the core of it is built around what we call the metadata system of intelligence. Because going back to the intelligent data enterprise, I believe the intelligence not only come from the data you're working day to day with, which is your product information or your customer data you're capturing, because often you're not always talking to the customer face to face. You are sometimes, but not always. But it is how you're reacting to those data as they come through your desk or come through your laptop or your iPad or whatever device you're using. AI has the opportunity to learn from that as well. If you're applying a certain type of data quality rules on a certain type of data, that's an opportunity for AI to learn. And next time it sees data like that, it can say, hey. By the way, it's a similar data. You have applied the masking rule. I will recommend you do so here again because this is a credit card information. That's the type of intelligence you are looking for that platform to deliver, and that is possible when you have those interaction with the data collected in a system of intelligence centrally. As I said, I'm quite biased around master data, and that's because I believe master data, by definition, is the most critical element of your enterprise. That's where bulk of your, work happens. So if you for example, your organization has a view towards, or a plan towards customer centricity, it starts with the information you captured about your customer. Right? Having it complete, having it dedupe, having it quality free is gonna be the goal for you so that you can if you're recommending a pro offer or a or a product to your customer, it's relevant to them. Right? It's not gonna be if you have three different version, couple of them with errors with the wrong address of, you know, postal address. You're not gonna be able to even if you get the interaction right, you probably will be delivering it to the wrong address as an example. Right? So having that concept of mastering is critical. And with that, underneath that platform, you will see, it requires a whole bunch of capability for mastering itself. Right? So matching and linking and survivorships to start with. Right? But then you also have to make sure the data is coming into the system. You need to be able to define what the data is, your data modeling, the cataloging of it, manage hierarchy. I'll show you some examples later on. Be able to build workflows. Right? One of the example, and I think, probably your customer I'm talking to right now in the process of implementing, where their customer onboarding or product onboarding takes three weeks, four weeks. In fact, in one case, it's a health care customer. Their product onboarding took six months. And that's because the onboarding process, that's after they built the product and decided they're gonna launch it, has to go through five, ten different systems. And some of them are, you know, Excel files, spreadsheets being sent in email. You're following up through, you know, email to get approval. And then couple of chases later, you were actually going and finally creating the record in three other system. They're now replacing that with a MDM based workflow so that you have a single user interface to able to capture data from various parties and the various stakeholders and seamlessly interacting with those system behind the scene so that you don't have to do that stitching together. Right? And reducing those the process that takes weeks or months down to, you know, days or hours in some cases. Right? There's huge benefit in, optimizing that. Now it doesn't mean that you get rid of that eight, ten systems. Maybe you will get rid of the spreadsheets. But the core system that you're using, they're still there, but you're just creating a layer of interaction in between, an intelligent layer in between to able to do it seamlessly. So from your user perspective, now they're only interfacing in one environment and be able to manage the rest of it. And then when you look at the application, this can apply. By the way, we we call this a multi domain MDM. So you can manage whether it's customer data, product data, supplier data, or reference data. I'll leave some examples of there as well. It could be finance data, your GL, cost code, some of the complex spreadsheet I've seen, finance team, CFOs CFOs going through streamlining, rationalizing codes across Excel sheets. It could be location data or if you're manufacturing for material data. These are applications built on top of it. And then we have industry specific applications, so or application specific. So for example, Salesforce integration or, SAP integration, where you have sim you're seamlessly connecting to your SAP environment, either onboarding a product or customer on SAP or Salesforce or MDM respective, but making sure all three of them are connected underneath it, right, to make sure you have the same consistent data across the board. And then build bring along, context of industry as well. So if you're insurance or finance financial services company with data model built on that, life sciences company or health care company, we can support you with that. So having sort of that combining industry knowledge along with application, and then be able to streamline that when I talk about those workflow, this is how you can achieve that. Right? Simplify that interfaces, making it easier for your your stakeholders, your employees to be able to engage and reference those data more effectively and be able to service you know, deliver the products and services they're required to at the time they need it. Right? Well, it's gonna be quite critical. So I will touch on, expand a little bit on sort of I've been talking about master data more broadly, but, generally, we break it down into two different types. So reference data and master data. And I'll explain that a little bit more as well. And then the third one, obviously, is the transactional data. So reference data will be data like, you know, your country codes, city code, you know, currencies. Now, you know, oftentimes, if you get it wrong, it doesn't necessarily, you know, create a huge challenge except when you're reporting, for example. So when you build a beautiful, you know, BI report at the end of the day and saying, show me my breakdown by cities and countries, and these data are captured differently in different system, it creates a whole bunch of chaos. So this is an area where we would bring sort of reference data management or master data management for reference data domain. And then your master data is your people, your organization, your tracking, customers you're tracking. It could be employee information. It could be product information, location information. These are critical information that you care about. And then the rest is what I would call the transactional data. It's price, discount, the date the transaction took place or the payment type, and all that kind of stuff. Right? Now just to give you an example, so if I was a customer and I went online, I made a purchase. In this case, Joe Blogg made a purchase with the phone number, and I tapped in my address and bought an espresso machine, for four hundred fifty dollars. That's how I'll break it down. Now this has been reference data because I'm entering information like state, country, or location. And then I have the name, address, and the phone number over here. But then the same person, in this case, Joe Blogg, turns up on a on a on in the store a couple of days later. This time, his name on the license is Joseph Log or well, he provides that in information, but don't have address information. But this time, it says New South Wales, not NSW, right, and goes by the intensive capsules, right, for thirty five dollars. The point of this example, though, is in itself, we end up typically two different records. Right? There's no easy way for us to capture that. But there is an opportunity, for example, when he's in store, if I identify it as the same person who just bought an espresso machine, I could, you know, offer some, I don't know, some additional kits for the machine at that time or provide a discount for buying some additional capsules or something along those lines. Right? And that is what what I would call the golden record will enable us to do, which is essentially combining these two pieces of information, which often is two distinct system in most organization, and be able to establish that single view, which not only addresses this is actually Joseph Bloch with this number and the full address, but also manage the reference data that now I will be tagging it as NSW. So no matter where I do the reporting, I will end up getting one Joseph block, make two purchases, and they they're based in New South Wales, which is NSW represented here. Right? So that's ultimately a simple illustration of how master data helps aid not only amplify or improve the reporting quality, but the example I was giving earlier on, getting all of these, AI agents that we ultimately want us to stream you know, streamline our processes, make it easier, faster, deliver more, deliver better, they would require that level of assessment. And by the way, this is not an automatic oh, no. No. No. We just made a decision. Joe blog is the same as Joe's blog. Depending on your industry, you will be able to establish how you make that determination. So if you're, for example, a health care, industry, you'll be having a very stringent rule in making that linkages. Whereas if you're in a marketing agency, let's say, in an espresso machine, you probably will be a little bit relaxed. But, again, you need to be able to apply your own organization governance rules to get that level of outcome. And in the process, improve your analytics, improve your AI, and bring in newer technology in the process. So for focusing on the customer data, just staying on that topic, that means essentially ensuring you have able to ability to connect to all of your data sources wherever they reside. And this is one area Informatica stands out because we have thousands of connectors to be able to connect to many different systems and data formats and file formats, many in batch as well as real time through APIs to be able to access the data and then make it accessible to all parts of your organization. Now this is where the governance become important because certain information you may not be, you may not want made available to certain people. Right? So credit card information may be only be available to only a few finance people and a mode of payment that might be interested. They might be interested. Otherwise, we shouldn't even store it, for example. So these are the type of policy consideration from a governance perspective you wanna be able to employ. Right? So bring building what we call the single view, the profile, bring that level of trust through enrichment. I'll show you an example of that later on, enriching the data to build the trust on it, and then ultimately bring this three sixty view, which then expands into not just the customer, but then their interaction with your products or the suppliers who are supplying the products or marketplace or online channels or other channels were making were you making those products available, and bringing that all together, right, and making that master profile accessible to various consuming systems. So you enrich your warehouse. You You enrich your data lake. So when you're now reporting on that job block purchase, you don't see two, separate entries. You now see one person made two purchase from two different channels. So now when you start to apply predictive analytics, it's gonna be far more different outcome compared to the previous version where we couldn't even relate the two. Right? So that's the level of improvement we're seeking to deliver in terms of decision making and, for that matter, automation. Now if I click a bit more into the product story, the same thing applies. Right? So master data isn't just customer. It can be applied to product. It can be applied to supplier and other things as well. And when it comes to product, and especially if you're in a medium to large organization, oftentimes, you're dealing with hundreds, if not thousands, of products. Right? You are trying to ensure the same product description is available online, offline through your, you know, product catalog or at the storefront and make sure they're consistent. Right? If you're manufacturing it, you wanna make sure what says in the packaging is what is said on online channel. It's significantly complex process. You also probably have to make sure you identify where the suppliers are coming from. By the way, we we hear about tariffs and stuff like that. That knowledge, that relationship of where the products are coming from, which route they're taking, who are the suppliers, and how does the tariff impact pricing, that knowledge is often missing. Right? So you wanna be able to consolidate some of this information. This could be your product might be augmented or packaged or distributed by business partner. They might have to put in additional information. Right? All of those coming into, again, going through that governance process of going through you enrich it. You know, you collaborate among within yourselves and make sure you have the right information available to the right team that are dealing with this and then pushing it out in a consistent manner in all of your channels no matter where they are. So, you you know, the information that is on your website, it should be the same as your social media, should should be the same as what the product packaging says on the package itself. Right? Making that consistent and making that process available, that is very much a governance function, a master data malfunction from product from a product perspective. And then tying this together, talked about customer product, but as I said, it can be expanded to any other thing that you could deal with. Right? We have had customers dealing with employee master or student master if you're an educational institution, product master if you're a financial services products or, you know, for them at a retail product or manufactured product. Right? Try it through sort of a governance process that brings in capabilities like recommendation engine. I was giving you an example of, you know, based on the type of data you are entering, you want your intelligent platform to be able to recommend what sort of data quality rules you need to apply to make sure it's consistent as the data goes in within your, platform. Right? You wanna create workflows, as I was giving the example of reducing down to multiple weeks or months down to, you know, days or hours to automate and streamline that process. And various other KPIs as you have that singular view of your data enterprise, the three sixty view, you are able to derive that. Otherwise, you are not gonna be able to. But, again, as I said, this is not sort of a random mishmash of data, and we're trying to predict what that is. It is applying your processes, your governance, your regulation, and depending on the industry you're in, all of that. And make sure then it is accessible to our data marketplace, for example, to share the data. So it's no good capturing all of this information, mastering it, spending huge amount of time and effort, but not be making it available to all the users, right, at the time they need it. And I talked about compliance and reporting. It could be regulatory reporting. I'm using an example of ESG, for example, is a good example of showing how master data can help create an environment for automation and process collaboration within your enterprise to capture things like ESG data. Right? So building into your business process. So for example, if you're a manufacturer and you're distributing your various products, ultimately, your ESG calculation or ESG rating will be dependent on not just the product you build, but also where you're getting the supplier from, the USG rating of your supplier or your distribution channel, the USG rating of your distribution channel. So capturing that in a consistent manner in a singular interface to facilitate that reporting at the end of whenever that becomes mandatory would be critical. But, again, if you have ten different system where you're capturing this information or trying to audit and manage that, you will save yourself significant amount of effort and energy down the road if you are having that centralized mastering concept from the day dot and then allow you to bring in automation on top of it, not on the side of it. I'll quickly touch on Claire, which is our AI engine. And that is the one that is informing our users or customers around, Copilot in terms of prompting, recommending, providing insight, etcetera, but also be able to interact in a natural language process. Now as I said to you earlier, the reason it is possible is because it is learning from your interaction now with the data. So there's about ninety seven trillion customer. I think this is slightly dated, but the last time we captured this information per month taking place, that is generating petabytes of metadata, not data. It's not learning from the data. It's learning from the metadata, And that is helping us provide the right recommendation at the right time. And if you extend that to, as I was mentioning, the GPT allows us to converse and communicate with data with your data in natural language. So rather than relying on, you know, a reporting or a graph, you now can inquire about the data in language and ask it to generate. And we've seen that, you know, ChatGPD and Gemini do that. But this time, it is based on your data underneath it. Right? But what's important is that as you ask those question and get the answers, it is also ensuring it is aligning and abiding by your governance policies. So if you didn't have access to that data because your access control restricted, ClearGPT will not eke out information about it accidentally. It will certainly not give you that information. Right? So being able to bring that capability of AI, having that conversational capability without having to go through sort of lockdown. The you know, I was talking about security breach earlier at the beginning. To be able to have to do a separate lockdown process is significant game changer for customers, right, to be able to bring that in naturally. And here's the other aspect that often we don't recognize because this is what's typically happening. You remember I showed you that, fan chart with various capabilities. Now there are multitude of vein vendors in each of those capability. And most of the organization, most of the customers I'm talking to, they have different players operating in each of these. So they may have two or three or four different data quality tools. They might have multiple integration tool or data catalog tool. Now it becomes your responsibility to try and stitch it all together in order to be able to generate the level of insight you're trying to as opposed to a single platform having that metadata. Because even if you're trying to stitch it together for you to reach that level of meta net metadata awareness as an enterprise will take significant longer, right, as opposed to a solution or a a platform that is able to talk to all of these different types of data sources and capture that automatically as it operates and they does day to day job and builds and creates those capability for you on top of it is a significant value proposition. I'm gonna slip skip through the slide, but this is a reference data architecture reference architecture, sorry, for our master data. But just generally, the point I wanted to make here is, of course, the capability I talked about from an integration perspective, matching and linking API and all that, but also to be able to seamlessly integrate with the rest of their data ecosystem, which is critical. So if you're building your next AI capability in Snowflake or Databricks or Azure or AWS, s three, whichever is your choice, right, to be able to seamlessly provide that mastered governed data, the AI ready data that I was referring to earlier, into that platform is gonna be the difference. Right? Because, otherwise, you're having to build rules here that either you are not gonna be able to scale or will continue to remain siloed as your most likely your organization have been, you know, if you look back last fifteen, twenty years. Right? So this is where one of the thing we I wanted to highlight to be able to bring those, not just the data sources, but the consuming applications to really drive the value from AI. So let me quickly switch to because I talked a lot about MDM or data governance more generally. I wanted to walk you through a quick quick click through demo, and I wanna make sure we have some time available for q and a. I'm not sure. I don't see the questions. There might be a few, but, I'll make sure I'll we have few minutes left for quick question and answer. So this is a standing landing page for what we call the Customer three sixty, so customer master. So here, I'm gonna be pretending to be a data steward who's exploring customer information in this particular case. And the first thing I do as I go into the system, I look at if there is a workflow. So remember I talked about governance processes? So in this case, I have a job waiting to say it found two person, EJ Bishop and Eric John Bishop. They're more likely most likely the same person. And it's giving me the information for me to assess and analyze and decide whether it's the same person or not. And the reason it came to me is because the scoring forced you to not to make the decision on my behalf because that's how I want it to be. And it says, look. This failed under the threshold. I'd like you to go and check if they're the same person or not. But let's do a little bit of an exploration. I'm gonna search for a customer. And in this case, I'm gonna say I'm looking for Volvo cars. Right? So it pulls up the information on Volvo cars. Now this is what I would call the golden record. Remember the job log example I showed you? This is all of my data available in my data ecosystem, and you will see some example of the source system. It might be sitting in the ERP, s three, or SAP, or other ERP app environment or CRM environment. And this is a consolidation of that in a single page, collecting the critical information that I care about. Now you are within your organization context will design. It's all drag and drop, and it's no coding or customization there. But you can, you know, make it, suit your requirement. The other aspect of it, remember I was saying, if I didn't have access to certain information, it will automatically not populate the screen with that particular information. Let's say credit card information or, date of birth, something that can be sensitive or PI type of information. It will manage that for you based on the organization governance rules that you've set up. Now if I click on the source system, it then then says, by the way, this particular Volvo car record is coming from multiple source system. And you will see, and this is often the case even within your environment, you will see the data captured in different way in the different systems. Right? And it used that and used your organization rule to decide what that golden record should look like. It also captures information like all of the the source systems, so you have ability to trace back to the source system where it came from, what ID was it in the source system, so that if you need to go and make the changes either automatically or manually, you can do so as well. If I go on related records, you know, remember I'm I'm talking about customer three sixty, so this isn't just about company information. You can now start to see their person information, people who work in Volvo cars. Right? Their multiple contact information. There's some services and thousand information, also product purchase they had made in the past. Right? I click on hierarchy. This is another interesting one because in this case, it's showing the hierarchy based on, where the organization is in terms of in United States within the state. But if you can you wanna look at the global hierarchy, you can also create a different set of hierarchy. Most applications will allow you to have hierarchy of one type. But if you think about when you're capturing sys data from multiple systems or if you're a global company, you have different way of slicing and dicing the customers or products or, any other master data. You want the flexibility to create unlimited different type of hierarchy. In this case, it's capturing that. Then there there are other types of hierarchy within the organization structure where they are. Maybe they have subsidiaries. Maybe they have information about their parent companies, and you can enrich that. Remember, I was talking about enrich enrichment to with the sources like Dun and Brexit as an example. So in this case, I show more and see where does this particular customer sit within that organization hierarchy. Now then I click on history. Now this is also very critical, particularly if you're a regulated industry and you're you have audit requirement and and reporting requirement, who has access to the data, what changes are they making, when are they making those changes, you will you will have now that ability to view that in a centralized manner, all of those change history at, in one place irrespective of where, which of the source system, whether it's CRM or SAP or anywhere else those changes are made, and you have the ability to track that it has the value has changed from one particular value to somebody have put in a different value, and you have a full audit history in one place. And then finally, the ability or the the the capability around multi domain MDM is to be able to span across different types. So in this case, I started with Volvo car, and I see the contacts related to Volvo cars, where they're located, what sort of product they have purchased in the past, so and so forth, and then drill through, let's say, one of the contact information. Now this case, now we are looking at a profile that is an individual as opposed to organization. So not only we are mastering these two types of data, we are now building the relationship. So going back to my first example I started with around that airline booking, that creating that hodgepodge and turn that into tidy, neat related environment, this is how you're starting to put that bricks in place. So whether you as a data steward approaching that later on or an AI agent doing that, it has a regulated, structured, standard way of approaching that data. Now if I click on hierarchy for that, I can see, hey. By the way, we've also built the hierarchy based on household, so we can see the family members of as long as I have access right to see that, the family members of the grand household. If I click on the three sixty view from the person's angle, let's say, alright. He she's an employee of Volvo Cars, part of that household, and she herself made a purchase of Chromecast with Google, right, as an example. The other extending the idea of multi domain MDM to say say, Dizina b, who's actually an account manager for our company. So she's an employee. She's the account manager managing the Volvo car as an enterprise customer. Now when I click on Xena, I'm in an employee record looking at it from a employee perspective. Right? Now one other thing I wanted to point out, for example, again, touching on enrichment, you see some of these little checkboxes, which basically says this billing address or the business address or the phone number has already been validated. But I'm mistalking about it earlier. But in the customer context, if you're in a customer record and if there is an invalid data, you now know that you need to do something from a governance perspective and data stewardship perspective to fix that so that you don't get into a situation where you're shipping some information or sending a mailer and never reaches the destination. Right? So, again, bringing some of those day to day interaction with your data, what you do with it as a business, in a single place to be able to manage that better. And, again, you can do the same thing from, from the account manager perspective now looking at that relationship. Now I see not only Volvo, but I see all the other, customers that is managed by Zena, which closes up back to the Volvo cars that we had there. So let me pause there. I don't know if there are any questions. That's pretty much all I wanted to cover. Rob, I'm not sure if there's any question. Thank you, Ash. So if you do have a question, we've got about five minutes left, so time for maybe a couple. Chuck those into either the chat or the q and a. Ash is more than ready to take some of your questions. We had quite a bit, on our last two, so don't be shy. Let's see if we can stop him. And I don't see any oh, we got one. Great. This is from Kamesh. Thanks for the demo, Ash. There are a lot of relationships between multiple entities of information. How is this set up? Is it an automated process or inherited from an ERD system? From a data relationship perspective, you can automate that. I touched on Claire earlier in terms of, being able to automate some of this, and be able to establish or build rules, for example. So when you you wanna, correlate household as an example, but, obviously, things like organization people are working in, it has to be inferred from, data that you're uploading. Right? But one of the differentiation on a master data perspective, which is why I kept touching on sort of, enrichment, is to be linking it up to the most accurate information. So oftentimes, for example, if you go and look into Salesforce, if you're using Salesforce or any other CRM, you know, you capture the personal person's information and you capture the company they work for, the company information is often sort of typed and, you know, doesn't relate to the actual organization or the hierarchy of the organization. And this is where the opportunity to enrich the data come from. So you are able to set those rules. You can enrich the data. And, so the platform will help then make you make the deduction on which organization within the structure it relates to. It will help you make those connections better. You can also create these through, if you already hold that in some of your systems where you have the relationship between entities, you can also automate. As part of the load process, you can automate that as well. So both are possible. Awesome. Thank you. Couple other quick questions. Can we get the presentation? As far, we we are recording this, and this will go up as as well as the other two Informatica InterWorks co presented webinars. Those are already up on the interworks dot com website, so this will be up in probably the next twenty four to forty eight hours. We'll be sending an email out to all of you. And maybe, Ash, if you would like, you could share with us your your deck, and we could send that as a PDF for the folks that want a a copy of it. Happy to. I'll share that, so you can include in the deck, the the the key that you sent out. Awesome. Ezra, we also transcribe the webinar, into to a a script written out, in in regular old language, so that's also useful. One more question, and then we will call it. This one is from, Pendurang. How can we take practical training of intelligent MDM and three sixty applications plus clear? Yeah. So, look, there's a lot of resources. If you go to Informatica website, there's a lot of resources, you can look for, product demonstration on the demo center. There is a experience lounge as well where you can actually play with specific areas, from some of the things I touched on, like the agentic AI, that is due for, later in the year release. But then you can start to also, explore sort of concept of it from our website as well. If you're interested, pan to run on something specifically, drop us a note. Rob or, myself, we'll be happy to look into where we can direct. But there's there's a lot of information online. There's videos being released. Follow us on LinkedIn or other social media. We we share that there as well. And last but not least, there's also Informatica University where there's package training available that you can also consider. Awesome. Thank you, Ash. Thank you everyone that has joined. As a reminder, we will be sending out an email in the next day or two with the recording of this. We'll share all the materials that we can. The recordings for the previous two webinars are already live on interworks dot com. You can find those. Otherwise, if you have any other questions or is there anything we can do to help you with your governance journey, please reach out. Thanks, everyone.