Welcome everyone to you today's session on how we can create hear AutoI with data science. And I hope you all have had a lovely week so far. And let's get started. Next slide, please, Austin. Awesome. So today you're gonna be part of our solutions spotlight series. Which we run every single month. And last month, we spoke about data strategy. And this month, we'll be looking at how we can create real ROI with data science. Next slide, please. Now, what is the solution spotlight series? So this series will focus on client outcomes and use cases. What we are gonna basically talk about is best practices across data and analytics. So, as I said, last month we covered strategy, this month we're going to cover on how we can create real ROI with data science. So, I'm so glad you guys were able to join us this month and let's get started on this new topic. Next slide, please. So before we get into our topic for today, I just want to touch upon who we are, who is interworks. So, I'm just gonna take a few minutes there. Next slide, please. Awesome. So, I want to take again few minutes to introduce InterWorks. Maybe this is your first webinar with ours or maybe you are returning. Either way. Welcome, and I'm super super glad to see you here. You might be wondering who interworks is. We do a lot and sometimes that is quite a bit to explain. If I put it simply, we specialize in data strategy. So, if you work in analytics, you know the challenges of an ever changing tech landscape, and the pressure of keeping up with high demand of insights that's needed to drive change within an organization. So this is where we come in. Our specialty is building the best data strategies alongside you and to be your trusted adviser when you're needed. Further, everything we do is backed by our people, so we're constantly learning to and we want to share that with you. Next slide, please. I think, Azu, we jumped a few slides. Sorry. Yes. That's okay. We have we have a global presence. So in APAC region or in Australia, we have offices in Sydney, Melbourne and Perth. So if you guys do wanna jump or come come up to our offices and have a chat with us. That will be amazing. We also have our presence in Singapore. Our mothership is in the United States, so headquarters is in Oklahoma. We also have offices in Canada, United Kingdom, Germany, and Holland. So we have a global presence. So you do get the collective global knowledge that we have to share. Next slide, please. And beyond our mission and our people, we can also help you navigate the right tools to align with your goals. Some of our partners you'll see on the screen right here. And if you're looking for more resources on data, analytics, or any of the technologies that we discussed today, be sure to visit the IntuitWorks blog. It's world famous and it's a great knowledge base for you or anyone in your organization who might be working in data. Again, before we get into today's webinar, just wanna do a little bit of housekeeping, so we hold this webinar these webinars, especially this series every month, and we value your feedback, especially from different customer communities and audiences. So we can curate the best content based on your biggest challenges. As mentioned before, today's, today's webinar is actually going to be recorded, and you will get probably a thirty minute recap on what we have discussed today. It'll also be up on our blog. So please feel free to check out a blog so you can catch up on today's content. And one final request. Throughout today's presentation, we will take questions towards the end of the session. But to help us out, please use the Q and A function in the chat button. It's at the bottom of your Zoom control, and use that only if you have questions throughout today's webinar, today's presentation. If you do have general commentary and we'll also be popping in links, put that into the chat. So, let's meet our presenter for today. Next slide, please. So we have Azisina Coronial today is our data science lead for the APAC region. I'll let Azu introduce yourself, but Asu will be presenting on how we can create real ROI using data science over to you as you. Awesome. Thanks for that. It's a beautiful intro and for explaining interworks, Carol. And welcome everyone to this webinar today. I'm quite excited about it. I always love to talk all about data science data, and more important how we can use this to really bring good opportunities and real ROI to our businesses. So, very glad that you all here. A little bit about me, I started as a business intelligence consultant. It's been more than fifteen years from that. So, quite a good experience on the data side, business intelligence data science area here. So very always happy to talk about data and about all things strategy. Let's get started. So how are we going to talk about today? Well, basically it's three points. The first most important one is why do we want to do data science? And again, just deep upfront thinking about business value. So just, if you keep in mind that think about business value upfront in mind, you will be feel really related to this webinar, and you will you will really enjoy this, and will it will be very helpful for your organization's Next, we are going to talk a little bit about the challenges and opportunities around this area and how you can get started finally. So to set a little bit the scene, I found this very interesting graph here that it's the IBM Global AI adoption index. And I felt interested because I can see us here, Australia being twenty four percent that of companies already deployed AI and forty four percent are just exploring them. It's probably a little bit lagging in comparison with other countries, and I started chatting with other colleagues and thinking about this, why might this be? I like in chatting, we chat about several results. Maybe it's because the salaries now are going up. Recruitment costs are up to double from last year, so it's quite important there. Maybe it's complicated to find the talent. Maybe it's less talent in coming, because previous years, we, of course, we know we had COVID, so maybe there are less people in here that we can bring from people of talent. Maybe it's because it's a little bit difficult, it sounds complicated, all this stuff, what we are going to talk about and how we can help with this. So first of all, in what area do you feel today? Are you on the right here with Drake? I want to try it all. I want to active. I am actively using it actually currently as we speak. It is very exciting. We are playing with maybe some use cases, but we haven't really seen the value out of it. But we are excited about it, and I'm very keen, or are you more it's too difficult. Maybe you have more pressing issues at the moment, you really don't are not thinking on how you can incorporate machine learning into your business. Maybe you feel that you are not ready, or maybe you feel that you don't have the skill set in house. So let's do a little live poll here to see how our audience tracks on those thoughts and questions. Carol, if you could please help us with the with the poll here. It will be fantastic. Perfect. So I've launched the first poll. And I'm just waiting. Yep. Let's wait a bit while people start We'll give it like a minute or so, and then we'll end this one. Nice. I love to see a few high numbers there from one to ten. So seven is getting quite a bit of traction there. Just gonna give it another minute We have like sixty eight percent participation. Nice. Okay. I think we're stagnant here. So I'm just gonna go ahead and yep. Seventy percent Perfect. There's a few of you guys are jumping in towards the end. Give another seconds, and then I pull up the results. Alrighty. So I'm gonna go ahead and end that call, and let's look at the results. So the results look really pretty good, especially with seven getting twenty four percent and six with a seventeen percent. So those are the data science initiatives. They're getting quite a bit of value, and seven is really, really good. So we have yeah. So we have four, five, and six sitting at seventeen percent, which is great. It's very interesting to see that no one really went that nine and ten. So, of course, there's always room to improvement, which is actually very exciting, right? Like, it's not all done and all set. We can do more stuff in this area for our business, which is pretty quickly. Let's I have a stop sharing. Pull up the second one. Yes. And I have launched it. No, thank you. This is to try identify the big challenge. So let's see what our audience is thinking about on that regard. Interesting. Hey. I love how, like, option one, two, and three is just like battling it out. With each other. We'll give it another thirty seconds. Yep. Alrighty. I'm gonna end that poll. And let's look at the results. Looks like we have The first one that is identifying use cases that bring real value to the business, that looks like the biggest challenge, which is which is pretty spot on. I'm really excited actually to see these results because we are precisely going to talk about these. And hopefully we will be able to play, like, some, sorry, turn on a little bit of light on it and talk about it. And of course, we are here to help you guys out. Love these issues with my data. I know garbage in garbage out exactly spot on like it. You will see that it all starts with the data. If you are not right on that, everything gets a little bit more complicated. Thanks very much for participating in this. I will just now close this and go to the next slide. Okay. So, what I said at the beginning, we need to keep in mind that the business value is the basically main and only reason that we should be thinking about when we are experimenting and and thinking why we should do data science. I really love this framework. It's from our friends in the Tycho. So how do we need to think about this? Well, they're very good reasons to do that science. Number one, to make money for our business, number two, to save money or to enable our organization to do something they couldn't do before. So if you start thinking about different use cases, or things that you'd like to explore, and you use this framework very, well, I'm pretty sure that you will find something that is real valuable for your business. So without further ado, let's look at some examples of businesses that actually drove real business value thinking of on this framework, implementing and having a framework that all of well, most several of you put the framework was an important challenge. So let's look at these ones. This is a company in the banking sector. So, this has to do with how they could reduce employee attrition, and why would they want to do this? Well, we all know that the cost of hiring new employees is always higher than the cost of retaining ones. All that's scaling up, all that them getting used to the business, etcetera. So This is an important issue of how business would want to retain their good employees. In the banking sector, specifically in some areas, the attrition rate is up to ten percent So that's quite high for some areas. In this specific company, they had a problem where the data was silo in different data systems. So, of course, if everything lays out everywhere, it's a little bit difficult to see the whole picture. So HR needed really a single view of employee attrition. And with that, they needed to understand what was happening. So how did they overcome this challenge? So first of all, we talk about garbage in garbage out. So, of course, the first step was to retrieve all the necessary sources and build that data. Very important, once they have consolidated this, they started doing machine learning. In this case, they build some attrition probabilities for each of the employees using different predictive tools. And finally, when they had these predictions, of course, it's not just staying there with the machine learning model and probably a few data scientists or analysts knowing what's happening. It's very important to share that out and bring it back to the business so that they can act upon it. So in this case, they use and visualize the results some dashboards for everybody to have really easy access, and HR principally to take some actions, preventative actions, to avoid their employees going away. And finally, the volume matching again back with our framework. First of all, it was they get some value time saving, because with this easy and fast process that was fully automated, they didn't need like to reduce several times the same thing, etcetera. So they automated everything. And, of course, they also reduced their employee efficient. Once they understood deeper, the reasons why people was living, they could predict, in advance, and then they have greater possibilities for early intervention. So thinking about our framework, remember, This is what were they doing, well, pretty much saving money, because they were reducing that cost of hiring new employees. So this was our first use case of how machine learning really gave back to the business value. Second one, this is an example of a company in the retail and CPG sector. So, this was a global CPE sector with multiple brands and multiple, and doing business with different retailers. So what happened here is that since COVID-nineteen, their customers were purchasing more and more online. Of course, no one could go out. So, the online sales went up in this case. But the issue with this company is was that the customers tend to buy in the retailer's websites rather than directly in their own website. So, they were losing a little bit of margin there, and they drill that their discount strategy was really very generic. It was not super interesting for their customers. So, how did they how did they approach this challenge? Once more, garbage in garbage out. This is going to be probably the first step in all the use cases, they needed to retrieve all the necessary sources and put them in one single spot so that they could see the complete picture of what was happening. Layer, they did build a Discountelasticity model by training a demand forecast model. So, with that, they were able to see dCT of the discounts to maximize the profit. And then finally, again, sharing all the results, not just them stay there in the analysis. They empower their business team by providing dashboards and proper UI so that they could play with the model and do what if analysis. If I would I discount this much, this product, and this in combination with X, Y, etcetera, how does my revenue looks like? And the value, of course, this was a pretty impressive result in the pilot. It was up to twenty three percent of revenue, pretty good number there. Up to thirty one percent total margin and up to twenty percent, twenty seven percent of their total volume. So that's pretty good from starting. All clients are really not excited to buy in our website to having this number and increase the sales. It was pretty good. And of course, it was also important to take this pilot all the way to production and and really into use. And it was a very fast, fast time to impact two months to fully customize solution. So that was a pretty good turnover. Once more our framework, these the numbers are there. We are talking about making more money. So again, real business value because we thought about this this impact and how it was good for the business. Let's move on to our third example here of use case. And this is a company that it's in the manufacturing sector. So, in this case, this is a giant manufacturer over one hundred years old producing trucks. And the challenge here is that they, for their maintenance of these drugs, they used to have a calendar based schedule without really considering any historical operations, soil degradation, and any other variables that they could choose to do it in a more efficient way. So, what this mean was unnecessary stop for operations, of operations for maintenance. So, even though the trucks really didn't need the maintenance, because it was just talent database. They just stop, do the maintenance, and then move them back to business. Which was not really very efficient. Once more, first step, retrieve all the necessary data sources. And then after that, they build a model to predict useful life and prescribe actions. So rather than having let's stop every six months no matter what and do the maintenance. They started saying, okay, this is probably going to fail in x number of days. Let's book it for maintenance. And what this meant for the business is, of course, reduced maintenance intervals on one side. It also improved the safety because it prevented critical failures. What about if one of the trucks really didn't reach that six months calendar based interval, it was going to fail before. They could see it, and, hence, they prevented axles. And, of course, cost savings. So it was fifty percent downtime reduction. That's a pretty good number, and also avoiding unnecessary maintenance ions, hence avoiding spending on all those materials and staff and work hours and all that stuff for unnecessary maintenance. So syncing once again back to our framework. This was focused on saving money. And let's spin a little bit here. We talk about specific use cases. Now let's talk about these company, which was a semiconductor manufacturing company, and they realized that they needed to enable and support machine learning use cases factories so that they could have really a short time from from starting to develop all the way to getting them into production. So how did they did this? Well, they did this through a citizen data scientist upskilling program. So we saw that one of the ideas in that treatment slide, maybe you don't have the skill set in house. And this is true. Again, getting talent, it's pretty difficult. But the people in your company have a lot of knowledge of of the company itself, a lot of knowledge of the business. And it's important to consider that and give them the opportunities to upscale and just have the technical, like, technical acumen to just hold that knowledge and really bring back once again. Business to the value. So with this program, what did they focus on, they focus on building a pipeline of machine learning use cases with potential tenths of millions of dollars of return value. So, they really streamlined this process, and they, in the end, they were able to produce use cases such as forecast, mass market demand, and segments. That's pretty good. They also created a repository for operations research because, again, this is a manufacturing company they should have been doing a lot of research, they streamlined all these to support that function. And finally, also, they use some machine learning techniques in supply chain. Thanks all these to that citizen data scientist program, upskilling their own employees. And once more the value, well, the improved forecasting accuracy with those forecast mass market demand and segments, They also had faster time to analyze complex data relationships. Again, people in this industry had a lot of experience. They kind of knew what was happening. But they just needed to understand how with other tools and some techniques to analyze deeper that in the data. And they also got the ability to reuse and report those successful solutions for other parts of the business. So, this is very important. Like, once you build your that science center of excellence, you are able to take one of the solutions that you already developed and is in production, being able to tweak it and then use it for another error. You don't need to have to, you don't have to start from scratch. So going once back again to our framework, what did they do with this? It was enabling new capabilities. Perfect. So we talk about few use cases that were delivering real business value and how they achieved that. Let's now talk once again, a little bit more about the challenges that we briefly touched at the beginning. So first of all, The first one, there might be too many or too few ideas. Maybe you mentioned in the chat, how can you prioritize and really identify those machine learning use cases. So, I like very much this framework. Well, first of all keep calm and prioritize. That's the first idea. And the framework is here. So, if you have this matrix from less value to more business value and from less effort to more business effort. Of course, you will want to start here. So let's focus initially on those low hanging fruit. The things that are going to really give back value to the business, And at the same time, they are not too difficult. They are, they are not going to take a big, big effort for you to get started. Remember, you are, we are just getting started, it's fine to start with the simple things that are going to deliver fast value and then keep moving up to the more complex stuff. So, for example, and this is, again, just few examples. This will depend, this prioritization will depend on the organization, on skill setting house, etcetera. So don't take this as, like, these are the use cases that belong to that quadrant. This is just example of what could it be. For example, predictive maintenance, we talk about that use case, maybe customer churn prediction, similar as with employees, the cost of acquiring new clients is higher than the cost of retaining the ones you have. So this might be a good one to get started, and it would bring real business value immediately ready. And maybe forecastimate. A little bit more of effort, but still very high business value may be automatic defect detection. Let's imagine that you are in a manufacturing company, and you have you have in your production line. You need to identify the defects, maybe you want to do something with image, computer image to automatically detect this and be able to to save some costs and to do things more agile. And let's talk a little bit about the other spam filtering, sorry, less value and probably less effort because it has been done several times, maybe spam filtering, like or different products or email already has this, so there is really no point on focusing our time on things that have already been done. And to the less value and low and very high effort, maybe image classification of your launch time. Like, it sounds interesting. Maybe you can do it as a pet break to practice a little bit of machine learning, probably doesn't really bring high value to the business. So remember this framework put together all your ideas, and then you can use this framework to identify where to begin. Okay. So let's talk about the second the second challenge, and everyone is singing to their own tune. And, well, we are data people. It's in our DNA to be curious and to want to explore things and to want to play with new tools and all this stuff. It's great for us. It's fun. It's probably not that fun for the organizations if everybody is trying their own things. So if we have some data scientists here in house, you might recognize that olds are versus python dilemma, It's us all this time. We all have our own preference on the tools. Maybe you have in your organization people that are very good with SQL, and they really feel comfortable doing all their analysis in that. Maybe you even have some people that love Jupiter notebooks. They can do their analysis there, they can see the results in line. UPS notebook is fantastic, one of my favorite tools. Maybe also talking about different work styles, you have people that it's really more visual. They maybe want to see more of graphs. They want to see more of visual things to help them do their analysis. And maybe you even have people downloading stuff and doing their magic in Excel. So everything is possible, really data people, it's we like to explore again, and they are quite resourceful, so there might look a little bit like this. And this is still like five different icons there, five different ideas. It's still pretty neat and okay to organize, what about if we move forward and we think of everything else that's out there? It starts looking like this. Not to need anymore, very complex. One of your data scientists might be working with something, the other one would be working with something else, it's a little bit complicated from the organization point of view. So how do we tackle that? First of all, frameworks frameworks are great. So you would need a framework to guide from ideation to taking your models all the way into production. With this framework, if we have like that skeleton, the people that is working on that might be easier for them to see where to start or if they have already started, what's the next step? So frameworks for that are great. You also need to consider a tool set for different work styles and skill levels to support the team collaboration. So again, if some people is doing things in SQL, some other people is doing things in Python and R and all that stuff, you might want to really look at a toolset that has all these abilities that allows your people to work with their own abilities, but very important also under just one roof to be able to collaborate. And also, you need to consider a platform that enables your team and grows according to your needs. So, for example, maybe you are a small team that requires help with start their machine learning models. You don't have that assigned in this thing house. You have very good data analysts, but really not to they haven't played too much with machine learning models. Maybe you can start with AutoML. That's perfectly valid. And I'll explain what is AutoML. Basically, in the old days where these tools did not exist, with that scientists would need to maybe open our Python notebook, maybe grab different algorithms, run, different tests, etcetera. Automate is automate on this. So it will have a series of different algorithms and different parameters to it, and it will run it and tell you which one is the better depending on the metric that you want to optimize for. So it is a very good way of getting started. Cool. But maybe like maybe you have a bigger team, a very well established one with data scientists working on their tools. They are using Jupiter notebooks. They have their own preferred coding environment, etcetera, very well established team. Well, your platform needs to have the ability to bring and use these, all these things that they are already using inside the same platform. And why do I say that? Because it's important to allow everybody working under the same roof Because from an organization's point of view, this will be easier for covenants, this will be easier for team collaboration, this will be easier for getting everything into production under the same framework. So, very important to think about this. Enabling your team at the same making is at the same time making it easier for the organization. Cool. The third challenge, data is everywhere, right? Like we saw in those shoes says that the very first step was to collect all data and making sure it was organized and ready for doing machine learning. So this is definitely one of the challenges. So we need to be mindful that the data, not all data is created So, your data needs to be representative of your everyday operations. Thinking back about that predictive maintenance example, For example, you might be using sensor data, and you might have left the sensor on while the machine was off. And this might be might corrupt your data with periods of inactivity. So, thinking about these little things of how the data is being created, and hence how you are going to use it is very important. You also need to consider that your data needs to be faceted enough so that the machine learning can detect meaningful patterns. For example, once again, in your predictive maintenance example, if you are just using, I don't know, you have maybe fifty different sensors. But for some reason, you are using just three. Maybe machine learning is not going to detect that meaningful patterns. You need to be sure that you are having a complete picture of all your full operation. We know garbage in garbage out, but don't worry, we can help you out with your datasets. We can make the machine learning ready. And once again, data is where everything starts. So, this point is really important. Talking about to the second point here, the data science really requires flexibility. So, you will need to do a lot of experiments, you will need to add maybe some new features of variables, maybe you need to look at new datasets to, again, make all these datasets richer for machine learning mode. So you need a platform that allows you to have this flexibility and to allow you to prepare your data assets as you need. I have seen in my experience some tools that don't have, like, really great data preparation abilities, and the teams in the data science space really struggle with this because they need to go back and probably ask some other people, some maybe some of the pioneers to maybe add one variable, or maybe, I don't know, split the data, etcetera. So allowing under the same tool to have good data preparation abilities, it's really just for your team and for the time to value in your development. And finally, data science is all about experimentation. So we also want to make sure that the tool allows for different experiments to be run to be able to to save the different versions of all those different datasets of all those different models. So, versioning is very important to have in a tool while doing data science. Once more in the old good old days of your Jupiter notebook, it might be like version one, version two, version final, version final final, version real final. So it's very important that that your platform really has this in a more organized way. And let's talk about these four challenge. So It's all good news. You have a model now. You have identified something that's optimized and giving decent results. Now, what do you do? So, According to Gardner, up to eighty five percent of machine learning models, projects do not really make it into production. What does this mean, not making into production, it means directly lose value for the business, because you spend the time developing things, you spend the time testing, and then it doesn't Like, it doesn't go anywhere. It's there. It's ready to go, but you really don't have that framework to put them into production, monitor them, making sure that it's delivering. Value to the business. So really, as it says here, developing a machine learning journey is just halfway of the journey. You still need to bunch of extra work for it. But again, don't fear we can help you with that. Let me talk to you a little bit about the machine learning lifecycle. So when we identify our problem, because again once more, everything will start with the value that we are trying to give back to the business, we will start with our data discovery. Then we will do some data preparation as we saw in our use cases. Very first step of everything. It's really get together the data. We will start our model development. We will do a bunch of iterations here and experiments. Then we will move to making sure that the model is ready. We identified our winner model, the one that is that we want to put into production. And we have finally our model sign of happy days. What happens next? You need to go model deployment, you need to allow the business to consume those results of the model. And it doesn't stop there. You need to really monitor it and make sure that it is not it is not drifting that data sets that you are using haven't really changed significantly so that your model is not relevant anymore. So you need to do a lot of monitoring. And finally, with this, you will find that everything will need to release a cycle of continuous improvement. So Maybe you're something in your operations change that needs to be reflected in the model. Maybe you found out of a new or you got a new idea of something that mining has the model. It needs to be put to test, If that works, then again who has more the cycle, put into production, etcetera. So we talk about a framework that we need to consider for identifying the high value lower for use cases. We talk about some of the challenges and that we need to keep an eye for the different tools that our team will require. For the different areas of the machine learning life cycle that we will require. Okay. So how can we put this really into practice? We talked a lot about theory here, some use cases of real life value. Let's get started. In practice. So if we zoom into that twenty twenty three machine learning artificial intelligence and that landscape that I put there the big screen shot with a lot of those, you will find that few logos for the enterprise and machine learning platforms appear here. For example, h2o, amazon stage maker, the tyiku, there are robot Google AI, etcetera. We partner really with the best in breed technologies and with the partners that we have seen in practice with our clients that are delivering real value and that are solving real problems within our clients. So, this is why we here in InterWorks partnered with a Taiku for the machine learning needs of our clients. So let's match some of the, of those challenges we talk about with the solutions and solutions with the capabilities that Taiko has. So we talk about that we need to support all the skill levels and the work size. People that it's really keen in doing their analysis. With SQL, with Python, with HER, people that has more like a visual style, the style to do their data exploration, where that DaICO enables the collaboration of all these work styles and abilities and skill sets under one roof, which is very important. Second point, it enables flexibility because attack is pretty good to for data preparation. So once more, thinking of all the skill levels that IQO has visual recipes that are very easy for athletes that are not that haven't practiced too much with coding to use those visual recipes, but it also allows for Python SQL recipes to be used if the people is really keen to do their things coding. It allows for experimentation because, of course, we have the model versioning. In that cycle, so you can iterate as many times as you want. You can go into production with something, and you can always go back to the previous models. If that's necessary. Very important, and I love this fourth point, it's for the full machine learning life cycle. And I even needed to grab a drink for this. So with this one, I have seen my experience several tools that are great at high performance algorithms. They are great at doing experiments, even, etcetera. But the way that they or the platform for doing the full machine lifecycle as in going into production, being able to monitor it from the same platform, being able to detect drift, etcetera, being able take it back to a shop and repair it, if you would. But I could do that very well. It really allows you to go from date exploration, to deploying into your production environment to monitoring and continuous improvement or under one roof, and -- which means really easy easy way to do it for the organization from a governance point of view, and really for everything to flow easier. And also that echo has ultimate capability. So we talk about maybe you have a small team that don't have highly, highly skilled in machine learning models, etcetera. You can always start with that type goes out to ML capabilities. That's very simple, straightforward. Okay. So moving on of how we can help you. So I saw this in a presentation before, and it's from the book Eddie Obing. I really like how this framework works for explaining about the four different types of projects. So if you think about the problem expertise that you and your company have, and the solution expertise, the solution expertise, the tools that you are used to, the things that you already know. If you think about this, we have really four types of projects. If you don't have too much knowledge of the solution nor the problem, then you are basically walking in full. You don't know what to do, how to get started, where you are going. Maybe you know the tool sets, you really know what do you want to work with, but you don't know really problem. You don't have the expertise and really an idea of them, like where you want to go. And that's like making a movie. You exactly know the different tool sets, what's going to be the outcome you are not really sure. If you know the problem very well, but you are not really sure about the tools, this spread is like going on a quest. You know what your goal is, but you really don't know how you are going to do it. And finally, if you are super expert in the problem and super expert in the solutions, tools, etcetera, then it is like append by numbers. And this is the best type of project, because you know what to do, you know how to do it, and its happy days. Basically, and of course the most likely to fail projects are those ones that are working in fog. And the ones that are most likely to win are those ones of pain by numbers. So we want to be there. We want to take our projects move them through the quadrants and making them pain by numbers so that it's easier to succeed. And how can we help you? Well, if you really don't know where to start, what's the endpoint all this we have for you a machine learning accelerator that I will explain for the in a little bit. If you need to if you know what you want to If you know your tools, but you really don't know too much about your problem, we can help you define what's possible and start and get you started today. If you already know exactly what you want to do, but you are not too sure about the solution, then we can map out your project and manage delivery for you. And if you have everything, you know, what to do, you know, how to do it, we can help you. Our consultants are experts, it's, in about sixteen years of experience across our consultants. And we can help you to get you there. And really be successful with your project. Okay. And we are almost off the time, and I want to give some time for questions. So this will take just a couple of minutes more, we can help you to accelerate the machine learning, your machine learning journey with InterWorks. So how does that mean, what does that look like. This is like a series of workshops and support that allow you to get from zero to machine learning excellence. So, we have, for example, the machine learning use case workshop, where we talk about all the ideas that you have and help you prioritize and get started of what's more, what could be more valuable to your business. Also, we can do a platform rapid start. So if you are keen to test run the platforms, we can set you up and get started today. The next step in this mountain is the use case deep dive and prioritization of those use case. So In my end, we already identified the use case that you want to work with, we need to check if the data is ready, because, again, garbage in garbage out, we really need to make sure that the data is there. We can help you with the different capabilities and the different frameworks to really develop this. Platform enablement, of course, we strive for our clients to be self sufficient and to really own your solutions. So we will get you enabled in the platform and making sure that you can manage your solution and also machine learning enablement. So Once more, for example, with the citizen data scientists program, we know that your, the people in your organization have a lot of experience and a lot of knowledge we just need to give them the tools of machine learning to use all this experience, a little bit of aid with machine learning, and they can get great praise and great things for your organization. And of course, we can help you with the use case support we can accompany you in that journey and finally take that model and hopefully, many more into production and giving real value to the business. And that's me done. If we have any questions in the audience, that will be great. Thank you, Azu, for enlightening us.