Hi, everyone. And, Thank you, Vicky, and welcome to today's webinar on AI demand driven forecast in partnership with Dataiku. My name is Michael Condon, and I am a Delivery Architect on the solutions team at InterWorks. And more importantly as and it works as a whole. We've got one kind of phrase to sum up everything we do, and it's we make data work for people. To make data work for people, we had to look at, like, a lot of different elements in the data journey. And so we're involved in a lot of different activities from the very high level strategic side of things where we can help assess your platform you decide on the platform, design data governance strategies, all the way over to the other end of the spectrum, where we end up doing some data engineering work, we can help connect to extract your data. We can even then start to visualize your data and present it to your users. Also, in case you need some additional support, We can also offer some training and enablement services so that your people can grow with your data. But with all of these things, technology plays a key role and, different tools come up And if you're in this world, you know that there is a vast array of choices in terms of what tools you could be using. And for us, we like to take something called a best of breed approach. And all that really means is that we look at the whole journey of data, and we look at all the stages in between, and we try to determine what is the best tool for each one of those stages. So, I mean, this is a bit high level and theoretical, but I can visualize this a bit better. And this helps us look at what we think is the BI architecture for a modern stack. And here in blue, I've listed out a lot of the key elements that are involved in some BI architecture. We've also thrown in some labels for some tools that we would recommend in each one of those stages. But also at the very bottom, in bright green, we can see something that says machine learning. And we can see how machine learning supports the rest of this BI architecture. And when it comes to choosing a tool for machine learning, we do like to recommend Dataiku. For a lot of reasons. I couldn't put them all on this screen, but to start with, when you look at Dataiku and you look at data preparation, if you've done any type of data science work, you know that data processing, cleaning, it takes an inordinate amount of time to do all of that. And with theiku's features, you can productionalize that. It makes your life a whole lot easier. I highly recommend looking at it. But thatiku also offers visual model creation, it supports generative AI. You can automate and deploy all of these models from the platform. But one of the cooler features that I didn't put on here, but I really appreciate about Dataiku is It's very transparent thatiku offers this kind of whiteboard like service. To where you can bring in different users from different roles to kind of collaborate on the whole journey of data all the way up to getting a model ready for production. That way, we can take the expertise of data scientists and the knowledge and context from business users to make sure that we have a very holistic model. At the end of the day. Again, this is all very good and theoretical. So for today, we have a recording from Dataiku's Benoit Rosher. And in this recording, he's going to walk through how we can use one of tau that Dataiku's pre built templates to apply to your own data so that you can get this AI demand driven forecasting. Now, this is a recording. But we would still like people to ask questions. So if you do have questions, please put them in the Q and A section of this Zoom call. We also have Manuel and Quentin on the line, and they're here to answer any questions. But since it is a recording, it's probably best to answer those at the end. Before I get started with that, is there any other questions, Vicky, that have popped up? We don't have any questions at the moment. However, I would like to add an interactive element if that's okay, Michael. Yeah. So just to quickly, I'd love to understand at the moment for those people who are on the call. We're gonna be looking at a use case. So I'd like to understand, what are you using or would you like to use AI for? So I think in this, recording, we're gonna see, Francois go through a couple of use cases So it'd be great to understand, you know, what would be the perfect use case for you, if you were to be using AI? So we're getting some answers coming in. It's actually really quite interesting to try and understand what the purpose is, and it looks like the majority of of people are saying demand forecasting. So you are absolutely on the right call today, which is great news for us. So it looks like we've had quite a few people participate in the polls. So what I'm gonna do is I'm gonna end that And, I am gonna share the results so that everybody on the call has got visibility. So as I said, demand forecasting, internal questioning, of data and customer service. So I'm gonna stop interrupting you, Michael, and I'm gonna let you hit that play button right now. Oh, fantastic. So without further delay, here we go. Hi, everyone. So I'm going to run you through our efforts to AI driven events. And for that, I'm gonna start by introducing myself. So My name is Bernard O'Rrar. I'm a director within the business solutions team here at Atiku and in charge of in charge of the retail and CPG industry. So today's, agenda is around three main areas. So first, I will introduce you to the business solutions. Then I will, show you what it looks like in practice for a specific area, which is demand forecasting, and we'll end up with a with a demo, of the solution within within the database. So let's start with, the tech business solutions. So the idea of, solutions, is to reconcile platform and solutions together so that we can help our customers a bit better and boost their right trajectories. Our platform is, perfect broadband for all types of use cases. And we are deeply convinced that by bringing data solutions on top of it, we are accelerating time to value, and we are helping our customers, quickly implementing, ready to use the business utilities. And so our value proposition, as of today is a set of, thirty ready to use, use cases. The form of fifteen plugin play. So you just need to connect your data, set your parameters and run, the application. And fifteen analytical templates, making it easier for you to customize and to start, from an interesting starting point, making you, more advanced than if you were starting from scratch. To the idea of those, of those thirty, pre built use cases, are to accelerate as I was, saying, actionable business outcomes both for data and business experts and all that, powered by, our, Dataiku platform. And when we talk about our detect platform, as you might already know, this is the platform for every, everyone in the enterprise. Whether it be, IT teams or data ops, whether it be data experts that are usually coders business experts, is that are more critters, let's say, or even process owners which consume the outputs of, a good solution, sir, is a good of, projects, for example. And so, bringing, as I was mentioning on the previous slide, the solutions, on top of those, abilities, capability, sorry, is a is a is a good way to accelerate, time to value across the board. So here on the middle of the slide, you can see, our main, value prop, I would say. Of data in cool, let's say around, cataloging and exploration, data preparation and pipelining, visualization, and analysis, machine learning, of course, ML ops, AI governance, AI apps, and lastly solutions. Which is, the topic I will focus on right now, just moving around so that you can see the full side. So the solutions are, built around four main principles. First, they're ready to use, in the sense that You can have access to it by simply uploading a solution or downloading the solution further, directly from our interface. They can be customized. So there is no black box, effect at all. Everything is fully transparent, documented, and, of course, everyone can customize it accordingly, according to its own needs, so that it can, it can be used the good way. There are industry program. It means that we build the solutions on a vertical as a to recycle as a porch. That's why I'm responsible for retaining CPG and we have other team members responsible for other industries. And we think that it bring an additional, set of, workflow capabilities by better answering specific needs. And finally, it's built through modular approach, meaning that solutions can be combined together to create incremental failure. So the goal of business solution I was I was mentioning reach, we accelerate time to value, and and and so to do that, we aim at tackling, tangible business challenges. The idea is to identify critical challenges within each industry and to try to address it, as much as we can. You know, a global approach so that our customers can then leverage it, in their own context. And ultimately, the goal is to accelerate, as a time to value recent organizations, with those, AI driven projects. As of today, as I was saying, we have roughly thirty solutions. And so today, I will mainly focus on, one solution, which is, demand forecast. And I will introduce you, to it right after. But here, you've got the the full catalog solution, which, of course, I want, go through in details, but we can have a look, and, and, and you can also go in our website. I look for more information. Everything is listed, documented. So feel free to, to go over our website and to reach out if you need any, you know, if you if you want to engage in any form of distribution. So introduction to demand forecasting. Let's start with, some business content, let's say. That IV comes from, all the interactions we have had with our customers. And so it always starts with strategic business needs, whether it be, being driven by, gmail planner, asking, asking ourselves and what is the right amount of product, that needs to be pulled there in a specific area Whether it be a pricing manager asking what is the right price, for launching a new product or whether it be a category manager, we're willing to know which product should be, kept in the portfolio. And so all of those questions in a way or another, link to demand forecast. Why is that? Because demand forecast is the basis to foundational a use case that is needed to unlock further topics. So it's really critical use case. It's a strategy use case. So in its form, of course, it answers to, the ultimate question about what is the demand for a specific product or specific category of products at a, in a given time frame. So that's the, let's say the images, problem that the solution is solving, but it also unlocked quite a number of other projects. And along along this, list of product, we have listed quite a few here. It's of course, not exhaustive, but just to illustrate also how it can help answer the questions, we have started, with on the side. And so demand forecast is needed to then reach the price optimization, project you need to be able to forecast the demand for every single price scenario that you like to implement so that you can, then make the right decision. Your pricing optimization strategy. Same for inventory management. If you want to optimize the number of products that is, that are going to be installed in your warehouses or in your stores, then you need to, of course, ultimately focus the demand at every single location. That you can optimize it accordingly. So far, sensors. So you have, quite a number of use cases, that are linked to this strategic use case. And so that's really critical to have this, interoperability and these links, in mind. Prior to, working and implementing such, the use case. So once is clear, the two last elements that needs to be answered for a proper implementation. Our line with the geography is a is a geographical scope of, the implementation. So, of course, it can be implemented, at specific geo level. As well as implemented for specific product lines. So you need to define what is the right, granularity, whether it be at category, a product category level, a subcategory or even a product SKU. So individual product, log on. So that is really the critical and initial piece of, syncing and, structuring that you need to address. So we have, built this, this demand forecast use case for many reasons. I have few option on the previous slide. But, ultimately, as I was saying, forecasting the demand from Boost, is also a good way to improve your packaging. So what have we learned in terms of, pain points by, you know, our disc through our discussions with, with our customers. So first, forecasting the demand is often attached to poor accuracy. So it's actually quite difficult, to make it, really, really accurate. And so we have, try to address this point by taking into account both business specific cities and external factors. So the way we have built our solution take that into account. So we'll, have a look at it during the demo. Secondly, handling multiple scenarios with the same level of quality is quite hard. So we have seen on the initial slide that it needs to be, deploy at geo level and at, product granular IT level. But it it's always quite difficult and tricky and it demand it can it can, we require quite a substantial amount of effort to, to achieve that. So the way we have built the solution is allowing for the, ability to handle all needs in terms of granularity. As well as allowing for coleslaw because that's something we have not, mentioned earlier on, but, forecasting the demand for existing good is easier because you have historical sales forecasting the demand for new, products is, other, let's say. That's what we call call start. Lastly, what we also witnessed is a domain experience. Like, a supply chain planner, for example, and data teams, might be troubling to final consensus when it comes to finding the right looking for the right solution. And so retry as much as possible to bring Zoom's team together within the same interface with clear rules and responsibilities. So that's really the the main, challenges that we have tried to to solve with this solution. So now more pragmatically, what we have built, is we think a path to success through a stricter process. So it requires, a few data sets, transaction history, and product hierarchy. We'll be we'll get into more details, applicator, but you've got, a very first understanding of what needs to be, inside the solution to to be working in terms of data. So usually what we see is that either a data analyst or a data scientist, will do the setup. And so we have, built an easy way to, set up these solutions through what we call, which is a simple data interface where you don't need to code, is really like, just, you know, setting up parameters, linking your data, So, linking your data in the right repository, so that it can be, used accordingly. And so you just need to, set all that up, to for the solution to be working. Once it's set up, is the idea is to, adjust the mod the the performances of the model, and the way we have done that, is to allow for data analysis to leverage evaluation dashboard. So, so that they can build, their own explainability, in terms of, model evaluation. So that's a really important part to build trust and therefore to follow for a trustful collaboration, with the end users, the end users being, in our example, a demon planner, and the way a demon planner will leverage the solution is through, another component, which is called a web app in our case. And so that's a really, like user friendly interface. Where you can, leverage, you can just, you know, adjust the way you want to visualize the results, by leveraging the demand for aspirations. So that's, an really quick snapshot of, of the web app, but I will, make the demo, but at least, you you see a first, first version of, what it looks like. And, prior to, going into the demo is just a very quick understanding in terms of, processing steps. So we start with, data ingestion. So what you see here is, maybe I should have started with that. What you see here is all the processing steps within a data flow. A data flow is a set of processing steps. So you start with some datasets in the left hand side and it goes so that's the square boxes. And it goes through processing steps. One step after the other, your data is transformed, up to the last phase in the web app zone. So in red narrow case, on the right hand side of this, print screen, where the final data is consumed to be, then retrieved, and leverage within the web app. So it all starts with the ingestion of the data. So you just need to connect to your data sources and all that is done as I was saying through a user friendly interface, so you don't need to actually go into each of the steps here. All that can be set up through a user friendly interface. I will show it to you right after So you connect to your data sources then there is, a feature engineering part, so that it can you know, generate, several data sets that will, allow for the model to be, working with the expected, data. Then, the third part is around, gathering the features that are computed on the previous step. First part is the modeling part, source through trained, and test split. We train the models and we predict the demand. For the selected datasets. And finally, as there is, as I was saying, a web app zone, retrieving all the data and allowing for web app consumption. So now let's go through a quick demo. So I will start with the end user interface first so that you can see in practice how it works. So here, I'm connected to, a data ecrual instance. I have loaded the Dimal Farast project, with some data, and I will consume the project as a demon planner will do it. So a demon planner would first need to, select the granularity to granularity, is attached to in this case, stores individual stores, whether it be selecting a set of stores, food store, or just what store, so you just need to select what is relevant in your own situation. So let's say I need to do a full analysis of the forecast for all my stores. Then, I'm afraid the possibility, the possibility, sorry, to select the the the product categories. So in our case, we have, street product categories, food, obese, and household. Let's say I'm selecting only one, category, which is foods. Correct? Then, a new set of filters is over. Now a subcategory, so of of all all all that depends on on your product hierarchy. But in our cases, we have attached to these projects, pro production hierarchy down to, the subcategory. So we select all three, subcategory. And then finally, we reach the final layer, the more granular one being the product ID available. So let's say we want to build, the forecast for all the products in those categories. So, yeah, I will just apply filters, and it will it will automatically build, the the the forecast based on the different, parameters that I have selected. So it will the forecast at a given time or reason So in our case, we set up this time a reason, the the the tag web. So this I will show it to you right after. And we can also compare it, to the calculated sales data. So that's the actual sales, for which we have also attached the demand forecast the previous period so that you can compare the performance of the model versus the actual solid data. This is the important part. It allows for bringing confidence in the ways of what the name is built, so that you can trust, or address, the, recommendation of the model. In terms of forecasting. You can also extend, the, time frame for the project. So that you can see a wider, ahead of time. And, again, compare actual sales data, choose a demand forecast with an impact. If I scroll down, I have all the details So all the all the data attached to, this graph, which is displayed here so that you can have a look into into it with with much more detail, if needed. Also another option is to display, the average historical sales So there is another, set of data that is being used. So the average history that's, is populated here. And you can also display the products that are on stock. So If you have questions about what it means, you can you can just click into the, question mark here. So that's so that you have some information, you can also add, a few more, you know, can also display it, whether it be on the left or on the right hand side. So this is the final user interface. So now you may wonder, okay, that's, interesting, but, what about, what's under the rule? So what we have done is that, we have explained everything within a week within a week. So all the, web app and processing steps, of the project is documented so that if you have any questions, If you ask yourself, about how it is working, what model is being used so forth and so on, every everything is documented so that you can have a good understanding so you can build, your your forecast statement in a in a trusted environment. And you can also adjust the project, to your own need if you need. So as you can see, we can deep dive into the user manual, within, for example, it's a project that will dashboards, which I was showing to you, right that way before. So everything is explained, so that you can have a good understanding of, what's inside the dashboard. Instead the same four, it's all the parts of the project. Now I will reach to, the data equip which I was mentioning. So the data equip is the place where all the setup is being done. So here we are in the, declaration of the demand forecast solution where you have several, let's say parts, to configure. So I'm just going to quickly show it to you. So there are quite a couple of parts, because the demand forecast modeling, it's a demand forecast solution. Sorry. It requires quite the number of parameters to be properly running. But don't be afraid that it's not that long. It's just a a couple of parameters that are actually to be aligned with your business objectives, business outcomes. So if everything is properly, framed before you run the project, it should be quite straightforward. First, you need to define your connection. So in our case, we define it as a snowflake connection. And then you retrieve, once you define the connection, you retrieve your data set where they are being stored. Then you do your data specification. Basically, it's mapping your data against the datasets that are required for the solution to be running. So here, transaction datasets, the prices as well if, you've got, the prices and, you want to leverage that. The product data sets forecast calendar it is so for sensor. So you can add a couple of parameters is that are not necessarily, mandatory that can be used to to optimize the way the model is ready. Then, you continue. So once you have, mapped your tables, your data, you will use the same form. You have columns identification. So for all the mandatory columns, You will select the field that are part of your datasets. So that we are sure the solution is leveraging the rights the right data. So here we do all the data mapping as I was saying. So for all the tables we have seen, the both. Then we compute with setup. Sorry. So forecast granularity So here, we go one step further, into the project where we tell, to to the project what we wanna what we want in terms of forecast granularity whether it be at the store high development or at another November. Whether it be, in a in a time frame of five weeks in our case, but it can be, it can be, of course, customized according to your needs. So as you can see, everything, can be set up accordingly. Also, just one quick, comment. Taking one step back. You can create as many data like web of demand forecast in our case. As you want. It means that, you won't affect one, version of the project. And therefore several, version, several, needs can be answered simultaneously without affecting one another. Then there is an additional data quality project that you can flag out flyers, let's say. Then we reached to the we reached the feature engineering part, where we set an additional set of parameters. And so all that is done at, the different tables, level. The first part being, related to the products or the skills, second part to the calendar events, then we have the sales windows. So what is the, less payer, it's a loopback barrel that's worth giving you two reports. And same as solve the past years, finally, we've got the feature engineering for the pricing parts, and we reached to the demand forecast of leading parts. So here, we basically tell, it's interface what we want to forecast and the, let's say, problem, the modeling part is supposed to to be solving. And so we do our set up, we set up, all the other parameters. Sorry. Sorry. Let's be tough for a longer a bath air. And so once it's, all set, you just need to, run the application and to look for, the reasons. You can also automate it so that it is executed at a regular interval at a regular and lastly, I'm going to show you the flow, which basically so once you have set your data acquired. It's once you have run it, it will automatically rebuild the flow with the new data and the new settings. So all that will be auto automatically built, accordingly. And, of course, you can, explore, everything that you have attached to your project in the settings. So let's say, for example, you want to go into your transaction. You can explore all the tables and have a look at your data. So that you you have full trust in what is being, processed by the public. So, I hope it was, clear of course, don't hesitate to reach out, sorry not for, being able to be here today. But I will be happy to, follow-up on any question you might have. And, there we go. That concludes, recorded from Benoit. And, So just to recap, he was able to walk us through a little bit about the platform, but then also that last piece was showing how you would end up using one of these pre built templates on your own data. So you wouldn't have to rebuild all of the other steps that are involved. Of course, you can always take your own approach to it and make your own steps for it. But this was just one of the features to show. But that's all we have from a recording. Vicky, did we have any questions in the app? We did get a question, and I think it's quite interesting to probably share this. So, is it, a model for all users or the model is trained for each company? And the answer from Quentin, the model are trained on the data data provided, so you can have a global model or a specific ones on subsets of your products. The model shipped with the solution is trained on sample data and should not be used on new data without retraining first. So I think that's quite a, a useful answer to the question. So thank you very much for putting that forward. We don't have other questions at this time, from the audience, but I do have one last question, if I may. So I'm keen, to understand from the people on the call. If they're actually using AI at the moment, you know, is there an AI solution in place? So you'll see a poll that's just popped up. And I think it's important to understand AI in ML have been around for quite some time now. But it would be great to get an understanding on who is planning to to look at that on the twenty twenty four roadmap, or if anybody, is still getting to grips and and like me trying to understand what it's all about and what that entails. So I'll leave that poll up for a couple more moments Michael. But if you want to carry on? Well, actually, this is gonna be the end of the presentation, but just to kind of paid you back off of the last question. Yeah, you can, adjust your data to fit some of these templates and They have several other templates inside of there, and they're all meant to be applied to your own data so that you can then get real results very quickly. So they have them in several different sectors, including finance and health care, but very, interesting to check out all the different templates they have. But that was all we had for today. Wonderful. Just one last point from myself, if anybody is interested in reviewing Dataiku, and learning more about it. We will be sending you a link to a trial. So you can always take the platform and give it a try for yourselves. We also are here at Intworks. We have an AI readiness assessment, that we can offer out, which is basically one of our consultants coming to support you and understand if the data is in the right shape and in the right place, and if what you would need to do within your organization to be AI ready. So thank you so much, Michael. Thank you everybody who joined us today. And we'll get a copy of the recording out to everybody. Appreciate your time.