Awesome. Thank you so much, Vicky, for this nice introduction. Let's go with our webinar for today, exploring the semantic layer. My name is Sebastian. I am based in Augsburg, Germany, if you wondered what my accent is. I'm an analytics lead at Interbrox, a global data consultancy. And in that capacity, today here to share my knowledge about semantic layers. And I'm going to hand it over to Michael and Rachel too for a quick one sentence introduction, please. Yep. Hi. My name is Michael, and I work on the solutions team as a delivery architect. I help clients basically find out what solutions could be the best choice for them given their goals or their problems. And then if they like those solutions, I can help you deliver those, and one of those being, implementing DBT and or Tableau Cloud. And my name is, Rachel Kurtz. If you're wondering about my accent, I'm based in the US. I am on the East Coast, so it's eight AM here right now. I'm an analytics architect at Intworx. Been here for a little over seven years now. Kinda work with the entire life cycle of data. I have a background in data science, but I work with data prep, data visualization, and model building. So do a lot with, DBT and Tableau as a combination. So excited to help you guys understand what we're talking about here. Thank you so much, YouTube. Alright. I said it before, we are a deep global consultancy. So I think we existed for, like, over twenty years by now. Right? Something like that. A long time. We have a mantra here, best work, best clients, best people. If you want, you can take a screenshot or look up the video afterwards, the recording, and read through all of that. This is a mantra we live by, and there's far more to that than even those few sentences. But what I want to say with that is we cover the whole data pipeline. So we start with the data in the first place, and we hopefully end up with usable insights at the very end. So so much about us. For our agenda for today, we have a tiny prologue here, where we go into pipelines and data prep. We talk about what a semantic layer actually is, tool independent, at the very beginning at least. Then we have a quick dive into two selected tools for today. DBT, of course, this is our focus tool for today, and also a tiny bit about Tableau because, hey, in the end, we want to show that data somewhere and do something with it. Tableau is one of the tools that we can use to do exactly that. Then we'll dive into the semantic layer directly in DBT, and we'll close the whole thing today with a tiny discussion among ourselves or based on questions you are asking in the q and a section or in the chat. So pipelines and data prep. What you are looking at right now, for those of you who have not seen a reference architecture or architecture in the first place before, this is what we are usually looking at when we are trying to build a data pipeline. So how to read the whole thing? Usually, we start on the left. We have some data sources somewhere. Could be on premise databases, could be something in the cloud, and so on and so on. And then we have to do something with that data. Usually, there is an ETL or ELT process involved in there. We have to store the data somewhere, do analytics with that, and, ideally, in the end, share the whole thing. I'm going to show you another one because this is only one example of so many different architectures that are possible. Very similar in terms of contents, but looking a little bit different. So here we also have a start here, which is quite helpful. So we start with our data sources that we have somewhere. We extract that for those of you who have not really dealt with ELTs and ETLs or ETL ELT pipelines, extract, load, transform, or extract, transform, load. Depending on which direction we go, our errors guide us through the whole processing here. So we do transformation. We clean the data. We enrich the data, transform specific fields. We store everything in a database. Could be a lakehouse. Could be a data warehouse. Could be anything else. There are so many options around that, actually. We have an orchestration, layer in here. There are two middle sections here guiding from the database to the analytics development, which are machine learning, and the one guy we are talking about today, our semantic layer. From there, it could go into validation, if we have a validation tool out there, into the actual reporting and sharing area. This is where we produce dashboards. This is where we produce reports and everything else. Hopefully, as I said before, this all leads to actionable insights in the end. Sometimes you also see this red dotted line down here. Not ideal, but can still work in some edge cases where we directly connect to whatever data we have and look at what is happening there. Not ideal, not recommended. Usually, we go through all the steps or at least a few steps of our pipeline. So you might have noticed semantic layer is somewhere here in the middle, in the center of the whole piece. It's not really a data prep layer or data prep tool as I've heard recently. Actually, the question here, what is it semantically in the first place, came up a lot in recent times. First, we have quite a few tools out there currently developing semantic layers, offering semantic layers in their tool sets. And, also, there's a lot of, I wouldn't say really misinformation, but misguided information around there. For example, just last week, I've heard, hey. A semantic layer is an AI layer. Right? And I have to say no. Definitely not. It can definitely work with AI. It can be supported by AI solutions. But the semantically, it's actually a very old concept. It goes back into the seventies, eighties, and I think it was for first build, not just theorized in the nineties, something around that, but, really, they are kind of experiencing a hype currently right now since we are all a little bit more aware of how data pipelines work in the first place. So that being said, let's go a little bit deeper into what we are actually going to talk about. I asked Chegg P team, recently, hey. How would you explain a semantic layer to someone who has never heard of that? And it told me, hey. It's a kind of translation. Think of it as a universal translator for everything. Well, at the time, I believed that. Today, I am not so sure this is true anymore. But at at the time, I immediately looked into universal translators and stuff like came up. And I fell into a rabbit hole of different iterations of universal translators. So I couldn't really use this, so I asked, hey. What else could we use? And it came up with a Babelfish from Hitchhiker's Guide Through the Galaxy, this guy here. And I thought, yeah. Well, this also translates stuff, but, actually, maybe it's not a translation tool, at least not really. So I thought a little bit more. Got feedback from a lot of my colleagues and also from the Internet. And, well, we came to a really cool explanation around the whole thing. So let's consider another data pipeline, a little bit simplified. Here, we start with raw data somewhere and go through the whole process, semantically here in this example in the center. And at the end, we share stuff. So the example here is we start with grains on a field. So imagine we are currently in the food industry, and we have to, well, get the grains from somewhere, usually from a field somewhere. Well, this is corresponding with our raw data. We have our ETL or ELT process to extract, load, transform our data, which is basically the tractor here carrying the stuff around, sorting it maybe, filtering everything out that we don't want to use later on, and so on and so on. We have our database, our storage, which actually has a similar expression in the real world for that too. We have a storage building somewhere, silos somewhere where we store everything that we gathered before, collected before. And now I'm going to skip the semantic layer, just a second, and go straight into the kitchen. This is our analytics layer. This is where our chefs start to do something with those ingredients and ideally come up with dashboards, with reportings, with everything that we then later can use to base decisions on. Because this is usually what we do with data. Right? We have data somewhere. We want to do something with it so we can make a decision upon that data. And this, we usually do later on in our sharing space here in our restaurant where we then have a buffet buffet buffet buffet buffet buffet. Not really short of round. That one. Buffet. Buffet buffet. Thank you so much. At least that's how we pronounce it here. Where we have all those delicious dishes, and we can consume the data as we need it. Now what does the semantic layer do? I also re first recently heard another expression that I haven't heard that often before in my life, the mise en place. This is partly prepping data, but more defining and prepping data. So in our kitchen here, this corresponds with to getting all the ingredients into a specific order so we can easily pick them as a chef and immediately start cooking. And we also have a little bit of prepping in here in terms of making sure our produce here, our vegetables, our fruits, whatever we have is well, touch is prepared as we need it later on. So the chef and, basically, all the users after that in our pipeline can work with something that is already in a form to be workable. I really like this example, by the way. I think I'm going to use it for many other things in my life because, actually, we can enrich the whole thing with all the tiny steps in between. Think of a governance layer, for example, and how that would fit into this example. Actually, quite limitless. Alright. Why would we think about the whole thing in the first place? Why should we think about a semantic layer? Here, I'm going to start with a buzzword slide. So I'm very sorry about that. There's a little bit of text here, but at least a few cool colors or at least one right now. So consistency. When we build a semantic layer, and we'll get into what it is, of course, technically is in a few moments. We get consistency, meaning we have metric and definitions centralized, one source of truth, a single source of truth for our data, which then can be used across different other tools. We have our implications onto governance in the DBT semantic layer, for example, but also in semantic layers and other tools, this is directly connected to the lineage capabilities of those tools where we can immediately see where did that metric come from, what data does it actually use. We have version control in most of those tools. And our end users or whatever dashboard uses those those metrics can be sure that the code for those metrics has been reviewed before. And this is not just, hey. We have coders there who build stuff because they usually review their own code. There's also a kind of re reiteration, a feedback loop involved in there. Like with every other BI tool around the world, when we build a dashboard, ideally, there's feedback later on, and, our dashboard improves. Thing same goes for metrics in a semantic layer. When we see there's something wrong with it, there's usually a feedback loop to the ones who build that semantic layer to improve on it, which then leads to more consistency and more centralized data that we can use. There's also this big argument around portability. Portability not in the sense of using different devices or anything like that. No. Here, it's about untying the business logic from specific platforms. So for example, if you are have a pipeline in your company currently, where you have a have a database leading directly to a, I don't know, to a BI tool, could be Tableau, could be Power BI, Looker or whatever you're using, Quite often, you are building that business logic in that BI tool, which, of course, means afterwards, it's kind of difficult if you want to switch to another BI tool because you have to migrate all those business logics to a BI tool. If we migrate this business logic to a semantic layer before of that before that, we can basically swap business, BI tools later on without having to think about that business logic at all. And that's, of course, also leads to this tiny perk here. We can avoid vendor lock ins that try to force us into one specific universe, into one specific platform within our lot dependencies on that specific platform. What can we do with that afterwards? Well, our usual BI and reporting, self-service analytics, you know, all of that, and, of course, AI applications, the well, not the most important part of the whole thing, but at least something that is emerging. The more streamlined, consistent data and governed data we have, the better the outcomes of our AI solutions become afterwards. Okay. But I think this is enough theory for today. Let's go into our tools. Vicky, can I quickly ask for our first poll, please? Absolutely. Fantastic. That poll is now up. We'd love for you to participate. Please take a moment to complete the question. So how comfortable are you with the term semantic layer? So hopefully, after Sebastian's wonderful introduction, we're all feeling extremely comfortable. So we're just gonna leave that open for a little bit longer. It looks like over half, of the people who are attending have participated. Awesome. Thank you so much for that. Wonderful. I'm gonna stop the poll any second now and then share the results. And those results are up and happy for that, Jim. Awesome. Thank you so much. Giving us a little bit of context around what we're talking about here. So the majority I have have more than one idea of what it might be. Yeah. I have to say I was in that bucket as well for quite some time. And, for the other items in our tiny list, it's kind of equal, actually. Okay. Awesome. Thank you so much. We are going to get into another one in a few moments. But first, I'm going to hand over to Rachel to get us a quick introduction into the first of the two tools we are going to talk about today with DBT. Rachel, should I just go to the next slide over to That'd be great. So, yeah, just doing a quick little, like, four three to four minutes, basically, to give you an introduction to what is DBT. This is where we'll be talking about the semantic layer specifically, within this instance of DBT. So I just wanna give you an understanding of what DBT is. So if you'll go to the next slide. For those of you who haven't heard of or utilized DBT at this point. So what is it? It is DBT stands for data build tool. One thing I also like to point out is you'll notice that every time I write DBT, hopefully, if I did it correctly, it's always lowercase, and that is very important to them. They're very particular about making sure that the DBT is all lowercase even if Microsoft Word wants to make that first letter capitalized. But DBT is a data build tool. I'm just gonna read this sentence. It enables data analysts and engineers to prepare and transform didn't spell that correctly. Transform data in their warehouse using familiar SQL syntax while providing features for testing, documentation, and version control. So kind of think of it basically as if you're you've got your data engineers who've created your data, brought it into Snowflake, brought it into Databricks, brought it into Redshift, wherever your data lives, then you're trying to visualize it within something like, Power BI, Tableau, whatever it might be, but you realize that there needs to be some edits that needs to happen. Some kinda, like, last mile data prep is a phrase that we use. This is where DBT kinda comes into play. It's a transformation of the data into a form that that you can actually utilize, renaming fields, creating calculations, that kinda thing before you start visualizing and making your insights. So DBT kinda lives in that middle space. It's kind of the middleman between your your big data parts, data warehouses, and your visualization layer. If you go to the next slide. Like I was saying, this is SQL based. So all things that are available within SQL, loops, conditionals, etcetera, you're able to utilize within DBT. So on the right hand side of this, you'll see a little bit of code. This is something from actually from within one of our DBT, model building instances where this is creating a model, creating a dataset, basically, that is selecting four different fields and joining things together in a way that's making it kind of reproducible and, able to utilize afterwards. So all SQL things, if you can do SQL, DBT isn't too much of a stretch from you. It allows for creating tables and views, similar to that, what we were just talking about, testing, so making sure that your data matches certain criteria that you have, making sure that everything is unique and not null, making sure everything is greater than zero. Right? Whatever test you're trying to do on your data, you're able to do that before you even bring it into your visualization layer. And then also documentation of the process. So if we're talking about governance and the semantic layer being one of the ways of, like, utilizing that, the documentation within DBT is also similar to that as well. On the bottom of this, you'll see this lineage graph. That's something that comes within DBT, so you can see how your models interact with each other, how orders and payments come together to make backed orders, and then backed orders and customers come in together to make dim customers. And it's just showing you just kind of this lineage around it. So DBT basically is just a nice SQL based visual way. And all this, I do I should say, for those of you who are not familiar with DBT or are kind of, this is talking about DBT cloud, and that's what the semantic layer for dbt is also available as dbt cloud, not dbt core. Dbt core is kinda more local based coding within your thing. Dbt is great either version of it, but just wanted to comment that some of the things that we're gonna be talking about today are specific to dbt cloud. So that is a quick little little crash course in what dbt is. And now I'll pass this back to Sebastian. And if you have any questions, obviously, about dbt, we work with DBT a lot, and they're one of our partners. So we've got a lot of blogs on them, and you can always, like, ask in the chat, and I can answer. Awesome. Thank you so much. Dickie, can I ask you to run our next poll, please? Yes. This is a bit of a double poll. So we're exploring something new here. So a couple of quick questions. We want to understand what your skill level is with DBT. But if you keep scrolling down, we also are asking the same question, for Tableau. So I'm gonna leave this open for a moment longer than the previous one, because I appreciate we're asking you two questions for the price of one. And it looks like it hasn't put anybody off, which is fantastic. Thank you so much, Linda. Everyone seems to be participating, which is great and really helpful, for for our host so that they can tailor, the presentation a bit more as we go through. So it looks like we have the majority of people answering. So that's eighty five percent of people have answered. So I'm going to end the poll, and I'm gonna share the results right now. So if you're interested. Yeah. The results for DBT look pretty well spread across the board. I mean, twenty five percent, twenty five percent, twenty nine percent. So, yeah, there's a it's a big spread for the d b two tool. For Tableau, do you have the the same or different? In Tableau, it looks like a lot of people here are, are frequent users. At least two thirds are, using it. So people are very familiar with Tableau, it seems like. In that case, I am going to shorten our Tableau introduction for today and make this into a a one minute crash course directly in the tool. Let me quickly think how to do that in one minute. So we're looking at one tiny sheet in here. Tableau is a drag and drop tool for fields that come from a data source, fields on the left. Every field can be dragged into the whole thing in here. This case here, we have flight data from two airlines. But, actually, let me do that quickly for you. You know what? I'm going to connect to our beloved global superstore from Tableau. If you want to stop the time, feel free to do that. So I'm going to quickly create a tiny dashboard here as soon as Tableau has loaded the detail here. So I'm going to take a category here, maybe a subcategory, also one of my measures down there. I sum up the whole thing. I color the whole thing maybe with a cooler color palette in here. Yeah. That's cool. Let's create another sheet. In this case, I want a tiny scatter plot with those subcategories in here with a few circles, a few names, and also with a few colors in here. And I'm going to create a tiny timeline here with an order date field with measure. I'm also putting this one onto color, and now I'm going to take my three sheets that I've built, throw them onto a dashboard, get rid of everything that I do not know, switch the filters on. And with that, I have a full and interactive dashboard in front of me that we can already use to analyze our data. So I hope this was one minute. I feel like you can do that in your sleep somehow, but that was that's quite impressive. Well done. Awesome. Alright. And with that, I think we have waited long enough. We are going to dive into DBT. And I'm going to hand it over to Michael to talk a little bit about the semantic layer directly in DBT. I'm going to stop my sharing here if you're comfortable with that, Michael. Yep. That's fine. Good. And I will go into here. And the DBT semantic layer. So, it's been a good lead up talking about what DBT is and also Tableau. So the part about how DBT views the semantic layer, like the vision to come. And so right now, we can kind of imagine this is a normal workflow. We can see that we have some raw data that's coming into our cloud data platform. We might already be using DBT to transform that data and make it a bit more consumable for all of our audiences downstream, such as business intelligence, data scientists. If we need to store it in a catalog or into an application. These are all consumers of this data, and they're all using different tools to do this, Tableau, ThoughtSpot, Mode, Google Sheets, maybe some data cataloging tools, all of these different tools. Now the problem that DBT is trying to solve here is that when these separate units keep trying to recreate the same metrics or have to recreate the same modeling, each one of these tools, it can cause issues with governance and consistency. And we have a hard time trusting the data, and we lose time and efficiency. So the way that DBT sees the semantic layer playing out is by adding that piece just above that cloud data platform. So then instead of going directly inquiring the data source itself, you can go to this kind of premade idea about what your data should look like. And these are models. It's not necessarily tables that are sitting there, but there are models, instructions so that when a business intelligence analyst says, hey. I need the, metric on how we calculate revenue. It sounds like a kind of an easy one, but depending on your data sources, it might be quite complicated. But somebody has built that in that semantic layer, and then that way, I no longer have to recreate it every single time I do a dashboard. As well, if the data science team says, hey. I need to make sure that I'm taking revenue into account. We also wanna make sure that they're using that same kind of logic inside of their projects. And doing this will allow for that to happen. So this is how we see the idea of the semantic layer supporting every piece or every discipline downstream. Now this is just concept. We're gonna dive a bit deeper layer by layer. And so if we're talking about this piece of the semantic layer, what does this actually look like? Well, this is a screenshot from our DBT, development environment. And this is based off of a very well known example in DBT called Joffel shop. But what we can see here is that we have a bunch of these blue models here. You can imagine this is staged data for payment. This is staged data for orders. And then we did some combination, some joins, and we made a orders table out of it. We have some staged customer data, and then we are able to make a dimensions table called customers. So this is very normal if, like a lot of people on this call today are using Tableau. You could see how you could already start modeling your data this way and then start consuming it inside of Tableau. But going a step further, we can then create these semantic models here, one for orders and one for customers. And then from there, we can start to create measures and dimensions from those without creating tables. And then these are already kind of preaggregated instructions about how we should interpret the data, and then we can start to get different metrics out of that. Now this can be a lot just by looking at this. So we're gonna jump over into the DBT development area so we can explore this a little bit more. So I will exit out of there. And so what you'll see is that we are in the DBT, development environment, and this is that exact same model that we had up on the presentation. I'll just make this a bit bigger. And so when we do this, there's some interactivity. We can move this around if we want to. And if I wanted to, I can see, hey. You know, I have this metric over here called average order value. I can see the lineage of how we actually got there. But what is making up this how did we get to this graph? Well, what we can do is we can go up a little bit. And now this might look strange to people that haven't interacted with DBT before or haven't interacted with YAML files. But for these semantic models, it's all in this declarative model called a YAML file, and you can see it's highlighted over here as semantic underscore model. And what we've done inside this file is we've created two semantic models, one for customers and one for orders. So you'll see here's one for orders and one for customers. Inside of these, we gave it a couple things. We said that, hey. This is a description for it. It's just a semantic model for DIM customers. That might not make a lot of sense. Maybe we wanna add more information to this, and this can be used later on because we can pull metadata about this. So then we can start providing context to other downstream consumers about what this model is and how it should be used. And we can also see what data source we're using for this. We can see that we create a couple dimensions. If you're familiar with Tableau, we can see that we have customer name here. We have first order date. We have most recent order date. And those of you who've done cohort analysis, this will seem very familiar where you're trying to find out, okay, when did they first make a purchase and what was their most recent purchase? We can also create some measures out of this and look at the total amount of orders they've made with us since they've been a customer. We can see how much they've spent as a customer overall. And now you can start to see that if you were in Tableau, you can always say say, okay. Yeah. I know how to do that. I might have to make, like, an LOD and get a fixed variable in there and calculate all that out. But now all of this is already being done upstream and no longer just for Tableau, but for everybody else downstream. Also, here, we can also create some metrics. If you're used to, like, Tableau pulse and you've seen how you can create metrics there, This is a very familiar, framework. We've gone ahead and we've had okay. Here's an order total for our metrics. And we said that, okay. We can get, where is it now? We've labeled it as order total. This is just it. That's all it's doing. It's just putting out the order total. But we can go further down and see how we can actually aggregate these. And here's a ratio one where we said, hey. For average order value, we want it to be a ratio where we use order total as the numerator and order count as the denominator. And then this will be able to be consumed, again, downstream. Now, again, sorry if this seems a little complicated. We're not gonna do, like, a whole tutorial about how to use the semantic model and define it and build it at the moment, but we just wanted to give a bit of an introduction about what it looks like behind the scenes. If you are looking for more information about how to build this and get the right model and measures, dimensions, and metrics, we can definitely help out with that later. But let's go one step further and see what we can do with this now that we've created this whole model. So what I'll do is I'll jump into Tableau, and here we are. So what I've done is I've taken the d v t connector that you can find on the community exchange. I've installed it on my desktop. And so now that allows me to connect to my dbt core dbt cloud instance. And now I'm able to pull in all of those measures, dimensions, and metrics that we just created inside of our semantic model. So I'll go over here. And so it's playing around with this a little bit earlier. And what we can do is we can grab, like, order dates here and throw it into columns. We can look at, maybe percent of orders that are large, so over a certain dollar amount. Again, I don't have to build that logic. It's already been built for me, so I can bring that into my rows. And, actually, let's put this down to day. And then there, I can already start to see by which day what how many orders I had and which ones were large orders and which ones were not large orders. So on this particular day, I had a hundred percent of all my orders were large orders. I can then even take order count, slip it into tooltips, and then I can see how many orders we're talking about there. But here, I haven't had to do any other aggregation to it. I haven't had to create any fields or calculated fields. This is all done for me. As well, I can look at total orders over time, just kinda doing the same type of thing, taking order dates, throwing it in, go by oh, I was wrong. Order dates. I need to practice my Sebastian skills and go into order count. I feel very honored. And then so now, again, I can already start to build this. Now these are simple use cases. You could probably do some of this by just connecting to the raw data, but it's more to illustrate how we can start thinking about the semantic layer and how some of the possibilities that will save us time down the line. And I don't have to just rely on just this semantic layer data source. This is a normal Tableau workbook. I can bring in other data sources as well. And so I can have these as maybe a high level row of metrics on my dashboard, and then I have the rest of my data down below. And then that saves me time from having to calculate all those every single time. And I can even link them to filters so that we can have that true interactive experience. So, again, just a couple different ways that we can think about the semantic layer and what it looks like downstream. Now going back to the presentation, I'll see if I can go back to here. So what you just saw was the Tableau desktop version connecting to a Tableau cloud, I'm sorry, a DBT cloud instance to look at the semantic layer. Now the cool part is is that DBT and Tableau Cloud have partnered up, and they are now going to make that possible in Tableau Cloud. Hasn't been released yet, but it is coming soon. We saw this was published back in early October, maybe October eighth around that area. And another cool thing that they're doing is that they're going to allow this idea of building metrics in DBT and then pushing those to Tableau Pulse. So, again, instead of having to create all of those metrics in Tableau Pulse, you could create them one time in DBT and have them push directly to Tableau Cloud. So some good functionality that's coming through between the partnership between DBT and Tableau Cloud. And that is it in terms of exploring the semantic layer, and we have a little bit of time that's left, to talk about questions. If anyone has anything they'd like to discuss, happy to discuss. Oh, have a few things, actually, but I think the questions have a priority here. So, we have a few things directly in the chat and also in the q and a area. First, Alexander. Hope I'm pronouncing that correctly. Dbt semantic layer is not available in dbt core. If not, what are the alternatives for dbt core as a semantic layer? Thank you. This is a tricky one, I think. Right? Michael, I'm looking at you for a second here, but, Rachel, if you want to chime in, you are invited to. I don't think there really is a cool alternative. Right? Not in this moment. No. Because, like, basically, the semantic layer is using some of the things that are available within the open source aspect of DBT core, so, like, the metric flow that you have there. But it then uses some of those, like, API and querying that is there's a few things of it that are very DBT cloud specific. So, unfortunately, you could set up metrics all you want within DBT core, but, like, to then have that layer that's, like, translating that from your model and your code to your BI tool of choice is, not available with DBT core. So at this moment Makes sense. At this moment. They're yeah. I don't I was able to I was very fortunate to go to DBT Coalesce, which is their conference in October, And they were talking about trying to create this version of, like, DBT with, like, core and cloud working a bit better because I know so many companies have both at a particular place. So Thanks a lot. Yep. Alright then. From the chat, I think this was for you, Michael. Can you share the repo of the code you are sharing for reference? The repo of the code? In DBT for Yeah. So this one can be found on the there's a site on DBT's website, and it's the Joffel shop semantic walkthrough. You can look it up, and they've got this whole kind of set of exercises you can do. It's free to anybody who wants to try it, and it will walk you through on how to build this entire thing. And I think the dataset in there is being pulled from some type of s three bucket. But, yeah. Rachel, do you remember, kind of where the source was for that? The for how we created this? Yeah. The it I I kept we could probably find the link. Yeah. We have a link that has kinda, like, the step by step on how to do that so we can send that out. Yeah. But it's a very good tutorial, and it'll take you through this entire thing. And once you go through that and you start building that out for the first time, a lot of things start to click, and then you're like, okay. And you can already start to see how you might be able to do that for your own organization. Mhmm. K. I'm jumping a little bit ahead here to the latest question that came in just now. But I'm not sure how the name is pronounced. I'm very sorry about that. M o I s e. Okay. About limitations in terms of calculations and aggregations in Tableau with metrics from the dbt semantic layer, are LODs possible? And, also, is joining the stuff possible in Tableau? Reading those out, Michael, if you need to test that beforehand or if you want to test it. No. I actually tested a lot of that. So, yeah, it's a it's a really great question because, part of the reason why the semantic layer is there is because of governance. And so, therefore, they've put a lot of guardrails in place about how you can use things downstream. So let's say, for instance, what I've done in this example is I've brought order count over, and it's already putting it as sum. I didn't say to do that. It just already did. If I go here and I tried to change this, it's not going to let me. So I can't say, oh, actually do a distinct count of it. Anything else, it's not going to let me. So, yeah, it is important to note that if you are using measures and dimensions coming from the semantic layer, you will be restricted into what you can do with it because a lot of it has already been preaggregated. Awesome. And for the joining stuff, can we join tables with the data source that that is d b t Yeah. I I actually gave that a shot the other day. Let's see. I, you know, I got rid of it. We believe you, actually. Yeah. Yeah. It's okay. But I was able to, bring in so we've got this one here. If I go to, yeah, I was able to bring these are the source tables for that one, and I was able to join it with the semantic table. So, yeah, you can do joins on it. Awesome. Hope the question is unanswered. Alright. There's one There's one Yep. Sorry. There's one in the QA that I do wanna just it's a it's a nice easy one to answer, so I'm just gonna do that one. In the QA, someone asked if the connector the semantic layer connector is available in Power BI for Power BI. There are a few native, integrations right now. Power BI is not one of them. I believe it is on their road map. I'm putting the link of the available integrations, but you can use exports in order to do something. If you go on that link that I sent about halfway down under custom integration, it says exports enable custom integration with additional tools that don't natively connect with the semantic layers such as Power BI. So I believe connecting natively for Power BI is on their road map, but right now, there are workarounds for it. Awesome. Then we have one one, e in the q and a section about let me read it out. It seems to me that a semantic model would typically be implemented and managed by a central e. What would be the best approach if your organization follows a data mesh architecture where decentralized domain teams are all built to their own data assets? One semantic layer for all or one per domain? Would the latter defeat the purpose of a semantic layer in the first place? That's an excellent question. Yeah. I don't think there's one answer to that. Right? No. That that's great. I was gonna I was gonna say the same thing. No. There isn't. I think it's gonna come down to, specific organizations, what the pool of people with these types of skills are and how that's distributed, how you get access to this tool. I think all of this is going to play a part in there. DBT does have a DBT mesh, feature that's there. I have not yet played around with it enough, but I think it's an excellent question that needs to be brought up inside the organization, and then somebody or probably a group of people debate on what's the most scalable and efficient way forward with it. One, you do wanna put some guardrails and governance in place. But if it makes it too hard to work with the data downstream, then we need to start thinking about how else we need to put that out there. What can we loosen up a little bit? What do people still need to access tables for? So, yeah, I think it's going to involve a lot of discussion. Sorry. I can't give a concrete answer on that, but it is an excellent question that does need to be answered. I do also just wanna comment that there's a lot of, articles and there's a webinar, that is online, not ours, but just kinda like that other people have been using about utilizing the DBT mesh and the semantic layer, together as well. So there's a lot of things on there where they talk about things to consider. So, basically, kinda what Michael was saying, but then with, like, specifics into it. So definitely recommend you just Google DBT mesh semantic layer Yeah. Then you'll get a whole bunch of a whole bunch of things. Yeah. I guess the main argument here when you have different semantic layers is that the purpose of having single source of true is basically not existent anymore, especially when there's overlap between different teams using the same data. Yeah. And you wouldn't want to create metrics for every single thing that's out there. I think there's still gonna be some stuff that is gonna have to be done custom, that you could probably still use the models for. But, you will probably not create metrics and just live off of metrics and just consume those. I'd at least not for now, I think there's still too much kind of, tailoring of the data that needs to be done. So I would look at it as using it for when it makes sense, using that high level organizational business logic and scaling it out so that we don't have to repeat, and we know that at least our most important KPIs are correct no matter where they are. The way I like to think of the semantic layer is, like, that's for production level. Like, reports, that's production level insights. Like, those are the the the big decision making. It's also for kind of if you want to I mean, you saw how quickly Michael was able to make this now that the semantic layer was created. Right? He didn't have to change a lot of things. He didn't create these calculations. So if you're trying to get the answers quickly, that's also another good use of the semantic layer. You don't have to do the joins. You don't have to do the calculations. You don't have to do anything like that. But there is still times, like Michael was saying, where if you're trying to do more, like, exploratory analysis or you're trying to, like, figure out what a new calculation would be and things like that, then that's times where you would probably not connect to the semantic layer, but instead connect directly to the models, themselves. So it feels feels bad to constantly say this kind of a consultancy answer of it depends, but it kinda depends on what your goal is and when you're utilizing it. So that's one thing to to kinda think about. So, yeah, here, like, with his orders or customers or whatever it is, these metrics that were created, you can connect to those immediately by going to the orders. But if there are some columns that aren't in your semantic layer, right, you're gonna connect to fact orders as it, on its own. And so just kinda knowing when to use it, is a big thing. And like he was saying earlier, we have a whole paper on it, written by a combination of somebody, at our company named Jack Colbert and, DBT, put that together. There is a paper, a nice little white paper that talks a little bit about the semantic layer and when to use it. One more. Andrew is asking, do you have any guidance on when DBT would be useful for an organization? So not just the semantic layer, but DBT as a tool. Is it based on number of users, size of the org, complexity of data? And this is something we can definitely talk to you about, Andrew, if you want to also in very extensive call because it's probably not an easy answer. Yeah? Yeah. No. Again, yeah. But, generally, maybe we have one or two talking points here or bullet points that we I would say, again, it's unfortunate. We won't have, like, a a hard cutoff or anything. But, I have seen organizations as small as three people using this for, just managing all the transformations inside of their cloud data platform. They've got multiple data sources. I think it was, like, fifteen different data sources. Mhmm. They all needed transformations. And so and, also, the version control is extremely important, especially when it's just two people and they're probably not touching every single one. It's got the documentation piece. It's easy to pick up. So it's hard to say when you need it and when you don't. But if you start finding yourself saying, what's the how do we come up with this table and you start tracking things back and it's taken your organization a lot of time, if you're saying, oh, we can't trust these numbers. We're getting too many conflicting numbers out there. Putting tools in place such as DBT to help with that governance piece can help with trust and can help with consist consistency. So, it ranges is the answer. But, yeah, again, happy to talk in more detail about what your type is what your organization is like, what your data is like, and if dbt is a great answer for you. Rachel, did you have any other thoughts on that? No. I think the biggest question mark, I think, is where does your data live. Right? Like, that's gonna be kind of one of the bigger deciding factors if you're gonna use and, again, specifically talking about dbt cloud or not. Where your data lives is probably gonna be a big thing. If it's a bunch of excels that live on your computer, d v t cloud may not be the thing for you. But if it's a cloud based infrastructure, that's gonna be a little bit more helpful if it kinda really depends on a lot of different things. The big thing that we always talk about with DBT is if you're talking about ETL or ELT, right, extract, transform, load. Right? That's what ETL means. DBT does the t within ETL. It only does the transform. It's not doing the extraction. So the idea is, like, all this data already lives in kind of a central data warehouse. So that's kind of the thing to think on as well when we're talking about that. But I think those are the only other things. I think almost everybody could use DBT, but I'm a little biased on it. I really like using it. But, yeah, I think that's that's pretty much it. Michael pretty much covered it, but we can always chat with you specifically about your company's needs and, goals as well. That'll be a big thing of, like, where are you trying to build to? Maybe you're a team of two right now, but the goal is in the next year be a team of ten, and then this may be setting you up for that so you don't have to, like, retroactively fix things later on. Yeah. Cool. Michael, would you quickly jump to the last slide, our actual slide, please? While we still have thirty seconds until I give it or hand it over back to Vicky for a closing remark, one last question just came in, Marsen. So from DBT level, you define connected models and list of fields available for users. If you don't want to let users list two million orders in Tableau by accident, you simply don't include them in the semantic layer. Right? Correct. Yeah. Yeah. Subject. So, yeah, you you you wouldn't put it in there. And, you can do a lot of upstream modeling. So you can have a middle layer that's in there and then create kind of, like, the final semantic layer about what is going to be there. So, yeah, there's definitely workarounds to do that. Not a problem. Awesome. Thank you so much. Vicky, any closing remarks for today? No. Just to thank you all for your time. It's much appreciated. We hope you got value out of this session. We will be sending a copy of the recording out to everybody who registered to attend. Finally, don't forget to take a moment to enjoy that lovely QR code, seem to be everywhere nowadays, which takes you to a link to our semantic lays in action white paper. So please take a moment to download that. If you have any further questions, please please reach out to us, interworx dot com forward slash contact. We are more than happy to discuss anything that we've covered here today in more detail. Thank you very much to our hosts, and thank you all very much for attending. Appreciate it.