Hello everyone. Welcome to Dataform. My name is Garrett Sauls. I work for a data consulting firm called Innerworks. We're in the US, UK, Germany, Australia, but, I am decidedly not one of the consultants. I work on the marketing team. I'm a content manager and I always joke that I'm a corporate English teacher, but my tie to enablement is largely through Annabelle and largely through getting our consultants, getting very smart data people to share their thoughts with the outside world, whether it's on a blog, whether it's on a webinar. So, I just kinda I try to get people to talk and facilitate discussion and that's exactly why we're here. So, I'll pitch it over to Annabelle to introduce herself, and then she'll introduce our lovely guest. Pleasure. Thank you very much. A warm welcome also from my side. My name is Annabelle Lincoln. I am, tuning in from Switzerland. I am available for hire. I should say that. Anyway, I'm passionate about data visualization and, also enablement. And, it's I think it's all elephant episode that we have, Garrett, with with, data fraud About enablement because, it's something that, we consider that is missing in a lot of place when they want to do, like, data, digital transformation or implement a new software. And sometimes, an environment is a missing piece. So that's what we wanted to, brought attention on that. And today, it's a great honor that we will, be receiving Ant Alfred. Ant is a group head of data service at Cognita School. I will let him introduce himself a little deeper. But to get you started, Ant, maybe, you reached recently joined, Cognita School. But we know that we you have a considerable experience in data and analytic. And by the way, I have seen spoke at the Tableau summit, I think it was, some months ago. And I found that what you said about data in environment and literacy was very interesting. So that's why I'm very happy that he come and spoke with us. And, like, you recently joined, this new role. But can you tell us a little bit more about your, pre year experience and how it translate and help you into this, new role? Yeah. Absolutely. So, yeah, thank you for inviting me and for having me today. It's a pleasure to talk to everyone. My name's Anne. I've I've I've spent the last fifteen or so years in data and analytics, and that's that's been across a range of industries, really. So, actually started off doing a placement year at my university for, an old steel organization based in the North of England, which was pretty grim, to be honest. But, my, yeah, my my first real job was working as a category analyst for a large retailer. I've then worked in an insight agency. I've worked for, a tech startup that that did smartphone and and wearable technology. I then spent six years at a company called Immediate Media. And, yeah, I I've recently moved to an organization called Cognita as their new head of data. That's a that's a group role. Cognita own over a hundred private schools across the world, and they've got locations in Latin America, in, the US and Europe. There's a large presence in the Middle East and then in in Asia. So, yeah, that role, I I'm leading a global function that covers, data platforming, engineering, data governance, analytics, AI, and machine learning. And we support these hundred schools and twenty thousand employees with trusted data and insight. So, yeah, my, my my route to this is, as I said, is is always had a data and analytics thread, but I probably maybe slightly opposite to, to people who end up managing teams. I started more business facing, and a lot of what I was doing in my formative years was kind of account management, but with data and analytics as a side. So when I worked for an agency, I was there for four years. At one point I had a hundred and fifty different clients and I would travel around the UK presenting to them data and insight. So quite early on in my twenties, I I learned to be able to communicate and build narratives and stories with data and present those in front of CEOs that were managing really large FMCG companies. So I think that that gave me good speed into being able to translate what we do, in in the technical side of things and bridge that with the business needs. And then as I progressed, I actually got closer to the technical side of things. And, when I was at a media, our our team ended up being a multidiscipline team. We we introduced analytics engineering. We deployed, the first machine learning models. As I left, we were doing a lot of stuff around vectorization and and AI models. So, yeah, I I kind of grew to really enjoy that side of things, and it's, you know, something that, as a leader, I probably shouldn't be in the weeds as much. But, you know, I love that and to continue trying to learn new skills and new techniques. But, yeah, I think having a the business at the beginning of my career gave me the good balance of trying to understand that everything you do has to be attached to a business outcome. Otherwise, as I mentioned at the beginning, it's, you know, redundant effort. That's really interesting. I'm curious, Annabelle. Does that yeah. I know Ant's, path seems a little different. Right? That's really interesting. Sometimes you can be a business user. You do a little data on the side, and all of a sudden, everyone's just like, oh, yeah. Ant's the guy who can do the data stuff and it becomes your unofficial and then official responsibility to to manage those things. I'm curious how that contrasts with your experience. Did you start, and we might have talked about this on on on prior webinars, but for people who might be hearing of you for the first time, did you start in a more technical role? You know, I'm doing technical things and then expanded out into the people side or kinda like, Ant, did you start business side and then adopted some of the technical things or is it something totally different? So I had master degree in, finance risk management. So I studied, like, statistical advanced statistic. At my time, machine learning didn't exist. And, I start working more on the financial market. And then, I work in a bank as a relationship manager. I was given, like, a responsibility, teams very quick quickly. So I have been managing LinkedIn, like, since I am twenty three years old. Wow. That was very complicated. And sometimes people that were double my age. And, then, I had this technical experience, a little, bit later. But still, like, when I work at the Bank of France, the French Central Bank, I had, teams to manage and also, like, technical, skill to learn. I had to learn SaaS and a lot of different skills. So, yeah, it would be like, buildings both skills together. Yeah. You you feel like that's consistent with your experience now at Cognito where you're, yeah, responsible for managing a team, but there is responsibility for you as well to to pitch in and do some of the work? Yeah. I I think, I mean, where where where it differs from previous roles and is actually, you know, probably more what I wanna focus on in the future is I'm I guess I'm taking my technical understanding and translating that into more strategic architectural plans. So we're we're actually at the beginning of a big digital and data transformation at Cognito. That was one of the main reasons why I joined the organization. So, yeah, it's it's given me a lot of opportunity to really think about what is the ideal future state architecture, which is gonna support the business in becoming more data centric and, positioning data as a strategic asset And also then helping the business make more data driven decisions and and reaching our goals. So, yeah, it's, I'm not I actually haven't touched any code in four months, which is a bit different. Right? Yeah. Yeah. You know, I was I was, on the side and maybe spending a bit too much time fiddling around with data products and some machine learning models and and all of that fun stuff. So yeah. I'd I'd I miss it, but, I think I'm I'm flexing a different muscle now, which is also quite nice. And, and if I, can ask, do you already have a plan what you want to do, like, for one year, two years? We're we're forming one. Progress. Yes. Yeah. I think when I when I joined, I was, pretty dogmatic in having a three month plan, and a lot of that was really just spending time understanding the business. It's pretty complicated business because each regional team has almost its own organizational structure, its own market complexities of coming into the education sector completely cold. So I'm used to, selling advertising space and, magazines on how to grow tomatoes and those sorts of things. So it's a slightly different product. And some of our schools are, you know, they're they're very expensive. They're very, very good schools. So you're you're coming from, a a very kind of mass user base, but low value product to a slightly smaller user base, but an extremely valuable product. So that's taken a little bit of getting used to. But, yeah, it's been good just having the head space and the time to really understand the business, understand where the capability gaps are, what the what the opportunities are, and then translating that into some organizational structural plans, some operating model plans, reviewing our technology architecture, thinking about what's got us to where we are now, where do we wanna go in the future, and making sure that we're building building for the future. Because, yeah, the the the data maturity at Cognito is growing. But if we were to have the shiniest new platform delivered tomorrow, we wouldn't be utilizing it just yet, but we need to be cognizant that in the next one, two, three, four, five years, we do wanna have that capability in place. So it's, yeah, it's kinda preempting what we might need in the future. That's great. We had a question here. Let's see if I can it says, it's from, Stephen, and it says, I find that most slash all data catalogs are not aware of user permissions provisioned in databases and cannot sort present data products the user is authorized to see versus not see. How have you advertised data products to analysts when each group may have an entirely different authorization to data? Well, going straight in at the deep end there. Yes. I'll probably have to bring it up a couple of levels because that that might be slightly outside of what I can extract up my my mind at half four on a on a No. That's alright. Yeah. I think yeah. Cataloging and data products and the overall data governance piece is usually a second thought, and is really fundamental if you wanna have proper data democratization. So where I've done this in the past, but probably not as sophisticated as the the question indicates, is just making sure that, firstly, you've got good lines of communication and, you're using the right channels to at least put out there what's available, whether it's cataloged formally or not. So I've had great success with things like data community of practices where we're sharing what we've got in our repositories and what we've got in our Tableau servers and those sorts of things. We, we we consistently used to review what we had in our catalogs, and, we went through a really large exercise, which we we labeled cut the crap where we, we we just deleted downloads of stuff that that hadn't been used before and actually unlocked loads of space and some optimized savings in our data data warehouse, which was really good. But, yeah, I think I think it's just about being really clear on what what is available. If you can make a catalog and keep lists of things, then then that's great. If you can automate that properly, then even better. You know, things like Microsoft Purview is really good for data cataloging and Databricks. You've you've obviously got Unity catalog, so there's lots of out the box tool in that you can you can utilize if you're mature enough to to use that. But if you're not, you know, just keeping a log of what's actually in your your repos and what's in your servers and what's being used, I think it's a good starting point to then make the decision as to how to, you know, either get more engagement with those. Is it a training problem? Do people just not understand how to navigate to where they are? And, you know, that's always a good starting point to understand how you can drive the adoption. Not sure if that was actually answering the question or not, but I think it's important. I think that's a, that's a good fundamental answer to, to, to, to back it up as well and kind of view governance a little more holistically. And I, yeah, I would imagine as you kind of get to the specifics and you can kind of parse those apart and those are going to vary from, those are gonna vary from practice to practice depending on what tools you use and how you're set up and what kind of industry you're in and what types of users you're serving. Another question here real quick and I think this this question kind of dovetails with a question that we had as well kind of in our initial conversations. But, you know, they're asking, can you define what is a data product with some examples? And the question that we would ask as well, we talked a lot about, data as a product operational model. So maybe those two things, dovetail a little bit. But, yeah, what how would you define a data product? I know that's kind of like that's probably a very broad question that can vary as well as you kinda get more granular, but what is that? And then how have those data as a product operational models worked out for you all? Yeah. Interesting. Again, my my answer might probably not be what what the the the question is looking for. I think the data as a product piece is is more about mindset than actually what is a product and what isn't. Because, really, I think anything that the team produces can be classed as a product. So it could be a semantic model. It could be a data pipeline. It could be a dashboard. It might be a machine learning model. It might actually be the data platform in its entirety. You know, any of that stuff could be the product that the team's accountable for. I think where it where it becomes an important model to implement into the business is when you wanna try and shift away from being a ticket repository that just delivers stuff to the business and where you're kind of, firstly, forcing the team to take a bit more ownership of what they're actually producing and managing the life cycle of what they produce like you would a product in a in a business. So really making sure that the owners of those products, the the team that puts them together are continuously looking to understand how they're being utilized, get feedback from their stakeholders, working through how to iterate them and and increase their their value to the business and and just produce higher quality work. So it's not necessarily this is a product and this isn't a product. You know, if you wanted to get into semantics, you can just define that in your in your own way. But I as I said, I think it's more about the mind shift of owning that, and being accountable for its performance and its value to the business. So that that's that's one part. The second part is also how you interact with the business and how you engage with the business users. And I think how this really starts to work and sing is where there's clarity between what tech deliver and what the business use, but then there's also a quite strong collaborative relationship by, you know, we we've delivered this product. It's for you. You're using the product. You're the owner of that product. How can we work together to make sure that we are continuously improving what what the what the whole point is in the first place? So, yeah, if you can if you can get that mindset, that data is a product mindset to permeate outside of just tech and into the business Then you get into a really powerful position where everyone is viewing the data products as important strategic products, and therefore, they're giving given enough airtime to actually improve and utilize and elevate. So, yeah, that's you know, it's it's less about this is product a, this is product b. I think it's more about that mindset approach. We have a question from Angela, and I think that is going along this way too. When scaling your data team, how did you end up the tension between centralizing data expertise and empowering business unit to make their own data decision? What trade off did you encounter? Yeah. This this is a really interesting point. I think, again, it depends on the maturity of the organization. The last two roles I've had, we were kind of coming in cold, not starting from scratch, but the data maturity were relatively low. And I think when you're in that situation, the the best foot forward is a centralized model because you need to be able to build the expertise and the capabilities with a fair amount of control and governance over what you want to create. Think of, you know, the I think the terms like a data fabric, basically, where you've got the the centralized team owning and governing the platform, thinking about how all of it hangs together, the the the CICD that you're gonna use, the way it's meant to be utilized with the business. You have to define those policies and standards and best practices. First and foremost, as the maturity scales, then you can start thinking about a broader adoption methodology. So whether that is kind of a full data mesh approach or a a kind of harmonized, bit in between where there are obvious use cases to to enable those localized teams, domains, regions to have access and and almost curate their own data products, then you don't wanna block that. You know, you you want to scale the democratization of data usage in the business. So you you want to really enable that, but you have to be super cognizant that that takes a lot of effort. It takes a lot of well curated governance over how everything's used, and there's a big kind of training piece, a big enablement piece that that goes alongside that. So where we've had success in the past is very specific use cases. So have small pilots. In my previous role, there were a few different data teams dotted around the business, and one of them was definitely growing in their their competencies and their expertise. And they were asking for more stuff, which we just couldn't deliver because we, you know, we had a massive backlog, and we ended up being a bottleneck to them. So we had to move into more of a a federated approach where we gave them access to a certain point to be able to build their own ingestion pipelines, curate their own data models, build their own, Tableau dashboards. But because we defined all of the governance and the CICD pipelines and processes and, you know, all of the rituals, essentially, first and foremost, you can bring them into your world, make sure that they've got the right environments. Nothing's kind of promoted to production without the right sign offs and those sorts of things. So, yeah, I think it's it's definitely the utopian goal is to have everyone accessing data everywhere. I remember my my old CEO actually used to say, can't you just train everyone's sequel? And I was like, well, yeah, you could. But do you then want to give. Hundreds thousands of people access to the data warehouse and, you know, there's loads of really bad stuff that would come with that. But, But, you know, that's the goal is basically being able to give everyone access to to everything they need, but you need to do it in a controlled and quite pragmatic approach, I think. Yeah. That's that's a really smart answer. I think a lot of times people kind of get I don't wanna say trapped, but, you know, you pick a side, you pick centralized versus, you know, federated and and people are like well, you know, this works and sometimes this works and people have strong opinions about it because both can work. But I kind of like that attitude of having some dynamism, having some flexibility in terms of, okay, well, yeah, what works if one group is reaching a high is getting higher on the maturity curve whereas others aren't, you know, treating them accordingly and being flexible in that way so that you are not a bottleneck in causing frustrations. I think that's really intelligent versus, you know, just planting your flag and saying we're centralized by God, you know, like, no. I don't care if you want to do more. So that I think that's a really interesting answer and I don't know that I've heard quite that shade of nuance in terms of, you know, which direction to go. So, yeah, that's that's great stuff. We have, you mentioned your team, and, how do you empower them to learn and continue to grow? Yeah. I think well, this is this is really important to me and, you know, my own growth and development. I'm really passionate about this. And I think, firstly, it helps to try and promote a growth mindset. So I remember having some leadership training a few years ago, and we had some third party consultants come in and talk about growth mindset. And I'd never heard of it before, and I had a bit of a eureka moment where I realized I thought I was potentially growth mindset, and I actually wasn't. And when you realize you don't have to see and think of problems in the same way anymore, it becomes really liberating because everything is is growth and a learning opportunity. So I think it's really important to try and model those sorts of behaviors when you're when you're coming in to lead a team. So that's, you know, that that's that's one priority thing to do. And then there's I think there's just lots of different multiple approaches to empowering teams to learn. We've, in the past, had, really basic things like every, Friday once a month, we would just shut down our emails and have a dev day, and we would get people learning stuff. And it doesn't necessarily have to be directly related to their job, but, having that sense of freedom to learn and experiment was actually really valuable. And a lot of the the proof of concepts that people were putting together in their dev days, we actually ended up scaling and and putting into production. So, yeah, I think just making people feel like they have the permission to spend time on learning and development is really important. And then if you wanted to be more sophisticated with it, you can you can think about more structured training programs. In the UK, we're we're quite lucky. There's, something called the apprenticeship scheme where if your company has a certain, employee, levy, you pay into a fund, and then, you can access that fund for training programs. And data is one of the the kind of bedrocks of those programs in the UK. So I've had maybe four or five of my team do data apprenticeship programs, and they they span from twelve month programs on introduction to SQL and Python and and some visualization to actually you can end up doing a master's in them. So, yeah, I think trying to understand what someone wants to do with their career, and lean into their strengths more as opposed to their weaknesses and trying to, you know, shine a light on those opportunities is is how you then grow someone's confidence in wanting to continue to learn and train and develop. And, you know, when you've got that that environment and that culture in your team, then it becomes really powerful because everyone's constantly just bouncing off each other, and we're all testing and failing and learning at the same time. So, yeah, those are those are some of the kind of techniques I've used in the past. I really love what you said about, like, I really like trying to focus on their strengths and not weaknesses. I think it's very, very powerful. Yeah. Yeah. Because I you know, everyone's got learning and development opportunities. And if you're underperforming in those certain areas, then obviously, you need to you need to action those. But, you know, just generally, if someone has a deep interest in something and is gonna go above and beyond to, expedite their skill set in that interest, then why wouldn't you want to to shine a light on that? I remember one of the, I hired a data analyst in my old company who quite quickly showed that he was really, really competent with the technical side of things. We then moved him into an analytics engineering role. We kind of branched that out into some machine learning. He was one of the first people in our business to deploy a propensity to churn model. And, you know, that just kind of grew out of his constant desire to continue learning and developing. So I think when you've when you've got someone that has a growth mindset like that, they're a Ferrari, and you you're gonna wanna let them just have as much road as possible to see how fast they can go. So you need to enable that as opposed to to hinder it and call out whether, you know, they're they're underperforming or they're not doing, too well in certain things. You know? There's a time and a place for that, but I think shining a light on the on the positives is is critical. Curious, there's a question in here from from Vamsi and it has to do with enablement. It has to do with kind of this growth. But, you know, how do you measure success whether you have a a center of excellence or whether it's more, not as defined and it's just measuring, the growth of individual users and and how they're developing. But, you know, how do you report back on that enablement growth for their personal journeys? And then also, how do you measure, you know, growth of the overall health of of analytics and data? Kind of those two things like how you're serving the business and how, you know, the people you manage are growing. Yeah. I mean, I think this is something that people in our world always struggle to communicate and articulate well. It's it's quite hard when you're essentially a service provider to really follow through to the end of a project to understand what it delivered and the potential ROI. So I usually split this into two categories. There's firstly, one that's very easy to manage and and report on, and that's kind of adoption. So how many people are using the reports? How many people are accessing the models? How many active users do do you have? What's the stickiness? You know, all of those kind of typical web analytics metrics you can apply to your dash Tableau dashboards, Power BI dash dashboards, etcetera. So that that's quite easy to do. The impact is slightly harder. And, again, there's there's kind of two parts to this. What I focused on, first and foremost, is the thing that, again, you can control and own a little bit more. And we started logging our tickets more formally. We weren't really doing this when I first joined. And at the same time, we launched quite a large data literacy program across the business. And what we started to notice was where, the usage of some of these these programs and, different bits of course material were were being used across the business increased. In some areas, our BAU issues kind of generic ticket started to to die down a little bit. We then doubled down on some of our super users, and we created a data champions learning program where we were given I think it was about twenty people in the business access to the data warehouse. We gave them training in SQL. We gave them some basic Tableau, dashboarding training. And then we specifically looked at the volume of requests that would have typically come from them, and how that changed over time. And that was then quite easy to work out. Okay. So over the course of a year, we've actually saved about twenty five or thirty percent of this one person's resource because their time's not spent on these generic BAU items anymore. Which we could have, you know, we could have just said, okay, that was X thousands of pounds a year. So that's, you know, that's a number that the CEOs are gonna wanna wanna understand. But then what was really good is we, we then said, okay, so instead of just doing nothing, we actually spent that time doing some added value work. And that's where this individual was the one that then built the propensity modeling, which then has a real business outcome, because then we could turn around and say, okay, if we reduced it, churn by ten percent, then that's gonna save us five million pounds a year. So you kind of have a two pronged approach there. It's the the optimized resourcing time that you might have saved and then redeployed somewhere else. But then if it's attached to a business outcome, then that's even better. So kind of following that, we we moved to a bit of more of a, like, a bling canvas approach to to, generating our requests intake, and we would always try and get the stakeholders to specify what were the KPIs that they thought this initiative were gonna drive and what was the perceived ROI. And then If you've got this product mindset, you stay very close to the product owner, the business owner of of the product that you're you're defining. And you can then understand from them, you know, how much is this safety time? Did this actually drive an increase in conversion or a reduction in churn or whatever it is? And you you keep a finger on the pulse of the actual impact that that what you you produced had. Yeah. That's a great answer. I I really like kind of that. Again, having that open line of communication with the data product owners and, as well as the business owners, the people who it's directly affecting. That way, you're not having to grasp at straws of trying to figure out, okay, well, what metrics are they gonna care about? Well, they're gonna tell you because they're kind of, to a degree in the trenches with you in terms of managing this product and rolling it out and they will be able to tell you again that ROI thing versus you having, having to guess at it. And I, I really enjoy it too because a lot of what you're saying here ties back to that original kind of poll question we had at the beginning, which a good part of identifying any goal is identifying from the outset how you're going to measure it. Right? You know, what are the metrics or what are the things that, your stakeholders are going to be looking for that you're going to report on? Because you see this all the time. I see this in marketing where, you know, we can gather a bajillion stats. You know, I can I can have a email marketing campaign with with open rate, with, you know, unsubscribe rate, with, a click through rate? But if, leadership doesn't care about open rate or click through rate, but I'm like, oh, here's this is important. It's not gonna, you know, regardless of whether there is real ROI happening, it's not gonna be perceived and therefore, it's not gonna be as successful because you don't have that buy in. But I think that's, that's really, really great stuff to kind of have that collaboration close that way. You're doing less fishing for how are we succeeding and it's, it's more real time, you know? Yeah. Yeah, absolutely. And as, as the the data maturity scales in the business and you'd expect the the volume of requests to increase, it's then also quite a good way of fielding off not doing the work. Because if it's not tied to a business outcome and a perceived ROI value, then you're not gonna do it because there's only a limited amount of capacity and resource, and you need to direct that to where the biggest bang for your buck is. So, you know, you can also use use it as a quite a good way to to field off, not doing, not doing work and focusing on the things that matter. We have two, we have two questions left, but I kinda wanna, I wanna shift gears a little bit because I know we, we had talked a fair bit about AI and its potential now, in this moment in time to maybe accelerate some enablement efforts or at least to accelerate some of those data products or whatever it may be. But I'm, I'm curious how you have seen AI accelerating progress kind of in your current role and as well as maybe, you know, some things that you're looking that that can potentially accelerate. You know, what what is that role playing for you right now in terms of AI hype versus this is actually delivering or will deliver real value? Yeah. It's I think it's a tricky one, to be honest. I I think it's definitely created more of an experimentation mindset in a lot of organizations, which probably was missing from before. And I think a lot of people have got to the point where you you need to get on the bus regardless of whether you actually understand it or want it to happen. So I think that's that's been a good thing to just accelerate people's understanding and appreciation of experimentation. You'd some of the projects that that I've been involved in with AI, have been great learning exercises. When I left the media, we were we were quite close to being able to create a vector search that we we could actually deploy onto our our websites, and that would be a much more sophisticated product for for users to to search recipes or how to guides or what whatever it was. So, yeah, I mean, in terms of the acceleration of all of that, I think it's it kinda goes back to the the the experimentation mindset and making sure that you're having, firstly, that growth mindset and you're carving out the space for people to to play around with these things and to think of some ideas. You know, you you can have loads of bottom up ideas as well as the the strategic top down. And then if you can be a bit more formalized in your approach with this, so, you know, where I've in in my previous role where I am now, we have quite clear, stage gates for projects. So if, something's gonna require a certain level of budget, then you need to have a bit of a business case together. You can then work through a proof of concepts that can then potentially move to a prototype, to an MVP. So you're you're making sure that you're constantly testing and learning and iterating at each of those stages. So you've still got the experimentation in your mind. The goal is to hopefully deliver something of value, but, you know, there was I think there was an MIT study recently that referenced something like ninety five percent of AI pilots don't deliver measurable ROI, which I guess is, you know, I probably slightly disagree because it's still valuable learning. You know, if something doesn't work, then at least you know that doesn't work. And where where we've had quite good success accelerating the development of things is just leveraging what's already out there. So there's a lot of, frameworks that you can that you can implement. Project we're currently working on at the minute is building a proprietary AI assistant tool. So, our back office staff can have access to a a GPT tool, where they can securely upload proprietary information and, you know, all that sensitive documentation, etcetera. And we've, you know, we've we've harnessed a lot of the the the templating and the the scaffolding that you can get from OpenAI and and those sorts of places. So, yeah, I don't I think utilize what's available. Don't reinvent the wheel. I've read recently that Databricks have just partnered with OpenAI as well. So quite soon, you're gonna have access to all of the OpenAI models within Databricks, and they've got a lot of out the box ways to productionize Databricks apps. So, yeah, I think it's combination of having that experimentation mindset, making sure that if you are gonna invest the time and the resource and and money, then you're quite clear on, those definitions of done and and what those use cases are so you don't just bleed and hemorrhage resource and and effort. And then to accelerate the learning, utilize, the frameworks and platforms that are already out there in existing. And, you know, if you could just plug and play and and do something straight away, then great. If it then looks like it might work and it needs to be maybe redesigned and rearchitected for productionization, then then you can do that. But you've not wasted time building something gold plated, which is not gonna fit it's not gonna not gonna get there in the first place. Yeah. I like that, perspective on, experimentation as well using AI, you know, less kind of managing your expectations with it, but also understanding that if, you know, an AI use case or an AI, you know, a POC doesn't work out, well, what did you learn from it? Because a lot of times, whether it's a dashboard or whether it's AI, one thing a lot of people learn is you do it, you get a bad result and you're like, why did that happen? Well, the underlying data probably was not clean or probably not in an order that it needed to be in and so you can go back and manage those data pipelines, do some data or analytics engineering and that is a valuable effort because it uncovered, maybe not the insight you were looking for, but it uncovered a valuable insight, which is your data needs some work, before you can start building off of it. So that's useful. Right? Yeah. Hundred percent. Yeah. Completely agree. Let's see here. I'm curious. You had mentioned AI. Speaking of this and speaking of dashboards, have you found, whether it's in your prior role or or current role that whether it's agentic ai, whether it's just its ability to kind of connect people to answers via like natural language search or whatever it may be, have you noticed that the need for dashboards is less because people can find those answers quicker? Or is that kind of more of, well, that's the promise of it. The reality might not be there yet. How what's that relationship like? Because obviously within Tableau circles, within a lot of circles, agent to KI is kind of the big buzzword and that's that's that's the hype, but what's the reality on the ground have you found? Yeah. I think the the North Star is is that, I I would say my experience, though, we're way off being there quite yet. I had said in previous talks, dashboard is dead. It's it's not, not right now anyway. I I think there's a bit of a shift in trying to bring the intelligence into the flow of work. So I think, like, early examples of that were Tableau integrations with with Slack. There was a recent one with Teams where you can, you know, you can put charts and data in front of people in their usual flow of work ways ways of working. I think Tableau Pulse was probably the first trial of them trying to, have some kind of generative insights delivered, through the platform. I don't for me and for the business I was in at the time, it wasn't adopted very well. I think it was pretty limited in functionality. I think the, the effort was there, but it just wasn't executed properly. And I know that even though we we've not used it yet, Power BI then has things like Copilot Analytics, and, that looks quite good from some of the the the examples that I've seen, but Copilot licenses are pretty expensive as well. So I think that's the direction of travel. We've, as part of this this AI tool that that we're looking to experiment with, my end goal is to have that connected to our data warehouse and our semantic layers. And it can be a one stop shop for someone to query how many days holiday they might have left, but then also what was our enrollment conversion last month and how much EBITDA did we generate last quarter. You know, it can be a one stop shop for all of that. There will still be, I think, a place for visualization, because, you know, that that helps tells the story of data. And if you wanna tell good good narratives, you need to have nice pretty pictures to help you do that. But, yeah, I think a lot of the innovation I see in the industry is definitely leaning into that more agentive agentic space. And, you know, I've mentioned Databricks a couple of times already, but something that they've recently launched was the the Databricks one was almost like a pane of glass that sits on top of, a lot of capability unified, which is more for the the business facing users. And within that, you do get access to some of their out the box dashboard in. There's the Databricks Genie, which is their kind of generative AI chatbot. So, again, I've not actually used it. On paper, it sounds like a really good step forward. But as you mentioned earlier, you need the good data going in. If you don't have that, then all of this stuff fails and is, is a bit pointless. So, you know, I always think don't just chase the shiny toys. You need to have your foundations absolutely watertight. If you've got really good data quality and data governance Then you can build these cool, shiny and probably super valuable products on top of that. But, you know, if you're gonna just chase the latest tech and everything else is a big spaghetti mess at the At the beginning point, then it's, yes, completely pointless. Yeah. I think I agree with you, and I think that, we are still not ready. I think a lot of CEO and the CFO will not want, like, to ask question. They want to have already the information when they just switch on their phone on their computer. That's what they want. Of course, you can customize. You can use AI to maybe summarize them, but they want to see the data often. So I don't know if the newest generation will do it differently. We still have some years, I think. I mean, you you firstly gotta get them from just knee jerk asking the data analyst, which is probably what they'll still do when we've got all these shiny, agentic tools. They're still gonna ask that one person they trust to give them the right data. So, you know, that's the first knot we need to crack, really. Yeah. Hard to change human nature. Yeah. I mean, some of them would like to have them print it. Yes. Yeah. Can you print it to this dashboard? Yes. Yeah. Think of the trees. This kind of This is Oh, go ahead, Annabelle. What were you saying? No. I was just saying that that's a ugly truth. I know. Truly, though. Truly. Yep. There will always be that. We're we're kind of at five minutes. I kinda wanna get these, last two questions out. And this is this kind of this first one from from Peter is is is relevant or adjacent to what we're talking about. He is asking, have you had to take on the seemingly impossible task? Yes. Of mapping out tech stacks in a legacy organization or government entity that has grown its tech over decades and there's large unknown across product lines meeting all of the business needs, which by design were siloed for governance. So has that been your experience where you've had to you've had legacy systems, you've or legacy processes, frameworks that you have either had to work in or slowly change? Or has your experience been more, well, we're we're just rebooting from the ground up, and it's whatever tool stack you prefer as, you know, group head of data or whatever it may be? A bit of both, really. Much earlier in my career, we, when I worked for the the tech company, there was no automated Power BI reports, and there was no sequel. Everything was done on Excel. So, yeah, I've, you know, I've had experience of trying to work through the legacy systems of where all of those spread marks come from and migrating those into, something which was slightly more sophisticated and automated. Thankfully, when I was at immediate media, the the kind of data warehouse build had been going for a few years prior to me joining, so we were already in quite a good situation there. And the the data volume was significant. You know, we were processing, ad click data, and that was millions of rows a day. But the actual number of sources was pretty small, so we had quite clear governance over the structure of all of that. So with that baseline, we could just build on top using, you know, more sophisticated tooling. Where I am now is slightly different where, you know, each school might have its own system and the business I'm in buys schools. So every time we buy a school, we either need to kind of onboard their systems onto our new systems or try and integrate into their legacy system. So, I mean, it's one of the most complicated fragmented systems architecture I've I've ever been a part of, and it's it's a real problem. So the data quality and the data governance is is pretty significant, but we're making steps forward to define what the right processes are, the right ways of working, define what our blueprints are. And, you know, if we're gonna create single customer views and and golden records, then we need to be really clear on our expectations of the data owners, the data custodians in the business to help us push there. And I think, as I said earlier, the data transformation generally across all businesses is cultural and business transformation at the same time. So it's so important to have the buy in from your leadership team and the alignment with the strategy because otherwise, what you do is gonna fail. So if you are gonna try and move to legacy systems, firstly, what's the point? You know, what's the what's the business value of doing that? Because there if there isn't any, it's gonna be wasted effort and redundant effort. So, yeah, I think working out exactly how the architecture is gonna support the business goals in the first place is the starting point before you even think about migrating to anything new. And to, just jump into this. How to convince people, we have another question from, the audience, and I think that is quite interesting, is, even if you have, like, the buying from your superior and the stakeholder, but you have, people who will block this idea because maybe they are, they have they are afraid or they don't want to go to this change. How do you manage to convince them and to drive them to change their mind? It's pretty common, to be honest. And I think I said it earlier, it's it's all about teasing out the quick wins and those priority use cases. Because if you've got someone knocking on your door, then work with them because, generally, in our roles, we're knocking on people's doors. So if someone wants to let you in and they're knocking on your door, utilize that. So work with them, understand exactly what capability you can help them build, what business outcome your output can feed into, and use that as a test test case and get a use case. And if you can demonstrate the value there, then other business leaders are gonna start recognizing, oh, okay. These these guys actually can help me deliver what I want. So you really need to understand what it is that the outcome is. You know? What what is that so what? Don't boil the ocean. You know? Start small and build, you know, start the embers small and then build that fire. And, you know, you need to take people on a journey. And, generally, those detractors are only gonna jump on your ship when they really see what's in it for them. And if you can go to a commercial director and tell him that you can make him ten million more pounds, if he gives you x, y, and zed, information and supports you building a product for him, then, you know, he's gonna buy into that. So, yeah, I think it's there's a big stakeholder management piece, but it's also been able to demonstrate value in those early kind of quick win use cases that then helps spread the word and gets everyone on board. I love that analogy. I would say too, leading into that, if you're you wanna work with someone who's knocking on your door, never underestimate the power of friendly competition. When someone knocks on your door and they kind of not necessarily inadvertently become a champion and then all of a sudden their peers want that same level of success, that sort of thing can be contagious in a really, really, really good inroads versus trying to like I'm gonna communicate to all of them and then do all of them at the same time, you know, starting with that, that person who's excited and eager and interested and all of a sudden someone else from another department or their peers gonna be like, hey I want can you do that for me too? Yeah. Which Just really valuable. Okay. One more question. I know we're over just like a minute over over time here, but if you have a minute or two and to answer this last one, I think it's right in line with what we're talking about. And and you may have answered it partially already, but, this is a follow-up question, from Angela, and she asks, can you share an example of a time when unclear data role definitions caused challenges in a transformation product project and how you resolved it? She's asking what process do you use to clearly define boundaries between roles like data engineer, data analyst, data architect during major transformation projects? Yeah. I mean, again, it it kind of does go back to the maturity. And I'd say that when when I first joined the media and we were growing the the analytics department, and then that kind of ended up including the data engineering team, we got into a bit of a a way of working and a culture where the the lines blurred a little bit and the handshake and trade off between engineering and analytics wasn't that clear, and it started causing some issues. So, you know, the quality of code sometimes wasn't as good. Pipelines were failing. We were, as sometimes wasn't as good. Pipelines were failing. We were, as analysts, building our our own ingestion pipelines that weren't conforming to the engineering standards and whose responsibility then is it to manage and govern all of that. So we got to a point where the scale and sophistication of what we were doing needed clearer definitions of roles and responsibilities, and that's what led me to create the analytics engineering part of the team. And we were really prescriptive in what does the engineer do? When does their job end? What does the analytics engineer do? When does their job end? And then what does the data analyst do? And there is gonna be some kind of bleeding of lines, and there is obviously a a handshake, and that kind of can happen at different parts of the life cycle. But, yeah, I think it's it's really important to almost have the freedom and flexibility to grow to the point where you're starting to actually maybe accumulate too much tech debt or you're running into ways of working issues or coding issues or whatever it is, then that's the time to actually take a step back, review what you've done and what you've got going on, reassess what it is that you you think the business needs in the future. As I said, you know, tie it to the business strategy to those outcomes. We knew that we wanted to move into the AI space. As I mentioned, we were building this vector search tool. So the data that we had going in, it needs to be absolutely gold plated. So in order to do that, we needed roles that specifically looked at the data modeling, and those roles were analytics engineers, and they needed tools like DBT. And we needed proper data models, and we needed a proper CICD process to sign everything off. So we changed a lot of that. We changed people's roles. We, you know, when we have new jobs, we had new tooling, but it was all to enable us to move into that space that we knew we needed to move to. So our output supported the business outcome. Yep. That's great. Sometimes you don't know what you don't know at the outset. Right? You start a project. You have your goals. You have your team in place, and then you start getting into the work, and you're like, oh, yeah. Maybe we need a little more guidelines to to reassess and to to move forward successfully. So Yeah. Well, very cool. I will share our last little slide. I'm curious, you know, is there a way, Ant preferred if if people were to reach out to you if they had questions? You know, what's your what's your preferred mode of contact? Yeah. I mean, if if anyone wants to continue talking, then, you can get me on LinkedIn. I've, yeah, quite quite active on that. So that's all good. And, yeah, big thanks for me to to have me on. It's been a really interesting conversation, and I've, yeah, I've really enjoyed it. So appreciate the invite. Thank you. Thank you, Anne, for coming. Yeah. Thank you so much. Likewise, our our challenge, as is every webinar is not enough time, and we really thank you for your time, for your insightful conversation. Lots of good gems in here. So thanks for joining us. Appreciate it. Thank you. Take care. Bye bye. Alright. Take care, everyone. Bye.