Ensuring Data Integrity

Transcript
We formed in nineteen ninety six as a local IT and networking company out in Stillwater, Oklahoma in the US. After several years, we kind of became a Dell partner. We then transitioned more into the BI space. We became one of Tableau's first partners, pretty much been there since they were ten people in an office. And then we were also then the Tableau gold partner, the very first one of those. Since then, we're talking now twenty seven, twenty eight years. We started to partner with some fantastic BI tools, and that's the likes of Snowflake, ThoughtSpot. And what we have envisioned to do is kind of be involved and offer services around your whole BI journey, whether that's the data ingestion into a warehouse, the ETL from that warehouse into a tool like Tableau, and then anything post Tableau and then setting your company up for success. I'm gonna just hand over now to Bruno and Ace, and I'll give you a little rundown of wisdom as a company. Over to you. Thank you very much, Ryan. Hey, Ace. Do you wanna say a couple words? Sure. Hi, everyone. My name is Ace. I'm a solution consultant with the Wiiisdom team, and I'm so excited to be here to show you a little bit about what we do, show you a little bit about the tool. Just a small background about me. I've been in the BI space for over a decade, and most of that has been working directly with the Tableau product. So working with the Wiiisdom team has been a pleasure because I get to work with a lot of customers who are working with Tableau, but probably struggling with some of what we're trying to solve more. And that is how are they testing, how are they spending their time with testing, hopefully, identifying these ways that they're manually covering those portions and helping them, automate some of that and hopefully build a bit more trust into their ecosystem. Excited to be here, and I'll be showing you a little bit more. But to you, Bruno. Thanks a lot, Ace, and, good morning, everyone. I'm, Bruno Masic. I'm I've actually been with, Wiiisdom for about ten years now, and I am located in the Boston office. So, I'm working daily actually with, with Ace. And, so today, anyway, thanks thank you very much again, Ryan, for having us. And I think as, Ryan was mentioning, you know, business is all about customers, and it's the same for us where today, our number one focus is is actually our customers. So that's that's why we live, and this is what we really care about today. So when it comes to, Wiiisdom, so for the past seventeen years, we've been supporting different, analytics platforms. Initially, we start with business objects, and then after that, we went to, Tableau, and then we also support now, Power BI. So the solutions we deliver are solutions about analytics governance, whether it is for testing, whether it is also to ensure that the content you have in production is tested and completely accurate or whether it is for version control. And talking about version control, this is something that we are going to be delivering, by the end of, q three for Tableau. And, so when it comes to analytics governance, this is really a topic that's extremely important. Today, when we talk about that, we exchange with our customers where they have typically four different pains. So why is that critical today? Because a lot of organizations today, everybody cares to make sure that their data is correct and, of course, that their analytics is correct. And, typically, their journey is going to start on the data side and then making sure that, customers, they actually get a good data product that is delivered to to Tableau. But one of the issues is when it comes to Tableau, Tableau itself has got its own, let's say, engine, transformation. So the data itself is going to be transformed there. And as a result, despite all the investments that you've done before that on the data side, there's still issues on the Tableau side. So here is actually at a high level. These are four cases that we see from our customers. So I don't know if any of these actually, would be cool to get some of your feedback, but, I I don't know if these actually, resonate to you. But, typically, for example, the first one is people say, hey, Bruno. Hey, Ace. We've invested a lot in our data, but we still actually have dashboards that are, showing wrong data. Why is that? That's the first part. Then, you know, the second part is, you've always got an issue about security to make sure, hey. I wanna make sure that, Ryan can only see information for Europe. So just show me that this is actually correct. The third part is about performance. Here, what we do is actually, Wiiisdom has got its own, performance engine within our solution. So we're able to put load and then measure the performances. And this actually works on Tableau Server and on Tableau Cloud, so this is pretty cool. And, of course, the last part is, going to Tableau Cloud is AI definitely is, is one of the reasons, you know, for many organizations to be going to the cloud. And, when it comes to cloud is you need to make sure that, essentially, you know, your data sources are correct, that they're certified, and then you need to make sure that that data source can be trusted because anything that you do on your data source, if there's anything that is wrong, is going to be amplified. And then after that, you need to make sure that, for example, your pulse is actually refreshed. So at a high level, these are the four different, topics that we see with our customer. And today, how do we fix that with Wiiisdom? How do we put that into a governance? So there are really three different elements that we bring on that. So one of the key factors at Wiiisdom, is we're able to test, promote, and certify Tableau content at scale. So this is what we do. And, importantly, at the end of this journey, what we do is we're able to certify dashboard. So it's gonna be red if the dashboard actually is decertified or green if it is certified. So each time, actually, a tablet dashboard consumer actually is going to receive his dashboard, he will know that it was tested, and he will know what the results are. So this is actually something where we're really excited because it's something where we actually have a a patent that is currently pending where we're able to dynamically able to do that essentially on NEI or any BI solution today. So this is really phenomenal, and this is really important when it comes to governance because there is no quality process in the world that exists without actually a final testing without a certification. So that has been a missing piece that we have actually brought, to analytics and to Tableau. Brian? Yeah. Perfect. I guess it's kind of a quick question on that, Bruno, is obviously, we've talked about the performance testing and a few other pieces, but it'd be really great to understand kind of what can be tested. I guess we're gonna hand over to Ace as the the kind of the technical SME on that point of, is there just this performance testing? How can you kind of take Wiiisdom that step further within your BI solution? Absolutely. Yeah. So there's a lot of components we can test on the Wiiisdom side. We generally focus on the data validation as well as the, functionality of the dashboard. But just like Bruno mentioned, performance is a big component of testing as well. Ultimately, what we're really trying to solve for is ensuring that you, as the end user, are trusting the dashboard that you're opening. You trust the data. You trust that it's performant, and you trust the quality of it on a daily basis. So I'm going to show you a little bit on what that looks like on both the Tableau side as well as the application from Wiiisdom. Perfect. Yeah. Thank you. Absolutely. So you should be able to see a Tableau dashboard right now. Is that correct? Yeah. Fantastic. Yeah. Perfect. So I like to start with the Tableau dashboard first because what I wanna do is for everyone on the call who's looking at this dashboard, you're probably coming from different backgrounds. You might be an analyst. You might be a developer. You might be a governance leader or an executive who every time you look at open this dashboard, you are testing it without even realizing. And what I mean by that is when we open these dashboards and for whatever reason, if any of these numbers or these KPIs or for some reason zeros or nulls or they're not loading, this is part of that entire testing process. We know that the data exists, but for some reason, it's not being loaded. That's a problem, and that's a problem that we wanna catch earlier so that our executives are not the ones who are finding them. Right? So because of those reasons, what we're doing here is we're automating these testings around all these different components so your end users are not the ones who are running this. Now the data is one piece at the aggregated level, but how do I make sure that this number of a thousand and seven for my sales per customer for a region of east is actually correct. For that, I may go to my data warehouse. I might ask my DBAs to give me an export of the data. I go back to that single source of truth and then comparing those values. But, of course, testing doesn't stop there because all the filters are here, and we might want to interact with them and then see the dashboard at that state. And ensure that at this stage of the data, when I'm slicing and dicing and interacting with the dashboard, again, both at the aggregated level as well as the role level, the data is matching. For example, this person, this customer does actually have the sales for this value. As you can see, this can be a time consuming process because just looking at one dashboard can take up to eight hours for every component to be tested. And you can just think about how many dashboards your organization has. What are the top end dashboards? What is the critical dashboard that you're running with? Now enough about what the testing looks like on the Tableau side. Let's see how this testing looks on the Wiiisdom, side. So WISDEM, the application itself, this is what the UI looks like. And for anyone who hasn't seen a tool before, I'm gonna give you a quick tour on what this looks like and what we're looking at. On the left side, I want you to focus on what I'm looking at here at tests. You can see a series of different existing tests that I've created in the past, and these are part of the examples that I run through. And they're all solving for a different reason. They might be testing the dashboard or extract certification or making sure that my filters and parameters are working or validating the data at different decimal points or even opening just series of dashboards and making sure things are running. All these different tests that I created are part of the several component that testings that we have, like extract testing, functionality, and regression or performance testing. Right now, we're just going to focus on the functional testing, and I want you to think about what we just saw on the Tableau side. The functional test has a drop down here ready for me that I can choose all the different tasks that I really need from a point of view. For example, does the dashboard open? I can use an open vis task because the filters work. I have a series of different set filter tasks that help me run through that. Interacting with parameters or validating the data or even selecting marks or switching tabs. So So we can see all those native functionalities of Tableau. We've created tasks around it. So, ultimately, as the test, what you really focus on is creating this flow of these different individual tasks that you see here so that the test goes one by one, making sure every component is tested for. If at any point any of these tasks fail, the test fails and it stops. Now I wanna run this test just to show you what happens. So this will take a few seconds for it to load up. But just to give you a background of what's really taking place here from a technical standpoint, We're using all the APIs that Tableau has made already available for you. Some of these APIs might already be in use at your organization. For example, think about the REST API, JavaScript API, APIs that allow us to interact with the Tableau environment, extract certain amount of information from it, and come back here and assess that, analysis. Based on all those APIs, we're able to make all those different decisions. And the benefit here is if for some reason your dashboard post creating these tests moves around the visuals or your filters are moving positions, It doesn't matter. The test will continue to run because it is API driven, and we're still extracting a lot of that information. Perfect. So just while this is running, obviously, when you talk about rest APIs and being a platformer and a DevOps engineer, security is kind of the center of everything I talk about. So how is it interacting with those rest API? Is it using user password? Or Great question. That question comes up often, and we actually use connected apps on the Tableau environment, which is a very secure way for you to connect to that environment, but also helps you run a test smoother, especially I'm sure organizations are having MFA. We help work with that so that your tests are ultimately automated. It runs smoothly. That information is collected. So with every authentication we are able to work, and connected apps makes that possible very smoothly. Okay. Great. And I was I can see this is loading now. So this is is this pulling up that dashboard, doing that interactive steps that we've discussed that we've defined in that test already. Exactly. So what's happening here is you can see I'm completely hands off. That means Wiiisdom is opening this dashboard. It's going to run through a series of filters as you can see on the right side. It's going to select all the different filters that I've created in my test to ultimately come back and tell me, is my data validated based on that point? And I can see that the test that's coming back in the console, I can see there are some failure points. And the goal here really is exactly this, being able to run tests, catch the errors faster. So when things do fail, you're notified faster as the admins, as the testers, and you can, assess them much faster than your end users. Now what exactly failed here? With every test that we run on the Wiiisdom side, you're always left with a report. This report is where we come to troubleshoot. And I can see in my latest report, every test has seemingly ran successfully except for a validation specific test. And when I click on that, I can see it tells me sum of sales was supposed to be greater than or equal to forty k for my KPIs after running those filters. And there's a value of twenty six k five hundred. Obviously, below forty k, so it's flagged in, and I come here and check for it. Now this is a very simplified version of validation of data. We're able to validate data based on formulas, if statements, if you want to get down to, again, decimal points, or if you want to connect to your data warehouse to ensure the data that's being stored in the tables is being truly represented on what's being displayed on the Tableau. But the goal is ultimately run these tests, automate them, enable a post process action so that the dashboard is certified so that ultimately when your users are looking at these dashboards, they immediately see this badge. It tells them this dashboard is not certified because, as we saw, the test failed. But, hopefully, your end users are going to see more of this certification here, where it shows them that the dashboard is, in fact, smoothly ran, when was the last test, and why it's certified to help you enable your trust back into the systems. Fantastic. I those so so those reporting, Ace, is there a way to obviously, these reports are great, but you're actually actively going to that report. Is there a way that we can notify end users if a test fails? Because the last thing the end user wants is to be the person to identify that issue versus being told proactive versus reactive. Totally. That's a great question. I want you to think about all the times that the dashboards were failing or had some issues with the data. And who did you reach out to? How did you find the person, the owner, the admins, the data stewards? Well, in Wiiisdom, we have capabilities of integrating with email or Slack so that you're always notified via these channels. Or if you have any applications that offers a webhook API, you can also plug that into your system. So if you have ServiceNow, Jira, Microsoft Teams, anything that offers that API, you're able to build that communication between the platforms so that every time you run a test, you're able to be notified that the test ran, If it ran successfully or, again, more importantly, if it fails so that you're notified as the admin. But, again, the certification is also a great piece for your end users that when they open this dashboard, they know exactly when this test ran at what time, and if they have questions, who to reach out to. Perfect. And I saw in those integrations, most of them made notification once, but there was a database. So how is how is Postgres involved? Great question. So with the Postgres integration, you're actually able to start storing metadata about the tests. How many tests are you running? What's the success rate or failure rate of all these tests? So that ultimately, you have a dashboard that looks like this. And you can find this dashboard in our Tableau public environment. It's available for everyone to download as a template. But the goal here really is to enable your admins to have a summary around every test that's run. How many tests are running total? How many of them are functional? How many of them are regression? But more importantly, hopefully, be able to identify patterns within failures. Are there specific tests that are running with more failures? Are there specific views or data sources or dashboards or owners of these, piece of content that are having issues? Or are there specific times of the day or the week? This will hopefully get you closer to the cause of all these problems so you can solve them and hopefully enable a much smoother process for everyone. Perfect. Yeah. I think that's fantastic. So like you say, you can spot some of those trends. Like, is there a performance issue on a Monday when everyone's using the server, and then it sells off in the week? And and kind of thinking back to my time as a client, there was always this us and them attitude of IT and end user, and this kind of bridges that gap on clarity to be like, look. We are testing, and we're open about if things are failing, if things are passing. And it's what we're actively doing on those failures to make things better for that end user. And you can be evident. You can see that we've made a change, and that's now passing every time instead of previously failing every week. Exactly. As data people, this is the best way to communicate. This is the data that's showing exactly what's going on. Perfect. I'm just gonna take control of that screen share again. And we're gonna actually go over to our first interactive poll. Just kind of a two part question. We're gonna ask, one, are you actually doing any testing, and when are you doing doing that testing? And, also, who do you feel is responsible for that testing within your organization? And kind of just while that's running, Bruno, do you wanna just kind of talk through a few of these additional pieces that are on the screen here? Yeah. Definitely, Ryan. And just a by the way, I had one question for Ace. Like, the test that you've shown before, how long did it take you to to build it, Ace? That test was very quick to build. Depending on the complexity of the test, we can really get deeper. But for the most part, test take about less than thirty minutes to an hour to really get started with the foundation of a test. It's very easy to connect to the data. It's very easy to connect to the workbooks, identify the pieces of every element, and ultimately design the test for yourself. Thank you. And in terms of testing, so, Ryan, essentially, today, we we've got five different type of tests. So the first one, it's all about accuracy. So here, what happens very often with our customers is, they've got their dashboards and they've got their worksheets, and then they question how accurate, the the data they have there. So what we do is we actually perform a test where we run a test, a a SQL test, essentially, between your data source and actually what you have in your, Tableau worksheet, your Tableau dashboard. On the top of that, in terms of accuracy as well, we are also able to test different, business rules to make sure that whatever you have in your dashboard meets your business rules. So these are the type of tests we do when it comes to accuracy. The second one, as you could see with ACE, it's all about user experience. So does my dashboard open? Any interactions essentially that you will have with your dashboard, we're able to, to simulate, to ensure that, they all work, accordingly. The third part is about security. So I wanna make sure that, Ryan can see what he's supposed to see. So, this is a visual way to confirm that, you know, Ryan has got the proper, credentials. The four test is about performance testing. So this is something really, again, unique where we've done a a very, an excellent innovation there where we're able to put, load and and measure the performances of your Tableau dashboards and whether it is on Tableau Server or Tableau Cloud. So it's extremely powerful, and, of course, we are able to compare that data through time. And then the last test that we'll be talking about, when it comes to, Tableau cloud migrations and after, of course, is about, regression testing where it's side by side comparison, AB testing, and to see, well, before the Tableau cloud migration, how it was and after how it was. Perfect. That's all. Fantastic. Yeah. So I can actually see obviously, the results have come in kind of from our results, and they're kind of as expected. Most people are trying to do tests. Most people are doing before each workbook release. Some occasionally, some never. And I think in an ideal stick world is that we would test every dashboard before anything gets promoted to production, But there's always kind of outside factors that can be incorporated inside things like resource re re restraints of just no one being available to test, time pressures of they just need a dashboard out yesterday, and things kind of skip past some of those testing. What is interesting is that there is a gap, that no one's answered that there is a CICD process, and this is something we'll cover later on. And it's kind of something that drawn me to Wiiisdom originally. It's not something I'd seen in the space, in Tableau previously. And, also, who should be responsible for that testing? Obviously, quality analyst is kind of the big one. But depending on size of organization, you kind of don't always have all the channelists available. And if they are within your business, trying to get their time is sometimes quite difficult if there's other things to kind of look at as well. So we're kind of just gonna have a look at kind of how InterWorks can incorporate Wiiisdom and how I've had a look at it as well. And it's kind of this notion of using some of that performance testing of one of those five as well as some of that AB testing as well. So we're into words kind of the big hot topic because most people who have touched Tableau probably know or been to any event or user group is Tableau cloud. And there is a big drive to get people from server to cloud, and and that's kind of based on not having to host stuff and manage it yourself, moving to the cloud where it's a lot more open, SaaS based. And how we kind of incorporate that is we offer a service where we migrate our customers to the cloud. We automate a bit of that. We work with them to make sure that we kind of move the right things and and kind of they're accurate. And where Wiiisdom is kind of coming to that place is this test, migrate, validate, kind of scenario where we want to test things before we even migrate them. Like, should we be migrating dashboards that don't actually work in your own environment first? Are we just wasting time and effort effort to get them into cloud? Should we fix it kind of point of source? We then decide what we want. We migrate them. Everything that's kind of passed those tests, we can migrate. But once we migrate, we need a validation step because, again, in an idealistic world, we would everything would work when it moves, but we know that that doesn't happen. So we kind of want to see what hasn't passed, and that's where we wanna focus our resource. And as Ace was saying, some dashboards can take up to eight hours to test. If you've got two hundred dashboards that you've just migrated to Tableau Cloud, I can't see any quality analyst wanting to take on that job. So having something that can automate some of that is fantastic. So we kind of had a look at a few dashboards, and we kind of followed this strategy. And we had a look at this and this kind of performance recording where we point this at Tableau server instance, we pointed it at a few dashboards, five in total, and we did a AppDeck score. And an AppDeck score is kind of like an application performance index, and it's a quantitative measure on the satisfaction of an end user on if there's an end user sat there, what are they happy to kind of sit and wait for? When do they start just tolerating the performance, and when do they actively become frustrated? And we can see here we didn't score very well on our survey. It's a huge server, lots of users, and we got an app debt score of point six seven. So sixty seven percent. And we can see that a lot of that was in the tolerate. And we didn't have any frustrating, which is great, but we didn't have a lot in satisfying either. So what we then did, we migrated those workbooks over to our Tableau cloud instance, and we reran the exact same test. And the APDEX score jumped from point six seven to point eight eight. And, automatically, we know that the right decision was to move to Tableau Cloud. Our end users are gonna be less frustrated. We're still getting a little bit of tolerating, but the satisfying is way outweighed. So there's potentially a bit of work we need to do on some of those workbooks, working to kind of reduce some of the complexity, data volumes to kind of improve those load times. But I can go back to the business now and say, look. Moving to cloud is the right thing to do. You invested in this, SaaS solution, and we can prove to the business that that was the right thing. However, performance is just one key element, and that's why you Wiiisdom doesn't just test one thing. It tests five because it's kind of a magnitude of different things can affect the workbook. So we took a couple of dashboards off Tableau Public, the ones that we migrated. And here, we can see a side by side. So the left is Tableau Server, and the right is Tableau Cloud. And if anyone can spot a difference here, I'd be amazed, if there is a difference. So if anyone wants a pop in chat, if they can see anything, and if they can, let me know. But we're actually gonna actively run a test on Wiiisdom live, and we're gonna see a difference and if there is a difference involved. So I'm gonna pop over to my Wiiisdom, instance, which Ace gladly shown before. And we built a a quite a simple task where we're looking at two dashboards. We're looking at a Spielberg dashboard, and we're also looking at a data visualization catalog on the best way to visualize some of your data. So we're gonna go ahead and and run that test. And I know that there is a difference in those dashboards. I know exactly where it is, but I'll just pop back over to that while that's running. And if I was to take out a tape measure, anything, like, the measure of the lines is the same. Everything looks the right way. The the, circles are the same size. So kind of Bruno, Ace, is there anything you can spot? You're my QAs today. I've given you this dashboard. Data looks right because everything looks accurate. Is there anything wrong with that dashboard? Look. Visually, I can't see any differences, with the dashboard. Yeah. So let's anything. Yeah. Let's see if Wiiisdom has. So it's gone away, and it's run that test. And this is kind of working in a headless way. So where I've shown you before where it brought up that browser, you could see it interacting. What we've done is we've just said, you know what? Run it, but don't show us the browser just because we don't really need to. And it will then spit us out a report as they've shown. The great thing is with this is once you've done that migration and you've done that validation step, you can actively just send out that report to your QAs and say, look. We've done a load of testing for you. We don't need eight hours per dashboard for a hundred dashboards. We need three hours per dashboard for two dashboards, and this is where you need to focus your time. They're much happier. They can go and work on other things, and they're not sat in front of a Tableau screen. And we can see it's spat out a report. And we're gonna open that in a browser just so we can see it a little bit clearer. But, actually, it has failed. It failed on a visualization view. So I know what I changed, and I actually changed the pink red hue of the dashboard by one mark. So in disclosable to the human eye, you wouldn't have noticed it, but Wiiisdom has compared that, and it's already found an issue. I wouldn't have if I was a betting man, I wouldn't have bet on a QA. Fine. Keep that a problem, but the tool has found it. And it doesn't just stop at that as Ace was saying. I have gone away and looked at another dashboard. I can see here highlighted in this red, there's a difference, and I know that there's a difference. When I look at that, is it just that there is a color difference? Well, no, actually. This is actually telling me there's a data summary. So something has changed within that. And I can actually see a snapshot of it and see, well, actually, yes, there is a data change. Is it because calculated fields have stopped working? Is it because the source has changed between now and the migration? Do we just need to do a retest on the server and make sure it matches again? Or is there something fundamentally wrong with our dashboard? Let's not release that to production when we go to cloud. Let's kind of move that into kind of quarantine within a project. Let's work on it, and let's promote it to production in the right way with the right testing going forward. So kind of that's how we would incorporate it with your Tableau cloud. And I think people would go, well, great. I've moved to Tableau cloud. I don't need Wiiisdom anymore. Well, that's not the case. That's just one piece. And, actually, Wiiisdom is much more a tool for your daily BI life, and that's kind of going back to what Bruno was saying on the out of the main test. So you've got, like, your enterprise level. You've got some of your, testing for product promotion to production. You kinda wanna certify some of your data sources, and that's where we move to this notion of data governance is kind of the keyword that everyone's throwing around. But, actually, it should be analytics governance when we're talking about things like Tableau because you can only verify data to a certain point before someone multiplies it, divides it, or adds another field to it, and then shows the KPI a completely different way. And all the effort left of Tableau can be ruined very quickly. And I I always use this analogy with clients of trust within BI is so hard to build, but it's so quick to lose. Like, as soon as you put a dashboard out with the wrong KPI and the CEO quotes that in a trading meeting, he's gonna be banging on your door asking why it's wrong within minutes. So and if you don't want that to happen, so let's capture that beforehand. Let's certify those dashboards so your CEO goes, great. I can see it's tested today. I can trust it. I can start making my business decisions off it today. Let's go and share that number. I'm gonna hand over to Bruno to kind of look at that daily lifestyle in the kind of the infinite of Wiiisdom. That's a very good transition. And just, about your story, Ryan, something that it just reminds me, at TC twenty four, you know, we were actually, talking with a with a customer that was saying, you know, he he actually, so it was a c level who was chatting with our CEO, and he was saying, you know, I had a couple bad meetings where, some dashboards actually, had been or the data actually had been tested, and, and and we had to provide some information for, the SEC. And we actually found out that the the dashboards were were not accurate. So this is the type of thing where the customer said, you know what? With your with your solution, essentially, we are able to continuously monitor the dashboard and continuously certify the dashboard or decertify if there is any issue. So so this has really got a a huge value. So, essentially, all the testing that we do, all the validation we do is as I've mentioned before, of course, you need to determine the test that you're going to do. So we went through the five tests that we're going to do. Our solution behind the scene as as well has got a promotion engine, which is really good, when you want to do continuous, monitoring, continuous testing of your Tableau dashboards. Boards. So, essentially, we're able to promote, the dashboards based on the, testing results. You know, for example, should it go from, from QA to production as an example if it if it passes? Or once it's in production, should I promote it back to a sub environment? So this is something that we do there. And then, ultimately, based on these results, we're able to certify and decertify that. So all that can be done in a automation, you know, in you know, it's it's it's all completely automated so that you repair your test. And then after that, you can let it run. And this is extremely powerful for enterprise grade type of, of reports. One thing that we're showing here as well is we're also going to come out with a version control for Tableau that we will be, definitely working on with, Ryan and the rest of InterWorks, on the top of it. One last element I want to bring up is when we talk about Tableau cloud, something that is really critical, one game changer with Tableau cloud is Tableau cloud is great. One key element is you will no longer control the upgrade. So Tableau cloud has got forced upgrades. And in the past, you know, you could always try to say, hey. I'm gonna test as much as I can before I do an upgrade. Well, here, the problem is, you're going to get all the time the latest innovations from Tableau. That's great, but you will not have all the testing windows. So here, that's even more important where, you need to make sure that you build test, you know, before or after the migration. And then after that, to maintain these tests after that, to deal with the forced upgrades and perhaps test every day, your at least your enterprise grade, dashboards. Perfect. Yeah. And I think that kind of leads on to to this view that we've got here of that analytics going into certification, that CICD. And, like you say, that kind of pre Tableau cloud or Tableau server, you could build a Tableau Server in a QA environment. You could restore your backup. You could quality test your key dashboards. However, that's removed from you. And and with Tableau, you you kind of may get a little bit of a notification, but you might have resource aligned elsewhere. You may not have seen the pop up. And the last thing you want is, like you say, your enterprise level reporting to suddenly break because you had a calculated field or a function or a feature that's deprecated and no longer available, and your whole BI state is no longer working. But I think that's a great point and and kind of great testaments, like you're saying, at teleconference with that client of identifying that and then actively making a change to go, do you know what? This is something that you should be doing. And if you were to say to someone before Wiiisdom, look. You need to be testing every day your critical dashboards. They would say no. They don't have the resource. They don't have the time. And that's kind of where Wiiisdom comes and fills that in you. You spend that time, which, as I've shown, is absolutely not quite long to build a quite an in-depth test where you're testing data, you're testing visualization, you're testing summary kind of functionality. And once you've built that once, that dashboard can be tested every day, every hour, every time the data refreshes if you want it. If you want it to be that kind of level of granularity, instead of within your control, you're not at the mercy of resource or cost or budget of someone being available to you at that time. And that's kind of where that CICD piece comes in. And as a DevOps engineer by trade and somebody kind of sits there writing code on a black and white screen. Using things like Git and CICD was critical to my role in the past of, like, you couldn't release anything to production unless it had gone through the right testing. There was version control. There was pull request. There was approvers. And I think that was something that was missing when I kind of moved into the BI space with Tableau was that any creator could publish something, and you you're trusting that that person has done the right testing. Whereas with the CICD piece with with Wiiisdom, you can kind of build that test. You can integrate it. I think Ace can probably give us a little bit more information on the tools that this can integrate with. But I tested with Jenkins, and I pushed a workbook up to GitHub. That triggered a test. It went away and did a performance test and a functional test on that dashboard. And then if it passed the test, gave me a comment on my pull request. If it failed, it commented on my pull request. But if it passed, it closed my pull request. I didn't actually need to get another developer to be my pull request approver, and then it automatically used that promotion, to production kind of feature that was within Wiiisdom to go and republish it and overwrite it. So I've made a feature change. It works great. And I took out the effort of needing one, two, three people to get something to production, and, therefore, it was just relying upon that test. And that test is version controlled shortly within q three that we can have that managed elsewhere as well. So I'm just gonna ask a quick question. We won't share the results of this one. This is it's kind of just for us and a little bit of a talking point. But is with everything you kind of seen today and discussed, do you feel that Wiiisdom can kind of bring value to your Tableau governance, and testing? You think it can kind of automate a big chunk and kind of remove some reliance on resource around the business. And kind of just while that's running, Ace, do you kind of wanna just touch on some of those other CICD tools? Because not everyone uses GitHub. Not everyone wants to push stuff to GitHub. People use things like Trello and Jira, those kind of things. Absolutely. Ultimately, what's really being run from Wiiisdom's side is a batch file. Right? The code is available in the console when we run it. You can copy it directly from Wiiisdom, save it, store it anywhere that you'd like. Ultimately, any CICD platform that you might have, any, orchestrated pipeline applications that you might already be using, Even homegrown in house built tools can run these options. So from an integration standpoint, we're very flexible because all we are is just that piece of code that runs the test. From your side, whatever, application that's available, we can find a way to connect and run the test. Perfect. And, obviously, we're both running this on Windows machines at the moment, but cost is gonna be a big thing for people. And running tests on a Windows machine, there's not gonna be a developer who's happy to have Wiiisdom kind of running kind of headless in the background on their machine. So cost in mind, is there other ways to kind of host this tool that kind of takes away some of that kind of license cost on an operating system and also some kind of improvement in performance as well? Absolutely. So we run on Windows, Mac, and Linux when it comes to running the codes. So any, server that you might already be utilizing for any of the jobs that you're running, you can utilize the same server for running these jobs. So if you're running it on, you know, different, cloud based servers, anything that kinda comes up during specific times of the day, runs them, whatever process there is, we can run with all those applications. Perfect. Yeah. And I think that's we kind of built within our Windows environment because we wanted that UI to go look how we build in the test. And then we we move those predominantly just adjacent file and a bit of folder structure over to our Linux machine. And we just said, you know what? Run as much as you want. It was performing. We could kind of ignore that server, and kind of keep developing. So I did pop a messaging chat. So I'm gonna give people a couple of minutes to kind of ask a few questions. We're we're happy to discuss them on this call. We can also come back to you separately as well. As Vicky mentioned, anyone who registered will get this recording, and we'll also follow-up with the mail as well. But just while people are kind of asking some questions, I guess, this probably one for for me to you is kind of the uptake in Wiiisdom. What kind of has been your client experience, firsthand experience when you've gone into kind of what was their state of BI before and and then kind of what was that state of BI after? So, you know, Ryan is, you know, when we saw actually all the poll results initially, it's just that actually just gives some some hints to it. So most people today, you know, they're aware that, they need to test, but, essentially, they, it's very difficult to be doing the testing manually. So for most people, they say, you know what? I recognize that I recognize that there is a problem. So so that's very clear with the with the first poll that that you did. That was the results were actually very much in line to what we hear from our customers because for people, it's it's just a you know, we we like to say it's the wild wild west. You know, it's it's like a zoo essentially behind the scene where people know that there is a problem. People don't necessarily trust, the outputs that their dashboards that they're going to share in particular problem, you know, despite all the the testing that is done at the data level. So let's say this is essentially where we are today. So let's say this is essentially where we are today. What we're delivering here, you know, the change that what we're bringing, the shift that we're bringing to our customers is essentially from a consumer perspective. Essentially, people, they wanna know if it was tested, and, of course, they wanna know that, it is you know, it was positive or negative. So whether it's green, you know, if it's positive or red. And this is essentially what we're doing with the dashboard certification there. So that's the first part. When it comes to BI team, essentially is, well, they want to be no longer, reactive. They want to become proactive. So, essentially, they wanna be notified when there is a problem. So because they're they're putting together continuous testing, so with the monitoring. So they're testing the con the content every day, you know, in the production. So they get notifications, and and they're very happy with that. There is another element actually I could add into it is the data team that works with the analytics team is they love that. There's a lot of value for them because they put a lot of love, a lot of effort into making sure that actually their data is accurate, is, trustable when it goes to, the BI team. And very often, one of her frustration is, well, I've got that data that's right, but I hear from the consumers that whatever came out of the analytics is not accurate. So, essentially, here, we're going to make sure that their data is properly, that all the transformations are checked. And then the last value, of course, is for the auditors. We're here. The auditors, they want to know, hey. Tell me, Bruno, what was tested? Show me the, you know, the the testing results. So this is something that we can actually document, and this is what we do with WISIM. So it's really a shift, and this is what we're delivering to our customers. Perfect. Perfect. Yeah. No. That's great, Bruno. And I guess and in the back and back over to yourself to ask to put you on the spot. Have you got any questions for me, from kind of using the tool as a kind of as a client and a partner? Yes. So to me is one thing is it it was great to be working with you. And, I've got, you know, two questions actually for you is the first one is in terms of services. So today, it's not something that we do offer is what type of services, would you offer today to to customers? So let's say customer goes and see and say, you know what? I love Wiiisdom. I really want to put this together. What type of services would you offer, Ryan? Yeah. Yeah. It kind of goes back to what we discussed at the beginning is we we kind of wanna be involved or we offer services kind of within that full BI stack. So whether that's getting your data ready to put in front of a tool like Wiiisdom, we can do the hosting of WisdWiiisdomom if you don't have the skill set to manage Linux servers or to kind of build those tests, we can offer that as well. And we can set Wiiisdom up for you. We can integrate it. We can kind of show people how to use it as well. We can offer that Tableau cloud migration. But I think where there's great value is that we can work with you on your customer journey, that BI journey. Where are you now, and where do you want to be, and how can we use Wiiisdom and kind of those tests? Because you might benefit straight away from performance testing because that's all your end users complain about. So we can work with you. We can discuss this with your end users, and we can build that strategy around that to say, look. This is what we're going to do. We can host that for you. We're gonna build some tests on the dashboards that kind of the people care about, and we're also gonna keep kind of that CICD. We can show you how to set that up with the likes of Jenkins, Jira. It's not in everyone's kind of skill set. It's also not in everyone's interest. Some people like me love those kind of integrations. Some people just like, I want something to work. Like, give me the result today. So, yeah, we can offer services around that. And it's kind of a big one that we kind of work with. Obviously, we can resell Wiiisdom for you guys, but where we kind of wanna work with our clients is on that services level because that's where we add the value. You add the value in the tool, and we add the value in the kind of the services that build that tool into the BI, space. Yeah. That that that's kind of the synergies we can offer, and people can go to our website, and they can see everything. It's not just Tableau. Obviously, Wiiisdom for Power BI. We're also in the Power BI space. We're in the SAP space. Anything that kind of people need as well. And and, Ryan, to us, it's this is really important because ultimately is, if you think about, you know, people are investing so much money into their data, into their analytics. So it is really very important to make sure that whatever gets out, the visible part is completely tested, and it's validated. And that's why, like, if you think about all the enterprise grade content, you know, it makes a lot of sense to to partner with you because you've got a very good understanding of your customer needs. Like, you're gonna do audits to understand, for example, or I'm assuming, anyway, to understand, well, you know, what is critical? What are the critical business rules? What do you want to convey out of these dashboards? And and it's it's very important at the end to to make sure that these are tested in, on a regular basis. And just a personal question I have for you is, like, how what was your experience with Wiiisdom? You know, was it complicated? What was easy? What are your three takeaways? Yeah. So probably because, obviously, people may not know, but I went through the guided evaluation that Wiiisdom does for a new client, kind of their onboarding to the tool, One, to see what that experience was like so I could feedback to you and say, look. This is how it works in the real world. And two, also, I wanted to put myself in the mind of a client and say, how would I integrate this into my system? So I can probably say three key takeaways was first one was ease of setup. Like, within having a license, downloading the tool, and getting building a test was minutes. I was it's so quick to kind of get started on that journey. And that kind of led into the second one is that this test when I first kind of thought about this kind of testing, I was like, well, what could you actually test? Like, there's not much you could check a load time. You could just see if a visual is the same. But was the expansiveness of features within that and kind of being able to work with that and also working I'm I'm not a dashboard developer by trade, and I would probably build some of the worst dashboards people have seen that would fail all your tests. So, it's probably a good job. But I actually went and spoke to our analytics consultants who do build dashboards. I was like, show me a dashboard. Show me how you built it and what you intended an end user to use it as as. And I then built the test based on what they were saying live and saying, well, they would go and do this filter, and they would click here, and this is what they would expect to see. And it was being able to kind of work collaboratively with that person to go, well, that's how you intended it to work. Is that the way that it is working? And kind of the third one was something, again, that I've not seen. The CICD was fantastic, but that dashboard certification was fun was was kind of the the gold star for me was I've seen data source certification as part of the data management add on, and having been able to do that by a code is great. But when people load dashboards, they're not always looking at the data source. They may not see that little tiny kind of pop up that appears at the top if it's been turned on. Whereas straight away, if that certification is on your dashboards and it's in the same place on every dashboard kind of as part of a best practice, automatically, our end user loads the dashboard, and their eyes go to that straight away. They're not looking at the dashboard. They're looking at the certification because they don't wanna waste their effort if it's wrong. Like, why should I go and filter and use those parameters and do that functional stuff that Ace was showing if it's wrong anyway. Like, I'm not gonna waste twenty, thirty seconds doing that, especially if I've got to do it on twenty dashboards. So that was kind of the big thing I didn't expect. And when we met in, kind of twenty twenty three and before, that that being there now is such night and day, and it just brings and And I am conscious for time. So thank you everyone for attending. I'd like to thank Vicky for kind of hosting and and seeing the beginning of this webinar. And thank you to, Bruce and Ace, for, sorry, Bruno and Ace. Sorry. Not Bruce. I've kind of combined the two names. So Bruno and Ace for sharing and and attending this webinar and for kind of working with Interworks as well. If anyone is curious, they can reach out to myself. We're all on LinkedIn. Our names are on the kind of the invite. You can reach out to Wiiisdom on their website, into works on ours as well. And we will be following up with a recording and a post kind of webinar email as well. So thank you everyone for attending in your kind of your input today.

In this webinar, Ryan Alexander, Platforms Architect from InterWorks was joined by Ace, Solution Consultant, and Bruno Masic from Wiiisdom to discuss the importance of analytics governance and dashboard testing in modern BI environments. The session focused on the challenges organizations encounter ensuring accuracy, performance, and secure access in Tableau dashboards, especially during migrations to Tableau Cloud. The speakers demonstrated how Wiiisdom automates functional, performance, and security tests, provides certification badges for trusted dashboards, and integrates with CI/CD pipelines for continuous validation. Real-world use cases illustrated how automated testing reduces manual effort, improves trust, and enables smooth upgrades, with actionable advice on implementing governance and validation best practices.

InterWorks uses cookies to allow us to better understand how the site is used. By continuing to use this site, you consent to this policy. Review Policy OK

×

Interworks GmbH
Ratinger Straße 9
40213 Düsseldorf
Germany
Geschäftsführer: Mel Stephenson

Kontaktaufnahme: markus@interworks.eu
Telefon: +49 (0)211 5408 5301

Amtsgericht Düsseldorf HRB 79752
UstldNr: DE 313 353 072

×

Love our blog? You should see our emails. Sign up for our newsletter!