Snowflake Data Journey

Transcript
Today, we are gonna talk about the data journey as it pertains to Snowflake. So there are natural milestones that you're gonna be hitting as you as you sort of commence, your usage and adoption of Snowflake, and all the great things that Snowflake can do for you. And that might be we're just at the start of considering it or it might be we are we've been using it for years. We've got a lot of adoption. We've got a lot of things happening on it. And so our mission today is to help elucidate what that journey might look like, some of the big steps that we've seen a lot of our other folks take, all of our other customers, as well as to, I guess, perhaps, obviously, selfish lift to tell you how we can help you. Because we've been doing this for quite a long time, for a lot of people. So without any further delay, let's make some introductions. So I am Robert Curtis. I'm the executive director for Asia Pacific. I am not wearing a blazer today. I'm wearing a hoodie, which is why I'm not gonna put my camera on. I'm working from home. And my companion cohort colleague is Paul Middlewick, who is our data lead for Asia Pacific. Say hello, Paul? Hello, Paul. No. I should probably go with something better than that. Hello, everyone. It's almost like we scripted that, but we I promise we did not. We did not. No. So we'll be chatting today, kind of in a back and forth conversational style. While we are chatting, there is a q and a button, inside of the webinar interface inside of Zoom, please, ask any questions that you've got. We will probably have some time at the end of this, so we'll certainly come back around and try to answer as many as we can, if there are any. Also, if you've got, Questions that we can't get to, then we'll certainly reach out and try to get you those answers via email. Alrighty. So a little bit about interworks, we are end to end data and analytics, which means we think about strategy, we think about the infrastructure or platforms that these tools sit upon. We think about data itself in terms of where we store it, how we leverage it, the pipelines that take data, from place to place, the analytics, and that could be self-service analytics. It could be data science. It could be reporting. It could be all sorts of things. And then, obviously, we're really focused on the last component, which is building communities, building user adoption and making sure that you're that you are able to cultivate a data culture, which is really the secret to making all of these things massively successful. Interworks is Global. I get to look after the red bits over here. I'm based in Melbourne. Paul is on the other side of the country based in Perth, so little arrows pointed at us, but One of the things that, really differentiates us is we are a private company, single owner, and we've got locations everywhere. So regardless of how big you are or what vertical you're in, or if you're a multinational or you just happen to be based out of a particular place here in Australia or Singapore, we can help. I wanna give you some context because some of you have heard of us before. I see lots of familiar friends, in the, in the attendees list. Some of them are new to me. So let's introduce ourselves a little bit more. We have always aimed. We as Interworks have always aimed at being the very best at what we do. Not the biggest. We could have certainly been acquired or or bought other companies and tripled quadrupled our size over the twenty seven years that we've been in business. But we've always aimed to be very focused and very, specialized in data and analytics. And that's given us a whole bunch of accolades and of words and distinction. So for instance, we were given the Forbes Small giant in twenty nineteen, a small company that punches far above their weight. We're, back to back winners of the Asia Pacific Innovation Award for Snowflake, within, within our region. Seventy five of the Fortune one hundred companies are interworks customers. So we have a tremendous amount of pedigree and experience dealing with the most complex data sets in the world. One of the things that differentiates us as well is to really try to get the most experienced thought leaders on our team. So I I figured this out earlier this year, so maybe this number is even higher now. But our team members have on average more than sixteen years of experience. So we don't we don't lead with really, senior consultants that sell you the business and then step out so that juniors can come in and try to deliver it. We're, we're experienced from top to bottom. We also take pride of being a thought leader. So our blog on data generates more than three million page views each year, which is tremendous. And and some of the scope in terms of eight thousand clients, almost two million hours in consulting, helping our customers reach, technical solutions. So that's a little bit about us. But what we wanna talk about is how we can help you with Snowflake and more specifically how Snowflake can help you. When you take Snowflake and Interworks, we believe that that is a peerless winning combination. So let's talk about this data journey. And I promise I'm gonna get Paul in here very quickly. So if you think about, the things that you might do with your data and analytics or specifically with Snowflake, It isn't necessarily a linear progression, but oftentimes there are things that you must do in a certain sequence. And so this is kind of what we're talking about. You have to start up Snowflake. You then have to figure out what use cases you want to solve on Snowflake. Okay. Then what's the ETL that will solve those use cases, and and so on and so forth as you start to reach up to real maturity with your platform. So whether you are just starting out, and you're thinking about making a switch to Snowflake, or you're a little bit further along. Or maybe you're pretty mature. Our pitch to you, and the whole purpose of this webinar, is to is to show you that we can get you all the way up to the top and beyond. In terms of the maturity, the cost efficiency, the user adoption, and expanding and exploring all sorts of downstream use cases from your data. We've got capabilities to help you with all of that. Paul, did you want to weigh in, on this slide at all? I think it's worth recognizing that everybody has a slightly different journey in front of them as well, Rob. So, you know, depending upon what industry you're in, what your current objectives that are or what are the pain points that you're dealing with, the journey might look a little different for each and you might do things in different steps. This is not a linear curve to, you know, to the stomach. Each one is gonna have a little bit of a different path. And what we also bring is that experiencing be able to customize the solutions that we have or try it where you have right a completely new path to suit the business case that you have. Your objective, of course, is to ensure that you reach business value or effective data decision making as quickly as possible. Rob? Great call up. And and if we were to more accurately beat this metaphor to death, we might have multiple mountains and multiple summits because you might be thinking about how you build your capabilities with Snowflake, and at the same time, data science, or big pipe ETL, or or building a community and all those things might incrementally progress, but regardless. Let's talk about how we can help. And and Paul nailed it in that everybody's journey is bespoke, which is why whenever we start to think about helping somebody from from the from the start line all the way to the finish line, we really wanna talk about what are your goals? Where do you want to go? We have a product that we created called a a strategy, vision, and roadmapper, s v r. Very tight. Two weeks of whiteboarding with different components of your business leadership to technical experts, the community, and so forth. And we talk about all sorts of things that we can kind of figure out. What are your strengths? What are your goals? What are the challenges you're gonna have? We we cross section that across all the different sort of stakeholders in this initiative. And then we we together, we build a bespoke solution exactly what Paul said. I think the key differentiator between what we do versus what other folks might do is we're also gonna give you very tactical steps. One, two, three, on how to start to accomplish some of those big goals. If you have any, interest in any of these things, we'll we'll put some contact information in the chat so you can reach out to our sales team, and they can see how they can best help you with all of these things. And we'll talk about at the end of this presentation how we incentivize this over the next couple of months, because we really wanna we really wanna accelerate your ability to start to do things with snowflaking with data. Another thing that we do is advisory services, which is helping you guys find the best tools to solve the problems or the goals that you've got in front of you. Paul? Yeah. So there are a number of kind of complimentary solutions that invariably client will need at some point on their data journey, having adopted the smokes like data platform. And to most of us, this very broad ecosystem solutions out there and vendors that have fantastic solutions to specific things that you may want to do around the platform. And it was very early that might start with an ETL use case. I need to get my data to Snowflake. What's an appropriate tooling tooling tool is that batch in gesture and streaming or something like that. But it then might be that, having made those collections, you get on to that journey to start landing data, you can know if I can start leveraging that within your business. And then further down the track, you're looking for a data catalog. Because you want to open up understanding across the business and visibility of, key metric. You might want to look at streaming or data quality or a variety of other things. It's mostly because this very broad ecosystem for this reason. It recognizes that it cannot be a, a, one platform does everything approach. It is very focused on very specific use cases. And then when it comes to these advisory services, what we can do is just basically give you the advice on what are the best solutions that we see in market. And that comes from a position of not reading a Gartner report and just submitting that back to you verbatim. It comes from the experience of our consultant team. We only work with dilutions that we enjoy working with and that deliver value. If a tool cannot deliver value in an agile manner, then it's probably not going to be something that we have a ground full of enthusiasm for and therefore want to kind of do too much work within the future. All of our team are focused on ensuring that we can get our customers to that value position as quickly as possible. And that might occur, you know, at this stage of a project when we're selecting an ETL solution to go with, snowflake record start, or that might occur much further into that offering as you, you know, just quickly stepped on the journey. Rob? I think there's a couple of things you called out, Paul. That are really worth spending just a little bit of extra time on it. That's that speed to value. Why is it so important to get value from day one? Well, obviously, you wanna impress your stake and executives and the people that are approving this from a budgetary standpoint. That's one. The second one is is that, I do a whole bunch of presentations, sort of all over Australia, all over Asia, where I'm talking about strategy and data strategy, and and one of the things that my research, as well as I've learned from talking with executives and leaders, is that generally, the strategy that people put in place for their data or the tool selection that they've got has somewhere between a three to four year life cycle. And that might be new CIO comes in, new CTO comes in, and wants to implement his his vision or her vision of what the data is going to do and and the platform that it's going to be on. Or it might be, the the industry, the the platforms that are available have gotten so good that it's actually worth doing the migration to really capture this new breakthrough technology, which is exactly what we saw with Snowflake four, five, six years ago. And was like, oh my gosh, this is a game changer. I've got to get onto this platform. So if your plan is to use months and quarters to really try to build steam and momentum and get that escape velocity from conceptualization to realization of value, you are going to miss the boat, and you're going to end up in that eighty percent churn of big data projects that fail. Which is why we're so passionate about speed to value, agility, and getting things going fast. So each of these Each of these are different products. There's strategy, vision, or road map, the SVR or the advisory services, and we've done these numerous times. We've got well defined methodologies, that can really help be, a guiding light terms. If we're trying to do this and when there's a whole bunch of buzz words, every vendor is saying everything, what what is actually going to cut through for me. What is the cost and budgets I should be thinking about with this tool or that tool or this combination? We will help guide you through all of that. Now as we start talking about how we can sell help you specifically on Snowflake, we've gotta take a look at the the plot form. And and Snowflake is not just a data warehouse. It is far more than that. It's it's all of these things and all the things that sort of connect into it. Paul is our data architect, data lead for Asia Pacific. So probably you're gonna wanna hear more from him on this slide than me. So Paul, think it's worth recognizing as you say, Rob, that, that this most like platform is a data platform. It is not just the data warehouse it owes it, heritage to that, and it does it brilliantly. But over the last couple of years, what we're seeing is an expansion of the opportunity to utilize that information in different opportunities, and different growth for a business. So a natural, spec way from a data warehouse is data time to application the ability to augment your data and get from a position of knowing what your position was yesterday to predict thing what your position may be in six months time. And so the platform's been designed to leverage, Davis science solutions that are out there and crucially scale as those data science initiatives to need them. Snowflake is absolutely brilliant at giving you the power of compute when you need it. You can be running your things on a relatively low compute capability to start with and really ramp up when there is an activity that warrants that extra horsepower. What has also come, and it gets inspired from that is the data exchange, you ability to share information in a secure fashion between your bill, your client, your vendors, whoever else, as either a business arrangement or business process for the share of that information or for some business advantage or monetization of your data. So you see big kind of expansion and log off our customers look for data exchange capabilities and, sharing capabilities. Somewhere in that mid to higher end of their journey. They have made August the kind of necessary, reporting and analytic that they need to run their business, and now they are looking to share that in a commodity that could be of use to their suppliers, to their vendors, or to their customers, and so they look at different changes the way of doing that. And the last is really, where we really differentiates. There's no cloud platform used to have data applications based. Now we can actually start to host our application on snowflakes. So any new development that we do within the organization could be co located with our warehouse. And therefore reduce the time it takes to get information from our application database to, analytic space to our data science at the new studios to an absolute minimum. As a result of them being co located, and this platform has been designed to run across multiple cloud environments. It can either run on AWS, Azure, or GCP, or a combination of those if you want a cool, full failover, as well as bring in any type of data sources that you can do. You are database is, you can do your flat files, you can do your, let me structure data, your your unstructured data. It is a repository for all. Those in their own right are exciting things, but when you put them all together, the opportunity to grow into and leverage the platform for it for the maximum advantage and for your advantage along with it really is there. And then why so many people are gravitating towards the most likely can, we've just seen a grandpa's adoption over the last five years as a result of this being such a danger you. Rump? Yep. All excellent points. We're talking about best in breed, data platforms. And so the thing that's also great about that is because it's it's agnostic and it's able to be sort of the best to breed, and it's not locked into a stack, whether you've got upstream stuff, which is your your ETL or your your data sources, ERP, CRM systems or downstream power BI Tableau, whatever, or the cloud that you have. Snowflake is ubiquitous, which is one of the reasons why it's so powerful. So if we're gonna talk about the data journey, then, obviously, we have to start with, strategy and and sort of framing the conversation and defining the use cases No flake could potentially solve for you aside, but let's consider that stage zero. The first gate is how we get started. And we do what we call rapid starts. And we do these as a fixed cost project. Now depending on what it is that you're trying to do, there may need to be a fixed cost for Snowflake And depending on the ETL, we also might need to then set that platform up for you as well. Paul, do you wanna expand on this? Yeah. Sure. You have to think of this as a blank canvas in front of you. Snowflake is a fantastic platform, but it does come as an empty canvas to start with And so what we've done over the last five years as part of our consultancy engagements is build up a framework of things that should be done as standard to set the platform up to receive data as well as an architecture for how data will transition through the different states. And so we come to you with, first of all, a set of recommendations that we will work through with your security team to make sure that the platform is ready to receive data. Things like IP white listing tying in with active directory or Octa, the security of the data during transit, as well as once it's within Snowflake are all considered. And we have a framework of, what we call, role based access controls are back. So we know different types of personas of usage, engineers, analysts, general consumers, what parts of the warehouse they will want to which or what parts of the whole platform they will need to utilize. And the result of that is we can just drop that in and get going as quickly as possible. Cap that alongside an ETL platform and then its implementation, and you now have a full stack ready to start receiving data and bringing it into the platform ready to start doing your analysis on it. So we might, you know, on the back end of this tie into tableau, path the eye, whatever else in order to get that consuming usage going as well. What we do is with these rapid starts is get you going very, very quickly. It's a matter of days rather than weeks in order to ensure that your platform is ready for whatever data you want to throw at it. Rob. When we say, fixed cost project from, That's the little asterisk, and what we're really trying to do there is just say, listen, if you need special things, like, for instance, higher level security on the data that you've got, that adds a level of complexity to the rapid start. So if you needed say Snowflake business critical edition, that's what the from is. We'll be very transparent in terms of what we think you need. And we can start very modestly and and expand. That's the thing that's great about Snowflake is it is very scalable. But it is something that I really wanna emphasize here. We are really committed to getting as many people, working with us and working with Snowflake as possible. So there are a lot of ways that we can do MVP, POC or seating pricing. Again, we'll put sales contact information into the chat we go through this webinar. And if there's an opportunity for us to work with you, and you've got some ideas on on how we can help by all means reach out. If that means we start with strategy, we start talking about a rapid start. We would love to get started immediately. Alright. So let's continue. Obviously, if we're gonna have a data platform, we need data. And so that's where you get your ETL, your ELT, and all the different of use cases that you might have related to data. I'll I'll I'll talk a little bit here and then I'll hand this off to Paul, but we have built a partnership toolset for transformation and reverse detail and all the different sorts of use cases. It should we've put on hand selected it across all the sort of leading vendors there. So that we could solve as many use cases as efficiently as possible. Officially mean performance and speed to value. How quickly we can get these things implemented as well as budgetary efficiency. Treat your money like it's our money. We're gonna make sure we're spending it wisely. So these are the tools that we've got in our and our, partnership tool belt at the moment. Matillion five tran DBT and Fermatica coalesce, they all do things, strengths, are all a little bit different, which is why we wanted to have such a broad view of this. Paul? Yep. I think it's important to recognize that every client is gonna have one, maybe two different ways that they want to bring information into their system. Maybe they want to, overnight load information from different systems. That's very much an ETL batch orientated solution Maybe they want to do things in real time and have beads from IoT centers or log coming straight into the platform without any kind of hesitation or latency involved. And so we built up this toolset because we recognize that every client journey here and requirement are going to be slightly different. Don will favor low code solutions that will do all of that construction of that. Pipeline for them and, just present that in a visual UI. Others, more info. Here's something my data engineering team, and we will code everything. We want to be able to control every aspect. And so these solutions very much follow that gamut as scale, as well as implementation type. Fundamentally, though, this is about getting data from a to b. How do we get it out of our systems or the systems that we have, subscribed to as that or that we have vendors, across the network for, how do we get that into snowflake? And one of these solutions is invariably going to be the this pollution for you. We've cherry picked these from the multitude that are available because we see these win time and again delivering that value getting you from a to b as quickly as possible. It's a data project should not be about sourcing information. It shouldn't take months to get that information into the platform. The data platform needs to be focused on what we do with that information as soon as it is available in the platform for what business decisions does it drive and what outcomes can we achieve as a result of making that information available, Rob? And speaking of data, we also have a whole bunch of accelerators, accelerators pre built, fully developed patterns to get that data in in a way that is gonna be useful as fast as possible, Paul? Yes. We do. Yeah. So recognizing again that each of the solutions we just talked about is a blank canvas. What we want to do is ensure that they're all operating in a uniform manner, but they're all following an architect or standard within Snowflake so that we can get data into the platform as efficiently as possible regardless of its structure, whether it's structured, semi structured or unstructured, there is a activity that we want to get to in order to get to a data architect inside the platform that leverages for all of these end user needs, whether it's coming out of an analytics tool, a Tableau, a power PI or whatever else, whether it's going into a data science solution for augmentation, going on to applications or going on and being shared fundamentally, and we want to have that architecture in place. And so And like a lot of, consultancy competitors, we come in, come, free ready with these accelerators. We want to ensure that that journey of getting the data in is as quickly as possible, but it's also monitored from the get go. We know what is happening when it is happening and whether there are any exceptions in the data making themselves evident. Change or will always kind of affect data pipelines, and we need to know as quickly as possible so that we can ensure that that pipeline's health remains well, healthier, essentially. Rob? Yeah. And just another way to reiterate, when you are working with Interworks, you are not just getting people that press buttons. You're getting best practices. You're getting thought leadership. You're getting a lot of experience on how a great way forward it is. So we're bringing you our ideas as well as our ability to execute. Patters for data lakes, again, these are additional accelerators across different clouds Paul? Yeah. I'm not gonna say too much on this one just to say that, unstructured and semi structured data invariably ends up landing in a lake. Essentially a big old file structure in the cloud in order for it to be utilized by Snowflake and other platforms. You might put it into a lake because you don't intend to bring everything into Snowflake with IoT centers, for example, all that. Whole center network might be registering sensor readings every second, so it might have ten to thousands of pieces of information or hundreds out in second being registered. And frankly, you're only really interested in what the reading is if something goes outside of a normal kind of acceptable window of tolerance, or when something stops, you know, those sorts of activities. You're looking at events rather than reading second from second. So in that case, you might use a data lake pattern to ingest information just in time to support an aggregate or event based notification and basically keep the majority of data down in that late level only surfacing the events that would be of interest to your, to your community in some way. So we've invested a lot of time making sure that we know how to establish that structure for effective ingestion into Snowflake as well as how to manage it and secure it so that we're not just stockpiling information in this layer. We are making the right level of information available to meet our customer's requirements. And are also meeting any kind of, you know, compliance related activities that may be necessary. So GDPR or other compliance and regulations that, you know, maintain and require you to stipulate our compliance to the level of information you store. Rob. And I I I think this is worth calling out for those folks that have never have never gotten a chance to look Snowflake, one of the big differentiators is storage and compute are untethered. So you can store a whole bunch of information inside a snowflake data lake lakehouse, whatever, for dollars on the terabyte. And because it's sitting there in the cloud, the accessibility to take it from cold storage to warm and hot so that you can then do things it is extremely, versatile. And cost effective to store that data there in terms of how accessible it is. So if you've got information that you're like, we're going to need this maybe we don't have the ability to do something with it now, or maybe we wanna use some sort of cataloging tool to go and scan and see what's actually in there and start to use it, having it there in your lake. Or your lake house. Perfect way, to to get it on the menu and start cooking. So, when we're talking about data warehouse data lake data engineering, all those things are very related. This is honestly where most of the consulting time gets spent whenever you're talking to anyone about data. It's the data engineering. It's it's the getting information from the lake into the house and doing all the transformation so that it can go into amazing stuff for your your user community. We've had the, we we've obviously had the privilege run with companies, but the one that we're gonna call up today that we did a, a joint presentation with a couple times is DHL. We're talking global HHL. We help build their, what they call their data factory, which is basically an analytics repository that we took off of a legacy system that was not giving enough, compute time so that they could run all of these essential queries and analyses and ideas and put it onto Snowflake. And they selected a community initially of a few hundred data champions from around the world and now have grown it significantly almost I mean, we were looking at the number of countries and we're like, there's more countries on this data platform than there are in the United Nations. It's insane. Twelve thousand users, almost a petabyte of information by next year would be well over that. Paul, you are the chief architect delivery lead on this. What do you wanna say about these as it relates to data warehousing lake and engineering? I think it's it's easy to say most of it can do things at scale. This absolutely demonstrates that capability, global coverage, thousands of users, hundreds of terabytes of information, all talk to something that can scale easily. It's important to note that this can come from a very humble beginning. We've started with it was hundreds of megabytes of data, and it was less than a hundred users initially discovering the value of the platform. And so things migrated over that two years. We've now empowered twenty four data engineering teams across the globe within DHL to bring information to the platform for themselves. And that's a key part of our journey and our value, I believe, is that we are always looking to ensure that our clients can be self sufficient. And it's, you know, kind of great cackiness and roll off of this client after almost three years building that, knowing that they're in a self sufficient place that they are, you know, able to take forward bigger engineering work, streaming work, data science works, data cataloging, they've already established their exchange. They've really leveraged every aspect of notably And, it's, you know, it's great to be able to step away knowing that they are stuff sufficient, but they'll be able to call on us as the next need arises or if they need something done in an actual manner. Rob? And what's not on this architecture, the Snowflake architectures is exactly that, and that's like community building, which we talked about at the beginning of this. Not only did we support them in growing their Snowflake and their ETL capabilities, We also supported them in doing self-service across Power BI, Tableau data science. So, again, we're helping them build the community so that when we can then step back as their consultant and allow them to be self sufficient. Great call out, Paul. Next is the data exchange or another way to think about this is the marketplace, and you could be, sharing your data, or you could be consuming data outside of your organization that can enrich what your what the analysis that you're trying to do. We've been doing a lot of work with AFL, which is obviously sort of a a league, office, the AFL House and all of the different clubs, which have very unique and sometimes competitive information with each other. So how do they then collect all of this great information and power all of their clubs to have great views of not just performance data, but membership, and revenue, and all the things that they AFL is the center, has a unique perspective on and then bringing that to the club so that they can then advance and enrich their own analyses. Paul? Yeah. I'd say data exchange data sharing starts with a need internally. AFL wants to share information with its clocks. So be it the clocks want to share information back to ASL, a bidirectional share in a secure way powered that. You see this in other organizations as well. We're just finishing up with a conglomerate out of Asia Pacific and a number of companies they have organizations in Hong Kong, in Singapore and across the rest of Asia Pacific in, you know, Ferris, strengths of strengths and sizes. And so we're starting with the conglomerate, establishing an instance there, and we've just gone into the first business unit where we've established an instance there, and then it's about becoming, you know, establishing those shares. There is going to be a eighty to ninety percent of data that NIFO is shared. That's specific to their entity or, you know, to their work patterns. But it's about that interchange of information. And the key differentiator between where you may be working and the way that this works is that it is real time As information is loaded into one of the afl clubs, it can be available via a share to the AFL and vice versa. And the power of that really shows itself time and again for business advantage I can use that information that's been shared with me in a way to augment my own, or I can, you know, basically highlight to my dependents something that needs attention, and that all of our organizations that we're working with find a way eventually of utilizing this. It's just moving that mindset from it's my data to this has advantage for my vendors or my customers if I share this information, Robert. Awesome. On the data applications front, there's all to share all of this. Obviously, there's the opportunity for us to help you build your data applications, but we're also looking down the track for things that are currently in development. So in terms of our twenty twenty four road map, Paul, would you like to sort of hit some of the highlights here? Yeah. Just highlights real quickly. We recognize N flashlight gives a data platform now supports your own application development and porting of your existing applications across. That is becoming stronger and stronger by the day in terms of its support. We have some tools that we have built with our clients that are looking advantageous to share with other clients. We have our own curator platform, which is, a platform for essentially offering one portal for all analytics content and think that could blend Tableau Power BI thought spot content all into one location, for example, that's something that we're going to be looking to host via the snowflake platform in the future. So that we can essentially offer that as either a service, or we can offer that as something that is of business value to our customer Ruth. One thing that's worth calling out is Interworks has a real spirit of innovation or entrepreneurialism to us. So we've been in business for twenty seven years. And and during that time, we have rolled out numerous products that we have passed on to our consumers. As an example, we've got curator on there, which is a an analytics portal that grades in with all of these tools that we do. But another one, we built a whole bunch of things to accelerate governance and oversight with tableau. So successful were these things that Tableau actually bought them from us and now our it's part of the core product in several places. So we have a real spirit for this. So This is one of the things we can't share everything because it's not ready for announce, but it is coming. So we're really excited about what this means in twenty twenty four. Building adoption. So, again, there's no real reason for data if it's not driving value with your user community. And so whether that's data science or operational reporting, self-service, real time analytics, all of those things, this is really our strength. Our our business, our BI business started, on the analytics front. And we grew into data once, something like, like, snowflake appeared. We got super excited about what this community and we really grew our business to encompass that as well. So we're talking Tableau, PowerBI, analytics and operational reporting site, thought spot, which search search based AI driven, user tool, dataiku for data science, we're really passionate about this type of stuff, and we've got twenty years of building communities and doing self-service. And if you take a look at all the all the awards that we've won across. These were talking, like, twenty different partner of the year awards globally. There's no one that does self-service better than us. Paul, this is more my domain, but anything you wanna add to this? I would just say we've got accelerators for each of these. We know how to get a performant dashboard out of Tableau or Power BI, utilizing Snowflake as a data platform. We know how to establish labs to ring fence initiatives that are being driven out of data science, and those are all things we can drop in or advise you on so that you can make the most of these solutions. Rob? What's not on here is sort of this underpinning of governance quality, observability, all all the sorts of things to make sure are we using our data? And are the right people using our data? And is the data that we're using trustworthy. And that's the sort of governance underlayer. So I sort of just drew it underneath it. We have done several conferences. We've done several BI study groups with leaders all over Australia. And we we asked him, what are your biggest concerns? And, overwhelmingly, whether it's in Perth or Melbourne or Sydney or across verticals or across the different size of organizations, a surprise to us, was the number one response was governance. We are worried about governance, and it can kinda come about from multiple points of view. It could be we have data that we're worried about. Somebody gets their hands on this data, we've got real problems. We don't know. We don't we're not confident that it is secure, or it might be we are producing so much data that we're worried we're not actually using it. Are we allowing a competitive advantage to slip through because we're just not looking and observing and and assessing what we could be using right now, that discoverability that master data management is critical, which is why we've This is a a newer part of our business in the last couple of years. We've partnered with informatica, elation, and atlin, and we've looked at all the other tools. But we're really excited about what this can do to empower, not just Snowflake, but how you how you think and see about data and blending an AI on top of all of that. Paul? Yeah. I would just state here that these solutions we've looked at and, decided to partner with because governance is invariably something that is discussed within an organization and comes out in the form of a policy document that somebody needs to implement or, just adhere to. What these solutions do is enable you could apply those policies across the organization, maintain them, and monitor them effectively to obtain the advent of privacy laws being applied through the solutions. We've seen the data catalog as a way of bruising visibility and agreement on key metrics. So these are things that we feel add that he asked the the appropriate tooling to take something from a paper design or concept and actually implement them. Rob? So let's jump over data monetization. Now, this is actually one of the huge sort of talking points that whenever you go to a conference, there's kinda two big ones right now. Right? It's AI and how do we monetize our data? I'll give you some stats And I got this from the IDC Future Enterprise Resiliency survey from twenty twenty three, which is just recently released. And, basically, they do an assessment of the marketplace, the industry, what's coming, trends, AI, all that kind of stuff. And one of the ones that they got back from their survey, with sixty percent of CEOs are seeking to monetize their data. Seventy five percent are seeking to monetize, APIs, apps, and software. So this is a critical differentiator for businesses going forward, particularly when you start to think about how you monetize data within the data exchange. And there are several folks that are actually on this call that we have worked with in terms of our strategy services to sort of think could we monetize your data? How could we take the products that you've got and create different blazer subscriptions based off of whether you're giving people self-service, you're giving them packaged reports, or you're giving them a data feed, and how would you deliver and execute that? This is a a critical component. And, again, we call we talk back to our our our thought leadership here. And, again, depending on the size, of the workshops and the types of things you're trying to cover, these start twenty k. That is a fraction of the cost that you'll see from the big four. And, again, we are very focused on data and analytics. So we are gonna give you very tangible steps versus just big aspirational goals and then a best of luck. Paul? I just add again that the SVR is something you typically see at the start of the journey. The opportunity exists at any point on that journey to look at monetization and data sharing opportunities. Invverably, it kind of pops up as an afterthought somewhere along the journey and it's very quick and easy to extend the opportunity to then start building a used case for it and dump to profits like that. And see where that page you decide whether it has value with a platform like Roche Lake, Rob? I did a, a roundtable discussion at the Data Analytics Summit in Melbourne. I think last week, time flies so quickly. I'm in We were talking about how eighty percent of data projects fail, and I I sort of challenged the group. What is failure? Is failure learning and failing fast. To me, that sounds like wins. So exactly to Paul's point, with a platform like this, with the agility that you get in cloud based I'm modern data tool sets. You can POC, MVP, test these things quite quickly, fail faster than apply learnings. That is in that is incredibly valuable as you start to assess what you actually want your long term vision of how how things can be. So when you take a look at all of the architecture across Snowflake and all the different things that make a successful Snowflake data platform, The reason we went through this exercise was to show you we cover all of it, and we've got partners to help expand and broaden it, whether it's ETL governance or analytics or data science, we're really trying to be an end to end provider to make sure that whatever part you feel confident in, great. We we we can support you, validate what you're doing there, but where the areas you need help, you have big goals, but maybe not enough hands, we can certainly augment that as well. Probably won't add anything to this slide? Only just say that it will bring a caliber of experience that you're not gonna find in other consultancies. Two times innovation award is the, is the baseline that we as an organization, and I'm a team focus on want to be known for delivering innovative and agile solutions, nothing of matters to us, and we spend a lot of time as a team debating different technologies, different aspects of use cases in order to find the most efficient and effective solution That is the key differentiator that with the experience that we have means that we can land on those solutions quicker than our con competition. Oh, so if you take this architecture and then we go back to our our data journey, it might look something like this. And Paul mentioned This is bespoke. Your journey is going to be different than, another company. Or your journey this year might have four or five steps, but next year, the, the ambitions that you've got necessarily are going to be grander or potentially different domains. Maybe you're not ready for governance year, but you will be next year. But again, as you can see, from strategy all the way to supporting your your ETL pipelines with Keepwatch or, assessing how well your data platform is performing and and auditing the queries. Are we running efficient queries so that we can maximize the budget and credits that you've got going through the various tools that you've got, we've got well defined patterns and, and products and services that we know that you'll find value because we crafted these in combination with all of our previous customers. These are the things they needed, and we built answers. And now we're we're mass producing articulating these in a way that is, replicable. Paul? Which they let's establish a firm foundation together and ensure your setup as an organization to leverage this platform to its capabilities and that nothing holds you back. Then when you have, any new activities or anything you are considering, you could call us in on any of these dot points. I want to expand data analytics and usage of this platform. Okay. That's an enablement activity. We can take the community, find people across the business and expand the usage of the system, and therefore, the value that we get from it. We could be called back in because there is a specific, technology that you haven't utilized up until now. It's something like streaming or data science, what are the patterns that we need to put on in order to better support things? So we see ourselves jumping in at either a foundational point at the and pointing the way up this mountain or jumping in as you hit a point where you are just switching back your losing time and agility because you need to, focus on this and it's taking you away from your key activities, we can drop in and give you the direction that you're looking for in order to point you for, you know, for top of the mountain. Rope? I I created this slide, so I'm gonna tell on myself, that helicopter, can't fly that high. So even though I was like, this is a great metaphor, I'll do an in a work helicopter, we're lifting the metaphor of data, the air's tooth in at the top of Everest. So it actually can't fly up there. So please excuse. I know there's probably people like, hey, that helicopter couldn't fly that high. I know. So part of the metaphor. I couldn't my OCD wouldn't allow me to go past the slide without pointing that out. Paul, you didn't expect me to say that. Did you? I didn't. I thought you might suggest this don't burn hard or something that is, you know, we'd be that rescue partner that people may need from time to time. There you go. We're Bernard piloting our helicopter. But wait, there's more. We mentioned, that we've got incentivizations, and we don't want anything to sort of be a barrier for you guys to get started and start testing this platform so you can experience it for yourself. I understand that the market is tight. That is something that is true for us globally. We get to see this play out in every different country and and vertical from North America to Europe to Asia Pacific. But we don't want that to be a blocker. So whether it's a a a platform rapid start for Snowflake or for the ETL tools, or it's, hey, we don't know where we're going, and we wanna talk about it, and we want some clear ideas or we wanna pick tools, we will work with you. Now again, there's there's there's variables in there that can make this, a little bit more complex. For instance, we've got extremely sensitive data, or we are we have to be firm compliant or whatever that might actually had some cost on here. But our goal is is for sort of that base option of a rapid start or a strategy conversation. At least until March twenty twenty four, we really wanna try to do these for free. So if there is something that you need, please reach out to our sales, and Giovanni, who's our our behind the scenes corner of these webinars. If you wouldn't mind, just drop in our, sales email into the chat so that anybody that wants to reach out, please reach out. We have a lot of different options and programs that we can use to get you if not free, then at a very low cost to get yourself going on this platform. But the best thing that we can do is just start chatting about how we can help today. Of course, that's the money part. So That is all of it. So let's get started today. We've got the ability to help you up and down the stack of Snowflake, all the surrounding peripheral technologies, whether it's ETL, or accelerators for the cloud, governance, analytics, or data science And commercially, we are very flexible to make sure that you guys get value on day one. There were no questions in the Q and A, so Paul, I don't know if I have anything to ask you for the the group here. So I'll give you the final thought. Oh, putting me on this spot. Thank you, rope. I I I would just say, look, the team really thrives on a challenge if there is something you haven't been able to graph yourselves internally or you have a business opportunity or pinpoint you are looking to overcome through data. We will thrive on those challenges. We would love to hear more, Rob. And, Giovanni has put Kathy McGregor's contact information into the chat, Kathy MacGregor, who's listening in on this call. She is our our sales lead for Asia Pacific. So you're getting the big ball right off the bat. There you go. Awesome. Well, thank you so much for attending. If you do have any questions or whatever, please reach out. We're always happy to help. If there is a colleague that you think would benefit from hearing this, it will be recorded. And I'm sure part the plan is to share a link with the recording once we get it posted for everyone that registered. Thanks again, and let us know how we could help.

In this webinar, Robert Curtis, Executive Director for Asia Pacific, and Paul Middlewick, Data Lead for Asia Pacific, from InterWorks walked participants through the modern data journey with Snowflake. They explained key milestones for adoption, strategies for platform optimization, and the role of advisory services tailored to business objectives and industry needs. The session covered fixed-cost rapid starts, integrating ETL and ELT tools, cloud architecture accelerators, and best practices for governance, data sharing, and application development. Real-world case studies illustrated scalable analytics, data lakes, and successful transitions from legacy systems, highlighting how Snowflake enables secure, flexible, and cost-efficient data management while fostering communities and driving value from analytics investments.

InterWorks uses cookies to allow us to better understand how the site is used. By continuing to use this site, you consent to this policy. Review Policy OK

×

Interworks GmbH
Ratinger Straße 9
40213 Düsseldorf
Germany
Geschäftsführer: Mel Stephenson

Kontaktaufnahme: markus@interworks.eu
Telefon: +49 (0)211 5408 5301

Amtsgericht Düsseldorf HRB 79752
UstldNr: DE 313 353 072

×

Love our blog? You should see our emails. Sign up for our newsletter!