This series highlights how Snowflake excels across different data workloads through a new cloud data platform that organizations can trust and rely on as they move into the future.
Five years ago, in the distant past that still included large, boozy, in-person technology conferences, I met two guys from an upstart cloud data warehousing company called Snowflake. Walter Aldana and Uday Keshavdas were those two guys, and while they are both great and were passionate about Snowflake, I was unconvinced at the time. I left the conversation telling them I’d take a look at the product and get back to them. We were in the midst of peak Hadoop, although it was becoming obvious to the smart money that Hadoop wasn’t going to be it for the types of analytics use cases that InterWorks was helping customers with. However, we didn’t really yet know what was going to be the next thing. Knowing that something won’t work doesn’t always tell you what is going to work, but understanding that it will never work can keep you out of trouble. At the time, I thought there was still some life in monolithic, in-memory columnar stores. There was (for a time), but it was never a great fit for us.
The future hadn’t yet come completely into focus.
In running partnerships at InterWorks, I’ve never felt a lot of urgency to jump for the sake of jumping. No action is often the correct move. Getting on the wrong boat is worse than missing the boat as a consultancy. Being wrong about something doesn’t just cost time and money—it also costs client trust, which is exceedingly hard to gain in the first place.
First Impressions
I eventually kept my word to Walter and Uday and spun up my trial account with a very early version of Snowflake. I found it so radically different that it was disorienting at first. No COPY LOCAL? Why are so many concepts overloaded words that already have established meaning in data? Just what the heck is a stage? The tutorials were fine, and the product worked, but it was so far ahead of everything else, I am unashamed to admit that I didn’t really get it. I understood that the concepts of separated storage and compute were incredibly powerful, but a lot of the ways Snowflake would ultimately transform our customers’ world and usher in a second wave of cloud data warehousing were not yet obvious to me at the time.
I left those early evaluation sessions in 2015 thinking, “Oh, this is neat, but not yet.” Had I really grasped a lot of the implications of what was already there, I would have had no choice but to shout it from the rooftops: HEY, THIS IS GOING TO CHANGE EVERYTHING! Instead, I kindly thanked Walter and Uday for their interest in InterWorks and told them to call me back when they had customers.
Timing Is Everything
Fortunately for me, Walter and Uday did call me back 18 months later after they found, as Bob Muglia called it, “a product/market fit.” In early 2017, InterWorks became a Snowflake partner, and I began a campaign of annoying the hell out of everyone at InterWorks with how interesting Snowflake was. At the time, I remember feeling a sense of urgency, like we needed to make up for lost ground that I’d cost us with my initial response to Snowflake. However, a lesson that I learned in the early days of our relationship with Tableau was that there is early and there is too early. We were painfully too early with Tableau. Dan Murray and I spent our first year of that partnership selling almost nothing. Fortunately, we hadn’t repeated that same mistake with Snowflake, and we also hadn’t missed being a first mover. It wasn’t until the end of 2017 that we started seeing major successes explaining the transformative power of Snowflake to customers. Once it started happening in earnest, I was confident that this is what the future of the data management practice at InterWorks was going to look like.
“We have seen [Snowflake] grow from a small company with a refreshing take on cloud-native databases into an absolute rocket ship of a company that is upending how the whole world thinks about storing, processing and querying data in the cloud.”
Snowflake quickly became the foundation of our modern cloud data management solutions and rose to the level of our most strategic partnerships. We have seen them grow from a small company with a refreshing take on cloud-native databases into an absolute rocket ship of a company that is upending how the whole world thinks about storing, processing and querying data in the cloud. In addition to those fundamentals, Snowflake has introduced several capabilities that weren’t even on the radar in 2015, such as multi-cloud replication, public and private data exchanges eliminating rote ETL and file transfer, and automatically suspending and resuming a database based on a shared-services architecture. They’ve done this in a platform that is easier for customers to own and manage, and it’s often significantly cheaper than the solutions we were discussing with our customers five years ago.
Where Are We Now?
It’s now the year 2020, and the world probably looks quite different than any of us expected. I, for one, didn’t have a Donald Trump presidency and global pandemic in my “Where are you going to be in five years?” predictions. I doubt many of you did either. However, if there is anything that is constant in my tenure at InterWorks, it is change. The world is always changing. So, too, has Snowflake changed. As additional components of the platform have emerged, Snowflake as a product has grown from an exceptional data warehouse into a complete cloud data platform, enabling organizations to take on even broader use cases with a single platform.
In the following posts of this series, we are going to examine each of the major use cases that Snowflake is addressing in 2020 as a complete cloud data platform. Stay tuned.