Today is the day! Snowflake is now in preview on Google Cloud Platform (GCP) in the us-central1 region. This news is something Snowflake fans have been looking forward to for quite some time, but now that it’s finally here, I wanted to share why this is a big move in the right direction.
The GCP launch signifies the next steps that Snowflake has been preaching for quite some time, and it’s really exciting to see this come to fruition. Looking at how the Snowflake ecosystem is now changed, we can see the following benefits to moving workloads into Snowflake.
This is a key differentiator that Snowflake has always talked about, even when they were only available on AWS. Snowflake’s availability on AWS, GCP and Azure makes it the only cloud-agnostic data warehouse that is delivered as a service. Redshift, BigQuery and Azure data warehouses all require a relationship with a single cloud vendor, and each of these cloud vendors sells their own tools for ETL, visualization and analysis.
Snowflake does not require a relationship with a cloud vendor, nor do they sell tools for ETL, visualization or analysis. They have one game, and that game is being the best data warehouse available. When you bring together the best data warehouse, the best ETL tool, the best visualization tool and the best analytic tool, you’re going to end up with the best solution. Snowflake’s cloud-agnostic approach makes that much more possible.
When I first heard about the idea of deploying on multiple cloud providers, I was concerned you would still end up with a favorite. After some preliminary testing with Snowflake on Azure, I can confirm that there are minimal differences, if any at all.
At the core of Snowflake’s architecture, they are running virtual machines for compute and blobs for storage. Whether it is Snowflake on GCP, Snowflake on Azure or Snowflake on AWS, the architecture is going to be the same, and it is going to function more or less the same. If I’m running a query, I really don’t care whether it’s an EC2 or a GCE instance delivering it. Snowflake has figured out how to place these platforms on level playing fields, using the same resources in the same way to deliver the same product. This architecture will allow Snowflake customers to easily migrate workloads between the clouds if needed.
Integration with GCP Big Data Tools
I do not have very much experience in the Google Cloud world, but to be honest, every time I hear it mentioned, it appears to have a big data focus. It appears to be the cornerstone that Google will focus their services on, and the announcement states that Snowflake on GCP will soon be able to integrate with Google’s Big Data Analytics platform. This is exciting because that platform contains products like Alooma, Cloud Composer (managed Airflow) and Dataproc. I am personally very excited to get in and start playing with the new toys—I’m already bugging Tim Rhymer about getting us a sandbox.
At Snowflake Summit, there were a few mentions of Global Snowflake. Global Snowflake is the idea of a data warehouse that is truly cloud agnostic, offering cross-cloud and cross-regional replication that would provide unmatched availability, disaster recovery and other benefits. In this world, you are just a customer of Snowflake, and they orchestrate the interactions between Cloud Providers on your behalf.
The launch on GCP pushes Snowflake closer to this goal, and I am excited to see the ideas here come together. As GCP pushes into preview, the future is bright for Snowflake.
This launch is exciting for Snowflake and for cloud data warehousing as a whole. Snowflake is placing all the power of your data in your own hands, and we couldn’t be more excited to continue partnering with them.
You can find the full press release from Snowflake on their website. Feel free to drop comments below and let me know what you think. As always, if you have any questions about cloud data warehousing, feel free to reach out to me on LinkedIn.