How to Clean up Temporary Matillion Stages in Snowflake


How to Clean up Temporary Matillion Stages in Snowflake

When using Matillion to load data into Snowflake, temporary stages are often created in Snowflake to store the data before it lands in the destination table. Often, this is not an issue as Matillion provides a handy load option called “Clean Staged Files,” which cleans up these stages. For example, this is a possible load option when using the Database Query component.

However, there may be occasions where the “Clean Staged Files” load option has been disabled and you have a large number of files building up in your Snowflake environment. Alternatively, you may find that you end up with a lot of empty stages that do not contain any files but still exist and lead to confusing bloat when administrating your Snowflake environment. In this brief guide, we will walk through how to remove these stages.


  • ACCOUNTADMIN privileges to your Snowflake environment, or equivalent privileges to query the SNOWFLAKE.ACCOUNT_USAGE.STAGES table in the shared SNOWFLAKE database for your account
  • MODIFY privileges on stages created by Matillion; most easily achieved by having access to the role that Matillion leverages or one of its parent roles

Dynamically Creating DROP Statements

The following SQL code can be executed in Snowflake to query the SNOWFLAKE.ACCOUNT_USAGE.STAGES table and generates a series of statements to DROP each temporary stage:


You may wish to add a few more conditions to the WHERE clause depending on your use case. This statement will yield a result similar to the following screenshot:

See that Copy button the big blue arrow is pointing to? This will allow you to quickly copy the output of your query. You can then paste it into a new worksheet and execute it. This will quickly and cleanly delete all of those temporary Matillion stages. We’ve even commented out the header record already.

And so concludes this short and sweet guide. I hope you find it useful!

More About the Author

Chris Hastie

Data Architect
Automated Ingestion from Azure Storage into Snowflake via Snowpipe If you wish to automatically stream/ingest data from an Azure Storage container into Snowflake, using native functionality, you will ...
Configuring Storage Integrations Between Snowflake and Azure Storage If you have data stored in an Azure Storage container and wish to ingest it into Snowflake using native functionality, you will most ...

See more from this author →

Subscribe to our newsletter

  • I understand that InterWorks will use the data provided for the purpose of communication and the administration my request. InterWorks will never disclose or sell any personal data except where required to do so by law. Finally, I understand that future communications related topics and events may be sent from InterWorks, but I can opt-out at any time.
  • This field is for validation purposes and should be left unchanged.

InterWorks uses cookies to allow us to better understand how the site is used. By continuing to use this site, you consent to this policy. Review Policy OK


Interworks GmbH
Ratinger Straße 9
40213 Düsseldorf
Geschäftsführer: Mel Stephenson

Telefon: +49 (0)211 5408 5301

Amtsgericht Düsseldorf HRB 79752
UstldNr: DE 313 353 072