The rapid growth of adoption of Snowflake cloud data warehouse is due to many factors. The instant scalability, minimal administration, accessibility from anywhere, separation of storage and compute resources, usage-based cost management . . . all of these and many others have contributed to Snowflake’s meteoric rise as the de facto cloud-based data platform. But a feature often overlooked (and potentially underutilized) is one of Snowflake’s most compelling: the ability to securely share data with outside entities without actually sending or transferring it.
For years organizations have had to rely on secure FTP to safely transfer data to outside entities. This usually meant manually exporting system data to a file (invariably a delimited Excel spreadsheet), establishing a shared FTP location (which usually meant dealing with firewalls and other network security protocols), and scheduling the transfers to recur at specific intervals (in addition to scheduling the retrievals on the other end). The receiver of the data would then have to contend with integrating the file into their enterprise data platform, hoping that the formatting remains consistent with each new transfer. To describe this process as “clunky” would be putting it mildly.
Those days are now behind us thanks to Snowflake’s Secure Data Sharing feature. The unique cloud architecture of Snowflake is what enables this: no actual data is copied between accounts. Instead, Snowflake’s cloud services layer “shares” the metadata of the source data with the recipient as a read-only data instance. The provider can share entire databases, selected schemas within that database, or even individual tables and views, providing granular control over which data elements will be accessed by the outside consumer. A “reader account” (with its own URL) is then set up by the provider, who has total control over consumer access to that account (including users and roles). The consumer can then query that database (or use it as a source for their own Snowflake data integrations) using their own Snowflake compute resources (the consumer is not paying storage costs for the shared data, the provider does).
The real beauty of Secure Data Share is that it is live data. This is the true death knell of FTP . . . the consumer always has the most up-to-date version of the data. Once the consumer accesses the data share, he/she can create a database from that share and utilize it as if it was his/her own. The consumer never needs to worry about “updating” that shared data with new files or other forms of manual extraction that may render the data stale and obsolete before it is ever queried.
Once you have mastered Secure Data Sharing, you are ready for the next step: creating a Data Exchange. This is a sharable data hub where data instances are published and discovered by consumers. It is currently in preview from Snowflake . . . this and the Snowflake Data Marketplace will be a subject for a future blog post. Meanwhile, enjoy the freedom and security of painlessly sharing data live data for better collaboration and insight among your customers, vendors and partners.
About the Author:
JOE CAPARULA is a Senior Consultant with Pandata Group who specializes in delivering data architecture and data integration services for clients across several industries.