Today’s businesses continue to strive for data maturity and to create a culture that is data-driven and data-literate. But the perception remains that starting such a journey is expensive and time-consuming. There is a belief that a combination of high upstart costs and lengthy implementation time prevents the business from seeing any near-future value in its data and analytics investment. The reality, however, is that today’s cloud-based data platform technologies, specifically the triumvirate of Snowflake (data storage and compute), Matillion (data loading and transformation), and ThoughtSpot (data analytics and insight), enable rapid analytic ability with minimum initial inv...
The time it takes from beginning a data cloud initiative to realizing any of tangible value can be years.
After all, there is a lot to consider. First you need to ensure that there is business alignment, between the business and I.T., on what the short, mid, and long-term objectives are. Second, there is the small task of assessing your entire data estate. It is important to figure out with each vendor which modern data stack configuration, workload option, availability zone, and pricing plan is the optimal configuration for you.
Third, you need to build a migration plan, defining a strategy for each possible candidate analytical application and data set, that will drive value to the business ...
One of the challenges that many Snowflake administrators face is the daily monitoring of user and resource activity in their environment. The advantages of Snowflake's consumption-based pricing model (instantly-scalable compute sizing, usage-focused scheduling, etc.) are best employed when compute and storage activity is transparent to the business. Snowflake includes an ACCOUNT_USAGE view-based schema in the out-of-the-box Snowflake database than contains all the information related to account activity. This data can, of course, be directly queried like any other data, but wouldn't it be nice to have a single view of key metrics around data storage and activity that can be monitored day to ...
We have all seen how more and more companies are moving to the cloud for their data management platforms. Snowflake, Azure Synapse, AWS Redshift, and Google Big Query are leading this charge towards low-admin, instantly scalable cloud database solutions. Accompanying this is a migration to cloud-hosted data integration and low-code ETL solutions like Matillion and Fivetran. It is tempting to assume that with all these low-overhead data management platforms the concept of data modeling may be a thing of the past, relegated to the pile of on-premise databases that this brave new world is supplanting.
In reality, data modeling is more important than ever. A key to understanding this importance i...
If you are like many ETL developers you’ve struggled with an easy way to source cloud services data via REST API. Although standards are in place for REST API web services protocols, it seems that every vendor has their own variation of them, creating new challenges for each new source. Matillion’s cloud ELT product has long featured an API profile creator that sources from JSON files and creates RSD (Real Simple Discovery, an XML format) scripts for use with API query components. The effectiveness of this approach, however, is only as good as the quality of JSON files provided by the vendor.
Now, with version 1.47, Matillion introduces much more simplified functionality for extra...
Enterprises running SAP Business Warehouse (SAP BW, BW/4HANA and BW on HANA) are keeping a close eye on challengers like Snowflake. Is cloud data warehousing the answer to all the challenges for organisations with a large SAP footprint? And how does a cloud data warehouse fit into the data architecture? Should it replace SAP BW or is there still value in SAP BW?
Why do most enterprises with a large SAP footprint run SAP BW?
The dominant position of SAP BW is easy to explain from a historic perspective, but it would not do justice to SAP BW to ignore its current strengths as well. Let us start with the latter. SAP BW is still the only data warehouse platform which delivers all data warehouse f...
In 2015, I was fortunate enough to lead the sales development effort of a cloud-based supply chain visibility product. The product was geared toward the manufacturing sector to allow a more transparent and collaborative platform to enable information sharing. We leveraged the Salesforce cloud to build the application, and it was intended to support digital transformation efforts within the supply chain to optimize the flow of information, provide real-time data to the supplier, empower collaboration, and scale for adoption. Unfortunately, the ability to execute on the vision failed and it was tied to one key component – the part on delivering real-time data.
Let’s fast forward to...
The rise of personal computing in the 1980s and 90s led to a boom in business productivity that was transformative in its scope. Suddenly businesses had the power of what were formerly room-size computers on their desktops. This period saw the rise of the “knowledge worker” and the digitization of business.
But it was the impact of the internet that really drove business to the next level. All of those isolated desktop computers were now connected via a world wide web to enable communication, marketing and commerce without barriers. An open exchange of innovation and ideas fostered rapid growth, collaboration, and the global marketplace. The inter...
The rapid growth of adoption of Snowflake cloud data warehouse is due to many factors. The instant scalability, minimal administration, accessibility from anywhere, separation of storage and compute resources, usage-based cost management . . . all of these and many others have contributed to Snowflake’s meteoric rise as the de facto cloud-based data platform. But a feature often overlooked (and potentially underutilized) is one of Snowflake’s most compelling: the ability to securely share data with outside entities without actually sending or transferring it.
For years organizations have had to rely on secure FTP to safely transfer data to outside entities. This ...
The data vault has long been viewed as a model best suited for historical and archival enterprise data. Its “insert only”, business-process approach to raw, unadulterated data is ideal for low-maintenance storage of all enterprise-generated information from all systems. Use cases for data vaults have traditionally revolved around historical tracking and auditing . . . however, the perception has largely been that it is ill-suited to analytics due to its many-to-many relationships and dispersed structure. In fact data vaults are often used as a “lightly modelled stage” for traditional star-schema data warehouses.
But the data vault may be best suited for a use case that...