The Warehouse

BLOG & NOTES

Mar
23
Where's the "T?" A look at ETL vs. ELT

In a previous blog post, we examined the differences between traditional ETL (extract, transform and load) and ELT, where the “heavy-lifting” of data transformation is handled by the robust and scalable (usually cloud-hosted) target platform. In today’s modern cloud data warehouse environment, ELT maximizes the speed at which data is staged and ingested, while leveraging massive computing power in the cloud to cleanse, aggregate and otherwise prepare that data for general consumption. But where is the best place to manage that transformation piece? Is it using cloud-friendly ETL tools, or is it within the management consoles of the cloud DWs themselves?


A common perception a...


May
29
Talend v. Matillion for Cloud Migration

The two leading ETL/ELT tools for cloud data migration are Talend and Matillion, and both are well-positioned for moving and transforming data into the modern data warehouse. So if you’re moving to any type of cloud-hosted DW, whether it is a cloud-dedicated warehouse such as Snowflake, or part of a larger cloud platform such as AWS Redshift, Azure SQL Data Warehouse or Google BigQuery, which tool should you use to move your existing on-prem data?


Both Talend and Matillion can source any kind of on-prem data and land it in a cloud-hosted data environment. They can also move data to and from AWS’s cloud data-storage S3 as well as Azure’s Blob storage (which can be used to s...


Mar
08
Data Integration Roadmap - Part One: The Logical Data Model

Our recent blog series on the data integration portfolio introduced a variety of new architectures that help the enterprise manage their data resources, including replication, virtualization and cloud data warehousing. Organizations are now able to integrate multiple data management solutions to address a variety of business sources and requirements. But it is important to understand that the foundation of any enterprise data management portfolio remains the same . . . a roadmap to data management must be created that is independent of the underlying technology. This series of blogs will examine the three main elements of the data integration roadmap: the logical data model, master data ma...


Sep
06
The Data Integration Portfolio - Part Four: Putting It All Together (In The Cloud)

This blog series has examined the hybrid data portfolio as a mix of technologies and approaches to a data foundation for the modern enterprise. We’ve examined a variety of strategies and technologies in data integration, including virtualization, replication and streaming data. We’ve shown that there is no “one size fits all” approach to an integrated data foundation, but instead have seen how a variety of disciplines that suit specific business and technical challenges can make up a cohesive data policy.


This final chapter puts it all together under the umbrella of “time-to-value" and its importance to the agile enterprise data platform. No matter what the techn...


Aug
15
The Data Integration Portfolio - Part Three: Streaming Data

In previous installments of this series we examined recent trends in data integration, specifically data replication and synchronization, as well as data abstraction through virtualization. Taken individually, all of these approaches are suited for high data latency requirements around historical reporting and trending analysis. In this chapter, we look at real-time streaming data, and how it can complement high-latency data integration approaches to create a complete enterprise data foundation.


Streaming data delivery is often perceived to be the "holy grail" of data integration in that it provides users with immediate and actionable insight into current business operations. In reality, st...


Feb
25
DATA INTEGRATION PORTFOLIO- PART TWO: REPLICATON

In our previous installment on the hybrid data integration portfolio, we looked at the role of data virtualization in a unified, multi-platform approach to creating a managed enterprise data foundation. In this chapter, we examine data replication and synchronization, i.e. the ongoing copying of data (without massaging or transformation) from one physical location to another, usually in conjunction with change data capture (CDC).


Data replication is often considered ETL without the "T", though where ETL is usually a batch-based delivery process, replication is often driven by "update-upon-change". Through this process, the target database only updates when changes occur to the source. Often r...


Pandata GroupLessLess
Pandata GroupLess

Madison

Pandata Group, LLC

316 W. Washington Avenue

Suite 525

Madison, WI 53704

Madison

Pandata Group, LLC

316 W. Washington Avenue

Suite 525

Madison, WI 53704

Less

Pandata Group © 2020

Pandata Group © 2020

Less


Chicago

WeWork/ Fulton Market

220 N. Green Street

Second Floor

Chicago, IL 60607


Chicago

WeWork/ Fulton Market

220 N. Green Street

Second Floor

Chicago, IL 60607

Less