Data Integration Roadmap - Part One: The Logical Data Model

Our recent blog series on the data integration portfolio introduced a variety of new architectures that help the enterprise manage their data resources, including replication, virtualization and cloud data warehousing. Organizations are now able to integrate multiple data management solutions to address a variety of business sources and requirements. But it is important to understand that the foundation of any enterprise data management portfolio remains the same . . . a roadmap to data management must be created that is independent of the underlying technology. This series of blogs will examine the three main elements of the data integration roadmap: the logical data model, master data management (including metadata management), and t...

ETL vs. ELT - What's The Difference and Does It Matter?

For most of data warehousing’s history, ETL (extract, transform and load) has been the primary means of moving data between source systems and target data stores. Its dominance has coincided with the growth and maturity of on-premise physical data warehouses and the need to physically move and transform data in batch cycles to populate target tables efficiently and with minimal resource consumption. The “heavy lifting” of data transformation has been left to ETL tools that use caching and DDL processing to manage target loads.

However, the data warehouse landscape is changing, and it may be time to reconsider the ETL approach in the era of MPP appliances and cloud-hosted DW’s. These architectures are characteriz...

The Data Integration Portfolio - Part Three: Streaming Data

In previous installments of this series we examined recent trends in data integration, specifically data replication and synchronization, as well as data abstraction through virtualization. Taken individually, all of these approaches are suited for high data latency requirements around historical reporting and trending analysis. In this chapter, we look at real-time streaming data, and how it can complement high-latency data integration approaches to create a complete enterprise data foundation.

Streaming data delivery is often perceived to be the "holy grail" of data integration in that it provides users with immediate and actionable insight into current business operations. In reality, streaming has primarily been utilized in conjunc...


Data Architecture, SAP Data Services, Agile data mart, ETL Development

Can ETL Be Agile?

Business intelligence projects benefit greatly from an agile development approach. Since BI closely aligns IT with business, an iterative delivery model ensures that business stakeholders are always involved in the design process and that a constant dialogue is maintained. The objectives and benefits of agile project management include:

  • Response to rapidly changing requirements

  • High degree of customer involvement

  • Quick results

  • Progress measurement

  • Team motivation

This approach has traditionally applied to the development of the presentation, or “customer-facing” layer of BI. But how does an agile project manager make “upstream” processes like data architecture and ETL part of the iterative deliverable? Much of the “data plumbing” remains ...