PERSPECTIVES

BLOGS & NOTES

Mar
15
Data Integration Roadmap Series - Part Two: Master Data Management

When planning your integrated enterprise data environment, it is impossible to understate the importance of master data management. Much has been written about MDM, and it encompasses a broad range of (mostly non-technical) disciplines that are beyond the scope of a single blog entry. Here we will provide a broad overview of the four main areas of MDM to start your journey towards enterprise data governance. We will also examine the relationship between MDM and recent developments around other enterprise management programs such as Product Information Management.

What is master data management? Quite simply it is the administrative oversight of organizational “data as an asset” to maintain its consistency and credibility. It ...


Mar
08
Data Integration Roadmap - Part One: The Logical Data Model

Our recent blog series on the data integration portfolio introduced a variety of new architectures that help the enterprise manage their data resources, including replication, virtualization and cloud data warehousing. Organizations are now able to integrate multiple data management solutions to address a variety of business sources and requirements. But it is important to understand that the foundation of any enterprise data management portfolio remains the same . . . a roadmap to data management must be created that is independent of the underlying technology. This series of blogs will examine the three main elements of the data integration roadmap: the logical data model, master data management (including metadata management), and t...


Oct
18
ETL vs. ELT - What's The Difference and Does It Matter?

For most of data warehousing’s history, ETL (extract, transform and load) has been the primary means of moving data between source systems and target data stores. Its dominance has coincided with the growth and maturity of on-premise physical data warehouses and the need to physically move and transform data in batch cycles to populate target tables efficiently and with minimal resource consumption. The “heavy lifting” of data transformation has been left to ETL tools that use caching and DDL processing to manage target loads.

However, the data warehouse landscape is changing, and it may be time to reconsider the ETL approach in the era of MPP appliances and cloud-hosted DW’s. These architectures are characteriz...


Sep
06
The Data Integration Portfolio - Part Four: Putting It All Together (In The Cloud)

This blog series has examined the hybrid data portfolio as a mix of technologies and approaches to a data foundation for the modern enterprise. We’ve examined a variety of strategies and technologies in data integration, including virtualization, replication and streaming data. We’ve shown that there is no “one size fits all” approach to an integrated data foundation, but instead have seen how a variety of disciplines that suit specific business and technical challenges can make up a cohesive data policy.

This final chapter puts it all together under the umbrella of “time-to-value" and its importance to the agile enterprise data platform. No matter what the technology, data strategies invariably involve movi...


Aug
15
The Data Integration Portfolio - Part Three: Streaming Data

In previous installments of this series we examined recent trends in data integration, specifically data replication and synchronization, as well as data abstraction through virtualization. Taken individually, all of these approaches are suited for high data latency requirements around historical reporting and trending analysis. In this chapter, we look at real-time streaming data, and how it can complement high-latency data integration approaches to create a complete enterprise data foundation.

Streaming data delivery is often perceived to be the "holy grail" of data integration in that it provides users with immediate and actionable insight into current business operations. In reality, streaming has primarily been utilized in conjunc...


Jun
11
Feb
25
DATA INTEGRATION PORTFOLIO- PART TWO: REPLICATON

In our previous installment on the hybrid data integration portfolio, we looked at the role of data virtualization in a unified, multi-platform approach to creating a managed enterprise data foundation. In this chapter, we examine data replication and synchronization, i.e. the ongoing copying of data (without massaging or transformation) from one physical location to another, usually in conjunction with change data capture (CDC).

Data replication is often considered ETL without the "T", though where ETL is usually a batch-based delivery process, replication is often driven by "update-upon-change". Through this process, the target database only updates when changes occur to the source. Often referred to as "just-in-time" data, this repres...


Jan
18
ENTERPRISE DATA AND ANALYTICS (EDA) FOR HEALTH PLANS

What can EDA do?

EDA is the governed integration of disparate data collection applications, into a centralized “source of truth”, for enhancing business acumen to improve and optimize decisions and performance. EDA allows health plans to easily:

  • Analyze provider performance in order to optimize pay-for-performance

  • Evaluate treatment patterns across providers and better tailor care management program

  • Identify underlying opportunities for health education and intervention with high-risk members

Sustaining fixed EDA support and overhead costs can allow an organization to continue growth and improvement, thus providing an infinite return on investment (ROI).

Why do you need an EDA?

  • How are your patient outcomes improving?

  • Are you doing the same things but expecting different results?

  • Have your past investments become your “legacy” systems?

  • Are you experiencing rising costs, long revenue cycles, and bad debt?

  • Who are the health care provider/vendors that are performing well?

  • What lost economic opportunity (LEO) have you missed?

  • Are you maximizing your ROI?

If you answered yes to any of the above questions, then you need EDA to give you give the best assessment of you organization performance, so that you can make the best decision for your organization. Even the smallest improvement in everyday decision-making can improve corporate perfo...


Jan
10
THE DATA INTEGRATION PORTFOLIO - PART 1: VIRTUALIZATION

The challenges facing organizations to integrate and make sense of the plethora of internal and external data continue to grow. Not only is there diversity in data sources, but the requirements from business units involve a mix of batch, real-time, on-demand and virtualized capabilities, often from the same source for different use cases. It is rare to find a "one-stop-shop" solution to these varied data integration needs, so organizations end up with a hodgepodge of redundant, overlapping products, or worse, rely on custom internal coding with no traceability or modularity.

In this series of blog posts, we will examine the "hybrid data integration portfolio" as a planned approach to handling multiple data integration requirements with a...


Dec
28
FOUR ENGAGEMENT MODELS PANDATA GROUP OFFERS TO OUR CLIENTS

Pandata Group works with clients all over the Midwest who have begun the journey of being data-driven. These clients each have different needs, require different resources, and expect different levels of control and responsibility in their relationships with us.

To meet the requirements of each unique client, Pandata Group offers four distinct engagement models, which are frameworks that define collaboration between us and our clients for projects such as data mart implementation or BI application development. We call these engagement models Extended Team, Specialized Team, Managed Product and Managed Services. All four models ensure the right mix of expertise and transparency during the development process.

We will describe Extended...