Having it all: The keys to comprehensive data capture

by: Peter Keers

Credit unions aiming to build Big Data & Analytics capabilities have a lot of decisions to make. One of the most fundamental decisions is how much source data to capture. The two dimensions of “how much” are depth and breadth.

Depth

Depth refers to the amount of historical data to be loaded into a data warehouse at inception. Loading a small amount of historical data is an option which may make the Big Data & Analytics launch quicker. For most credit unions, however, trending data over time is a major requirement. This option means it will take years to accumulate the necessary volume of data to support trending.

The other option is to load as much historical data as possible. While this is the preferred approach for most credit unions, there are important factors to keep in mind.

First, it is often the case that at least some of the historical data will be in an archival state. This typically means data that is stored on tape or some other offline medium. It can take significant effort to locate the data, restore it to a temporary repository, and then load it into the data warehouse.

continue reading »