As CIOs begin to grapple with the new world of “big data” – data coming at them from new sources, in unprecedented volumes, and with different usage patterns than they are used to managing – some of them discover that big data can have unexpected fellow travelers: master data management—managing the critical data defining who and what a company works with, whether it be customers, suppliers, staff, products, or services—and data de-duplication.

These folks use the need to figure out how to handle the new, the unusual, and the unexpected as an opportunity to focus more clearly on how to bring new levels of order to their traditional structured data.  With new streams and types of data coming in, they realize it is more critical than ever to know what the data they have means and where the “systems of record” are for all the important data.  So, master data management becomes a new priority and a key discipline.  Clearly identifying the sources of definitive information in the data landscape can (among other things) help an organization store only what it needs and avoid unnecessarily replicating data in multiple systems.

These CIOs also convert the need or prospect of having to cope with new volumes of data into a jumping-off point for driving new efficiencies in management and provisioning. De-duplication is one key technology in making data center storage more efficient, and one that is getting a lot of attention from technology leaders who are eager to get very tight in their execution on enterprise storage management before the floodgates open on new sources and typed of content. 

Bottom line: Even before big data becomes a reality in a data center, a savvy CIO can use it as a goad to achieve improved order and efficiency.  Both will serve IT well in coping with big data when it does arrive.