I can hear the groans now for another blog on Big Data. Well, let’s focus on the positive instead – the value it brings. Financial institutions can gather massive amounts of transaction information and process analytics to reveal opportunities within their customer base. Oil and gas producers can use predicitive analytics to identify potential reserves. telecom providers can better understand market penetration and forecast the potential of customer-facing products and services, and so on.

While we’re here, “not another blog about how difficult Hadoop is!”, but instead, again let’s look at the value derived from Hadoop. By now, unless your name is Robinson Crusoe, you’ll know that Big Data is something you should be looking at, and if not you should be looking at it soon. Four key words for Big Data: Volume, Velocity, Variety, and Spread – how much is being created, how quickly, in what form and where. The Value from Hadoop (and wouldn’t the marketers amongst us love to have the word “spread” beginning with a “v”) is derived from its capability to address each of these keywords.

The attraction of Hadoop is the rumored lack of need to upgrade machines. Well, that’s another debate for another blog, but its real attraction is found in it using clusters of machines and coordinating work across them, and these clusters can be scaled to dizzying amounts of bytes that most enterprises dream or sweat about, depending on who you are in the organization. At the same time it can be used with existing business intelligence systems, again preventing the need to mortgage the ranch.

Deploying Hadoop, however, does require change o many levels, from establishing a starting point for determining the right plan, defining an appropriate architecture, instituting new oprating models, and very key, creating an executable plan that works. You need a Roadmap! An executable roadmap will be based in a cross-departmental, collaborative approach, with a prioritized list of activities that eases the transition and provides the “workable” bit in that it should accelarate the Big Data analytics journey.

Get everyone together.

– establish primary stakeholder objectives

– define the business challenges

– identify risks, issues and mitigation strategies. what should be the scale of Hadoop deployment? How do you manage it? What are the key migration considerations?

– refine requirements. by identifying critical succes factors you can create a point of reference to help direct and measure the Hadoop project

– confirm the key business drivers, key to how your Hadoop installation should be shaped and how the risks, issues and mitigations are to be addressed

– develop the roadmap

The case for Hadoop is very powerful, and because your competitors are deploying, chances are, you will soon, if you haven’t already.

Learn all about how to build a roadmap at HP DISCOVER,  

#HPDiscover www.hp.com/services/bigdata