Thursday 31 May 2012

The Journey from Data Warehouses to Big Data


The Journey from Data Warehouses to Big Data
Data Warehouses were driven by the explosion of data within corporate ERP and CRM systems and the need for better management reports. Generally speaking, the standard reports from these packaged systems did not fit the needs of most corporate so initially there was a huge demand to download data to Excel or Access. This approach spawned a huge industry out of end user computing with the report development process in the hands of power users within each business group. While at the time it certainly added value, if left unchecked it created huge internal disconnects for management reporting processes. The advent of better replication and BI tools helped corral some of the wild west attitude that prevailed, while in parallel the management reporting processes, integrated ERP/CRM user friendly reports improved but most importantly, the deployment of Data Warehouses aimed at delivering really useful end user reporting started to gain traction.

If you were not around to see the buzz and hype that surrounded the planning and deployment of Data Warehouses then you might think this all happened really quickly and industry got real value from most deployments. The real truth is that most Data Warehouse deployments in the early years struggled to gain traction due to several factors and for the first 10 years there were many costly failures.

The data stored in Data Warehouses is primarily extracted from the ERP and CRM systems so thus comes from known sources that are updated at regular intervals and where the ERD is under tight control. Big Data deployments typically have several new data sources that are coming from outside the control of the corporate IT group. The data structures, data provenance, data quality, timing of updates and several other factors are not within the control of the consumers in most cases. These data sources can provide data in several formats which include even unstructured data.

If we are to believe in the vision that “data is the new oil” for service enabled businesses of the future, then businesses must develop staff, technology and processes that will assist in unlocking the value beneath the surface. The challenge is a very different one to deploying Data Warehouses so the staff and approach need to be adjusted. The future vision of business and government functions powered by Big Data enabled services disrupts the existing paradigms of static processes that are occasionally reviewed and changed. The new world business order will have even more nimble business processes where we understand not just what happened so we can report on that historical event or what even might be a feasible scenario that could play out but we will better understand the factors influencing these events so we can do further models and react closer to real time. The organisational ability to react to change will be key, organisations that are better informed with nimble processes will be better positioned to react to downturns or take advantage of upturns as the opportunities arise.This is where the FuturICT Flagship www.futurict.eu hopes to play an important role in the development of this new Big Science.

No comments:

Post a Comment

Note: only a member of this blog may post a comment.