How next-generation technologies enable enterprises to challenge traditional data warehousing assumptions
As demand for responsive Business Intelligence (BI) and Business Performance Management (BPM) grows, global enterprises are still turning to data warehouses as their preferred source of data for analysis  . The principle of gathering corporate data into a single, consistent store remains perfectly valid, but as businesses are constantly changing, the practice of traditional data warehousing can prove complex, costly and prone to failure.
The fundamental problem is that traditional data warehousing methodology promotes stasis of the business model, but businesses thrive on change. The difficulty of reconciling these opposites is a major contributor to why four in every ten data warehouse implementations are expected to fail .
Conventional data warehousing wisdom says that you should plan for a lengthy and expensive implementation, that you will need an army of skilled project managers and technicians, and that you can forget about trying to reflect the changing state of your business: a data warehouse is static data in a static model, custom-built to meet fixed user requirements.
However, in order to be able to adapt intelligently and at high speed to new competitive challenges, business users need access to information that remains consistent however much their organisation is changing. The cost and time overheads of re-coding a conventional data warehouse to track every change in the business are prohibitive, so reporting in such an environment will always be delayed or inaccurate, and business intelligence initiatives will not deliver actionable conclusions.
Leaders of responsive, ROI-conscious enterprises rightly observe that this is no way to support a business. Rather than moulding their business models to fit in with what data warehousing convention says is possible, major companies such as Royal Dutch/Shell Group, HBOS plc, and Unilever are breaking the rules, using next-generation tools and methodologies that make data warehousing responsive to their businesses, and highly cost-effective.
Next-generation data warehousing assumes that both the business model and reporting requirements are ever-changing. This enables businesses not only to obtain up to date business intelligence, but also to compare present, past and predicted performance, no matter what the business structure is at any given time. This enables business leaders to run truly adaptive enterprises, capitalising on opportunities and reacting to global events faster than the competition.
The conventional rules - and how to break them
By using a data warehousing application with a generic data structure, users can create customised data warehouses without the usual cost or time overheads.
With next-generation data warehousing, defining an end-point is no longer necessary, giving business intelligence and performance management tools the ability to be adapted to changing user requirements. The latest data warehousing techniques make it easier to define new data feeds and alter existing ones, as new star schemas can be automatically created. Adding a new transaction data set, or modifying an existing one and then regenerating the star schema, is a point-and-click operation. Business users can also alter their own reporting and querying requirements through defining and managing their own data marts.
Global enterprises may introduce new brands, acquire competitors or sell off under-performing business units on a daily basis, so freezing the business is an impractical proposition. By separating data from the business model, and allowing multiple models to co-exist, next generation data warehousing enables the data warehouse to evolve at the same speed as the business even during implementation.
Next generation data warehouses provide a generic data structure that separates transaction and reference (business context) data from the current business model, and stores them all as separate entities. This makes it possible to view all of the organisation's collected data according to past, current or future business models. A clear view of data in current and future business models is particularly important during merger and acquisition activity, where it enables decision-makers to compare pre- and post-merger performance at high speed and low cost.
By storing data separately from its model, enterprises can support multiple business models across a federation with greater ease. Synchronisation can be handled automatically, with new business models distributed over the internet, and reporting controlled from a central point for maximal cost-effectiveness.
By using a pre-built data warehousing application that can quickly be adapted to suit the business, then managed by business users via a simple interface, enterprises can create and run data warehouses without the investment in programming skills normally required - and without needing a skilled database administrator for every local instance.
Enterprises that use data warehousing applications rather than building from scratch can expect much faster implementation at significantly reduced cost. Next-generation data warehousing software also gives enterprises the opportunity to change the structure and purpose of the data warehouse during the implementation cycle, reducing the need for exhaustive pre-planning and dramatically cutting the risk of project failure.
The next generation goes live
Next-generation data warehousing is not merely a blueprint for the future, but a reality in major enterprises around the world, where it is saving time and money, and delivering a clearer and more accurate view of performance throughout change.
Take for example Shell OP, the various Oil Products businesses within the Royal Dutch/Shell Group. Shell OP needed to accommodate independently-changing local, regional and global business models and data structures, while providing a standardised global view of business performance. According to the standard assumptions about data warehousing, the cost of designing, building and maintaining such a system would be astronomical, and the system would have a high chance of failure.
Challenging the rules, Shell OP successfully built a federation of over 60 data warehouses covering over 80 countries in just 18 months, a timescale that would have been inconceivable under the conventional rules of data warehousing. The solution brings together management information to support standardisation and segmentation, with global and local views of key business entities such as customers and products. The federative approach permits any number of localisations to co-exist with the common corporate data model, giving a consistent top-down view without forcing a structure on individual operating units.
Global FMCG giant Unilever regularly undertakes mergers and acquisitions, so it needed a data warehouse that would not require its multiple business models to remain static. The company also needed to be able to view historical brand performance, in order to measure the effects of restructuring initiatives. Unilever successfully broke through the constraints of conventional data warehousing, building a flexible and cost-effective solution that has delivered rapid results.
Using next-generation data warehousing technology, Unilever has succeeded in bringing together complex, time-variant data from numerous systems, and is using this data to deliver relevant and timely management information directly to business users. The company now has commonality across supply-chain, brand, customer and financial data, all cross-referenced by the same master reference data warehouse, ensuring greater consistency and accuracy of information.
The solution has made a substantial contribution to savings in procurement, and expanded Unilever's ability to view the historic and projected performance of global brands across financial and non-financial measures.
When Halifax and Bank of Scotland merged to form HBOS plc, the board wanted to integrate procurement data across the whole organisation in order to facilitate cost savings. Conventional wisdom dictated that a custom-built data warehouse would be needed, and that HBOS would need to define an end-point very carefully before starting any work. HBOS could not accept these constraints, because the nature of its ongoing business evolution meant that its organisational structures would be changing regularly. Furthermore, HBOS needed an operational data warehouse as quickly as possible, since the board of directors wanted to use the cost savings made within the first few months of the merger as proof of its success.
With conventional data warehousing methodology, this degree of flexibility would have been at worst unfeasible, and at best expensive and slow to build. HBOS used a data warehousing application to bring together data in different coding structures, and was able to give business users a clear view of the merged procurement information within just three months, without affecting its ability to view data according to the old business models.
Breaking free from constraints
Enterprise leaders seeking to improve the ROI of their management information initiatives no longer need to feel that data warehousing technology holds them back. As the above examples demonstrate, new software and methodologies make it possible to create highly responsive data warehouses that can be managed at low cost in rapidly-changing business environments. These data warehouses can deliver a consistent view of the past and the present without requiring any costly changes to source systems, and automatically adapt to business change.
By challenging restrictive assumptions about data warehousing, enterprises can develop the flexibility they need, but without having to make unsustainable investments in technology. In a climate of cost-cutting, can any enterprise afford to ignore next-generation data warehousing?
 A Harte-Hanks information integration survey published February 2003 found that 54 per cent of Global 2000 companies are implementing a data warehouse, and 27 per cent plan to do so in the next 12 months. The survey was commissioned by Kalido Group, and was based on interviews with 154 respondents from the US, UK, and the Netherlands.
 Cutter Consortium, Corporate Use of Data Warehousing and Enterprise Analytic Technologies, December 2002. According to the report, the addition of features during development is a primary reason for data warehouse project failures.
About the Author
Cliff Longman, is the Chief Technical Officer, Kalido Group. Mr. Longman is setting long-term strategic product and technological directions for Kalido Group's information integration portfolio. He has a strong track record in this field, having provided independent consulting services in Object Orientation, Information Management and Information Systems Strategy. Before becoming a consultant, Longman spent nine years at Oracle with responsibility for new technology including the architecture of the Oracle Designer 2000 product set. Cliff graduated with a 1st Class Honours degree in Computer Science from Coventry University. Email him at Cliff.Longman@kalido.com; telephone +44 (0) 207 934 3300.
Kalido (www.kalido.com) is a leading data warehouse lifecycle management (DWLM) software provider. The KALIDO® DWLM Suite automates data warehouse creation and modification to provide business users with timely and consolidated management information throughout organizational change. This ability empowers IT and the business to become more closely aligned to help accelerate financial reporting and enhance performance management of key business entities such as customers, brands, sales, marketing, supply and logistics.
The KALIDO DWLM Suite consists of adaptive data warehouse and master data management software. This software application portfolio allows users to rapidly build and maintain an enterprise data warehouse and its master data without the need for programming or operational system standardization. The software suite is designed upon UK-patented technology that enables the business model within the data warehouse to be easily modified to keep it continuously up to date with the business as it changes. Kalido’s software reduces the creation and lifecycle maintenance staffing costs of data warehouses by 55 per cent or more compared to custom-built solutions according to an independently audited study.
Longman, C., "Data Warehousing at the Speed of Business", DSSResources.COM, 11/05/2003.
Allison Parker, Imagio/J. Walter Thompson, provided permission to archive this article and feature it at DSSResources.COM on behalf of Kalido and Longman on Thursday, September 4, 2003. You can contact her at Ph. 206-625-0252, ext 3051 and email: firstname.lastname@example.org. This article was posted at DSSResources.COM on November 5, 2003.