Poor data quality is undermining the massive investments being made in enterprise resource planning (ERP) – and the failure of many companies to take this issue seriously is exposing them to certain systems failure.
The past five years have seen businesses large and small investing millions in the process – and all too often the pain – of implementing ERP systems, by definition a heavy-duty, headache-inducing process that ties together an organisation's many data centres and makes them available to employees throughout the organisation. Cost of ownership of an ERP system is notoriously high, aggravated not only by the cost of the hardware but the ongoing upgrades needed to keep it going. A large ERP user, for instance, can realistically budget for between R50 million and R100 million for hardware upgrades every three years.
Part of the need for such costly upgrades is the sheer size of the databases that support ERP systems, says Darryl Joubert, chairman of specialist data quality software supplier and service bureau Intimate Data. ""These grow exponentially the longer they are in use. And the bigger they get the more money and resources need to be thrown at them to ensure the ERP system continues to function adequately with the right response times,"" says Joubert.
Just as the hardware of an ERP system needs to be upgraded to keep abreast of technological developments, so do companies need to pay attention to the lifeblood of any IT system: the actual data in it and, most importantly, the quality of that data. Dirty data, redundant and inconsistent data, is common in organisations with no clear data quality strategy. ""Without clean data, management has no guidelines for making decisions that are crucial to the success of the business,"" says Joubert.
What is dirty data? When the data in a system does not mirror reality, it automatically becomes dirty. It may be that a customer's address is out of date or that the postal code is wrong. It may even be that the same piece of information is stored in two places and they do not agree - one of the two or both are dirty.
Data problems worsen when a company merges two or more disparate information repositories to implement an ERP system.
""Effective data analysis and ongoing data quality management is critical if ERP is to be successful,"" says Joubert. Research has shown that failure to examine the content, structure and quality of data prior to integration is one of the root causes why about 80% of data integration projects cost two thirds more than the original budget, or collapse altogether.
""As a great deal of data relates to customers, and the percentage of data that relates to customers is likely to be 20% to 30% incorrect, the more the database is de-duplicated, the better the system is likely to perform, and the less the company will have to spend on upgrading hardware,"" says Joubert.
This is in addition to the range of other direct bottom-line benefits companies can enjoy from improved data quality, such as substantial bulk postal rebates from the South African Post Office (SAPO) for qualifying with good clean data, faster payments, improved cash flow, greater control over customer targeting in marketing campaigns, and sound return on investment.
Companies that invest time and resources in ensuring ongoing data quality are positioning themselves to secure maximum value from their ERP investment. At the same time they are building a foundation for any future business changes required to keep them ahead of the game.
""Organisations that maintain the quality and integrity of their data will protect their business and bottom line, assuring their position as a major player in an ever-changing business environment,"" concludes Joubert.