A good asset management practice is essential to running any utility safely, efficiently and in compliance with a myriad of demands from industry, regional and legislative bodies. But data deficiencies can easily undermine confidence in asset management processes and force ‘finger-in-the-air’ decision making and reporting. Introducing a data quality management process can quickly identify the issues, support their resolution and bring back trust.
Physical assets dominate the balance sheets of most utility companies, with figures often shown in billions. Asset management is the processes of optimizing their contribution to the business and its customers while meeting health, safety and environmental standards.
From an operational point of view, an effective asset management function needs to have answers to questions like: What do we own? When did we buy it? Where is it? What’s it made of? What are its exact specifications? When was it last maintained? When does it next need inspecting? When will it need replacing? There are also additional financial governance demands for facts such as: What did we pay for our assets? Do we still own them? What are they worth today?
Access to reliable data to answer these questions is essential to supporting the asset lifecycle from purchase and installation, through usage and maintenance through to end of life disposal and replacement. Accurate details are necessary to support capital expenditure (CAPEX) optimization and to control operating costs (OPEX (News - Alert)). They are also essential to ensuring accurate asset reporting to local and industry bodies and compliance authorities.
While it’s crucial that utilities are confident in their asset data, cross-industry evidence suggests that such confidence could be misplaced.
According to Real Asset Management International, as much as 50 percent of assets on the books of multi-nationals are either poorly described or no longer in use and cannot be located during a physical audit. Furthermore, one asset management consulting firm judges from over a decade of experience that as much as 65 percent of fixed asset data is incomplete, inaccurate, or altogether missing.
Where assets described on the register are no longer in operational use or existence, then the business may be presenting false financial accounts and compliance reports. It may also be overpaying for asset insurance. Additionally, by applying depreciation costs to ‘ghost’ assets, a company could be under-calculating the profits it declares to taxation authorities. A lack of consistent, complete and accurate information could lead to assets operating beyond their scheduled lifetimes, missing regular maintenance or being used outside of their performance parameters. Unplanned and expensive maintenance, asset failures and brand-damaging service interruptions could result. In the very worst of cases, such as a catastrophic failure of a main gas pipe, or water pumping station for example there could be health and safety rules breaches, the public could be put at risk or environmental damage could occur.
The better the asset data, the better the visibility. This allows asset managers to communicate better about the status of assets in different stages of their lifecycle, allows data driven processes to become more effective and for the right actions to be taken at the right time more often. It permits better financial and compliance reporting.
Three steps to achieving reliable asset data
Analyst firm Aberdeen (News - Alert) reports that “Best in Class” organizations respond to the asset data quality challenge by implementing processes that allow for a better understanding of their assets.
The key to ensuring good data for asset management is to create a data quality compliance process. This must ensure that what is allowed to enter corporate systems and processes meets required standards for cleanliness, relevance and timeliness. There are three main data compliance steps:
Assess the quality of existing data and its degree of reliability and consistency for asset management. Get some quantified insight into the quality of your data. Data profiling enables you to fully understand the issues in your data and determine what steps need to be taken to remedy them. Specialist data quality software automates this process, enabling you to incorporate your own rules, so not only is the data validated for quality, but also for relevance to your utility’s specific needs.
The appropriate data quality software should also convert these rules into processes that transform and correct the data into a common format. Many utilities have multiple sources of asset data. A standardized and corrected asset record ensures it will match with associated data coming via other channels and with historical legacy systems of data collection. This ensures that associated data is linked to the correct asset as well as any available reference data to further describe asset properties, such as supplier details.
Finally, the same repeatable process created for step two can also be embedded into your asset management-centric systems to automate the validation and correction of data at point of capture. Asset managers and supporting teams will all have a high level of data consistency, quality and reliability serving their specific business requirements without the latency and cost problems commonly associated with data reconciliation.
Reap the benefits
By implementing an asset data quality process you will improve and be able prove the reliability of the organization’s view of its assets. Moving from ‘best guestimates’ to trusted asset information is essential to determining existing and lifetime asset values for accounting (CAPEX) purposes and for compliance reporting. It will provide an accurate view to maintenance operations, extending asset life, reducing costs (OPEX) and improving safety. Additionally, a clearer view of replacement cycles and a trusted insight into existing inventory will optimize procurement spend, reducing the frequency of unnecessary purchases. The time required for manual data analysis and reconciliation will fall.
Ed Wrazen is VP Marketing International with Harte-Hanks Trillium Software, a leading provider of total data quality solutions. Working with customers, partners and industry analysts, Ed is responsible for the international marketing operations of Trillium Software. With over 30 years experience working in Information Management, Ed started his career as a computer programmer on retail banking and travel reservation systems. He has been heavily involved in database and data management technologies as a product developer, consultant and lecturer, specialising in data architecture, performance design, data integration and data quality. He is a regular speaker at industry events worldwide and author on data management, data governance and data quality topics.
Want to learn more about the latest in communications and technology? Then be sure to attend ITEXPO Miami 2013, Jan 29- Feb. 1 in Miami, Florida. Stay in touch with everything happening at ITEXPO (News - Alert). Follow us on Twitter.
Edited by Brooke Neuman