“The perception or an assessment of the fitness of data to serve its purpose within a given context.”

Within an insurance organization, acceptable data quality is crucial to operational and transactional processes and to the reliability of business analytics  and business intelligence (BI) reporting. Data quality is affected by the way data is entered, stored and managed. Data quality assurance (DQA) is the process of verifying the reliability and effectiveness of data.

Maintaining data quality requires going through the data periodically and cleaning it. Typically this involves updating, standardizing, and de-duplicating records to create a single view of the data, even when it is stored in multiple disparate systems.

Aspects of data quality include:

  • Accuracy
  • Completeness
  • Update status
  • Relevance
  • Consistency across data sources
  • Reliability
  • Appropriate presentation
  • Accessibility

Best Practices for Data Quality Management

1. Analyse Data Quality

To asses data quality,  a complete analysis of the current state of your data needs to be performed on a regular basis. Information with errors, inconsistencies, duplicates or missing fields can often be difficult to identify and correct. Mostly because bad data can be buried deep within legacy systems, or is being received from external sources such as brokers, cover-holders,  third-party data providers, external applications and social media channels like Facebook and Twitter.

Conducting an independent analysis will provide your organization with an in-depth report that includes accurate and detailed statistics about the quality of your organization’s data. Subsequently a data quality management strategy can be formulated or refined tailored to your specific organizational needs, and data governance policies can then be developed that address specific data management requirements.

2.  Build a Data Quality Firewall

Data is a strategic information asset, and the organization should treat it as such. Like any other corporate asset, the data contained within your organization’s information systems has financial value. The value of the data increases and correlates to the number of people who are able to use it. Feeding inaccurate data into your data warehouse or mastering systems will not only make it difficult to obtain accurate business insights and gather actionable information, it will also damage the good data.

A virtual data quality firewall detects and blocks bad data at the point it enters the environment, acting to proactively prevent bad data from polluting enterprise information sources. A comprehensive data quality management solution that includes a data quality firewall will dynamically identify invalid or corrupt data as it is generated or as it flows in from external sources, based on pre-defined business rules.

3.  Combine Data Management and Business Intelligence

Even with the best data governance policies in place, this alone is not enough to protect your data. The sheer volume of data that flows through enterprise systems can make it particularly challenging to maintain peak data quality at all times. It simply isn’t possible to manage quality record by record, or to attempt to govern every piece of data collected by your organization. The key to success is identifying and prioritising the type and volume of data requiring data governance.

Business intelligence (BI) solutions allow you to determine which data sets are most appropriate to utilise and target for quality management and governance. Data management processes can then be set up to collect that data, eg customer buying preferences or policy purchasing information and move it to a repository for cleansing and analysis as a high priority.

4.  Make Business Users Data Stewards

The business users need to take ownership of the data they are helping to create and feed into their internal IT systems, prompting many insurance companies to create data governance roles who manage data quality from end to end.

The data governance director selected from a business group, will be the primary focal point for all data related-needs within that group. Some organizations have multiple roles for data governance to represent different areas of the business. These data overseers take a leadership role in resolving data integrity issues, and act as liaisons with the IT group that manages the underlying information management infrastructure.
The primary objective for instituting a data governance board is to mitigate business risks that arise from highly data-driven decision-making processes and systems in the current business environment. These boards include business and IT users who are responsible for setting data policies and standards, ensuring that there is an effective mechanism for resolving data related issues, facilitating and enforcing data quality improvement efforts, and taking proactive measures to stop data-related problems before they occur.

In Conclusion

Successful data governance starts with a solid, well-defined data management strategy, and relies upon the selection and implementation of a cutting edge data quality management solution. The key to effective data quality management is to create data integrity teams, comprised of a combination of IT staff and business users, with business users taking the lead and maintaining primary ownership for preserving the quality of any incoming data. While data integrity teams will drive the data quality management plan forward, it is also important to have a comprehensive data quality management solution in place. This will make the strategy more effective by enabling data governance professionals to profile, transform and standardize information.

Contact us to find out more about how we can help.