​Data Quality

Data quality has been defined as "the quality of data's content and structure (according to varying criteria), plus the standard technology and business practices that improve data, such as name-and-address cleansing, matching, house-holding, de-duplication, standardization, and appending third-party data" (TDWI, 2006).  In simpler terms, it is ensuring that the information consumed by the end users is in alignment with their expectations.

The old adage of "garbage in, garbage out" still holds true. The most sophisticated information platform is useless if the underlying data cannot be trusted or is considered to be unreliable. Inaccurate and incomplete data causes end-users to first distrust and then refuse to use centralized data environments.

Data Quality Assessment

As a part of Edgewater Consulting best practices, we conduct Data Quality Assessments for the early identification of foundational data issues. The assessment minimizes negative project impacts that would compromise our clients' ability to maximize return on investment (ROI). 

Data Quality Roadmap

While the Data Quality Assessment focuses on the identification of potential data quality issues, the Data Quality Roadmap outlines the plan for the remediation and on-going monitoring related to overall data quality. This includes:

  • The organizational constructs that need to be in place for the on-going monitoring and remediation (ex. Data Stewardship)

  • The processes and best practices for the detection and proactive prevention of data quality issues

  • The enabling technologies in both the data integration architecture and the monitoring capability

Implementing a comprehensive data quality platform is not done overnight.  Edgewater's approach is to phase the implementation of the required organizational, process, and enabling technologies capabilities over a period of time in support of business priorities and implemented business capabilities.



Recent Articles