Restricted access

September 16, 2009

Improvements to Avoid Loses Through Poor Data Quality

Filed under: Data Quality — Olga Belokurskaya @ 5:45 am

And again about data quality. According to Gartner, though the adoption of data quality tools increased, organizations still lose more than $8 million annually due to poor data quality. Moreover, achieving comprehensive data quality processes is something most organizations are very far from. The reason is that mainly IT staff utilizes data tools, because they are complex and difficult to understand for non-IT users, resulting in their slow adoption by most of the supposed users.

Gartner provided some recommendations called to help organizations improve data quality:

    First is for vendors to make data quality tools simpler to use so that not only IT people, but business responsible could use them and start being more accurate with their data in terms of quality.

    “In particular, providing data profiling and visualization functionality (reporting and dashboarding of data quality metrics and exceptions) to a broader set of business users would increase awareness of data quality issues and facilitate data stewardship activities,” Ted Friedman, an analyst, mentioned in his report.

    The advice for organizations is to consider pervasive data quality controls throughout organization’s infrastructure and investments in technologies applying data quality rules to data in terms of capturing and maintenance, as well as downstream.

No Comments »

No comments yet.

RSS feed for comments on this post. TrackBack URL

Leave a comment