As businesses offer more multi-channel opportunities for customer interactions, they increasingly rely on data to communicate with customers and provide business intelligence. However, as much as 94 percent of businesses surveyed say that there is some level of inaccuracy in their systems, according to “Data Quality and the Customer Experience: Today’s consumer and how contact data affects relationships,” a white paper from Experian QAS.
According to the survey results, fewer than 1 percent of businesses lack a data quality strategy and data accuracy improved an average of 8 percentage points from last year. The most common types of errors are noted in the survey are incomplete or missing data, outdated information and duplicate data, and 92 percent of organizations said they have duplicate data within their system.
“The fast-paced, omni-channel environment often results in siloed touch-points and reduced resources,” said Thomas Schutz, SVP, general manager of Experian QAS. “To ensure a positive customer experience, many businesses are implementing new strategies to better utilize consumer intelligence and analytics. To gain a high level of insight that can create a more personalized experience across channels, organizations need to improve accuracy and incorporate data quality into strategic planning.”
Human error, cited by 65 percent of organizations, is the main cause of data problems, followed distantly by a lack of internal manual resources, an inadequate data strategy and insufficient budget; just 14 percent cited inadequate senior management support, which the authors said illustrates the importance of data quality for the C-suite.
The omni-channel environment, the report said, is changing how companies message to consumers, and to create meaningful interactions and customer experiences, companies need to make real-time, dynamic offers and therefore need demographic and behavioral details to better understand individual needs.
The report offers four steps to implementing real-time relevant customer messages:
1.
2.
3.
4.
Duplicate Data is now among the most common data quality issues, and 92 percent of participants admit having duplicate data as a result of human error, multiple points of entry, multiple databases and multiple business channels. The report offered several techniques to remove existing duplicate records within databases:
•
•
•
•
There are several steps businesses can take to eliminate human error, the report said:
•
•
•
•
The most common problem according to those surveyed is sending mailings to the wrong address, followed by sending multiple mailings to the same customer, and staff inefficiencies; 32 percent said inaccurate contact data negatively influences customer perception and 29 percent said they had lost a customer due to inaccurate data input.