A lot of our customers engage us in some form of data cleansing when implementing Dynamics CRM. Why? Because a company is only as good as the data set that they work from and bad data can lead to inefficiencies and quality issues. Data cleansing and management oftentimes is a project in and of itself that requires attention and resources. There are many obvious advantages to going through this process BEFORE you move your data and begin utilizing the new software and it’s highly recommended as a best practice.
Ongoing Data Quality Management
But what about after the implementation? Certain data is constantly changing and evolving and if this isn’t managed properly, you could potentially end up with an unreliable set of data despite your previous efforts to clean it up. Shouldn’t there be a strategy or plan for continuous Data Quality Management? Well we think so, and Microsoft does too, so they have included a few out of the box features including duplicate detection and auditing to help with this dilemma. The auditing feature includes the ability to audit records, groups of records, and individual fields. It also lets the user know when auditing has been turned on or off for records and fields and which user was responsible. These data quality management features certainly help an organization keep their data clean and may be sufficient for some companies, but some organizations potentially need a more robust solution and plan to manage their data.
The out of the box auditing function does not allow the user to take the proper action on the audited records. The user needs more control over which fields get audited and more specifically exactly what to do with the audit logs once they are gathered. In light of this limited functionality we have developed a tool to help out in the process.
The DQM Solution
Our Data Quality Management solution (DQM for short) leverages the already existing CRM audit rules and workflows to continuously allow an organization to audit their data. It is customizable within CRM allowing the user to monitor only the data that is most important to them. It allows the user to create "Rules" for auditing data as they see fit. These rules can apply to system or custom entities, and system or custom fields within those entities. The rules can be specifically grouped by user or team, and can be prioritized as well. Once the rules are in place, audit logs are created and the "Auditor" has the ability to review and perform actions on the record such as accept, revert, or pend.
Management Commitment and Data Management Stewards
I mentioned that the solution is set up for an individual or team. That’s because a team of specific users should be designated to be responsible for ongoing data management.Our customers that have teams to manage data quality tend to have fewer issues with their data. These do not have to be large teams with multiple members. A few select individuals who act as data gatekeepers and who are given the control over changes to your data is sufficient.
How Does the Solution Work:
The basic setup is as follows:
Rules are created to audit specific fields
When those fields change, an audit log entry is created and put into a view for an auditor to review the changes and take action.
The auditor can choose to Ignore the field change, Revert the field back to its value before the change, or Override the change and manually enter in a field value.
When action is taken, there is still a visible record of the fields audit trail including the name of the field, what the field value was before, what the approved value was, and what the field value is now.
The tool also records when the review took place and which user was doing the reviewing.
The “audit logs” view includes different filters so the auditor has control over which entries they are viewing.
We have deployed this solution successfully for a few of our customers and have been able to augment and improve their ongoing data quality management efforts. Our solution coupled with a data quality management strategy will give any organization more control over and more confidence in their data set and ultimately lead to more efficient processes and better decision making.