Home » Cloud » Preventing Duplicate Records in Salesforce.com

Preventing Duplicate Records in Salesforce.com

Duplicate Leads, Contacts and Accounts is a major issue hampering quality and productivity of Salesforce.com data. System Administrators would love to handle it by making configuration changes such as making a field Unique and Required. But unfortunately this is not that easy. It is very rare that a record is duplicate based on a single custom field (Unique and Required properties are not available for standard fields). Many organizations choose to allow duplicates and remove later by some kind of deduping process.
The Solution I am proposing is based on a real life situation where the client company had millions of contacts and accounts. Uniqueness was based on a combination of multiple fields such as First Name, Last  Name, State, Country and Zip Code for example.  Their developers had written Apex code in before trigger to select all the records which match these fields during insert or update and if the match is found then prevent the record from saving by throwing an error. This worked quite well on a test sandbox. But once deployed in production the performance took a heavy toll and the code was removed from the trigger logic.  I was asked to look into the issue. The solution was simple and worked quite well without any noticeable performance decadence.

A custom text field called UniqueKey long enough to hold concatenated string of First Name, Last Name, State, Country and Zip was defined. The field was made Unique and Required. Trigger logic (before insert/update) was added to normalize these fields by converting  each field to lower case and removing extra spaces and padding if any.  All these fields were then concatenated and assigned to UniqueKey. And that’s it. Salesforce prevented duplicates very efficiently and effectively.

Obviously a one time batch process was written to populate UniqueKey for the existing records.



  1. vardhan gupta. says:

    one can do the same thing by creating a hidden field (make that field an external ID) and then writing a workflow rule to concatenate all the required fields into that field.

    By making it an external ID, we can enforce uniqueness and at the same time make all the fields searchable from the sidebar search.

    One does not need to write a custom trigger to achieve the same.

  2. Arun Gaikwad says:

    Workflows always run after update which makes it little bit inefficient. Because you update twice and then roll back if there is an error.

  3. Neha says:

    Thank you Very much for sharing this…its very helpfull
    nice work a round..
    thanks a lot
    Keep sharing..

  4. Mark Emmett says:

    Interesting, but I am not seeing how this is going to help us as this will only flag identical matches surely? Is there a function that you can use to calculate the similarity with another record – i.e. how many of the characters or what percentage of the string is the same?

  5. Recently I attended our local Salesforce user group here in Boise and this exact topic was discussed. As other posters have pointed out there are multiple ways to handle duplicates in your system, however I would like to propose something different based on our discussion. We were fortunate enough to have a representative from CRM Fusion who was kind enough to demonstrate the approach used by their company (products they produce and sell through AppExchange) – and just in case, ‘No’ I am not a paid rep or receive anything by making this suggestion.

    Now I realize this is a ‘paid’ solution and may not be for everyone, but the flexibility that was in the tools, matching and merging of multiple fields (and yes you can make changes on the definitions just in case what you think was unique was not) capabilities of the products demonstrated far outweigh the cost of duplicate marketing efforts, locking someone in a room for days to deal with pulling the duplicates out of the system, loss in productivity for the team (sales and IT), etc.

    My suggestion is give it a shot, being in IT for over 32 years and responsible for terabytes of data over the years, I could not help but smile at the flexibility offered by the tools demonstrated – couldn’t help but wonder if I only would have had some of these tools years ago, how much time would I or other members of my teams have had for other things that were much more important for securing our futures?

    Regardless of what your approach is, Ideally your de-duplication efforts should be part of a monthly cycle in your business processes to reduce the impact of doing it all in one shot at the end of the year. Good luck and happy deduping.

    • Arun Gaikwad says:

      Thanks for the feed back.
      The problem with the app exchange tools is, it expects clean data such as name, address etc. I think it is the responsibility of the tool to guarantee clean data.

  6. Eric says:

    Is there a way to create a more legible Error message. The current unique error message on the concatenated field is unclear.

  7. thetseng says:

    Sounds like a good solution for our problem. Can you please tell me where to find the code for the trigger? Thanks!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: