Duplicate Lead Management

Stacks42
参加者

I have been merging contact records via the Data Quality tool, and now I have boatloads of contacts with multiple open lead records! What do I do?? Is there an easy way to clean up and manage these duplicate leads? Thanks.

0 いいね!
1件の承認済みベストアンサー
karstenkoehler
解決策
殿堂入り | Solutions Partner
殿堂入り | Solutions Partner

Hi @Stacks42,

 

As far as I know, this is currently not possible natively and neither do integrations like Koalify, Insycle, or Dedupely support it yet.

 

I would definitely advise against flat-out bulk closing / disqualifying a large number of leads as these could currently be worked and you can't be sure you're deleting the right half of two duplicate records.

 

Depending on the volume, it's very likely going to be the best course of action to create a filtered leads view: https://knowledge.hubspot.com/records/create-and-manage-saved-views

 

In this view, sort leads by name (which should for the most part "group" your duplicates) and display columns which help you judge which one to keep - activity dates, current stage, owner etc.

 

Then start selecting the checkbox at the beginning of the row to delete the ones you consider a duplicate.

 

As far as solving the core issue goes, the reason for duplicates is typically not the fact that contacts are merged but more that HubSpot's default pipeline automation is creating leads and that there are additional sources for lead creation on top of that (workflows, manual creation). Using a custom report (data source: lead) you can check what the origin of these leads is.

 

Workflows that are creating leads, for example, can be easily set up to first check in a branch whether there is an open lead - and only create a new lead record if there isn't.

 

Best regards

Karsten Köhler
HubSpot Freelancer | RevOps & CRM Consultant | Community Hall of Famer

Beratungstermin mit Karsten vereinbaren

 

Did my post help answer your query? Help the community by marking it as a solution.

元の投稿で解決策を見る

2件の返信 2
karstenkoehler
解決策
殿堂入り | Solutions Partner
殿堂入り | Solutions Partner

Hi @Stacks42,

 

As far as I know, this is currently not possible natively and neither do integrations like Koalify, Insycle, or Dedupely support it yet.

 

I would definitely advise against flat-out bulk closing / disqualifying a large number of leads as these could currently be worked and you can't be sure you're deleting the right half of two duplicate records.

 

Depending on the volume, it's very likely going to be the best course of action to create a filtered leads view: https://knowledge.hubspot.com/records/create-and-manage-saved-views

 

In this view, sort leads by name (which should for the most part "group" your duplicates) and display columns which help you judge which one to keep - activity dates, current stage, owner etc.

 

Then start selecting the checkbox at the beginning of the row to delete the ones you consider a duplicate.

 

As far as solving the core issue goes, the reason for duplicates is typically not the fact that contacts are merged but more that HubSpot's default pipeline automation is creating leads and that there are additional sources for lead creation on top of that (workflows, manual creation). Using a custom report (data source: lead) you can check what the origin of these leads is.

 

Workflows that are creating leads, for example, can be easily set up to first check in a branch whether there is an open lead - and only create a new lead record if there isn't.

 

Best regards

Karsten Köhler
HubSpot Freelancer | RevOps & CRM Consultant | Community Hall of Famer

Beratungstermin mit Karsten vereinbaren

 

Did my post help answer your query? Help the community by marking it as a solution.

Pankaj-Sharma
参加者

Hi,
This usually happens when contacts are merged but the associated lead records do not consolidate properly. The easiest fix is to filter leads where the same contact has multiple open records, keep one primary lead, and bulk update the rest to Closed or Disqualified. Going forward, review duplicate rules in the Data Quality tool to prevent this from repeating.

Hope this helps.
Pankaj

0 いいね!