Start a new topic

Profile Hanging on Duplicate Search when API code is included

I have an export file from our 3rd party fundraising website that places the home address in one set of columns (ex. G-K) the business address in another set of columns (ex. M-Q). Within my IOM profile I've mapped G-K to Constituent Address fields and added the following code in the API:

Public Overrides Sub AfterDictionaries(ByVal Cancel As ImportOM.API.iCancel)


'Set AddressType to Home

If Import.Fields.GetByName("HomePreferredAddress").Value = "y" Then

Import.Fields.GetByName("Address Type").Value = "Home"

End If

'Move data as needed for a business address

If Import.Fields.GetByName("BusinessPreferredAddress").Value = "y" Then

Import.Fields.GetByName("HomePreferredAddress").Value = "n"

Import.Fields.GetByName("Address Type").Value = "Business"

Import.Fields.GetByName("HomeAddressLine1").Value = Import.Fields.GetByName("BusinessAddressLine1").Value

Import.Fields.GetByName("HomeAddressLine2").Value = Import.Fields.GetByName("BusinessAddressLine2").Value

Import.Fields.GetByName("HomeAddressLine3").Value = Import.Fields.GetByName("BusinessAddressLine3").Value

Import.Fields.GetByName("HomeAddressLine4").Value = Import.Fields.GetByName("BusinessAddressLine4").Value

Import.Fields.GetByName("HomeCity").Value = Import.Fields.GetByName("BusinessCity").Value

Import.Fields.GetByName("HomeProvince").Value = Import.Fields.GetByName("BusinessProvince").Value

Import.Fields.GetByName("HomePostalCode").Value = Import.Fields.GetByName("BusinessPostalCode").Value

Import.Fields.GetByName("HomePostPermission").Value = Import.Fields.GetByName("BusinessPostPermission").Value

End If

End Sub


The code runs find when I test it, however when I run the actual Import, it hangs up on the first record that uses the code. I ran a performance log and the record it was hanging on it appears to hang up during the "Duplicate Search".

Any suggestions why this might be?

Thanks for the help,


Hi Jonathan,

If you put this row into it's own test file (just a header row and this line of data) and make those changes manually, then comment out the code in that Sub, does the row import successfully? If not, I'm guessing the duplicate search might be timing out. If that is the case, what duplicate criteria are you using, and how many constit records do you have in your DB? Also, have you performed maintenance on the database recent to rebuild indexes, etc?


Hey Jeff,

I have no trouble when I make manual changes to the file and run the row without the code. That's what I have done in the past. I'm trying to add the code, so I can run the file without so much intervention from myself or my users.





Something is clearly off, if this were a code problem then the performance log would not be showing it hanging at the duplicate search. Can you take this one row that you know it hangs on and pass it into the code tester within the profile screen? This will run the AfterDictionaries event with your code. From there we can see what the output will be prior to the duplicate search. My guess is that the translated row of data won't match what you would have done manually.
Hey Nic,

I ran the one row in code tester and everything in the output looked correct. I even exported the file from code tester and ran it through the import with the code commented out and it worked fine. Yet, when I try to run the import with the code to do exactly the same thing, RE hangs and this is what I see in the performance log.

Import Started - 14:35:10
BeginProcessLoop - 14:35:10
--- Start Duplicate Search ---
- 14:35:10
Search 2 - 14:35:10
Search 3 - 14:35:10
Search 4 - 14:35:10
Search 5 - 14:35:10
Search 6 - 14:35:10
Search 7 - 14:35:10
Sorry I am so late to this!

One thing to think about is that after this code runs both the home address and business address data are filled in since you never delete the business info. Maybe that's a thing?

Another test to try might be importing that one line but manually moving the business address data to the home address columns in the import file (i.e. recreating the function manually). That might show if it's actually the data being copied that is causing the problem.

This code doesn't have any direct effect on the duplicate search but it is changing the data being used for the search so I think I would focus there.
Login or Signup to post a comment