Overview
You checked the logs and realized that the MUU_Ready_File_Check job ran successfully lately, but it never kicked off the MUU_Insert_Exception job.
Information
The Daily Import Process
The Daily Import Process (MUU_Insert_Exception) is a highly custom process where the specifics differ for each instance, it isn't possible to document all potential errors. This document will provide a bit of a guide to help with investigation. Some specific errors that were encountered are listed as well.
Generically, the process will involve these steps:
- Ready File Check - The process makes sure the file named ready.txt exists (this file is normally created alongside the import files). If the file does not exist, the job will stop at this point. If it does exist, the job will delete it and continue with subsequent steps. Data Load steps - Run DTSX/SSIS packages to pull information from the import files into the CTExternal_Source database.
- Pre-processing Steps - perform data manipulation on the data in CTExternal_Source
- Disable Triggers – eliminates potential for trigger to update something we don't want to touch.
- Processing steps - run SPUploadExternalData to pull data from CTExternal_Source into the bank database
- Move Orphaned Contacts - some data updates may result in contact records being orphaned; this will move these contacts to the exceptions database.
- Automatching - link up contacts and activities where possible; any that can't be matched will go into unmatched and the customer can review and manually resolve. Post-processing - perform any data manipulation that is still needed in any of the databases.
- Enable Triggers – return to normal functionality
- Move Import Files - Move the import files to the "Processed" folder and append the date as part of the file name.
When an issue arises, you can view the job history for each step to see if a relevant error is logged and then proceed accordingly.
<supportagent>
As this is a custom process, some customers have steps to turn on xp_cmdshell at some point before the step that moves/renames the import files, and then turn it off after. Depending on the customer, you may see other steps in the job that aren't among those listed here. This is not a surprise, as the process varies quite a bit by customer. For example, some customers it may be 20 steps, but some others might have over 100 steps.
</supportagent>
Common Issues related to this process
Some of the quick solutions described in the following sections will need to apply Update queries to the database. Please make sure you back up your entire database before making any changes to the DB.
-
Wrong import file.
One of the common issues is that the data in the import file is wrong, so if you get a report that there is incorrect data coming through, it's best to provide a specific example, find the appropriate import file, and check the data to see if it is correct/incorrect there.
- If it is incorrect, the process that creates the import files needs to be checked again.
- If it is correct, then contact support for a thorough analysis.
- Issues in the DTSX package.
There may be an issue in the DTSX package, and you might need to install Visual Studio to check these type of files and make sure the mapping is correct between the import file and the table in the CTExternal_Source database.
We have noticed sometimes a bug when saving DTSX's where it loses the stored password. If that happens, you can edit the file in Notepad and manually assign the password there.
- Import file's delimiters.
Sometimes, the import file's delimiters were changed and that caused the DTSX to not be able to read the file correctly. You can check the DTSX's in Notepad (if no Visual Studio) to see what delimiters it expects and then compare to the import file. Some examples:
_x000D__x000A_ Represents CRLF
_x0009_ Represents Tab Character
_x007C_ Represents Vertical Bar (pipe) Character
_x002C_ Represents Comma
- Matching switched new products to unmatched.
Sometimes when new products are added to the import, matching puts them all into unmatched. This may leave you with far too many rows to match manually. In this case, an auto-matching step may need to be added. It's best to use an existing auto-matching step as an example when building a new auto-matching step.
-
Invalid Object Name.
One issue we have seen at times is an Invalid Object Name error that references the AFI_375 database or EX_AFI_375 database. There have been a few master releases with different bugs over the years that cause these errors. Usually these are apparent immediately after upgrade. At the time of this writing, this issue has been resolved in master.
If these errors do occur, look at these stored procedures and function and change references of AFI_375 to the appropriate Bank DB name.
- MatchContacts (stored proc)
- MatchActivities (stored proc)
- GetExContacts (function)
- CTADMIN
-
spMoveOrphanedContacts
Also look for synonyms under the bank database that may be for EX_AFI_375. Script those out to file and then provide the correct database and synonym names. Also remove the EX_AFI_375 synonym.
- Value is larger than the field definition.
Another issue that may arise is an imported field's value is larger than the field definition in the DTSX, CTExternal_Source table, or the bank database table. Normally in the DTSX and CTExternal_Source, varchars are defined as (255) but not in all cases. The resolution for this will be situational - in some cases, truncating the data at import may be the best; in others the field definition may need to be increased (which may mean other changes are needed, depending upon the field and how it is used).
-
SQL update changed the way NULL values are handled.
One issue became apparently after a SQL update changed the way NULL values are handled when
appending to a string. Previously, it would treat a NULL as a "" when appending/concatenating. But
now it NULLs the entire variable assignment. A symptom of this is when no accounts were being imported. Stored procedure spGenerateActivityInsertSQL uses several variables to build components of a SQL insert statement, then appends them together and returns a full statement.
One of the variables was never assigned (was NULL). Appending a NULL value makes the assigned value NULL, regardless of values of other variables in the append. This stored procedure gets deleted and rebuilt at upgrade; it was reported as a master bug so it will not reoccur going forward.
Adding this script to the stored procedure after the variable definitions will solve the issue:
-- @strSQL was NULL, never assigned a value.
-- This caused @strFullSQL to be NULL. Should contain an INSERT.
SET @strSQL = 'INSERT INTO ' + [dbo].[DB_QUALIFIED_NAME](@DBPrefix, @DB_Name) + '.dbo.Activity ' -
Index Already Exists error.
You get an error like the following:
Msg 1913, Level 16, State 1, Line 1
The operation failed because an index or statistics with name
'idx_DEPOSITS_SAVINGS_ExternalKey_Unique' already exists on table
'CTExternal_Source.dbo.DEPOSITS_SAVINGS'.
This is caused because spProcessMapping is including brackets in the index name search before dropping it, which caused it to not find the index to drop. Then it tries to create the index, and thus threw the error. Removing the brackets in spProcessMapping solves this issue. There were several places that had to be changed. For example:
--- brackets cause failure
--- IF EXISTS (SELECT name FROM CTExternal_Source.dbo.sysindexes WHERE name = '[idx_' + @MapTable + '_ExternalKey_Unique]')
IF EXISTS (SELECT name FROM CTExternal_Source.dbo.sysindexes WHERE name = 'idx_' + @MapTable + '_ExternalKey_Unique')
--- brackets cause failure
--- SET @DropIndexSQL = 'Use CTExternal_Source Drop Index ' + @MapTable + '.[idx_' + @MapTable + '_ExternalKey_Unique]'
SET @DropIndexSQL = 'Use CTExternal_Source Drop Index ' + @MapTable + '.idx_' + @MapTable + '_ExternalKey_Unique'
If you want support to execute the backend modifications on your behalf, please contact the Support staff and provide them with the full information and description of this case. A customer support representative will contact you back with the result of the execution of the aforementioned query.
Priyanka Bhotika
Comments