Insert Data and read all Foreign Keys/Constraints first

  • How can i read all the constraints first based on the data I have in my select and it tell me that i don't have data in TRACEY_VEND_TERMS.

    I just completed an exercise where one table had over 20 constraints and by the last insert failed and i fixed all the data, I thought there has to be another way to do this.

    Start Read record 1

    get all constraints based on the data i pass in from the select statement, go find if that has data match in the constraint table and if it does move to the next record

    If found a row in the VEND_TERMS and it matches go insert the data and so on.

    next record.

    CREATE TABLE [DBO].[TRACEY_VEND](

    [VEND_ID] [varchar](12) NOT NULL,

    [TERMS_DC] [varchar](15) NOT NULL,

    [S_AP_1099_TYPE_CD] [varchar](6) NULL,

    [AP_1099_TAX_ID] [varchar](20) NOT NULL,

    [AP_ACCTS_KEY] [int] NOT NULL,

    [CASH_ACCTS_KEY] [int] NOT NULL,

    [MODIFIED_BY] [varchar](20) NOT NULL,

    [TIME_STAMP] [datetime] NOT NULL,

    [COMPANY_ID] [varchar](10) NOT NULL,

    [CAGE_CD] [varchar](15) NULL,

    CONSTRAINT [PI_0383] PRIMARY KEY NONCLUSTERED

    (

    [VEND_ID] ASC,

    [COMPANY_ID] ASC

    )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]

    ) ON [PRIMARY]

    GO

    ALTER TABLE [DBO].[TRACEY_VEND] WITH CHECK ADD CONSTRAINT [F07056] FOREIGN KEY([TERMS_DC])

    REFERENCES [DBO].[TRACEY_VEND_TERMS] ([TERMS_DC])

    GO

    CREATE TABLE [DBO].[TRACEY_VEND_TERMS](

    [TERMS_DC] [varchar](15) NOT NULL,

    [DISC_PCT_RT] [decimal](5, 4) NOT NULL,

    [DISC_DAYS_NO] [smallint] NOT NULL,

    [S_TERMS_BASIS_CD] [varchar](1) NOT NULL,

    [S_DUE_DATE_CD] [varchar](1) NOT NULL,

    [NO_DAYS_NO] [smallint] NOT NULL,

    [DAY_OF_MTH_DUE_NO] [smallint] NOT NULL,

    [MODIFIED_BY] [varchar](20) NOT NULL,

    [TIME_STAMP] [datetime] NOT NULL,

    [ROWVERSION] [int] NULL,

    CONSTRAINT [PI_0385] PRIMARY KEY NONCLUSTERED

    (

    [TERMS_DC] ASC

    )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]

    ) ON [PRIMARY]

    GO

    INSERT INTO TRACEY_VEND (

    [VEND_ID]

    ,[TERMS_DC]

    ,[S_AP_1099_TYPE_CD]

    ,[AP_1099_TAX_ID]

    ,[AP_ACCTS_KEY]

    ,[CASH_ACCTS_KEY]

    ,[MODIFIED_BY]

    ,[TIME_STAMP]

    ,[COMPANY_ID]

    ,[CAGE_CD])

    SELECT '1000','NET 0','AP','AB',1,1,'TRACEY',GETDATE(),'1','C'

    GO

    Msg 547, Level 16, State 0, Line 73

    The INSERT statement conflicted with the FOREIGN KEY constraint "F07056". The conflict occurred in database "CPDEV1", table "dbo.TRACEY_VEND_TERMS", column 'TERMS_DC'.

    The statement has been terminated.

    • This topic was modified 4 years, 10 months ago by  TRACEY-320982.
  • If you're doing an insert where you don't want the constraints checked, you can script out the constraints on the table, then drop them, do the import, then add them back later. Most likely best done as part of an SSIS package that runs after hours.

  • I dont want to turn off the constraints,  what I am trying to do is read one row of data, and if there is a constraint, i.e TERMS_DC and that value of NET 0 isn't on the child table then I don't want to not insert.

    Basically I want to know all my data issues before executing the code, rather than when I get the error message.

    Trying to see if I could script it ahead of time.

     

    Cheers.

     

     

     

  • There's no easy way to do this other than querying all the tables... Or, ensuring that the code that is generating your inserts is using appropriate data. The application shouldn't be allowing non-existant data to be a part of the query.

    I wouldn't recommend trying to query every table. It's going to perform very poorly. Better to put the checks and constraints into the code to ensure that it's using the data and tables correctly.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • Got it, if it has a constraint read that table as part of the join based on your data, rather than trying to automatically code for them all.  Thanks.

  • I think it would be more accurate and far less work to rely on the existing constraints.

    You should be able to use a BEGIN TRANSACTION and a TRY/CATCH block to test INSERTs.  The INSERTs would occur, but you would immediately roll them back (undo/cancel them).

    Something like this:

    BEGIN TRANSACTION
    BEGIN TRY
    INSERT INTO TRACEY_VEND (
    [VEND_ID],[TERMS_DC],[S_AP_1099_TYPE_CD],[AP_1099_TAX_ID]
    ,[AP_ACCTS_KEY],[CASH_ACCTS_KEY],[MODIFIED_BY],[TIME_STAMP]
    ,[COMPANY_ID],[CAGE_CD])
    SELECT '1000','NET 0','AP','AB',1,1,'TRACEY',GETDATE(),'1','C'
    COMMIT TRANSACTION
    /* or, if you don't want to do even a valid INSERT at this moment
    ROLLBACK TRANSACTION
    --write code to log key(s) of row that had valid INSERT
    */
    END TRY
    BEGIN CATCH
    --write code here to capture ERROR_ values
    IF XACT_STATE() <> 0
    ROLLBACK TRANSACTION
    --write code here to report errors
    --write code to log key(s) of row that had invalid INSERT
    END CATCH

     

    SQL DBA,SQL Server MVP(07, 08, 09) "It's a dog-eat-dog world, and I'm wearing Milk-Bone underwear." "Norm", on "Cheers". Also from "Cheers", from "Carla": "You need to know 3 things about Tortelli men: Tortelli men draw women like flies; Tortelli men treat women like flies; Tortelli men's brains are in their flies".

  • ScottPletcher wrote:

    I think it would be more accurate and far less work to rely on the existing constraints. 

    Yes yes.  Even if checks are performed in advance it's no guarantee the insert will work.  Only actually inserting the row can assure the transaction is successful.   To separate out rows which insert from those which don't you could wrap ScottPletcher's code in a procedure and call it from within a cursor loop.  If the procedure's return value is assigned from within the catch block, which means the row failed to insert, then insert to a 'failed_inserts' table.  Otherwise the row would've been inserted successfully.

    Aus dem Paradies, das Cantor uns geschaffen, soll uns niemand vertreiben können

  • TRACEY-320982 wrote:

    what I am trying to do is read one row of data, and if there is a constraint, i.e TERMS_DC and that value of NET 0 isn't on the child table then I don't want to not insert.  Basically I want to know all my data issues before executing the code, rather than when I get the error message.

    What is the source of the data and how many rows are you dealing with?  If you are loading external data that you can't pre-check inside the application like Grant says, then maybe SSIS would be a better tool.  It can easily divert failed rows:

    https://docs.microsoft.com/en-us/sql/integration-services/lesson-4-3-adding-error-flow-redirection?view=sql-server-2016

     

  • I will be dealing with 2000 tables and approx 1 million in some tables.

    It is a data migration from one system to another with a mapping in between of old to new.

     

    Nice Chris, not used SSIS but could look at it.

     

     

     

  • Scott, how would i know what record i am on here....

    --write code to log key(s) of row that had valid INSERT

    BEGIN TRANSACTION

    BEGIN TRY

    INSERT INTO TRACEY_VEND (

    [VEND_ID],[TERMS_DC],[S_AP_1099_TYPE_CD],[AP_1099_TAX_ID]

    ,[AP_ACCTS_KEY],[CASH_ACCTS_KEY],[MODIFIED_BY],[TIME_STAMP]

    ,[COMPANY_ID],[CAGE_CD])

    SELECT '1000','NET 0','AP','AB',1,1,'TRACEY',GETDATE(),'1','C'

    COMMIT TRANSACTION

    /* or, if you don't want to do even a valid INSERT at this moment

    ROLLBACK TRANSACTION

    --write code to log key(s) of row that had valid INSERT

    */

    END TRY

    BEGIN CATCH

    --write code here to capture ERROR_ values

    IF XACT_STATE() <> 0

    ROLLBACK TRANSACTION

    --write code here to report errors

    --write code to log key(s) of row that had invalid INSERT

    END CATCH

  • first I hope you aren't doing those inserts one by one - if so change it so you work with sets not with a record at the time

    in order to determine if a record is going to be inserted onto a table with fk's you need to check, as part of the insert, that the corresponding key exists on the parent table.

    INSERT INTO TRACEY_VEND .....

    SELECT * from (values ('1000','NET 0','AP','AB',1,1,'TRACEY',GETDATE(),'1','C') t(VEND_ID, TERMS_DC, S_AP_1099_TYPE_CD, AP_1099_TAX_ID, AP_ACCTS_KEY, CASH_ACCTS_KEY, MODIFIED_BY, TIME_STAMP, COMPANY_ID, CAGE_CD)

    where exists (select 1 from fk_table fk where fk.TERMS_DC = t.TERMS_DC) -- repeat for all fk columns

  • pietlinden wrote:

    If you're doing an insert where you don't want the constraints checked, you can script out the constraints on the table, then drop them, do the import, then add them back later. Most likely best done as part of an SSIS package that runs after hours.

    Ah, be careful now... Because you'll have to skip the CHECK during the recreation so that you can recreate them without getting FK violations thanks to the garbage data you allowed in by importing with the being dropped, that will result in "Untrusted FKs" and the whole world of hurt that goes with them.

    https://www.google.com/search?&q=Untrusted+FKs+in+sql+server

     

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • TRACEY-320982 wrote:

    I will be dealing with 2000 tables and approx 1 million in some tables.

    It is a data migration from one system to another with a mapping in between of old to new. 

    Source and target are in Sql Server?  Is this question related to the migration or the mapping?  It might make sense to separate the two.    You could try to get all of the mappings stored in tables before attempting the migration.  As pietlinden said in the beginning, you could drop all constraints on the target tables and insert from the old tables joined to the mappings.  Maybe it's necessary (temporarily) to add additional indexes to the source tables.  After row insertion re-index and add constraints back to the target tables.  If the mappings were complete it should work smoothly (it never does the first time but hey it's still a valid approach).

    Of the 2000 tables do some migrate without transformation?  If so a schema comparison tool like Redgate Sql Compare or ApexSql Diff (and Data Diff) could be worth a look.  These tools often save endless amounts of time.

    Aus dem Paradies, das Cantor uns geschaffen, soll uns niemand vertreiben können

  • I got so far, trapped the error and wrote to a log table TRACKER_VEND so at least it didn't fail on the constraint

    That is  a step forward.

    I will also work on the next part of the fk table read aswell in the code.  Right now it doing one row at a time, so i need to review the SSChampion advice and how to get the output if it fails on a record and log it.

    Created a track table

    CREATE TABLE [dbo].[TRACKER_VEND](

    [KEY_ID] [int] IDENTITY(1,1) NOT NULL,

    [VEND_ID] [varchar](12) NOT NULL,

    [TERMS_DC] [varchar](15) NOT NULL,

    [S_AP_1099_TYPE_CD] [varchar](6) NULL,

    [AP_1099_TAX_ID] [varchar](20) NOT NULL,

    [AP_ACCTS_KEY] [int] NOT NULL,

    [CASH_ACCTS_KEY] [int] NOT NULL,

    [MODIFIED_BY] [varchar](20) NOT NULL,

    [TIME_STAMP] [datetime] NOT NULL,

    [COMPANY_ID] [varchar](10) NOT NULL,

    [CAGE_CD] [varchar](15) NULL,

    [ERROR_MSG] [varchar](20) NULL,

    [STATUS_DESCRIPTION] [varchar](max) NULL

    ) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]

    GO

    --Staging table that has the data.

    CREATE TABLE [dbo].[STAGE_VEND](

    [KEY_ID] [int] IDENTITY(1,1) NOT NULL,

    [VEND_ID] [varchar](12) NOT NULL,

    [TERMS_DC] [varchar](15) NOT NULL,

    [S_AP_1099_TYPE_CD] [varchar](6) NULL,

    [AP_1099_TAX_ID] [varchar](20) NOT NULL,

    [AP_ACCTS_KEY] [int] NOT NULL,

    [CASH_ACCTS_KEY] [int] NOT NULL,

    [MODIFIED_BY] [varchar](20) NOT NULL,

    [TIME_STAMP] [datetime] NOT NULL,

    [COMPANY_ID] [varchar](10) NOT NULL,

    [CAGE_CD] [varchar](15) NULL

    ) ON [PRIMARY]

    GO

     

     

    DECLARE @first AS INT = 1, @Last AS INT = 1, @STATUS_DESCRIPTION VARCHAR(100)

    SELECT @Last = COUNT(*) FROM STAGE_VEND

    WHILE(@first <= @Last)

    BEGIN

    BEGIN TRANSACTION

    BEGIN TRY

    INSERT INTO TRACEY_VEND (

    [VEND_ID],[TERMS_DC],[S_AP_1099_TYPE_CD],[AP_1099_TAX_ID]

    ,[AP_ACCTS_KEY],[CASH_ACCTS_KEY],[MODIFIED_BY],[TIME_STAMP]

    ,[COMPANY_ID],[CAGE_CD])

    SELECT [VEND_ID],[TERMS_DC],[S_AP_1099_TYPE_CD],[AP_1099_TAX_ID]

    ,[AP_ACCTS_KEY],[CASH_ACCTS_KEY],[MODIFIED_BY],[TIME_STAMP]

    ,[COMPANY_ID],[CAGE_CD] FROM STAGE_VEND

    WHERE KEY_ID = @first

    COMMIT TRANSACTION

    SET @STATUS_DESCRIPTION = 'PASS'

    INSERT INTO TRACKER_VEND (

    [VEND_ID],[TERMS_DC],[S_AP_1099_TYPE_CD],[AP_1099_TAX_ID]

    ,[AP_ACCTS_KEY],[CASH_ACCTS_KEY],[MODIFIED_BY],[TIME_STAMP]

    ,[COMPANY_ID],[CAGE_CD], ERROR_MSG, STATUS_DESCRIPTION )

    SELECT [VEND_ID],[TERMS_DC],[S_AP_1099_TYPE_CD],[AP_1099_TAX_ID]

    ,[AP_ACCTS_KEY],[CASH_ACCTS_KEY],[MODIFIED_BY],[TIME_STAMP]

    ,[COMPANY_ID],[CAGE_CD], @@ERROR, @STATUS_DESCRIPTION FROM STAGE_VEND

    WHERE KEY_ID = @first

    END TRY

    BEGIN CATCH

    IF XACT_STATE() <> 0

    ROLLBACK TRANSACTION

    SET @STATUS_DESCRIPTION = 'FAILED'

    INSERT INTO TRACKER_VEND (

    [VEND_ID],[TERMS_DC],[S_AP_1099_TYPE_CD],[AP_1099_TAX_ID]

    ,[AP_ACCTS_KEY],[CASH_ACCTS_KEY],[MODIFIED_BY],[TIME_STAMP]

    ,[COMPANY_ID],[CAGE_CD], ERROR_MSG, STATUS_DESCRIPTION )

    SELECT [VEND_ID],[TERMS_DC],[S_AP_1099_TYPE_CD],[AP_1099_TAX_ID]

    ,[AP_ACCTS_KEY],[CASH_ACCTS_KEY],[MODIFIED_BY],[TIME_STAMP]

    ,[COMPANY_ID],[CAGE_CD], @@ERROR, @STATUS_DESCRIPTION FROM STAGE_VEND

    WHERE KEY_ID = @first

    END CATCH

    SET @first += 1

    END

  • Hi sschasing, I doing a few inserts first, and then a larger conversation starts from source to target.  I thought it may be simple to do a few inserts but it is becoming harder due to the constraints.

    Having code just error wasn't good, so now I can use the begin try and capture the error.

    Going forward, I want to speed this up by looking ahead and read constraints and give an overall output of all errors before any inserts get executed.

Viewing 15 posts - 1 through 15 (of 19 total)

You must be logged in to reply to this topic. Login to reply