INSERT avoiding duplicate error on multi-column unique index

  • Got the insert part going just trying to figure out how avoid duplicate error.

    i have figured out how to avoid for a single column unique index but how do i accomplish the same thing with a multi-column unique index.

    insert into Table1 (f1, f2, f3, f4, f5, f6)

    select f1, f2, f3, f4, f5, f6 from Table2 where f1 not in (select f1 from Table1)

    seems to work at avoiding duplicate errors but how do i handle it if Table1 has a unique index of f1(int), f3(char(20)), f4(decimal(12,5)) or whatever. Would I get better performance by setting the IGNORE_DUP_KEY = OFF and just do the insert without the where clause or should i figure out how to use the where clause with a multi-column index?

  • roy.tollison (11/1/2013)


    Got the insert part going just trying to figure out how avoid duplicate error.

    i have figured out how to avoid for a single column unique index but how do i accomplish the same thing with a multi-column unique index.

    insert into Table1 (f1, f2, f3, f4, f5, f6)

    select f1, f2, f3, f4, f5, f6 from Table2 where f1 not in (select f1 from Table1)

    seems to work at avoiding duplicate errors but how do i handle it if Table1 has a unique index of f1(int), f3(char(20)), f4(decimal(12,5)) or whatever. Would I get better performance by setting the IGNORE_DUP_KEY = OFF and just do the insert without the where clause or should i figure out how to use the where clause with a multi-column index?

    This would be a lot easier if you would stick to a single thread instead of starting news ones for each step.

    What exactly do you mean here by avoiding duplicates? Do you mean that you can have duplicates in Table2 but you want only the distinct values in Table1?

    Again, ddl and some sample data would go a LONG way to making this easier.

    _______________________________________________________________

    Need help? Help us help you.

    Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.

    Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.

    Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
    Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
    Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
    Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/

  • I guess what I am looking to do is INSERT but skip/ignore/don't care about any duplicates.

    Some of these tables that I am working with have multi-column unique indexes and my program ran into the error during a test run. When I looked at the destination table none of the rows from the source table were in it. These Tables are created by another department and they don't want to use the ignore_dup_key option in the unique indexes. So I am trying to determine a way to mass insert from Table1 to Table2 the matching fields without it stopping if there are duplicate rows.

  • I am trying to find anywhere how to link fields together to match the index. These indexes range in size from 1 to 12 columns. I tried the + thing on the columns but that doesn't work correctly. f1(int)+f4(char(5)+f5(smallint).

    Is there a way to match up on more than 1 column.

    Do I have to match up on columns or is there a way to use the index names. 'where index1 not in (select index1 from table1)' ??

    DDL

    Table1 Table2

    f1 int, f1 int,

    f2 int, f3 char(50),

    f3 char(50), f4 char(125),

    f4 char(125), f5 int,

    f5 int, f7 int,

    f6 decimal(12,5), f8 char(25),

    f7 int, f9 tinyint,

    f8 char(25), f10 int,

    f9 tinyint, f11 int,

    f10 int, f12 decimal(15,8),

    Identity int; Identity int;

    index1 clustered unique (f1, f3, f10); Index1 clustered unique (f1, f3, f10);

    now Table1 has 100,000 rows that I need to INSERT the matching fields into Table2.

    So the other dept. creates Table2 and they add 1-???? rows of data into it. Some of them have the same data information that is in Table1. So when I issued the INSERT command with all the fields defined then it threw an exception (no duplicates) and didn't insert any of the data in Table1 over to Table2.

    They don't want to update the data in Table2, only insert the data from Table1. Duplicates be skipped/ignored.

  • This does not seem be a difficult problem. But as was mentioned, a lack of DDL for the tables and Indexes makes this very difficult to understand. This is an example of what your DDL for your tables and all the indexes should look like:

    CREATE TABLE Table1

    (

    [RIN] [int] IDENTITY(1,1) NOT FOR REPLICATION NOT NULL,

    [Year] [int] NOT NULL,

    [TRMasterRIN] [int] NOT NULL,

    [TRValueRIN] [int] NOT NULL,

    [AttributeRIN] [int] NOT NULL,

    [QualityRIN] [int] NOT NULL,

    [VisualRIN] [int] NOT NULL,

    [ErrStringRIN] [int] NOT NULL,

    [Sequence] [int] NOT NULL,

    [Amount] [int] NOT NULL,

    [MDateTime] [datetime] NOT NULL,

    [MUser] [varchar](128) NOT NULL,

    CONSTRAINT [APL_RINX] PRIMARY KEY CLUSTERED

    (

    [RIN] ASC

    )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]

    ) ON [PRIMARY]

    CREATE NONCLUSTERED INDEX [TBL1_TRMasterNdx] ON [dbo].[Table1]

    (

    [Year] DESC,

    [TRMasterRIN] ASC,

    [TRValueRIN] ASC,

    [Sequence] ASC

    )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, IGNORE_DUP_KEY = OFF, DROP_EXISTING = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]

    Add some Insert statements for your tables and boom! Answers will show up!

    __________________________________________________________________________________________________________
    How to Post to get the most: http://www.sqlservercentral.com/articles/Best+Practices/61537/

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply