Insert Into Where NOT Exists

  • Also, your code reformatted and column Fields5-10 treated as a single column:

    INSERT INTO
      [Tbl_ComputerInventory]
      (
        [ISOCountryAbb3]
        , [SiteNumber]
        , [ComputerName]
        , [SerialNumber]
        , [Field5-20]
      )
    SELECT DISTINCT
      LEFT([Computer Name], 3)             AS [ISOCountryAbb3]
      , RIGHT(LEFT([Computer Name], 8), 5) AS [SiteNumber]
      , [Computer Name]                    AS [ComputerName]
      , [Service Tag]                      AS [SerialNumber]
      , [Fields5-20]
    FROM
      [ImpTbl_ComputerInventory]
    WHERE
      NOT EXISTS
      ( SELECT
          1
        FROM
          [Tbl_ComputerInventory]
        WHERE
          (
            [ImpTbl_ComputerInventory].[Computer Name]   = [Tbl_ComputerInventory].[ComputerName]
            AND [ImpTbl_ComputerInventory].[Service Tag] = [Tbl_ComputerInventory].[SerialNumber]
          )
      );

  • Now using table aliases, which is what you should be using:

    INSERT INTO
      [Tbl_ComputerInventory]
      (
        [ISOCountryAbb3]
        , [SiteNumber]
        , [ComputerName]
        , [SerialNumber]
        , [Field5-20]
      )
    SELECT DISTINCT
      LEFT([Computer Name], 3)             AS [ISOCountryAbb3]
      , RIGHT(LEFT([Computer Name], 8), 5) AS [SiteNumber]
      , [Computer Name]                    AS [ComputerName]
      , [Service Tag]                      AS [SerialNumber]
      , [Fields5-20]
    FROM
      [ImpTbl_ComputerInventory] AS [ci1]
    WHERE
      NOT EXISTS
      ( SELECT
          1
        FROM
          [Tbl_ComputerInventory] AS [ci2]
        WHERE
          (
            [ci2].[Computer Name]   = [ci1].[ComputerName]
            AND [ci2].[Service Tag] = [ci1].[SerialNumber]
          )
      );

  • ok I follow your Table Alias now.

    Yes, fields5-20 represents 15 other individual columns I spared the forum having to read.

    ImpTbl_ComputerInventory is the only Table with data at this point, 
    Tbl_ComputerInventory is empty because I don't want any garbage in it

    So not caring about fields5-20 having unique data per row, how do I set a single record for Pkey_Tbl_ComputerInventory and have SQL skip the duplicates?

  • One option:
    Instead of DISTINCT

    GROUP BY <primary key> and then decide what you want to do with each of the other columns (MIN, MAX or other aggregation)

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • How about changing your primary key to a unique index with (ignore_dup_key = ON) ?

  • Scott Coleman - Wednesday, January 3, 2018 12:58 PM

    How about changing your primary key to a unique index with (ignore_dup_key = ON) ?

    Thankyou Scott for finding the easy fix!

  • GilaMonster - Wednesday, January 3, 2018 12:57 PM

    One option:
    Instead of DISTINCT

    GROUP BY <primary key> and then decide what you want to do with each of the other columns (MIN, MAX or other aggregation)

    Another option which I posted already here is to use ROW_NUMBER() OVER(PARTITION BY [your key columns here] ORDER BY [your key columns or the columns that determine which single row you want here]).  But in any of these solutions you are going to lose data.  How do you know which of the rows with a duplicate key column has the data you really want?

  • shannonphillips - Wednesday, January 3, 2018 1:28 PM

    Scott Coleman - Wednesday, January 3, 2018 12:58 PM

    How about changing your primary key to a unique index with (ignore_dup_key = ON) ?

    Thankyou Scott for finding the easy fix!

    Easy, but not a good one. What that does is silently discard data if it matches an existing key. You don't get to know what it discards, you aren't alerted that rows were discarded, and if there are rows with duplicate keys being inserted as a single insert, it's not defined which row is kept and which discarded.

    Easy? Yes, if all you care about is that you don't get errors.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • shannonphillips,

    You have duplicates in newly loaded records.

    You need first to find which records have duplicate PK values:

    SELECT

    [Computer Name] AS [ComputerName]

    , [Service Tag] AS [SerialNumber]

    , count(*) AS Dup_Cnt

    FROM

    [ImpTbl_ComputerInventory]

    GROUP BY Computer Name], [Service Tag]

    HAVING count(*) > 1

    Then you may wish to see the data in those records:

    SELECT I.*

    From [ImpTbl_ComputerInventory] I

    Inner join (

    SELECT

    [Computer Name] AS [ComputerName]

    , [Service Tag] AS [SerialNumber]

    , count(*) AS _ComputerInventory]

    FROM

    [ImpTbl_ComputerInventory]

    GROUP BY [Computer Name], [Service Tag]

    HAVING count(*) > 1

    ) D on D.[ComputerName] = I.[Computer Name] and I.[Service Tag] = D.[SerialNumber]

    Order by I.[Computer Name], I.[Service Tag]

    After that you may pass this record set to someone who's in charge of data handling and ask how to resolve the conflict you've just revealed.

    _____________
    Code for TallyGenerator

  • Thanks All,

    For this use case, I don't care which of the duplicate records gets inserted.

    I don't see the option on this forum page to tag this post as resolved.

  • shannonphillips - Thursday, January 4, 2018 5:46 AM

    Thanks All,

    For this use case, I don't care which of the duplicate records gets inserted.

    I don't see the option on this forum page to tag this post as resolved.

    There isn't really.  You could see this thread resurrected several years from now.

Viewing 11 posts - 31 through 40 (of 40 total)

You must be logged in to reply to this topic. Login to reply