Insert in batches using select coalesce option

  • Hi, I am trying to insert multiple tables data into one table using Insert in batches and with select coalesce. I am getting Ambiguos column (id) error. Can some one please help me to resolve the issue. Thank You.

    ERROR MESSAGE:
    Msg 209, Level 16, State 1, Line 26
    Ambiguous column name 'id'.

    Line:26 - , ROW_NUMBER() over(order by [id]) as 'RowNbr'

    Original Query:

    --Keeping track of inserts
    declare @rc int;
    declare @tableRows int;
    declare @batchSize int;
    declare @start int;
    declare @end int;
    Checkpoint
    WaitFor DELAY '00:00:01:00'

    set @rc = 1;
    select @tableRows = count(*) from [dbo].[Table];
    set @batchSize = 5000;
    set @start = 1;
    set @end = @start + @batchSize - 1;

    while @rc < @tableRows
    begin
    with cte([id],
    [name],
    [dept_id],
    [mail],
    [contact],
    RowNbr) as
    (
    select coalesce(a.id, b.id, c.id, d.id) as id
    , max(a.name) as Name
    , max(b.dept_id) as Dept_Id
    , max(c.mail) as Mail
    , max(d.contact) as Contact
    , ROW_NUMBER() over(order by [id]) as 'RowNbr'
    from Table1 a
    full outer join Table2 b on a.id = b.id
    full outer join Table3 c on b.id = c.id
    full outer join Table4 d on c.id = d.id
    group by coalesce(a.id, b.id, c.id, d.id)
    )

    insert into [dbo].[Table]([id], [name], [dept_id], [mail], [contact])

    select [id], [name], [dept_id], [mail], [contact]
    from cte
    where RowNbr between @start and @end
    OPTION (MAXDOP 1);

    set @rc += @batchSize;
    set @start = @end + 1 ;
    set @end = @start + @batchSize - 1;
    Checkpoint
    WaitFor DELAY '00:00:01:00'
    end
  • The ambiguity is in the ROW_NUMBER function: in the order by([id]) the [id] should be written as coalesce(..).

    Giorgio

     

  • You've got four columns called id in your query, and you haven't specified which one to use in your ROW_NUMBER call.  Replace "id" with "a.id", "b.id", "c.id" or "d.id".

    John

  • It's worked. Thanks allot.

  • please please do NOT use cursors for this

    standard construct for a batch update where you have a KEY (or set of columns) that you can use to filter what has already been processed is as follows - working example where you would need to replace tablenames and columns names with your own

    It may also be better to insert the left outer joins onto a temp table before going directly to the final table

    if object_id('tempdb..#t1') is not null
    drop table #t1
    ;
    if object_id('tempdb..#t2') is not null
    drop table #t2
    ;
    -- noformat
    WITH E1(N) AS (
    SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1 UNION ALL
    SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1 UNION ALL
    SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1
    ) --10E+1 or 10 rows
    , E2(N) AS (SELECT 1 FROM E1 a, E1 b) --10E+2 or 100 rows
    , E4(N) AS (SELECT 1 FROM E2 a, E2 b) --10E+4 or 10,000 rows max
    -- endnoformat
    , num as
    (select row_number() over (order by (select null)) as rownum
    from e4
    )

    select rownum as id
    , rand() as value
    into #t1
    from num

    select *
    into #t2
    from #t1
    where 1 = 0

    declare @BatchSize bigint = 100
    declare @RowCount bigint = 1

    while @RowCount > 0
    begin
    begin transaction -- optional - but can help to reduce the size of TLOG if either database in in simple recovery or if backup logs are very frequent
    insert into #t2
    (id
    , value
    )
    select top (@BatchSize) t1.id
    , t1.value
    from #t1 t1
    left outer join #t2 t2
    on t2.id = t1.id
    where t2.id is null
    set @RowCount = @@RowCount

    waitfor delay '00:00:01:00' -- optional - to allow server to resources to be used by other processes - may not be required and may cause further issues
    commit -- optional
    end

     

  • Hi frederico_fonseca,

    Thank you for the sample script. I'll try to implement it. Thanks again.

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply