Record Count from BulkInsert

  • Hi All

    Just wondering if anyone can advise how I would get a count of the records processed by the BulkInsert task.

    There doesn't seem to be a property for it like the DataPump task has.

    Thanks

    Phill Carter

    Edited by - phillcart on 11/07/2002 9:49:09 PM

    --------------------
    Colt 45 - the original point and click interface

  • This was removed by the editor as SPAM

  • I think @@rowcount should give you the number of records inserted.

    begin tran

    select * into authors2 from authors

    select @@rowcount

    commit tran

  • quote:


    I think @@rowcount should give you the number of records inserted.

    begin tran

    select * into authors2 from authors

    select @@rowcount

    commit tran


    That gives me how many rows ended up in the destination table. It doesn't give me how many rows where processed by BulkInsert. In comparisson, the DataPump task has ProgressRowCount, RowsComplete and RowsInError. However, I don't want to use the datapump task because I'm importing 2mil+ rows from a text file.

    Thanks

    Phill Carter

    --------------------
    Colt 45 - the original point and click interface

  • Have you tried using the Batchsize option.I think if you use this option with a specific size(1000) you can findout how many batches are processed and calculate the number of records processed.

  • quote:


    Have you tried using the Batchsize option.I think if you use this option with a specific size(1000) you can findout how many batches are processed and calculate the number of records processed.


    That may work, but it raises two more questions.

    1) How do you know how many "batches" have been processed?

    2) How do you get a count of the final partial batch?

    Thanks

    Phill Carter

    --------------------
    Colt 45 - the original point and click interface

  • What about @@rowcount along with the batch size

  • quote:


    What about @@rowcount along with the batch size


    Let me explain my situation.

    I have a file that has around 2.2mil records in it. This file will be loaded into a table via DTS using the BulkInsert task.

    I need to report a count of how many records where in the file, how many where successfully imported and how many failed.

    Doing a @@ROWCOUNT on the destination table will give me how many records where imported. How do I get the rest of the info?

    When using the DataPump task I can easily get all these counts using a combination of the ProgressRowCount, RowsComplete and RowsInError properties. However, using the DataPump task is not an option due to the length of time it takes to import the file.

    I can write a short VBScript that'd open the file and the move through record by record, but that'd take ages as well.

    Thanks

    Phill Carter

    --------------------
    Colt 45 - the original point and click interface

  • Try importing all rows into a staging table, do your validation on the data base table, then only import the rows that passed. Should still be fast and you will have all of the SQL Server functions available to do your validation and reporting.

  • quote:


    Try importing all rows into a staging table, do your validation on the data base table, then only import the rows that passed. Should still be fast and you will have all of the SQL Server functions available to do your validation and reporting.


    That's what I do now. What I need to do is report that all the data from the text file made it into the staging table.

    Thanks

    Phill Carter

    --------------------
    Colt 45 - the original point and click interface

Viewing 10 posts - 1 through 9 (of 9 total)

You must be logged in to reply to this topic. Login to reply