Table has 50 Million record..How to Move data to BCP file

  • Hi My database table fills daily at very fast speed of approx 50 million record daily but my database has not enough space so I want to create a job which can move my data on daily basis to BCP file and delete record from my table so that I can keep that BCP file at another place for future use...

    Now to solve this problem I want to use BCP command which transfer data from table to file but this enormous amount of data will hit performance of database during transfer so I read an article in which they suggested to use Partition of table and then transfer but I dont know how to use partiotion and transfer data...

    Please help

  • I suggest you start from reading on the topic. There's plenty of resources out there, starting from BOL and technet.

    I found this[/url] article very helpful.

    This one is also very interesting and looks like will fit your needs.

    Everything depends on what you want to achieve with partitioning: if you are planning to move data from the "expired" partiton to a new table and then dump the table to file, this could be a good strategy to avoid locking on the production table, but won't preserve the whole db engine from resource consumption during the dump.

    Another plus with this strategy would be the possibility to truncate the table you are dumping instead of deleting from the production table.

    Maybe it would be interesting to take into consideration SSIS instead of BCP, since it performs significantly better for operations such as table dump.

    Hope this helps.

    Gianluca

    -- Gianluca Sartori

Viewing 2 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply