March 2, 2015 at 1:18 am
Hi All,
Can anyone help me on the issue below.
I have xml_logs_table which contains 6,35,34,823 rows. I need to create a package where i need to filter data which is greater than 2 years, among the result set i need to push only 10,000 rows to another destination data table and delete 10,000 data from the source table. This process i need to schedule to run on every day.
can anyone provide the details, how to resolve this issue.
Screen shot for the solution would helpful to understand.
thanks in advance....
March 2, 2015 at 2:07 am
In my opinion the easiest option would be to write a DELETE statement and then use the OUTPUT clause to put those rows into another table.
OUTPUT clause in INSERT/UPDATE/DELETE statements
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
March 2, 2015 at 2:48 am
Make sure you've got a sufficiently-sized transaction log and appropriate indexes for this!
Regards
Lempster
March 2, 2015 at 2:49 am
Lempster (3/2/2015)
Make sure you've got a sufficiently-sized transaction log and appropriate indexes for this!Regards
Lempster
For 10,000 rows the transaction log doesn't need to be that big.
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
March 2, 2015 at 4:52 am
Hi ,
can you please let me know, how to do this with SSIS packeges
March 2, 2015 at 4:54 am
Index were already present for the table to make a fast search in the report side...
March 2, 2015 at 5:04 am
sunil9372 (3/2/2015)
Hi ,can you please let me know, how to do this with SSIS packeges
You can use an Execute SQL Task to launch the DELETE statement.
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply