The requirement is export the rows to a text file in a path when it meets the condition. Like say when there is a transaction of particular store, the row has to be created as a text file. If there are 1000 rows for that particular, I need 1000 text files to be formed in a path which should have only row in each. The text file should have only the latest row inserted for that particular store. The text file created will be read by other software.
Bloody RBAR 3rd party software. Who comes up with this stuff?
Anyway, IMHO, the thing to do is to have a trigger that simply captures the PK of the table and writes it to a staging table. The Staging table should have an IDENTITY column. Then, have a per minute job that checks the staging table to see if any rows are present. If there are, then do the xp_CmdShell thing in order by the IDENTITY column once for each row deleting the row that was just processed from the staging table.
If you need to keep track of which rows were sent, add a DateSent column to the original table and update it when you do the xp_CmdShell thing in the staging table.
As for the per-minute PITA that Sergiy talked about, create another job that scans jobs to see if they process on a per-minute basis and delete their job history automatically after, say, 24 hours or so.
is pronounced "ree-bar
" and is a "Modenism
" for R
First step towards the paradigm shift of writing Set Based code:
________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.
"Change is inevitable... change for the better is not".
"Dear Lord... I'm a DBA so please give me patience because, if you give me strength, I'm going to need bail money too!"
How to post code problems
How to Post Performance Problems
Create a Tally Function (fnTally)