Issues with SQL Trigger

  • I have created a SQL Trigger as below.

    create trigger testtrigger on testdb

    for insert

    as

    declare @sql varchar(8000)

    SELECT @sql = 'bcp "select * from testdb" queryout c:/test.txt -c -t -T -S localhost'

    exec xp_cmdshell @sql

    go

    But when a row inserted in the testdb, the text file is created, but there is no content. Can someone please help me to resolve this.

  • It's a terrible idea to bcp from inside of a trigger.

    Never import/export data, send emails, or anything like this from a code of a trigger.

    You may create a staging table, populate it with the data from a trigger and use a job to output the data to a file.

    But never do it directly from a trigger.

  • Thank you.

    Can you please provide sample steps. I haven't done this before.

  • Are you able to provide me a sample on how to do this. I need the text file on live when the row is inserted.

  • Do you actually have a table called "TestDB"?  If so, which database is it in and what in your BCP command identifies that database?

    I also agree with Sergiy.  Having this kind of thing in a trigger is a form of "Death by SQL".  If something goes wrong, the trigger will hang forever and at least the table will be totally unusable.  It could affect the whole database.

     

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.
    "Change is inevitable... change for the better is not".
    "Dear Lord... I'm a DBA so please give me patience because, if you give me strength, I'm going to need bail money too!"

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Yes for my testing, I have database called tesdb and testing it by manual insert. So if this is not the right way of doing, can you please advise on what's the recommended way of achieving my requirement.

     

    I need one text file for each row when the sql meets the condition.

  • You may create a staging table, populate it with the data from a trigger and use a job to output the data to a file.

    I guess use the Inserted virtual table inside the trigger to append to a permanent table (maybe add a datetime column that you can update so you know which records have been "processed").

    CREATE TABLE permanent_table(recordID INT, col1, col2, col3,..., dateadded SMALLDATETIME DEFAULT GETDATE())

    then in the trigger,

    INSERT INTO permanent_table(recordID, col1, col2, col3)

    then you could just process the "permanent_table" data with a cursor (if you have to do row-wise actions).

  • +1 for an interim table. I use a similar concept for sending emails, it was not feasible to do in the application due to reasons I won't go into.

    If it's of any use to someone searching the forum, I use the following process to queue up and send emails. Originally it was based on just new case creations in a legal case management application, but I did it in a way that would allow it to be used for other updates/insertions on other tables by adding further "light touch" triggers.

    Disclaimer, not infinitely scalable for obvious reasons. It has worked without error or performance impact for just over 3 years, sending about 500-1000 emails a week, including after occasional set based updates and inserts.

    Create the "queue" table:

    CREATE TABLE dbo.udEmailQueue
    (
    emID INT IDENTITY(1,1) NOT NULL PRIMARY KEY,
    emType NVARCHAR(20) NOT NULL,
    emObjType NVARCHAR(15) NOT NULL,
    emObjID BIGINT NOT NULL,
    emReqested DATETIME NOT NULL,
    emIsSent BIT NOT NULL,
    emIsFailed BIT NOT NULL,
    )

    Example light touch trigger: This one is simply when a new case is created - a similar ones exist for another scenarios, but where possible I use sprocs which test for various conditions and insert records into the queue.

    CREATE TRIGGER [config].[tgrFileNumberGenerator] ON [config].[dbFile] 
    FOR INSERT

    INSERT INTO udEmailQueue
    (
    emType
    , emObjType
    , emObjID
    , emReqested
    , emIsSent
    , emIsFailed
    )
    SELECT
    N'NBICreateNotify' --What sort of email are we sending
    , N'FILE' --
    , i.fileID
    , GETDATE()
    , 0
    , 0
    FROM Inserted AS i

    Create a logging table

    CREATE TABLE dbo.udEmailQueueLog
    (
    emqlID BIGINT IDENTITY(1,1) NOT NULL PRIMARY KEY,
    emqlObjID BIGINT NOT NULL,
    emqlObjType NVARCHAR(15) NOT NULL,
    emqlType NVARCHAR(20) NOT NULL,
    emqlDesc NVARCHAR(255) NOT NULL,
    emqlDate DATETIME NOT NULL,
    emqlRows INT NOT NULL,
    )

    Create sProc to send anything in the queue. I have had to severely redact this so I hope it still makes sense. This is called by a job, which runs every minute (suitable for our uses and why I say it will not scale infinitely!) The other sprocs  I mentioned, in place of triggers, run in earlier stages of the job and populate the email queue, another send mail sProc with different criteria runs after.

    Successes and failures are logged and can be easily joined to the source table(s) if required.

    CREATE PROCEDURE [dbo].[udspNBIemailSendCreated]
    AS

    DECLARE
    @EmailType AS NVARCHAR(20) = 'NBICreateNotify'
    , @BodyText as varchar(max)
    , @SubjectText as varchar(200)
    , @OjectType as NVARCHAR(15)
    , @FileID AS BIGINT
    , @CaseIdentifier AS NVARCHAR(20)
    , @RecipientsText as nvarchar(1000)
    , @CCtext as nvarchar(1000)
    , @Created as NVARCHAR(30)
    , @Rows AS INT

    SELECT TOP 1
    @FileID = df.fileID
    , @OjectType = 'FILE'
    , @RecipientsText = ColumnValues
    , @CCtext = CASE
    WHEN tests
    THEN ColumnValue
    ELSE OtherColumnValue
    END
    , @SubjectText = CASE
    WHEN tests
    THEN 'One Subject ' +
    ColumnValues
    ELSE 'DifferentSubject ' +
    ColumnValues
    END
    , @BodyText = 'Your message ' + AnyColumnValues +
    ' more message text ' + AnyOtherColumnValues +
    'etc ' + '<br> <br>' + -- Any HTML formatting
    'more message text '
    FROM
    udEmailQueue AS ueq
    INNER JOIN dbFile AS df
    ON df.fileID = ueq.emObjID
    AND ueq.emObjType = 'FILE'
    /*
    Add any other join conditions required to populate variables in email body/subject/recipients
    */
    WHERE
    ueq.emIsSent = 0
    AND ueq.emType = @EmailType
    ORDER BY ueq.emIsFailed --will retry, but failed ones will not stop new ones being sent

    --------------------------------------------------------------------------------------------

    SET @Rows = @@ROWCOUNT

    BEGIN TRY
    IF @Rows > 0
    BEGIN
    EXEC msdb.dbo.sp_send_dbmail
    @profile_name = 'YourDBmailProfile'
    , @recipients = @RecipientsText
    , @copy_recipients = @CCtext
    , @subject = @SubjectText
    , @body = @BodyText
    , @body_format = 'HTML'
    , @importance = 'HIGH'
    , @from_address = 'DoNotReply@yourOrg'
    , @reply_to = 'DoNotReply@yourOrg'

    INSERT INTO udEmailQueueLog
    (
    emqlobjID
    , emqlObjType
    , emqlType
    , emqlDesc
    , emqlDate
    , emqlRows
    )
    VALUES (
    @FileID
    , N'FILE'
    , @EmailType
    , N'NBI Creation Notification email sent to ' + @RecipientsText
    , GETDATE()
    , @Rows
    );

    UPDATE udEmailQueue
    SET
    emIsSent = 1
    , emIsFailed = 0
    WHERE
    emObjID = @FileID
    AND emType = @EmailType;
    END
    END TRY

    BEGIN CATCH
    INSERT INTO udEmailQueueLog
    (
    emqlobjID
    , emqlObjType
    , emqlType
    , emqlDesc
    , emqlDate
    , emqlRows
    )
    VALUES (
    @FileID
    , N'FILE'
    , @EmailType
    , N'Failed to send NBI Creation notification email for ' + @CaseIdentifier
    , GETDATE()
    , @Rows
    )

    UPDATE udEmailQueue
    SET
    emIsFailed = 1
    WHERE
    emObjID = @FileID
    AND emType = @EmailType;
    END CATCH

    Please forgive any code smells! I know it could probably more efficient.

    "Knowledge is of two kinds. We know a subject ourselves, or we know where we can find information upon it. When we enquire into any subject, the first thing we have to do is to know what books have treated of it. This leads us to look at catalogues, and at the backs of books in libraries."
    — Samuel Johnson
    I wonder, would the great Samuel Johnson have replaced that with "GIYF" now?

  • This is called by a job, which runs every minute

    "every minute" jobs create quite a trouble due to filling up the job history in msdb.

    I found it more effective to dynamically launch such a job:

    1. check if "next run time" for the job is within 2 minutes
    2. If yes - do nothing and return. If not - create one-off schedule for the job to run at the beginning of a next minute.

    msdb is known for its ineffectiveness, so I shift the current time by 30 seconds, so the job engine has at least 30 seconds to complete all relevant manipulations and not to miss the schedule.

  • n.subbuu wrote:

    Yes for my testing, I have database called tesdb and testing it by manual insert. So if this is not the right way of doing, can you please advise on what's the recommended way of achieving my requirement.

    I need one text file for each row when the sql meets the condition.

    If what you say is true, then your query won't work even if you run it from the SSMS while you're in the TestDB database.  There's no sense in getting into xp_CmdShell and Email and triggers, etc, etc, if the query you wrote won't work without any of that other stuff.

    In other words, your query is broken because you're trying to select from  database and not a table in the database.  Fix your broken query first.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.
    "Change is inevitable... change for the better is not".
    "Dear Lord... I'm a DBA so please give me patience because, if you give me strength, I'm going to need bail money too!"

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • To be honest, I think this whole task might be a mistake in the making.  It sounds like someone is trying to build a homegrown version of replication.  What is the real purpose of needing a text file for every row that's being inserted?  I ask because, if we knew the ultimate purpose behind all of this, we might actually be able to recommend a better way.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.
    "Change is inevitable... change for the better is not".
    "Dear Lord... I'm a DBA so please give me patience because, if you give me strength, I'm going to need bail money too!"

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • The requirement is export the rows to a text file in a path when it meets the condition. Like say when there is a transaction of particular store, the row has to be created as a text file. If there are 1000 rows for that particular, I need 1000 text files to be formed in a path which should have only row in each. The text file should have only the latest row inserted for that particular store. The text file created will be read by other software.

  • How do I move only the new data to new table?

  • I don't get it.

    you say that if there are 1000 rows for a particular store. Than you need to create 1000 files, 1 for each transaction.

    and then in the very next sentence you say that the file must have only the latest row for any particular store. The latest may be only 1, not 1000.

    can you explain this, please?

  • n.subbuu wrote:

    How do I move only the new data to new table?

    what is “new data”?

Viewing 15 posts - 1 through 15 (of 16 total)

You must be logged in to reply to this topic. Login to reply