June 9, 2011 at 4:49 pm
hi everybody,
may be my question will be little crazy, but really i need it.
my Question is there any way to read table Data before current transaction begin without using trigger.
such like this:
Let's say that Table1
Col1 Col2 Col3
----- ----- -----
1 9 10
BEGIN TRAN
Update Table1
SET Col1 = 9, Col2 = 2,Col3 = 3
WHERE Col1 = 1
-- Do Select here to find Table1 Data before Begin this transaction and return {1,9,10}
COMMIT
June 9, 2011 at 5:07 pm
If you are using row-versioning isolation (SNAPSHOT ISOLATION or READ_COMMITTED_SNAPSHOT), you will see the data as of the last committed transaction and before the start of uncommitted transactions.
June 9, 2011 at 5:39 pm
i try it, but it not work!!
SET TRANSACTION ISOLATION LEVEL READ cOMMITTED
begin TRAN
UPDATE Table1
SET Col1 = 10
WHERE Col1 = 1
SELECT *
FROM Table1
WHERE Col1 = 1
Commit
Can you give me an example??
June 9, 2011 at 5:51 pm
My assumption was that you were trying to view the data from another connection.
You can use the OUTPUT clause of the UPDATE statement to get what you want.
begin TRAN
UPDATE Table1
SET Col1 = 10
OUTPUT deleted.Col1
WHERE Col1 = 1
Commit
June 11, 2011 at 2:57 am
Yes i want to access data through same connection.
but i want to access Uncommited table data through another SQL Statement
begin TRAN
UPDATE Table1
SET Col1 = 10
WHERE Col1 = 1
{
SELECT *
FROM Table1
WHERE Col1 = 1
}
OR
{
EXEC SPName --> Which Handle the old data and new data and find the difference and save it into another Table
}
Commit
Thanks alot,,
June 11, 2011 at 6:55 am
Read up on the OUTPUT clause as someone previously suggested. It does exactly what you want to do.
It will enable you to store before/after data from every changed row into a table.
__________________________________________________
Against stupidity the gods themselves contend in vain. -- Friedrich Schiller
Stop, children, what's that sound? Everybody look what's going down. -- Stephen Stills
June 11, 2011 at 9:54 am
If it's the same connection, why do you have to do the update first?
June 11, 2011 at 3:44 pm
My idea is call SP that save old Data and new data in "Trace Log(Audit Trail)" table , this SP take parameters such as {Table Name, ID Of data}, so i need to make Generic SP that handle the Table that given before and after update.
June 11, 2011 at 4:31 pm
But you know what the new data is, so why can't you send in the audit first? Or select the OLD data and store it before the update?
If it doesn't need to always happen, then it doesn't need a trigger, but the OUTPUT clause is built for returning this data for use in auditing.
June 11, 2011 at 9:56 pm
i know old Data and New data. but i need to build one Stored procedure to handle any changes on data whatever was the table, with saving the changes into Table "TraceLog".
for example:
CREATE Procedure UpdateAddresses
@AddressID NUMERIC(8,0) = NULL,
@AddressDesc NVARCHAR(250) = NULL
BEGIN
BEGIN TRAN
UPDATE Address
SET AddressDesc = @AddressDesc
WHERE AddressID = @AddressID
EXEC SaveTraceLog @TableName = 'Address', @ID = @AddressID
COMMIT
END
GO
CREATE Procedure UpdateContacts
@ContactID NUMERIC(8,0) = NULL,
@ContactFName NVARCHAR(50) = NULL,
@ContactSName NVARCHAR(50) = NULL,
@ContactTName NVARCHAR(50) = NULL,
@ContactLName NVARCHAR(50) = NULL,
@ContactHPhone NVARCHAR(20) = NULL,
@ContactMPhone NVARCHAR(20) = NULL,
@ContactWPhone NVARCHAR(20) = NULL,
@AddressID NUMERIC(8,0) = NULL
BEGIN
BEGIN TRAN
UPDATE Contacts
SET ContactFName =@ContactFName,
ContactSName = @ContactSName,
ContactTName = @ContactTName,
ContactLName = @ContactLName,
ContactHPhone = @ContactHPhone,
ContactMPhone = @ContactMPhone,
ContactWPhone = @ContactWPhone,
AddressID = @AddressID
WHERE ContactID = @ContactID
EXEC SaveTraceLog @TableName = 'Contacts', @ID = @ContactID
COMMIT
END
GO
SaveTraceLog Stored Procedure Should Take Data from given table Commited and uncommitted data and find difference and insert into TraceLog Table, this is simple example, i have a big system contain alot of SP have to handle Changing the data. my problem now is how to handle the uncommitted data in same Connection????
June 12, 2011 at 7:26 am
Saif:
What we are hearing is that you want to audit the changes: logging the data before the update is committed along with the data after the update is committed. What we are trying to tell you is the best way to do this is simply to add an OUTPUT clause to your update statement.
If you will please just read up on it here http://msdn.microsoft.com/en-us/library/ms177564.aspx, you will find that it is precisely the tool you need. It gives you access to the pre-update data using column names with the DELETED prefix, and at the same time gives you access to the updated values using the INSERTED prefix. You can use it to insert directly into your TraceLog table, or to a worktable that you can then use as a source of data for an insertion into TraceLog.
While we want you to be able to accomplish your objective, we are not going to waste a lot of time trying to help you figure out how to do something the hard way. Please have a look at the link above, and then let us know if you have any questions. Good luck.
__________________________________________________
Against stupidity the gods themselves contend in vain. -- Friedrich Schiller
Stop, children, what's that sound? Everybody look what's going down. -- Stephen Stills
June 12, 2011 at 8:00 am
i was trying to simple the question from the begining, and i was 100% sure that can find answer here.
FYI i search alot before writing this question
anyway thanks for your cooperation.
??????? ?? ??? ???? ?? ??? "???? ???? ????"
June 12, 2011 at 8:22 am
Steve Jones - SSC Editor (6/11/2011)
but i need to build [font="Arial Black"]one [/font]Stored procedure to handle any changes on data whatever was the table
Ah... no. That would require some fairly complicated Dynamic SQL that would likely slow things down for you quite a bit. For the type of auditing you're talking about, I do actually recommend that you put a dedicated, hardcoded trigger on each table that you want to audit.
--Jeff Moden
Change is inevitable... Change for the better is not.
June 12, 2011 at 8:28 am
Thanks Alot.
I almost finish my idea with handle it from another connection with using "READ_COMMITTED_SNAPSHOT" which "SSCrazy" Mention to, i will test it then i will decide.
I appreciate your help..
Thanks again for All,
June 12, 2011 at 9:39 am
If you are trying to log all data changes from any column in any table, you are going to make coding this very complicated.
You'd be better off building a template for your proc, and then building a script that will read a table and generate the code for a proc for that table. Or for all tables, which is a loop of the same thing.
Trying to do this in one sproc because you think it's more efficient will create a complex sproc that is more likely to have bugs.
Also, how are you planning to store the audit data? In XML? Have you considered the size of this data? If you are grabbing changes, you could easily end up with more audit data than you have data over time, which is why people tend not to blanketly audit everything that changes.
I would suggest that you start a new thread on how you plan to audit things with some examples of tables and get feedback there. Then you can work on a script to build sprocs to handle the auditing.
Viewing 15 posts - 1 through 15 (of 17 total)
You must be logged in to reply to this topic. Login to reply