Reading SQL Server's Transaction Log

  • Thought so. They still doing data warehousing or just the identity stuff?

  • Not sure. It was folded into IBM and I haven't seen any other information about their log reading tech.

  • Hi Kraig

    Thank you very much for very useful article regarding transaction log reading. I have a question about update and delete commands - how we can handle these operations?

    Thanks

  • TRACEY-320982 (11/21/2010)


    With this information would you be able to replicate the data to another database. Most of the software we use does not have primary keys so SQL Replication is not an option.

    Mirroring, snapshots not an option as the reporting database needs to be up 24 /7 and not down whilst snapshots run or log transactions are built on the reporting database.

    Just curious

    Not a good idea to build production functionality on undocumented features.

  • fn_dblog changed slightly between SQL 2005 and SQL 2008. This was, of course, without notification or documentation. If I remember correctly, it changed from 2000 to 2005 as well. I don't know if it's changed from 2008 to 2008 R2, but it could have. Also haven't checked Denali (SQL v 11) yet.

    I've used it to test for whether a log backup was needed on databases that have intermittently high activity in long periods of no activity, comparing LUNs in the log to data in the backup history in msdb. Even then, I wouldn't use it in anything where an error message mattered much. Too much chance of it changing again.

    - Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
    Property of The Thread

    "Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon

  • Mike C (11/22/2010)


    I'd like to stress a point the author mentioned in passing: this function is UNDOCUMENTED. What does that really mean? Well, to summarize:

    1) Microsoft does not support the function. If you run into issues in production with this function, the first solution you will probably get is to remove the dependency on this function. Apart from that you're probably pretty much on your own.

    2) It could have undesirable side effects. Blocking comes immediately to mind for this function.

    3) It may not be available/accessible in the next release. That is, it can be pulled at any time, including major release, minor release, or even service pack. Sp_makewebtask anyone?

    You mentioned Paul Randall, Kimberly Tripp and Isaac Kunen in your article, but I've never seen a recommendation from them to use this in production. Every time I've seen them use this it has been to demo one-off troubleshooting. You may want to clarify this before people start asking how they can use this to do things like create their own transaction log monitoring or home-baked custom replication systems in production environments. Nobody wants to waste money on a CSS call only to find out the code they've written using undocumented functionality is not supported.

    TRACEY-320982

    Mike and Phil's points about these commands being "Undocumented" and unsupported cannot be stressed enough, obviously. But a very popular blogger once said "It's a physical-record locator function! Undocumented and unsupported (obviously), but hey, some of the best features are :-)" -Paul S. Randal

    fn_dblog has changed over the different versions of SQL Server, but it also has survived all of the versions to date as well. It also is the basis for many of the third party applications available, so you be the judge.

    I've given you some tools that you can use for your exploration, what you do with it is only limited to your knowledge and imagination. If you've got the wherewithal to build a replication system or some other mechanism using it, then by all means do so. But do keep in mind there are risks to such an appoach that can't be ignored.

    When questioned about his work to come up with first light bulb, Thomas Edision replied. “I haven't failed, I've found 10,000 ways that don't work”.

  • From the perspective of a professional DBA and Dev, this is useless. First, it is troublesome. Second, it cannot guarantee pulling data correctly. Third, it does not give enough information for users.

    Usually, we need a tool to check online logs or backup logs to find changes made by users for specific tables. Currently, ApexSQL log is the right tool for it. It is around $1300. It provides detailed information for all insert, delete and update operations including schema changes. For example, for update operations, you will see each updated record's old and new data information together with user, time and so on. It is good enough for monitoring OLTP change and is very convenient to use.

  • fat wallet (11/22/2010)


    From the perspective of a professional DBA and Dev, this is useless. First, it is troublesome. Second, it cannot guarantee pulling data correctly. Third, it does not give enough information for users.

    Usually, we need a tool to check online logs or backup logs to find changes made by users for specific tables. Currently, ApexSQL log is the right tool for it. It is around $1300. It provides detailed information for all insert, delete and update operations including schema changes. For example, for update operations, you will see each updated record's old and new data information together with user, time and so on. It is good enough for monitoring OLTP change and is very convenient to use.

    You go too far, sir. This article is not "useless". Knowing how to read this function's output can help when troubleshooting specific issues. If you need enterprise-class monitoring or you need a regular "pull" of data then yes, you need a third-party tool. If you're trying to troubleshoot one specific issue or just poking around trying to learn a bit about SQL Server internals on the side, this article just saved you $1300.

    I wouldn't recommend building functionality on top of this function, for the reasons previously stated. Basically it is undocumented and therefore unsupported. It can also be dropped or its behavior/parameters/output changed without warning.

  • Oh, for that matter, it works with 2008 r2 and Denali CTP1.

    dbcc page may be undocumented; but it is there for now--maybe not in the future, but today it works.

    .

    This article is helpful and the general idea is that it could keep your caboose from falling off the tracks. (scil. it has saved my *** before.)

  • Kraig Kerr (11/22/2010)


    Looks like TDWI took down that link. Here is another that points to the same article. http://findarticles.com/p/articles/mi_m0EIN/is_2000_Nov_8/ai_66696495/

    The current flavor of this works with SQL 2008, but with minor changes will work with SQL 2005. I also have a version for SQL 2000. I should have specified this in the article, thanks for the catch.

    After reading this, how did you make the jump to reading the transaction log as documented in your article? Are you trying to use the transaction log for some sort of transactional-replication-like data movement tool?

  • I was wondering the same thing when I saw the throughput -- 1,000 recs/min.

  • It's definitely not useless data. It's very useful. It just needs some caveats, which it has. It's a good article and obviously took a lot of work to put together.

    - Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
    Property of The Thread

    "Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon

  • I don't believe it is useless and really appreciate the amount of work it has taken. But there are two caveats when it comes to using things like this, one the average production dba does not have time to research issues in such detail, which is why there are tools to do it (or calls to CSS), and two lot of people cannot and do not run scripts found on websites (no matter how well researched or well written) on production without some kind of trust or prior association with the author.No offence to this particular author but there are lot of scripts and lot of authors who write very custom scripts, scripts that will work in some situation/some environment very well but not so in others. So the statement that it saved someone $1300 or any amount of money may not necessarily be true either.

  • dma-669038 (11/23/2010)


    No offence to this particular author but there are lot of scripts and lot of authors who write very custom scripts, scripts that will work in some situation/some environment very well but not so in others. So the statement that it saved someone $1300 or any amount of money may not necessarily be true either.

    As I very clearly stated (and will probably do so again and again, I'm sure), if you are trying to (1) troubleshoot a one-off problem or (2) simply poking around SQL Server's internals trying to learn something you can (A) spend the $1300 big wallet mentioned for a tool you might use a few times or (B) you can poke around with fn_dblog for free. If you decide to poke around with fn_dblog for free, then you just saved big wallet's $1300.

    Your argument about authors who write scripts that work in some situations/environments very well but not so in others doesn't seem to add anything to the argument. It goes without saying that the best way to protect yourself against malevolent scripts is to understand what the script is doing before you run it. The fact that the author is educating his readers about the functions he's using here provides a little more insight into what all those scripts you see out there are doing.

    I'm not sure I understand your assertions, unless you were just responding to a post without bothering to follow the thread to get a little bit of context.

  • The author himself providing clarifications on where the script works and where it doesn't is not enough for his/her scripts to be considered trustworthy to be run on a production environment (atleast where i work my boss would not take that for an answer). I do not mean that as anything personal against the author but such trust is built over time from blogging repeatedly (one way), and prior experience perhaps from using something same author did.

    The situation 1 you describe is exactly what I was referring to, a one off emergency may happen that may need use of such scripts. The average dba cannot just predict such situations happening and when they happen cannot just use a script he/she finds. We need third party supported tools exactly for this reason. Thanks.

Viewing 15 posts - 16 through 30 (of 44 total)

You must be logged in to reply to this topic. Login to reply