July 22, 2007 at 11:20 pm
Hi,
I am trying to use xp_readerrorlog, but it has one problem:
If a line of text is too long (greater than about 255 characters), the output is continued to the next line and the ContinuationRow field will contain a 1 rather than a 0. If you use the stored procedure to create your own archive, you'll need to adjust the data to allow for these anomalies.
When I try to filter out one day old data, I encounter lines like these:
.bak'}).,1
: {'msdb_00__512bb63a_60ca_49cd_a435_eaa428bc5d2c_'}).,1
_db_200707180100.BAK'}).,1
While expecting the line to start like these:
2007-07-09 11:58:57.53 backup Log restored: Database:
2007-07-09 12:02:39.40 backup Log restored: Database:
So my job fails.
Apart from deleting the bad lines from the temp table are there workarounds for this limitation? Like put the broken lines together again.
Thanks.
July 22, 2007 at 11:38 pm
I have faced this situation and we encountered this handling it programtaically. All yu need to see is if the value in the continuation colun has 1 then read the next line too else leave it and process.
Cheers,
Sugeshkumar Rajendran
SQL Server MVP
http://sugeshkr.blogspot.com
July 22, 2007 at 11:41 pm
Is there a TSQL code that can do this?
July 23, 2007 at 12:24 am
If you want it can written i do this to check if there are any errors that have generated in the last 1 hour and post that if any. If you want the code i shall post it to you as a private message.
Cheers,
Sugeshkumar Rajendran
SQL Server MVP
http://sugeshkr.blogspot.com
July 23, 2007 at 1:17 am
I already have some code which would filter out last week/day/hour from the log, but I would appreciate the code that works around the broken lines problem.
Thanks.
July 23, 2007 at 2:09 am
Can you send me your code i sahll do the necessary changes and send back again.
Cheers,
Sugeshkumar Rajendran
SQL Server MVP
http://sugeshkr.blogspot.com
July 23, 2007 at 5:41 pm
Just sent the code.
Thanks.
Viewing 7 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply