Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase

The Transaction log for the database 'tempdb' is full, please help! Expand / Collapse
Author
Message
Posted Wednesday, October 3, 2012 9:03 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Monday, February 18, 2013 8:33 AM
Points: 2, Visits: 30

Hi,

I just started using SQL server management Studio, i have a database and a table with 498591 records.

I have a sql script which remove duplicates.

Whilst running the script, 2.5 hours later i get a popup saying space on E:\ is running low.

At the end it said query completed with errors, "The Transaction log for the database 'tempdb' is full".

I checked the E:\ and it was full, only had 10mb available, E:\ is also where the tempdb file was.

What would be the best course of action? I assume i need to rerun the script again as there is still 498591 records.

Post #1367760
Posted Wednesday, October 3, 2012 9:17 AM


Old Hand

Old HandOld HandOld HandOld HandOld HandOld HandOld HandOld Hand

Group: General Forum Members
Last Login: 2 days ago @ 8:21 AM
Points: 318, Visits: 1,084
You could post the script - sounds a bit slow-running!
Post #1367779
Posted Wednesday, October 3, 2012 9:36 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Monday, February 18, 2013 8:33 AM
Points: 2, Visits: 30
The script works as its been tested by our coding department. Is there anything else i can do?
Post #1367804
Posted Wednesday, October 3, 2012 4:07 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Yesterday @ 8:51 PM
Points: 35,606, Visits: 32,190
rahim.shabbir (10/3/2012)
The script works as its been tested by our coding department. Is there anything else i can do?


Obviously your coding department didn't test it for scalability because it's not working for you. There's likely a "Triangular Join" in the script that's generating billions of internal rows that TempDB is trying to store.

Post the code and the table definition (CREATE TABLE) or we just won't be able to help. I recommend that you take a look at the second link in my signature line to get the best help possible on this problem.


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #1368079
Posted Friday, October 5, 2012 8:01 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Yesterday @ 10:35 PM
Points: 12, Visits: 78
Either you need to rework on your script, something like using table variable that shall take top 5000 records and delete it, put it in a while loop till all records are deleted, it seems that your script is trying to delete all the records in one go forcing tempdb to fill.

Else you may want to create another tempdb log file to another drive where ample space is present.
Post #1369062
Posted Thursday, October 11, 2012 1:10 AM


SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Wednesday, November 19, 2014 5:03 AM
Points: 2,840, Visits: 3,975
[quote]rahim.shabbir (10/3/2012)
i have a database and a table with 498591 records.I have a sql script which remove duplicates./quote] whenever you are dealing with high volume of data you need to think about batch approach (deals in chunks with data)


-------Bhuvnesh----------
I work only to learn Sql Server...though my company pays me for getting their stuff done
Post #1371237
« Prev Topic | Next Topic »

Add to briefcase

Permissions Expand / Collapse