Sql Statement in SSIS package is all of a sudden making the tempdb grow exponentially

  • So we have a very large SSIS package that is inserting records from one table in a database to the same table in another database. It is just a simple select query with one parameter that filters it. We have been running this package for weeks with never a problem. Today we went to run it on one database only 20,000 records and the tempdb files grew from 2.5gb to 28gb. When it failed it said it ran out of disk space for the sort. It is running in the Fast Load mode. Why would it be sorting on a simple insert? Could indexes on the table inserting into be the issue? And if so why wouldn't this have happened before?

    We are using SQL SERVER 2012 SP1 CU5.

    If anyone has any thoughts or you need more information please let me know. I am extremely frustrated and not sure where to go next.

  • Take a look at the actual execution plan of the query to see if tempdb is used somewhere.

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • I have looked at the execution plan. That is not the issue. I have attached a picture of the tasks

  • So it looks like the issue is the actual bulk insert. It won't complete. But if you use an execute sql task just doing a regular insert it works. Any thoughts?

  • What if you disable check constraints and enable table lock in the fast load options?

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • So it turns out it has something to do with the indexed views associated with the table. The work around was to drop the views do the bulk insert and then recreate the views. The part that makes no sense is why wasnt this an issue from the beginning.

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply