SSIS: Insufficient disk space

  • Hi All,

    I am getting below error while executing SSIS package through job:

    There may be error messages posted before this with more information about the failure. End Error Error: 2014-08-22 14:26:11.81 Code: 0x800705AA Source: IMPORTDETAILS Description: End Error Error: 2014-08-22 14:26:11.83 Code: 0xC004704B Source: IMPORTDETAILS Description: The buffer manager cannot write 8 bytes to file "T:\Windows\Temp\DTS{9448AF1C-04C5-4C33-8211-6FE6C9112EA8}.tmp". There was insufficient disk space or quota. End Error Error: 2014-08-22 14:26:12.39 Code: 0x800705AA Source: IMPORTDETAILS Description: End Error Error: 2014-08-22 14:26:12.39 Code: 0xC004704B Source: IMPORTDETAILS Description: The buffer manager cannot write 8 bytes to file "T:\Windows\Temp\DTS{C555DCA5-7131-41C9-844D-6700A5A32367}.tmp". There was insufficient disk space or quota. End Error Error: 2014-08-22 14:26:13.40 Code: 0x800705AA Source: IMPORTDETAILS Description: End Error Error: 2014-08-22 14:26:13.40 Code: 0xC004704B Source: IMPORTDETAILS Description: The buffer manager cannot write 8 bytes to file "T:\Windows\Temp\DTS{DF561118-A55C-4DF1-BF26-B28347E12F66}.tmp". There was insufficient disk space or quota. End Error Error: 2014-08-22 14:26:14.93 Code: 0xC020204B Source: IMPORTDETAILS Lookup 1 [1311] Description: A buffer could not be locked. The system is out of memory or the buffer manager has reached its quota. End Error Error: 2014-08-22 14:26:14.95 Code: 0xC004701A Source: IMPORTDETAILS SSIS.Pipeline Description: component "Lookup 1" (1311) failed the pre-execute phase and returned error code 0xC020204B. End Error DTExec: The package execution returned DTSER_FAILURE (1). Started: 2:23:13 PM Finished: 2:27:29 PM Elapsed: 255.748 seconds. The package execution failed. The step failed.

    I am using Lookup table and configured as Full Cache. Can anybody please help me on this . How to resolve the issue.

    Thanks,

    Arooj

  • The error is exactly what is says on the tin, the system is out of disk space on the T: drive. To resolve this, either some space has to be freed up on that drive or the lookup changed from Full Cache to either Partial or None.

    😎

  • Hi Eirikur

    Thanks for the reply. I have plenty of space in T drive almost 200 GB. and I have checked with partial and no cache but there is no use. Still I am getting the same error. I have changed the BLOB setting like BLOBtempstoragepath, while executing its taking C drive space,instead of T drive .

  • arooj300 (8/25/2014)


    Hi Eirikur

    Thanks for the reply. I have plenty of space in T drive almost 200 GB. and I have checked with partial and no cache but there is no use. Still I am getting the same error. I have changed the BLOB setting like BLOBtempstoragepath, while executing its taking C drive space,instead of T drive .

    This certainly rings a bell, digging into grey cell memory.... Quick questions, how large is the lookup set, what is the file system type (NTFS, FAT, xFAT etc.) on the T: drive, is there any disk quota?

    😎

  • T is NTFS system type and data will be 12 M.

  • arooj300 (8/25/2014)


    T is NTFS system type and data will be 12 M.

    Just a quick thought, could this be a permission issue? Try creating a folder, give the appropriate permissions on that folder to the SSIS process user and see what happens.

    😎

  • This is not a permission issue. Because from 1 year job is running, and we did not have any problem. But from two days job has started failing because the records are more.

    we have 28 Gb free space in C drive. we job is running, all 28 Gb will be reserverd by the job. After completing the job it will release the reserved space.

    My concern is that, we have one source and one destination. Is there any alternative option apart from Lookup transformation, because Lookup will take all the drive space and then start copying the data.

    I want something like which will not overloaded in both source and destination server and we can copy data easily through job.

    Thanks

  • If the lookup has not enough memory, it will flush to disk.

    So you might have a memory issue.

    How did you configure the lookup? I hope you didn't use the dropdown menu...

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • arooj300 (8/26/2014)


    This is not a permission issue. Because from 1 year job is running, and we did not have any problem. But from two days job has started failing because the records are more.

    we have 28 Gb free space in C drive. we job is running, all 28 Gb will be reserverd by the job. After completing the job it will release the reserved space.

    My concern is that, we have one source and one destination. Is there any alternative option apart from Lookup transformation, because Lookup will take all the drive space and then start copying the data.

    I want something like which will not overloaded in both source and destination server and we can copy data easily through job.

    Thanks

    Quick question, what has changed around the time the job started failing, hard to believe that there has been such a dramatic increase in the number of records?

    Can you test the package on another server/pc to verify that the package is the issue?

    😎

  • For a similar task , rowcount task to see the records got processed daily. Plus unit test the package for 100 records and monitor the db file growth as the package is executing.

  • Source db are production db, so obviously records will be increased and whatever changes is happend in live server samthing applying in the destination server through SSIS package for Reporting purpose.

  • I'll just repeat it again.

    If the lookup has not enough memory, it will flush to disk.

    So you might have a memory issue.

    How did you configure the lookup? I hope you didn't use the dropdown menu...

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • 1) In General tab, cache mode-full cache, connection type-oledb connection manager, specify how to handles rows with no matching entries-redirect rows to no match output

    2)In connection tab, Oledb connection manager- Destination server, use a table or a view- table name

    3)In Columns tab, mapping key columns.

  • arooj300 (8/27/2014)


    2)In connection tab, Oledb connection manager- Destination server, use a table or a view- table name

    Write a query instead. Retrieve only the columns you actually need to do the lookup.

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • Hi Koen,

    I have written query, and selected only the key column but still the problem is same.

Viewing 15 posts - 1 through 15 (of 20 total)

You must be logged in to reply to this topic. Login to reply