• kevin.kembel (4/16/2013)


    I have, and there's plenty of RAM free (8GB of 16GB free when it crashes), but is there another limitation that I'm running into? Something per-process?

    I read that there's a 2GB limit for each process, but as far as I can tell that's only for the 32-bit and I'm running 64. Or is there somewhere else I should be looking at for memory usage?

    Apologies Kevin, you did mention that in your post above and somehow I skimmed past it to get to the update in the second post.

    I don't have 2012 to experiment with, unfortunately, but some of these are other items.

    The error itself seems more related to the Visual Studio environment than it does to the SSIS engine. Out of curiousity, did you run the original package outside of VS at all and get the same errors? IE: Fired it up from a SQL Agent task? You mention you tried to execute while debugging, I'm jsut not sure where else you've ran this from that you also get errors.

    In regards to your other questions, cache'ing on the lookups just has to do with the lookup data. It'll apply some memory constraints but usually nothing horrible unless you're loading it with A few thousand VARCHAR(100) fields or something equally large.

    What's the rowsize of the data in the flow? Are you working with BLOB fields in the flow?

    If you really want to get deep into SSIS optimization, an interesting place to start:

    http://technet.microsoft.com/library/Cc966529

    In particular in that article is the following information:

    Estimated Row Size – Estimated Row Size is not a specific SSIS setting. Rather, it is something that SSIS calculates based on the metadata that it collects about your source data at design time. You can shrink the row size by identifying the smallest possible data types for all of your columns as early in the data flow as possible. This is especially important for flat file data sources because each column is automatically read into SSIS as a string data type unless you configure the column’s data type explicitly.

    DefaultMaxBufferRows – DefaultMaxBufferRows is a configurable setting of the SSIS Data Flow task that is automatically set at 10,000 records. SSIS multiplies the Estimated Row Size by the DefaultMaxBufferRows to get a rough sense of your dataset size per 10,000 records. You should not configure this setting without understanding how it relates to DefaultMaxBufferSize.

    DefaultMaxBufferSize – DefaultMaxBufferSize is another configurable setting of the SSIS Data Flow task. The DefaultMaxBufferSize is automatically set to 10 MB by default. As you configure this setting, keep in mind that its upper bound is constrained by an internal SSIS parameter called MaxBufferSize which is set to 100 MB and can not be changed.

    MinBufferSize – While MinBufferSize is not configurable, it is still important to know about it because SSIS uses this internal parameter to gauge whether you have set the DefaultMaxBufferSize too low. MinBufferSize is defined by the granularity of your operating system’s virtual memory allocation functionality. Typically, this is set to 65,536 bytes, but it differs from machine to machine.

    This is part of why I ask how large is your row? In 32 bit it's 4GB for a process, in 64bit it's... well, bigger than your memory can probably physically insert into the boards. 🙂

    I realize I may be asking a few questions that are at 'is the thing plugged in' levels, but I just want to make sure I've got apples to apples going before we dive deeper... particularly with BLOB fields.


    - Craig Farrell

    Never stop learning, even if it hurts. Ego bruises are practically mandatory as you learn unless you've never risked enough to make a mistake.

    For better assistance in answering your questions[/url] | Forum Netiquette
    For index/tuning help, follow these directions.[/url] |Tally Tables[/url]

    Twitter: @AnyWayDBA