Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase 12»»

Memory allocation for SSIS variable Expand / Collapse
Author
Message
Posted Tuesday, September 25, 2012 10:27 AM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Thursday, July 24, 2014 8:37 AM
Points: 41, Visits: 363
I'm having a problem with an SSIS variable.

The variable is defined as data type object. I'm writing an ADO record set to this variable using a Recordset destination in a data flow. I'm writing one input column to this variable, varchar(50), and the count is 1,210 rows.

My problem is that, when I view the code execution in debugger, everything works but it hangs up writing the recordset variable. All other data flow tasks turn yellow but the recordset destination remains white. I've written to this variable successfully many times before. I'm wondering if I'm having a problem with restricted memory when the number of rows increase.

Is there a way to increase this with an SSIS setting? Is there a way to verify that memory is the problem? This isn't the biggest record set in the world so I don't understand why the task fails to complete.
Post #1364173
Posted Tuesday, September 25, 2012 1:02 PM


SSCoach

SSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoach

Group: General Forum Members
Last Login: Friday, June 27, 2014 12:43 PM
Points: 15,444, Visits: 9,596
That won't be a memory allocation problem, unless the server you're running it on is already all the way at the limits of the swap file. You'd know that, because the server would crash instead of just not getting to a certain step in the SSIS package.

More likely, the step is either taking longer than you expect, possibly because of the dataset size being larger (hence, longer query), or there's something in the package stopping it from getting to that step in the first place, like a malformed process flow (conditions incorrectly set, that kind of thing).

Are you querying data from SQL Server to fill the dataset? If so, running a trace (Profiler or server-side), with begin and end steps logged, will tell you how long the query takes to run. That might be a good place to start.


- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread

"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
Post #1364243
Posted Tuesday, September 25, 2012 1:10 PM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Thursday, July 24, 2014 8:37 AM
Points: 41, Visits: 363
No, it is preceded by a multicast transformation. One leg of the data flow goes into an OLE DB destination and receives 1,210 rows. The other leg goes into the ADO recordset destination which should also receive 1,210 rows, albeit one column from the row.

I see 1,210 going into the OLEDB destination. The ADO recordset destination just hangs there. I've let it run for over and hour and it still doesn't complete.
Post #1364247
Posted Tuesday, September 25, 2012 3:13 PM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Thursday, July 24, 2014 8:37 AM
Points: 41, Visits: 363
Just for chuckles, I changed my initial selection criteria so that only 100 rows get selected, instead of 1,000 or so rows, ending up in the ADO recordset destination that writes to the object variable I'm having issues with. And it worked flawlessly. So the problem definitely seems to be related to volume.

But why? I still don't know.
Post #1364304
Posted Tuesday, September 25, 2012 3:38 PM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Today @ 2:51 AM
Points: 6,171, Visits: 7,238
rburko (9/25/2012)
Just for chuckles, I changed my initial selection criteria so that only 100 rows get selected, instead of 1,000 or so rows, ending up in the ADO recordset destination that writes to the object variable I'm having issues with. And it worked flawlessly. So the problem definitely seems to be related to volume.

But why? I still don't know.


I don't have anything but anecdotal evidence for this. I've seen similar issues with Object items for recordsets for larger datasets, but considering the only thing you can do with it is loop it after the fact (other than via .NET) I don't normally hit more than 100 items in my recordsets. I have however seen similar issues for the record count. Width doesn't seem to matter as much, if at all. It's inconsistent however. I've done other tests where I simply created a script source and dumped 5000 ints into the stream to a recordset object directly with no issue.

You might be better off dumping this to a .raw file, but what's the end goal with the recordset data? I don't have a reason it's occurring but I might know of a few workarounds if I know where it's heading.

The fact that the exact same package works with less inputs blows out most of my usual troubleshooting.

GSquared above is absolutely correct, however, SSIS simply uses local windows memory for whatever it needs and will grow to whatever it wants. It ignores SQL Server's cache completely (which means SSIS might be starved if you've got it on the same system as a server leaving only 2 gigs for the OS left, but that's a different problem). The object itself doesn't have any restrictions afaik, unless these rows are rediculously huge (4 gigs huge) and you're in 32 bit mode.

Here's the final curiousity though... Yellow/White/Whatnot occurs during processing. Anything white hasn't even begun to fire yet, but the multicast should already be sending data to it (at least the first row).

What is listed in Execution Results for that task after you've stopped the process? Is it still validating? Does it Prepare for Execute?... you get the drift. Once execute fires it should turn yellow in the interface, but that can get grouchy as we all know.



- Craig Farrell

Never stop learning, even if it hurts. Ego bruises are practically mandatory as you learn unless you've never risked enough to make a mistake.

For better assistance in answering your questions | Forum Netiquette
For index/tuning help, follow these directions. |Tally Tables

Twitter: @AnyWayDBA
Post #1364323
Posted Wednesday, September 26, 2012 6:15 AM


SSCoach

SSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoach

Group: General Forum Members
Last Login: Friday, June 27, 2014 12:43 PM
Points: 15,444, Visits: 9,596
rburko (9/25/2012)
No, it is preceded by a multicast transformation. One leg of the data flow goes into an OLE DB destination and receives 1,210 rows. The other leg goes into the ADO recordset destination which should also receive 1,210 rows, albeit one column from the row.

I see 1,210 going into the OLEDB destination. The ADO recordset destination just hangs there. I've let it run for over and hour and it still doesn't complete.


As Kraig mentioned, Not Yellow = Hasn't Started, which is different from Hasn't Finished. Or possibly the UI is balking on the update. Have you checked available system resources? Have you looked at the execution log to see if that step is even reached?


- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread

"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
Post #1364607
Posted Wednesday, September 26, 2012 6:46 AM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Thursday, July 24, 2014 8:37 AM
Points: 41, Visits: 363
Evil Kraig F, GSquared, both good points.

I let the task run all night, came in this morning, and it still did not complete. However, I did notice this in the execution results:

[blah blah blah[1]] Error: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.

So it looks to be a buffer error of some kind. I'm still researching a solution.
Post #1364625
Posted Wednesday, September 26, 2012 7:40 AM


SSCoach

SSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoach

Group: General Forum Members
Last Login: Friday, June 27, 2014 12:43 PM
Points: 15,444, Visits: 9,596
Not sure it's relevant, but I plugged that error into Google and got this as the top result: http://social.msdn.microsoft.com/Forums/en/sqlintegrationservices/thread/8a3558df-221a-45e6-8e08-c23c987c12a9

Seems to be a similar issue. Since you're going for an ADO recordset, instead of a table, it might not be relevant.

On the other hand, you might want to check if you can recode the package to put the data into a table in tempdb or some such, instead of ADO, and see if you can make it do what you need that way.


- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread

"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
Post #1364679
Posted Wednesday, September 26, 2012 7:50 AM
SSC Rookie

SSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC RookieSSC Rookie

Group: General Forum Members
Last Login: Thursday, July 24, 2014 8:37 AM
Points: 41, Visits: 363
I'm taking the recordset, saving it to a variable, and using it in a subsequent loop.

I'm sure I can cajigger the subsequent step to read from a temp table or something but I wanted to use the loop. And you're right, the things I've seen about this particular error seem to only obliquely be related to what I'm trying to do.

If it's a buffer issue, I don't know where to make changes in settings to get it to work, it that's at all feasible.
Post #1364689
Posted Wednesday, September 26, 2012 11:10 AM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Today @ 2:51 AM
Points: 6,171, Visits: 7,238
rburko (9/26/2012)
I'm taking the recordset, saving it to a variable, and using it in a subsequent loop.

I'm sure I can cajigger the subsequent step to read from a temp table or something but I wanted to use the loop. And you're right, the things I've seen about this particular error seem to only obliquely be related to what I'm trying to do.

If it's a buffer issue, I don't know where to make changes in settings to get it to work, it that's at all feasible.


There's another way to deliver to a recordset, but it's awkward. Use a Script Component Destination and build out your code to feed the recordset object. Worst case scenario you might at least get a usable error. I don't know how to do it offhand but I know there's a few blog posts around the web on it.



- Craig Farrell

Never stop learning, even if it hurts. Ego bruises are practically mandatory as you learn unless you've never risked enough to make a mistake.

For better assistance in answering your questions | Forum Netiquette
For index/tuning help, follow these directions. |Tally Tables

Twitter: @AnyWayDBA
Post #1364847
« Prev Topic | Next Topic »

Add to briefcase 12»»

Permissions Expand / Collapse