SSIS Buffer Error

  • sgmunson - Thursday, May 3, 2018 2:07 PM

    skaggs.andrew - Thursday, May 3, 2018 1:53 PM

    sgmunson - Thursday, May 3, 2018 1:07 PM

    skaggs.andrew - Thursday, May 3, 2018 12:48 PM

    Phil Parkin - Thursday, May 3, 2018 12:45 PM

    skaggs.andrew - Thursday, May 3, 2018 11:32 AM

    Phil Parkin - Thursday, May 3, 2018 11:25 AM

    skaggs.andrew - Thursday, May 3, 2018 11:14 AM

    The package says the execution completed.  It acts as though the data flow completed without error.  However, when I look at links between the sources and targets within the data flow task, I can see that they have not all completed.  It's as if everything stopped and didn't complete the data transfer from source to target

    Sure, but that's not quite what I meant 🙂
    Is this just a simple data flow from A to B, or is there more complexity than that (eg, lookups, aggregations, joins)?

    Sorry.  This is a very straight forward source to target, no transformations involved.  I have a data flow task and inside I have probably 10 source to target paths setup.  This has been working for years and now that I have migrated the package, I am getting this error.

    No problem.

    Was this a straight migration, or were any changes made to the packages? Specifically, values like Rows Per Batch and Maximum Insert Commit Size on the data flow destinations?

    Is the RAM on the 2017 machine the same, or larger, than what you have on the 2008 instance?

    I essentially just copied the packages over from the old environment.  Maybe 10 packages in total.  I have already tested about 6 of them and no issues until now.  I did not make any changes to the package other than configure the connection manager.  I have the same amount of RAM on both machines.

    And how much RAM is that?   Any chance you were on the edge of capacity in the older environment but just didn't know it?  Is the new server under load yet, or is this still pre-production work?   While there have been old boxes with 4 GB of RAM operating with SQL 2008 for years, boxes that small are often way under-powered, and even a new box with the same amount of RAM isn't likely to avoid issues that might just be a little hidden because they're edge cases.  Sometimes, just having the much faster processors cause the RAM requirements to jump up considerably simply because the server can now do things so much faster.   I rarely spec a SQL Server at much less than 32 GB of RAM, but that doesn't mean there aren't cases for slightly less...

    the current(old) has 16 GB ram, the new has 16 gb of ram.  should I ask my admin to have it doubled?

    Maybe... but it might mean that you'd need to justify it.  Do you have any stats hanging around that document RAM usage by SQL Server?   Might be hard to justify without it.   Alternatively, do you have any kind of stats hanging around that document any kind of capacity limit being reached with the old server?   Could be RAM, CPU, disk space, or even I/O waits

    Thanks Steve,

    @Op, Please could you try to query the system DMV from SSMS and see how much memory was allocated to the buffers in SQL main memory. I think this will help you to sort out the issue.

  • subramaniam.chandrasekar - Friday, May 4, 2018 5:28 AM

    sgmunson - Thursday, May 3, 2018 2:07 PM

    skaggs.andrew - Thursday, May 3, 2018 1:53 PM

    sgmunson - Thursday, May 3, 2018 1:07 PM

    skaggs.andrew - Thursday, May 3, 2018 12:48 PM

    Phil Parkin - Thursday, May 3, 2018 12:45 PM

    skaggs.andrew - Thursday, May 3, 2018 11:32 AM

    Phil Parkin - Thursday, May 3, 2018 11:25 AM

    skaggs.andrew - Thursday, May 3, 2018 11:14 AM

    The package says the execution completed.  It acts as though the data flow completed without error.  However, when I look at links between the sources and targets within the data flow task, I can see that they have not all completed.  It's as if everything stopped and didn't complete the data transfer from source to target

    Sure, but that's not quite what I meant 🙂
    Is this just a simple data flow from A to B, or is there more complexity than that (eg, lookups, aggregations, joins)?

    Sorry.  This is a very straight forward source to target, no transformations involved.  I have a data flow task and inside I have probably 10 source to target paths setup.  This has been working for years and now that I have migrated the package, I am getting this error.

    No problem.

    Was this a straight migration, or were any changes made to the packages? Specifically, values like Rows Per Batch and Maximum Insert Commit Size on the data flow destinations?

    Is the RAM on the 2017 machine the same, or larger, than what you have on the 2008 instance?

    I essentially just copied the packages over from the old environment.  Maybe 10 packages in total.  I have already tested about 6 of them and no issues until now.  I did not make any changes to the package other than configure the connection manager.  I have the same amount of RAM on both machines.

    And how much RAM is that?   Any chance you were on the edge of capacity in the older environment but just didn't know it?  Is the new server under load yet, or is this still pre-production work?   While there have been old boxes with 4 GB of RAM operating with SQL 2008 for years, boxes that small are often way under-powered, and even a new box with the same amount of RAM isn't likely to avoid issues that might just be a little hidden because they're edge cases.  Sometimes, just having the much faster processors cause the RAM requirements to jump up considerably simply because the server can now do things so much faster.   I rarely spec a SQL Server at much less than 32 GB of RAM, but that doesn't mean there aren't cases for slightly less...

    the current(old) has 16 GB ram, the new has 16 gb of ram.  should I ask my admin to have it doubled?

    Maybe... but it might mean that you'd need to justify it.  Do you have any stats hanging around that document RAM usage by SQL Server?   Might be hard to justify without it.   Alternatively, do you have any kind of stats hanging around that document any kind of capacity limit being reached with the old server?   Could be RAM, CPU, disk space, or even I/O waits

    Thanks Steve,

    @Op, Please could you try to query the system DMV from SSMS and see how much memory was allocated to the buffers in SQL main memory. I think this will help you to sort out the issue.

    How do you do that?  I have never done this before.  Thanks

  • skaggs.andrew - Friday, May 4, 2018 7:15 PM

    subramaniam.chandrasekar - Friday, May 4, 2018 5:28 AM

    sgmunson - Thursday, May 3, 2018 2:07 PM

    skaggs.andrew - Thursday, May 3, 2018 1:53 PM

    sgmunson - Thursday, May 3, 2018 1:07 PM

    skaggs.andrew - Thursday, May 3, 2018 12:48 PM

    Phil Parkin - Thursday, May 3, 2018 12:45 PM

    skaggs.andrew - Thursday, May 3, 2018 11:32 AM

    Phil Parkin - Thursday, May 3, 2018 11:25 AM

    skaggs.andrew - Thursday, May 3, 2018 11:14 AM

    The package says the execution completed.  It acts as though the data flow completed without error.  However, when I look at links between the sources and targets within the data flow task, I can see that they have not all completed.  It's as if everything stopped and didn't complete the data transfer from source to target

    Sure, but that's not quite what I meant 🙂
    Is this just a simple data flow from A to B, or is there more complexity than that (eg, lookups, aggregations, joins)?

    Sorry.  This is a very straight forward source to target, no transformations involved.  I have a data flow task and inside I have probably 10 source to target paths setup.  This has been working for years and now that I have migrated the package, I am getting this error.

    No problem.

    Was this a straight migration, or were any changes made to the packages? Specifically, values like Rows Per Batch and Maximum Insert Commit Size on the data flow destinations?

    Is the RAM on the 2017 machine the same, or larger, than what you have on the 2008 instance?

    I essentially just copied the packages over from the old environment.  Maybe 10 packages in total.  I have already tested about 6 of them and no issues until now.  I did not make any changes to the package other than configure the connection manager.  I have the same amount of RAM on both machines.

    And how much RAM is that?   Any chance you were on the edge of capacity in the older environment but just didn't know it?  Is the new server under load yet, or is this still pre-production work?   While there have been old boxes with 4 GB of RAM operating with SQL 2008 for years, boxes that small are often way under-powered, and even a new box with the same amount of RAM isn't likely to avoid issues that might just be a little hidden because they're edge cases.  Sometimes, just having the much faster processors cause the RAM requirements to jump up considerably simply because the server can now do things so much faster.   I rarely spec a SQL Server at much less than 32 GB of RAM, but that doesn't mean there aren't cases for slightly less...

    the current(old) has 16 GB ram, the new has 16 gb of ram.  should I ask my admin to have it doubled?

    Maybe... but it might mean that you'd need to justify it.  Do you have any stats hanging around that document RAM usage by SQL Server?   Might be hard to justify without it.   Alternatively, do you have any kind of stats hanging around that document any kind of capacity limit being reached with the old server?   Could be RAM, CPU, disk space, or even I/O waits

    Thanks Steve,

    @Op, Please could you try to query the system DMV from SSMS and see how much memory was allocated to the buffers in SQL main memory. I think this will help you to sort out the issue.

    How do you do that?  I have never done this before.  Thanks

    Kindly look at this article,

    https://www.red-gate.com/simple-talk/blogs/a-quick-look-at-sys-dm_os_buffer_descriptors/

    Also try to find out the buffers size and If needed please make the necessary changes to buffers size.

  • I have added more power to the server but I am still running into issues executing my package.  When I run the package, it will partially load, then I will get the message at the bottom saying that everything has successfully executed.  However, I can see in my data flow that it hasn't actually completed loading everything.  I search the execution results, and there are no error messages.  It's like it's getting caught up and freezes, but says that everything has completed.  Any suggestions for this?  I have tried running this multiple times now, moving things around, and I get nothing about memory or buffers anymore.

  • skaggs.andrew - Tuesday, May 8, 2018 12:53 PM

    I have added more power to the server but I am still running into issues executing my package.  When I run the package, it will partially load, then I will get the message at the bottom saying that everything has successfully executed.  However, I can see in my data flow that it hasn't actually completed loading everything.  I search the execution results, and there are no error messages.  It's like it's getting caught up and freezes, but says that everything has completed.  Any suggestions for this?  I have tried running this multiple times now, moving things around, and I get nothing about memory or buffers anymore.

    You'll have to be a lot more explicit in specifying exactly what you mean by "see in my data flow that it hasn't actually completed loading everything."   We can't see your package or the results, so you'll have to provide a lot more detail.   However, the fact that an increase in capacity, (you said "added more power", which isn't exactly specific as to what that means) stopped the errors from occurring suggests that the original problem has disappeared, and that capacity may well have been an issue.   Thus two questions arise. 1) What means "added more power", and please be specific, and 2) What means "I can see in my data flow that it hasn't actually completed loading everything." ?

    Steve (aka sgmunson) 🙂 🙂 🙂
    Rent Servers for Income (picks and shovels strategy)

  • sgmunson - Tuesday, May 8, 2018 3:21 PM

    skaggs.andrew - Tuesday, May 8, 2018 12:53 PM

    I have added more power to the server but I am still running into issues executing my package.  When I run the package, it will partially load, then I will get the message at the bottom saying that everything has successfully executed.  However, I can see in my data flow that it hasn't actually completed loading everything.  I search the execution results, and there are no error messages.  It's like it's getting caught up and freezes, but says that everything has completed.  Any suggestions for this?  I have tried running this multiple times now, moving things around, and I get nothing about memory or buffers anymore.

    You'll have to be a lot more explicit in specifying exactly what you mean by "see in my data flow that it hasn't actually completed loading everything."   We can't see your package or the results, so you'll have to provide a lot more detail.   However, the fact that an increase in capacity, (you said "added more power", which isn't exactly specific as to what that means) stopped the errors from occurring suggests that the original problem has disappeared, and that capacity may well have been an issue.   Thus two questions arise. 1) What means "added more power", and please be specific, and 2) What means "I can see in my data flow that it hasn't actually completed loading everything." ?

    8 cpu's and 24 GB ram is what I have on the server.

    I attached some screen shots of the control flow, data flow, and the ending info from the results. Let me know your thoughts, thanks

  • skaggs.andrew - Tuesday, May 8, 2018 5:14 PM

    sgmunson - Tuesday, May 8, 2018 3:21 PM

    skaggs.andrew - Tuesday, May 8, 2018 12:53 PM

    I have added more power to the server but I am still running into issues executing my package.  When I run the package, it will partially load, then I will get the message at the bottom saying that everything has successfully executed.  However, I can see in my data flow that it hasn't actually completed loading everything.  I search the execution results, and there are no error messages.  It's like it's getting caught up and freezes, but says that everything has completed.  Any suggestions for this?  I have tried running this multiple times now, moving things around, and I get nothing about memory or buffers anymore.

    You'll have to be a lot more explicit in specifying exactly what you mean by "see in my data flow that it hasn't actually completed loading everything."   We can't see your package or the results, so you'll have to provide a lot more detail.   However, the fact that an increase in capacity, (you said "added more power", which isn't exactly specific as to what that means) stopped the errors from occurring suggests that the original problem has disappeared, and that capacity may well have been an issue.   Thus two questions arise. 1) What means "added more power", and please be specific, and 2) What means "I can see in my data flow that it hasn't actually completed loading everything." ?

    8 cpu's and 24 GB ram is what I have on the server.

    I attached some screen shots of the control flow, data flow, and the ending info from the results. Let me know your thoughts, thanks

    I get the feeling that there is a precedence constraint that decides when the package succeeds that relies on the INCIDENT_LU_CODE portion completing.   That precedence constraint will have to change, or alternatively, all other data flow tasks prior to it would have to go in their own container, and then INCIDENT_LU_CODE depends for start on the completion of that container.   Alternatively, remove the precedence constraint for success altogether.   Or, post back with why that is in there.   If you predicate package success on an element that can complete before other elements, this is what can happen.

    Steve (aka sgmunson) 🙂 🙂 🙂
    Rent Servers for Income (picks and shovels strategy)

  • skaggs.andrew - Tuesday, May 8, 2018 5:14 PM

    sgmunson - Tuesday, May 8, 2018 3:21 PM

    skaggs.andrew - Tuesday, May 8, 2018 12:53 PM

    I have added more power to the server but I am still running into issues executing my package.  When I run the package, it will partially load, then I will get the message at the bottom saying that everything has successfully executed.  However, I can see in my data flow that it hasn't actually completed loading everything.  I search the execution results, and there are no error messages.  It's like it's getting caught up and freezes, but says that everything has completed.  Any suggestions for this?  I have tried running this multiple times now, moving things around, and I get nothing about memory or buffers anymore.

    You'll have to be a lot more explicit in specifying exactly what you mean by "see in my data flow that it hasn't actually completed loading everything."   We can't see your package or the results, so you'll have to provide a lot more detail.   However, the fact that an increase in capacity, (you said "added more power", which isn't exactly specific as to what that means) stopped the errors from occurring suggests that the original problem has disappeared, and that capacity may well have been an issue.   Thus two questions arise. 1) What means "added more power", and please be specific, and 2) What means "I can see in my data flow that it hasn't actually completed loading everything." ?

    8 cpu's and 24 GB ram is what I have on the server.

    I attached some screen shots of the control flow, data flow, and the ending info from the results. Let me know your thoughts, thanks

    Also, 24 GB is actually pretty small for running SSIS packages, and your package appears to do a lot of stuff in parallel, which generally tends to drive RAM usage higher.    While you provided your current CPU & RAM specs, what were they before?   How large is your largest database on that server?  I'd guess you were most likely under-powered to do what this package does in addition to whatever the remaining load on that server is.   If you're expecting growth, I'd start looking at getting 32GB, with eventual plans to double that as you grow.

    Steve (aka sgmunson) 🙂 🙂 🙂
    Rent Servers for Income (picks and shovels strategy)

  • sgmunson - Wednesday, May 9, 2018 6:31 AM

    skaggs.andrew - Tuesday, May 8, 2018 5:14 PM

    sgmunson - Tuesday, May 8, 2018 3:21 PM

    skaggs.andrew - Tuesday, May 8, 2018 12:53 PM

    I have added more power to the server but I am still running into issues executing my package.  When I run the package, it will partially load, then I will get the message at the bottom saying that everything has successfully executed.  However, I can see in my data flow that it hasn't actually completed loading everything.  I search the execution results, and there are no error messages.  It's like it's getting caught up and freezes, but says that everything has completed.  Any suggestions for this?  I have tried running this multiple times now, moving things around, and I get nothing about memory or buffers anymore.

    You'll have to be a lot more explicit in specifying exactly what you mean by "see in my data flow that it hasn't actually completed loading everything."   We can't see your package or the results, so you'll have to provide a lot more detail.   However, the fact that an increase in capacity, (you said "added more power", which isn't exactly specific as to what that means) stopped the errors from occurring suggests that the original problem has disappeared, and that capacity may well have been an issue.   Thus two questions arise. 1) What means "added more power", and please be specific, and 2) What means "I can see in my data flow that it hasn't actually completed loading everything." ?

    8 cpu's and 24 GB ram is what I have on the server.

    I attached some screen shots of the control flow, data flow, and the ending info from the results. Let me know your thoughts, thanks

    I get the feeling that there is a precedence constraint that decides when the package succeeds that relies on the INCIDENT_LU_CODE portion completing.   That precedence constraint will have to change, or alternatively, all other data flow tasks prior to it would have to go in their own container, and then INCIDENT_LU_CODE depends for start on the completion of that container.   Alternatively, remove the precedence constraint for success altogether.   Or, post back with why that is in there.   If you predicate package success on an element that can complete before other elements, this is what can happen.

    I tried making updates to the precedence constraint to test out the package.  It still executed in the same manner.  However I then just manually ran the first Data Flow Task in the container.  In this data flow task, I have just one source -> target flow.  I attached screen shots.  This makes me think it isnt the precedence constraint.  Thoughts?  Thanks

  • sgmunson - Wednesday, May 9, 2018 6:41 AM

    skaggs.andrew - Tuesday, May 8, 2018 5:14 PM

    sgmunson - Tuesday, May 8, 2018 3:21 PM

    skaggs.andrew - Tuesday, May 8, 2018 12:53 PM

    I have added more power to the server but I am still running into issues executing my package.  When I run the package, it will partially load, then I will get the message at the bottom saying that everything has successfully executed.  However, I can see in my data flow that it hasn't actually completed loading everything.  I search the execution results, and there are no error messages.  It's like it's getting caught up and freezes, but says that everything has completed.  Any suggestions for this?  I have tried running this multiple times now, moving things around, and I get nothing about memory or buffers anymore.

    You'll have to be a lot more explicit in specifying exactly what you mean by "see in my data flow that it hasn't actually completed loading everything."   We can't see your package or the results, so you'll have to provide a lot more detail.   However, the fact that an increase in capacity, (you said "added more power", which isn't exactly specific as to what that means) stopped the errors from occurring suggests that the original problem has disappeared, and that capacity may well have been an issue.   Thus two questions arise. 1) What means "added more power", and please be specific, and 2) What means "I can see in my data flow that it hasn't actually completed loading everything." ?

    8 cpu's and 24 GB ram is what I have on the server.

    I attached some screen shots of the control flow, data flow, and the ending info from the results. Let me know your thoughts, thanks

    Also, 24 GB is actually pretty small for running SSIS packages, and your package appears to do a lot of stuff in parallel, which generally tends to drive RAM usage higher.    While you provided your current CPU & RAM specs, what were they before?   How large is your largest database on that server?  I'd guess you were most likely under-powered to do what this package does in addition to whatever the remaining load on that server is.   If you're expecting growth, I'd start looking at getting 32GB, with eventual plans to double that as you grow.

    I dont recall what the CPU was, but the ram was 16 gb before.  The largest DB I have on this server that my SSIS packages will write to is 279 GB.

  • skaggs.andrew - Wednesday, May 9, 2018 8:53 AM

    I tried making updates to the precedence constraint to test out the package.  It still executed in the same manner.  However I then just manually ran the first Data Flow Task in the container.  In this data flow task, I have just one source -> target flow.  I attached screen shots.  This makes me think it isnt the precedence constraint.  Thoughts?  Thanks

    Please be more specific.   What was the precedence constraint for package success?  How, exactly, did you change it?  If you attached any new screen shots, I don't see them, and I've already looked at the ones you posted earlier.  I still suspect a precedence and/or package success constraint that is misconfigured in some way.   Seems to me if there were a bug that could cause this kind of problem, someone else would have already found it.   And I've seen this kind of problem before.   If the package success is dependent on only one thing, and that one thing can finish before the rest of the elements of the package, then those other things that aren't done yet get abandoned and the package ends.   You have to be rather careful when setting up package success constraints, to ensure you DON'T run into this problem.

    Steve (aka sgmunson) 🙂 🙂 🙂
    Rent Servers for Income (picks and shovels strategy)

  • sgmunson - Wednesday, May 9, 2018 10:30 AM

    skaggs.andrew - Wednesday, May 9, 2018 8:53 AM

    I tried making updates to the precedence constraint to test out the package.  It still executed in the same manner.  However I then just manually ran the first Data Flow Task in the container.  In this data flow task, I have just one source -> target flow.  I attached screen shots.  This makes me think it isnt the precedence constraint.  Thoughts?  Thanks

    Please be more specific.   What was the precedence constraint for package success?  How, exactly, did you change it?  If you attached any new screen shots, I don't see them, and I've already looked at the ones you posted earlier.  I still suspect a precedence and/or package success constraint that is misconfigured in some way.   Seems to me if there were a bug that could cause this kind of problem, someone else would have already found it.   And I've seen this kind of problem before.   If the package success is dependent on only one thing, and that one thing can finish before the rest of the elements of the package, then those other things that aren't done yet get abandoned and the package ends.   You have to be rather careful when setting up package success constraints, to ensure you DON'T run into this problem.

    Sorry, here are the screen shots

  • skaggs.andrew - Wednesday, May 9, 2018 10:36 AM

    sgmunson - Wednesday, May 9, 2018 10:30 AM

    skaggs.andrew - Wednesday, May 9, 2018 8:53 AM

    I tried making updates to the precedence constraint to test out the package.  It still executed in the same manner.  However I then just manually ran the first Data Flow Task in the container.  In this data flow task, I have just one source -> target flow.  I attached screen shots.  This makes me think it isnt the precedence constraint.  Thoughts?  Thanks

    Please be more specific.   What was the precedence constraint for package success?  How, exactly, did you change it?  If you attached any new screen shots, I don't see them, and I've already looked at the ones you posted earlier.  I still suspect a precedence and/or package success constraint that is misconfigured in some way.   Seems to me if there were a bug that could cause this kind of problem, someone else would have already found it.   And I've seen this kind of problem before.   If the package success is dependent on only one thing, and that one thing can finish before the rest of the elements of the package, then those other things that aren't done yet get abandoned and the package ends.   You have to be rather careful when setting up package success constraints, to ensure you DON'T run into this problem.

    Sorry, here are the screen shots

    In your Incident Tables container, I'd be looking to see the actual precedence constraints, and for ALL of the elements within it.   I'm also interested in any package success constraints.  Need the details...

    Steve (aka sgmunson) 🙂 🙂 🙂
    Rent Servers for Income (picks and shovels strategy)

  • sgmunson - Wednesday, May 9, 2018 11:05 AM

    skaggs.andrew - Wednesday, May 9, 2018 10:36 AM

    sgmunson - Wednesday, May 9, 2018 10:30 AM

    skaggs.andrew - Wednesday, May 9, 2018 8:53 AM

    I tried making updates to the precedence constraint to test out the package.  It still executed in the same manner.  However I then just manually ran the first Data Flow Task in the container.  In this data flow task, I have just one source -> target flow.  I attached screen shots.  This makes me think it isnt the precedence constraint.  Thoughts?  Thanks

    Please be more specific.   What was the precedence constraint for package success?  How, exactly, did you change it?  If you attached any new screen shots, I don't see them, and I've already looked at the ones you posted earlier.  I still suspect a precedence and/or package success constraint that is misconfigured in some way.   Seems to me if there were a bug that could cause this kind of problem, someone else would have already found it.   And I've seen this kind of problem before.   If the package success is dependent on only one thing, and that one thing can finish before the rest of the elements of the package, then those other things that aren't done yet get abandoned and the package ends.   You have to be rather careful when setting up package success constraints, to ensure you DON'T run into this problem.

    Sorry, here are the screen shots

    In your Incident Tables container, I'd be looking to see the actual precedence constraints, and for ALL of the elements within it.   I'm also interested in any package success constraints.  Need the details...

    As of now, I have all of the precedence constraints between data flow tasks setup as the following:

    Evaluation Operation = Constraint
    Value = Completion

    In the beginning, all of the values were set to "Success".  I tried switching to see if that would fix it.  It is stopping on the very first data flow task that has just 1 process "source => target"
    I don't see any constraints inside the data flow task. Are there constraints here as well?

  • skaggs.andrew - Wednesday, May 9, 2018 12:34 PM

    sgmunson - Wednesday, May 9, 2018 11:05 AM

    skaggs.andrew - Wednesday, May 9, 2018 10:36 AM

    sgmunson - Wednesday, May 9, 2018 10:30 AM

    skaggs.andrew - Wednesday, May 9, 2018 8:53 AM

    I tried making updates to the precedence constraint to test out the package.  It still executed in the same manner.  However I then just manually ran the first Data Flow Task in the container.  In this data flow task, I have just one source -> target flow.  I attached screen shots.  This makes me think it isnt the precedence constraint.  Thoughts?  Thanks

    Please be more specific.   What was the precedence constraint for package success?  How, exactly, did you change it?  If you attached any new screen shots, I don't see them, and I've already looked at the ones you posted earlier.  I still suspect a precedence and/or package success constraint that is misconfigured in some way.   Seems to me if there were a bug that could cause this kind of problem, someone else would have already found it.   And I've seen this kind of problem before.   If the package success is dependent on only one thing, and that one thing can finish before the rest of the elements of the package, then those other things that aren't done yet get abandoned and the package ends.   You have to be rather careful when setting up package success constraints, to ensure you DON'T run into this problem.

    Sorry, here are the screen shots

    In your Incident Tables container, I'd be looking to see the actual precedence constraints, and for ALL of the elements within it.   I'm also interested in any package success constraints.  Need the details...

    As of now, I have all of the precedence constraints between data flow tasks setup as the following:

    Evaluation Operation = Constraint
    Value = Completion

    In the beginning, all of the values were set to "Success".  I tried switching to see if that would fix it.  It is stopping on the very first data flow task that has just 1 process "source => target"
    I don't see any constraints inside the data flow task. Are there constraints here as well?

    Just to test, I created an all new package.  I have 1 data flow task.  Inside the task I created a source and target.  I'm pulling from my clients Oracle DB and doing a straight load into the Temp database in SQL server.  There is nothing else at all in this package.  I ran it and it's stopping roughly at the exact same amount of rows loaded.  It hasn't completed, but the message below says its complete.  I attached a file so you can see.  Do you think maybe its not the constraints?  I'm lost as to why it just abruptly stops

Viewing 15 posts - 16 through 30 (of 41 total)

You must be logged in to reply to this topic. Login to reply