Data flow doesn't finish processing records to OLE DB destination

  • Hi,
    I've got a fairly simple data flow that loads from a flat file to a sql server table. According to the log it knows how many rows there are in the source file but it's only processing a portion of the file then it just hangs after 15-20 minutes and it's only happening when I have a table over 1 million rows. I haven't found any errors or anything unusual in the logs. I've tried settng the delay validation on both the connections and input and output objects. I've tried running multiple files in the same package and also singularly. I'm dropping the destination table and re-creating it prior to loading so I don't think it's hanging up because of a truncation. I do have a redirect to an error file so it should process all the rows either to one destination or the other. My destination table is an all character staging table.
    I would appreciate any light you can shed on this.
    Thanks
    Vicki

  • It may be possible that your database log file runs out of disk space.   You might be able to use BULK INSERT with a format file instead...

    Steve (aka sgmunson) 🙂 🙂 🙂
    Rent Servers for Income (picks and shovels strategy)

  • wouldn't there be clues in the log on something like that?

  • vicki.l.lucht - Monday, September 10, 2018 9:23 AM

    Hi,
    I've got a fairly simple data flow that loads from a flat file to a sql server table. According to the log it knows how many rows there are in the source file but it's only processing a portion of the file then it just hangs after 15-20 minutes and it's only happening when I have a table over 1 million rows. I haven't found any errors or anything unusual in the logs. I've tried settng the delay validation on both the connections and input and output objects. I've tried running multiple files in the same package and also singularly. I'm dropping the destination table and re-creating it prior to loading so I don't think it's hanging up because of a truncation. I do have a redirect to an error file so it should process all the rows either to one destination or the other. My destination table is an all character staging table.
    I would appreciate any light you can shed on this.
    Thanks
    Vicki

    Do any rows get committed to the SQL Server table? Does it always occur at the same point in the file? If you run the import locally, does the same thing happen?

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

  • vicki.l.lucht - Monday, September 10, 2018 11:45 AM

    wouldn't there be clues in the log on something like that?

    Not if the log is filling up because there's no more disk space available...   That kind of thing will hang you out to dry...  and leave you there...

    Steve (aka sgmunson) 🙂 🙂 🙂
    Rent Servers for Income (picks and shovels strategy)

  • I am running it locally since it's still in development,I haven't tried to run it through sql agent, and yes it does finish at the same spot. I've tried setting the rows per batch and max commit size and rows are getting commited. It's actually finished on some yesterday but it only processed about 3/4 of the rows so it didn't "hang" that time, but it's still not getting all the rows. I had the dba check on transaction logs yesterday while it was running and he said it was only at .2%. I did put a derived column step in yesterday, just as a pass through, to get the redirect rows so that it  had data in the correct columns to be fixed but I'm thinking I tried that earlier and then it seemed like nothing went to the redirect.
    Vicki

  • vicki.l.lucht - Tuesday, September 11, 2018 8:52 AM

    I am running it locally since it's still in development,I haven't tried to run it through sql agent, and yes it does finish at the same spot. I've tried setting the rows per batch and max commit size and rows are getting commited. It's actually finished on some yesterday but it only processed about 3/4 of the rows so it didn't "hang" that time, but it's still not getting all the rows. I had the dba check on transaction logs yesterday while it was running and he said it was only at .2%. I did put a derived column step in yesterday, just as a pass through, to get the redirect rows so that it  had data in the correct columns to be fixed but I'm thinking I tried that earlier and then it seemed like nothing went to the redirect.
    Vicki

    OK. Try splitting the file in half and importing each half separately. Do they both finish?
    This step is an attempt to establish whether the problem is with the data or with the overall process.

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

Viewing 7 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic. Login to reply