• Zach, you have this correct.

     

    To flesh out the scenario a bit, this is a conversion process from one application to another involving as much as 10-15 years worth of data.  In the original application, every time a real-life event happened (which could be tens or hundreds of time per day) a report (or narrative) was written.  These narratives vary in from one 70-character line to as many as 1000 70-character lines.

     

    The database in the new application is designed differently, so these old narratives are converted to text documents and then “attached” to the converted event in the new application.  This allows users to see all the old information, especially when the old info doesn’t fit well into the new database.

     

    This conversion process will happen when the customer goes live on the new application.  They will stop the old system, the data will be converted, and the customer will go live on the new system that now contains their converted data.  The text documents don’t have to be generated exactly within that window, but will come very shortly thereafter.  So we are attempting to create these documents as quickly as possible.

     

    For the particular customer I am currently helping, they have roughly 3 million events for which the narratives have to be converted to text files.

     

    Any thoughts/suggestions are welcome.

     

    I agree that this sounds like some sort of memory-pressure issue.  I'm working to diagnose 1) exactly what the cause is, and 2) how I can fix it/work around it.