• Shaun-884394 (7/2/2012)


    This is a nice solution. However when we have hundred other files for similar process it will eventually lock a considerable memory forever.

    I was thinking of having split the task into two

    1. File watcher task (WMI)

    2. File processing task (Package)

    The watcher task could be a package or windows service which will be running continuously. It would trigger the processing package when the file is in place.

    This way the system resources would be less consumed.

    The File watcher task would be singular for all the source files. I.E. it will keep on looping through all the source directories and fire appropriate package as and when the file arrives.

    Obviously it would need to have a mapping between file and package stored either in sql or config file.

    Thank you for the kind words Shaun-884394. For discussion purposes, on my machine, the simple package produced by the demo in the article occupied ~15MB of RAM while waiting for a file to arrive. I agree that having one hundred packages loaded concurrently could be a concern depending on the complexity of the packages and how much RAM were available on the server. Although I felt that a scenario involving one hundred files was a bit out of scope for this particular article, it would be interesting to consider for a future article. If that were part of the requirements in a particular environment the ideas you proposed involving a file watching service or a parent/child package approach would definitely be worth exploring during the design phase.

    There are no special teachers of virtue, because virtue is taught by the whole community.
    --Plato