• So at this point your job is completing and successfully loads the data and archives the csv files, but it does not include job step output information in the job history?

    If this is the case the reason why the step output does not include job step output information for the second step is due to the implementation of some very basic threading (I'm not sure I should even call it threading). In run-SmoToCsvFile.ps1 there is a function called "LaunchThread" which uses the .NET System.Diagnostics.ProcessStartInfo to start several new independent powershell consoles running Write-SmoToCsvFile.ps1 (the number depends on your $maxthread setting). The problem with this approach is the original Powershell command executed from SQL Agent has no knowledge of the independent shells. It just starts them, kind of like sp_start_job in SQL Server. If you execute sp_start_job you only receive success or failure of the job starting and not whether the job itself completed or failed.

    This causes some little problems. First as you pointed out the original session does not get back information which would ultimately show up in the job step. This is also why Write-SmoCsvToDb which does not launch additional powershell consoles returns job step output information. The second problem is that Write-SmoCsvToDb can start prior to all the independent powershell consoles running Write-SmoToCsvFile completes. To overcome the second problem I had to include a WAIT step in the job after Run-SmoToCsvFile:

    WAITFOR DELAY '00:30'

    For the first problem I've tried to include output files rather than use the job step output.

    Admittedly the solution of launching multiple Write-SmoToCsvFile.ps1 scripts from Run-SmoToCsvFile.ps1 is less than ideal. On the plus side I'm able to reduce total runtime significantly by launching multiple processes instead of processing through each server sequentially.