SQL Server PowerShell Extensions (SQLPSX) Part 2

  • Comments posted to this topic are about the item SQL Server PowerShell Extensions (SQLPSX) Part 2

  • This is a great way to collect the security state of my servers. Took a bit of tinkering to get it all to work - i wanted the SQLPSX data in an existing meta data database that i have so wanted all the objects in their own schema. Lots of minor changes to the script, changing dbo. to SQLPSX. and we got there!

    I cannot (yet) get the SQL job to run via SQL Agent. The 2nd step "C:\WINDOWS\system32\WindowsPowerShell\v1.0\powershell.EXE -command "C:\SQLPSX\run-SmoToCsvFile.ps1 2>&1"" fails with error "Executed as user: . Unable to open Step output file. The step failed."

    When i run the same command from the command prompt it executes with no error so i am thinking its permissions based??

    Off to tweak the reports now.

    Many, many thanks for such a useful project

    Jonathan

  • Check the "Advanced" page under the job step "Run-SmoToCsvFile" and ensure the path to the SQLPSX.err output file exists. This is one difference between the command-line and SQL Agent execution where I'm using the output file option on job steps and probably explains why it works executing directly from command-line.

  • yup spotted that one pretty early on.

    still not getting it to run from the job though. 🙁

    SQLPSX_SMOtoCSV.log is empty

    SQLPSX_CSVtoDB.log has:

    Move-Item : Cannot create a file when that file already exists.

    At C:\SQLPSX\Write-SmoCsvToDb.ps1:75 char:10

    + Move-Item <<<< "$CsvDir*.log" "$arcDir$((Get-Date).ToString(`"yyyyMMdd`"))"

    Setting the directory C:\SQLPSX\Output\Archive\20081008\ to compress new files [OK]

    Compressing files in C:\SQLPSX\Output\Archive\20081008

    1 files within 2 directories were compressed.

    0 total bytes of data are stored in 0 bytes.

    The compression ratio is 1.0 to 1.

    SMOCSSVTODB.log file has:

    2008-10-08 16:16 processCsv Login

    2008-10-08 16:16 processCsv ServerPermission

    2008-10-08 16:16 processCsv ServerRole

    2008-10-08 16:16 processCsv SqlLinkedServerLogin

    2008-10-08 16:16 processCsv SqlUser

    2008-10-08 16:16 processCsv DatabasePermission

    2008-10-08 16:16 processCsv ObjectPermssion

    2008-10-08 16:16 processCsv DatabaseRole

    2008-10-08 16:16 archiveCsv C:\SQLPSX\output\20081008

    and thats it.

    the job completes with success but move zero rows of data.

    if i run "./run-smotocsvfile.ps1" it ships the expected data, i then run "./write-smocsvtodb.ps1" and SSRS shows details perfectly.

    I cannot run the job step in SQLCmd mode ie:

    "!!C:\WINDOWS\system32\WindowsPowerShell\v1.0\powershell.EXE -command "run-SmoToCsvFile.ps1" >> C:\SQLPSX\SQLJobLog\SqlSec_Err.LOG"

    Do I need to give the full location of the run-smotocsvfile.ps1 file? --> c:\sqlpsx\run-smotocsvfile.ps1?

    running this from the Command prompt succeeds:

    "C:\WINDOWS\system32\WindowsPowerShell\v1.0\powershell.EXE -command "C:\SQLPSX\run-SmoToCsvFile.ps1"

    thanks.

  • I see a mistake in the article where I set the archive and data directory to the same directory. Here's what I have in my production script Write-SmoCsvToDb.ps1:

    $CsvDir = "$scriptRoot\Data\"

    $arcDir = "$scriptRoot\Data\Archive\"

    And here's what I have in my Write-SmoToCsvFile.ps1

    $dir = "$scriptRoot\Data\"

    You can use the $scriptRoot variable which is set at the beginning of each script or you can specify the path. Ensure the archive and data directories exist. The Write-SmoCsvToDb.ps1 script will create a subdirectory under \Data\Archive\ for each run and NTFS compress the subdirectory.

    Sorry about the error -- Let me know if this works and I'll correct the article.

  • It doesn't seem like you can run PowerShell through SQLCMD. I just tried running !!powershell -version and all I see are unprintable characters. I often run PowerShell through SQL Agent using the Operating System (CmdExec) type without issue. You do have to specify the full path of script files.

  • OK, SQLCMD isnt the big deal, I was trying to use that to debug why the job wast working...

    any ideas on the reason that the csv files dont get generated, just the log files confirming that nothing has been done?!

    Thanks for the help so far.

    J

  • Nice series ...

  • Have you had a chance to try this fix?

    cmille19 (10/8/2008)


    I see a mistake in the article where I set the archive and data directory to the same directory. Here's what I have in my production script Write-SmoCsvToDb.ps1:

    $CsvDir = "$scriptRoot\Data\"

    $arcDir = "$scriptRoot\Data\Archive\"

    And here's what I have in my Write-SmoToCsvFile.ps1

    $dir = "$scriptRoot\Data\"

    You can use the $scriptRoot variable which is set at the beginning of each script or you can specify the path. Ensure the archive and data directories exist. The Write-SmoCsvToDb.ps1 script will create a subdirectory under \Data\Archive\ for each run and NTFS compress the subdirectory.

    Sorry about the error -- Let me know if this works and I'll correct the article.

  • many apologies, i hadnt seen your two posts before mine.

    I downloaded the files and copied them to the server - i didnt spot that the article was different. I have them as

    Write-SmoCsvToDb.ps1

    $CsvDir = "$ScriptRoot\output\" #"$scriptRoot\Data\"

    $arcDir = "$scriptRoot\Output\Archive\"#"$scriptRoot\Data\Archive\"

    Write-SmoToCsvFile.ps1

    $dir = "$scriptRoot\output\"#"$scriptRoot\Data\"

    The logging actions from these files work - thats to say the .log (.err in the original files) iscreated and then collected up and deposited in the data/archive/...date... folder. Its whole contents are:

    2008-10-09 15:11 processCsv Login

    2008-10-09 15:11 processCsv ServerPermission

    2008-10-09 15:11 processCsv ServerRole

    2008-10-09 15:11 processCsv SqlLinkedServerLogin

    2008-10-09 15:11 processCsv SqlUser

    2008-10-09 15:11 processCsv DatabasePermission

    2008-10-09 15:11 processCsv ObjectPermssion

    2008-10-09 15:11 processCsv DatabaseRole

    2008-10-09 15:11 archiveCsv C:\SQLPSX\output\20081009

    the log files created by the SQL Job steps get created in a different folder (same level as Archive).

    SQLPSX_SMOtoCSV.log is empty

    SQLPSX_CSVtoDB.log contains

    Directory: Microsoft.PowerShell.Core\FileSystem::C:\SQLPSX\Output\Archive

    Mode LastWriteTime Length Name

    ---- ------------- ------ ----

    d---- 09/10/2008 15:11 20081009

    Setting the directory C:\SQLPSX\Output\Archive\20081009\ to compress new files [OK]

    Compressing files in C:\SQLPSX\Output\Archive\20081009

    smocsvtodb.log 398 : 398 = 1.0 to 1 [OK]

    2 files within 2 directories were compressed.

    398 total bytes of data are stored in 398 bytes.

    The compression ratio is 1.0 to 1.

    The SQL job reports success on all steps now - The history for step 2 (Run-SmoToCsvFile) shows

    Message

    Executed as user: . The step did not generate any output. Process Exit Code 0. The step succeeded.

    I would have expected the output of the script creating the csv files running into this message. At least in part if not in full.

    Sorry I cant say it's all working yet...

  • So at this point your job is completing and successfully loads the data and archives the csv files, but it does not include job step output information in the job history?

    If this is the case the reason why the step output does not include job step output information for the second step is due to the implementation of some very basic threading (I'm not sure I should even call it threading). In run-SmoToCsvFile.ps1 there is a function called "LaunchThread" which uses the .NET System.Diagnostics.ProcessStartInfo to start several new independent powershell consoles running Write-SmoToCsvFile.ps1 (the number depends on your $maxthread setting). The problem with this approach is the original Powershell command executed from SQL Agent has no knowledge of the independent shells. It just starts them, kind of like sp_start_job in SQL Server. If you execute sp_start_job you only receive success or failure of the job starting and not whether the job itself completed or failed.

    This causes some little problems. First as you pointed out the original session does not get back information which would ultimately show up in the job step. This is also why Write-SmoCsvToDb which does not launch additional powershell consoles returns job step output information. The second problem is that Write-SmoCsvToDb can start prior to all the independent powershell consoles running Write-SmoToCsvFile completes. To overcome the second problem I had to include a WAIT step in the job after Run-SmoToCsvFile:

    WAITFOR DELAY '00:30'

    For the first problem I've tried to include output files rather than use the job step output.

    Admittedly the solution of launching multiple Write-SmoToCsvFile.ps1 scripts from Run-SmoToCsvFile.ps1 is less than ideal. On the plus side I'm able to reduce total runtime significantly by launching multiple processes instead of processing through each server sequentially.

  • Unfortunately not, the job just generates the log files. There are no csv files created or archived. apparently the first PS script doesnt begin, just the SS job log files are created...

    I am wondering whether to create a .bat or .vbs file that will run the PS scripts and then start that off from the SS job ("the knee bone is connected to the ankle bone, the ankle bone is connected to the .... !!!")

  • This might seem obvious but your execution policy is set to remotesigned? Also you may want to just run Write-SmoToCsvFile.ps1 and pass in the server name instead of using Run-SmoToCsvFile.ps1 as a test:

    C:\WINDOWS\system32\WindowsPowerShell\v1.0\powershell.EXE -command "C:\SQLPSX\Write-SmoToCsvFile.ps1 'MyServer'"

  • Yes - remotesigned.

    Tried the single server name in threee styles -

    - SERVER\INSTANCE

    - SERVER

    - INSTANCE

    and there is no difference in the execution. A log files gets created and archived.

    no other work done.

  • I am getting the same problem as Jonathan, is there a workaround for this when using the agent to run the job?

Viewing 15 posts - 1 through 15 (of 24 total)

You must be logged in to reply to this topic. Login to reply