Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase 123»»»

SQL Server PowerShell Extensions (SQLPSX) Part 2 Expand / Collapse
Author
Message
Posted Wednesday, October 8, 2008 12:38 AM
SSC Veteran

SSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC Veteran

Group: General Forum Members
Last Login: Friday, April 18, 2014 8:01 AM
Points: 258, Visits: 701
Comments posted to this topic are about the item SQL Server PowerShell Extensions (SQLPSX) Part 2


Post #582357
Posted Wednesday, October 8, 2008 4:51 AM
SSC Veteran

SSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC Veteran

Group: General Forum Members
Last Login: Thursday, August 14, 2014 7:44 AM
Points: 278, Visits: 571
This is a great way to collect the security state of my servers. Took a bit of tinkering to get it all to work - i wanted the SQLPSX data in an existing meta data database that i have so wanted all the objects in their own schema. Lots of minor changes to the script, changing dbo. to SQLPSX. and we got there!

I cannot (yet) get the SQL job to run via SQL Agent. The 2nd step "C:\WINDOWS\system32\WindowsPowerShell\v1.0\powershell.EXE -command "C:\SQLPSX\run-SmoToCsvFile.ps1 2>&1"" fails with error "Executed as user: . Unable to open Step output file. The step failed."
When i run the same command from the command prompt it executes with no error so i am thinking its permissions based??

Off to tweak the reports now.


Many, many thanks for such a useful project

Jonathan
Post #582442
Posted Wednesday, October 8, 2008 7:57 AM
SSC Veteran

SSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC Veteran

Group: General Forum Members
Last Login: Friday, April 18, 2014 8:01 AM
Points: 258, Visits: 701
Check the "Advanced" page under the job step "Run-SmoToCsvFile" and ensure the path to the SQLPSX.err output file exists. This is one difference between the command-line and SQL Agent execution where I'm using the output file option on job steps and probably explains why it works executing directly from command-line.


Post #582568
Posted Wednesday, October 8, 2008 9:35 AM
SSC Veteran

SSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC Veteran

Group: General Forum Members
Last Login: Thursday, August 14, 2014 7:44 AM
Points: 278, Visits: 571
yup spotted that one pretty early on.

still not getting it to run from the job though.
SQLPSX_SMOtoCSV.log is empty
SQLPSX_CSVtoDB.log has:
Move-Item : Cannot create a file when that file already exists.
At C:\SQLPSX\Write-SmoCsvToDb.ps1:75 char:10
+ Move-Item <<<< "$CsvDir*.log" "$arcDir$((Get-Date).ToString(`"yyyyMMdd`"))"

Setting the directory C:\SQLPSX\Output\Archive\20081008\ to compress new files [OK]

Compressing files in C:\SQLPSX\Output\Archive\20081008\


1 files within 2 directories were compressed.
0 total bytes of data are stored in 0 bytes.
The compression ratio is 1.0 to 1.

SMOCSSVTODB.log file has:
2008-10-08 16:16 processCsv Login
2008-10-08 16:16 processCsv ServerPermission
2008-10-08 16:16 processCsv ServerRole
2008-10-08 16:16 processCsv SqlLinkedServerLogin
2008-10-08 16:16 processCsv SqlUser
2008-10-08 16:16 processCsv DatabasePermission
2008-10-08 16:16 processCsv ObjectPermssion
2008-10-08 16:16 processCsv DatabaseRole
2008-10-08 16:16 archiveCsv C:\SQLPSX\output\20081008

and thats it.

the job completes with success but move zero rows of data.

if i run "./run-smotocsvfile.ps1" it ships the expected data, i then run "./write-smocsvtodb.ps1" and SSRS shows details perfectly.

I cannot run the job step in SQLCmd mode ie:
"!!C:\WINDOWS\system32\WindowsPowerShell\v1.0\powershell.EXE -command "run-SmoToCsvFile.ps1" >> C:\SQLPSX\SQLJobLog\SqlSec_Err.LOG"

Do I need to give the full location of the run-smotocsvfile.ps1 file? --> c:\sqlpsx\run-smotocsvfile.ps1?

running this from the Command prompt succeeds:
"C:\WINDOWS\system32\WindowsPowerShell\v1.0\powershell.EXE -command "C:\SQLPSX\run-SmoToCsvFile.ps1"
thanks.
Post #582706
Posted Wednesday, October 8, 2008 10:08 AM
SSC Veteran

SSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC Veteran

Group: General Forum Members
Last Login: Friday, April 18, 2014 8:01 AM
Points: 258, Visits: 701
I see a mistake in the article where I set the archive and data directory to the same directory. Here's what I have in my production script Write-SmoCsvToDb.ps1:
$CsvDir = "$scriptRoot\Data\"
$arcDir = "$scriptRoot\Data\Archive\"

And here's what I have in my Write-SmoToCsvFile.ps1

$dir = "$scriptRoot\Data\"

You can use the $scriptRoot variable which is set at the beginning of each script or you can specify the path. Ensure the archive and data directories exist. The Write-SmoCsvToDb.ps1 script will create a subdirectory under \Data\Archive\ for each run and NTFS compress the subdirectory.

Sorry about the error -- Let me know if this works and I'll correct the article.




Post #582738
Posted Wednesday, October 8, 2008 11:11 AM
SSC Veteran

SSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC Veteran

Group: General Forum Members
Last Login: Friday, April 18, 2014 8:01 AM
Points: 258, Visits: 701
It doesn't seem like you can run PowerShell through SQLCMD. I just tried running !!powershell -version and all I see are unprintable characters. I often run PowerShell through SQL Agent using the Operating System (CmdExec) type without issue. You do have to specify the full path of script files.


Post #582786
Posted Wednesday, October 8, 2008 2:41 PM
SSC Veteran

SSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC Veteran

Group: General Forum Members
Last Login: Thursday, August 14, 2014 7:44 AM
Points: 278, Visits: 571
OK, SQLCMD isnt the big deal, I was trying to use that to debug why the job wast working...

any ideas on the reason that the csv files dont get generated, just the log files confirming that nothing has been done?!

Thanks for the help so far.

J
Post #582932
Posted Thursday, October 9, 2008 1:48 AM
SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Yesterday @ 5:09 AM
Points: 5,338, Visits: 1,385
Nice series ...


Post #583107
Posted Thursday, October 9, 2008 5:23 AM
SSC Veteran

SSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC Veteran

Group: General Forum Members
Last Login: Friday, April 18, 2014 8:01 AM
Points: 258, Visits: 701
Have you had a chance to try this fix?

cmille19 (10/8/2008)
I see a mistake in the article where I set the archive and data directory to the same directory. Here's what I have in my production script Write-SmoCsvToDb.ps1:
$CsvDir = "$scriptRoot\Data\"
$arcDir = "$scriptRoot\Data\Archive\"

And here's what I have in my Write-SmoToCsvFile.ps1

$dir = "$scriptRoot\Data\"

You can use the $scriptRoot variable which is set at the beginning of each script or you can specify the path. Ensure the archive and data directories exist. The Write-SmoCsvToDb.ps1 script will create a subdirectory under \Data\Archive\ for each run and NTFS compress the subdirectory.

Sorry about the error -- Let me know if this works and I'll correct the article.




Post #583205
Posted Thursday, October 9, 2008 8:31 AM
SSC Veteran

SSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC Veteran

Group: General Forum Members
Last Login: Thursday, August 14, 2014 7:44 AM
Points: 278, Visits: 571
many apologies, i hadnt seen your two posts before mine.

I downloaded the files and copied them to the server - i didnt spot that the article was different. I have them as
Write-SmoCsvToDb.ps1
$CsvDir = "$ScriptRoot\output\" #"$scriptRoot\Data\"
$arcDir = "$scriptRoot\Output\Archive\" #"$scriptRoot\Data\Archive\"


Write-SmoToCsvFile.ps1
$dir = "$scriptRoot\output\" #"$scriptRoot\Data\"

The logging actions from these files work - thats to say the .log (.err in the original files) iscreated and then collected up and deposited in the data/archive/...date... folder. Its whole contents are:
2008-10-09 15:11 processCsv Login
2008-10-09 15:11 processCsv ServerPermission
2008-10-09 15:11 processCsv ServerRole
2008-10-09 15:11 processCsv SqlLinkedServerLogin
2008-10-09 15:11 processCsv SqlUser
2008-10-09 15:11 processCsv DatabasePermission
2008-10-09 15:11 processCsv ObjectPermssion
2008-10-09 15:11 processCsv DatabaseRole
2008-10-09 15:11 archiveCsv C:\SQLPSX\output\20081009

the log files created by the SQL Job steps get created in a different folder (same level as Archive).
SQLPSX_SMOtoCSV.log is empty
SQLPSX_CSVtoDB.log contains
Directory: Microsoft.PowerShell.Core\FileSystem::C:\SQLPSX\Output\Archive


Mode LastWriteTime Length Name
---- ------------- ------ ----
d---- 09/10/2008 15:11 20081009

Setting the directory C:\SQLPSX\Output\Archive\20081009\ to compress new files [OK]

Compressing files in C:\SQLPSX\Output\Archive\20081009\

smocsvtodb.log 398 : 398 = 1.0 to 1 [OK]

2 files within 2 directories were compressed.
398 total bytes of data are stored in 398 bytes.
The compression ratio is 1.0 to 1.


The SQL job reports success on all steps now - The history for step 2 (Run-SmoToCsvFile) shows
Message
Executed as user: . The step did not generate any output. Process Exit Code 0. The step succeeded.


I would have expected the output of the script creating the csv files running into this message. At least in part if not in full.

Sorry I cant say it's all working yet...
Post #583358
« Prev Topic | Next Topic »

Add to briefcase 123»»»

Permissions Expand / Collapse