Converting Substring to DateTime - extra eyes please

  • I didn't mean to be flippant with my response it just seemed like there was a lot of work being put into what seemed like relatively simple folder clean up.  So I gave a generic solution to that which should do what he was trying to do without giving details about exactly where to run it from as there's a number of options in that regard depending on exactly where this fits into the whole picture.

  • Jeff Moden wrote:

    David Burrows wrote:

    Jeff Moden wrote:

    Totally agree with the XP_CmdShell thing.  Of course, if you do that, you don't need PowerShell to begin with.  Just create the file deletion commands directly from MSDB backup history and Bob's your uncle.   Of course, that's not what I'm looking for from ZZartin.  I'm looking for how he'd run his PowerShell suggestion from a proc or a job along with being able to pass a parameter for the path to it.   As for SQL Agent having a "Powershell Type", let's see you pass a path to the job that runs that. 😀   And, of course, if a "fixed" directory is always used for this problem then, yes, an SQL Agent Job could be built to do it.      

    That's what I love about you Jeff, I answer a post and you slap me with a pork chop, a cold one at that 🙁 But seriously, I agree with you about xp_cmdshell. However powershell has the ability to accept command line parameters. This does not help with Powershell type in the agent though.  

      Oh, man.  I'm sorry it came across as a pork chop.  Didn't mean to do that at all, David.  I was offering alternatives to the PowerShell solution because I didn't see the need for PowerShell here at all.   Equally so, I'm not trying to pork chop ZZartin, either.  I just see a whole lot of PowerShell solutions/suggestions and wonder if they're running it from a batch file or what and they almost never say "and here's how to run it in a Proc or from and SQL Agent Job", which is what looked like the method really needing to be done here.   I also ask people how they would schedule their PowerShell batch and I either end up with crickets or someone suggests (definitely not my first choice) using the Windows Scheduler.   So, no slight to either of you and I sure didn't mean for it to make it look like I was trying to turn you good folks into pork chop landing pads.

    No apologies necessary Jeff 🙂

    I should have qualified my answers a bit better like you did. And I totally agree that people think Powershell is a panacea for all their problems. It's like any tool, use it when you need to not just because it's there and it's new.

     

    Far away is close at hand in the images of elsewhere.
    Anon.

  • ZZartin wrote:

    I didn't mean to be flippant with my response it just seemed like there was a lot of work being put into what seemed like relatively simple folder clean up.  So I gave a generic solution to that which should do what he was trying to do without giving details about exactly where to run it from as there's a number of options in that regard depending on exactly where this fits into the whole picture.

    You've been around for a very long time (more than 8 years) and you're definitely one of the good guys, ZZ.  Along with all of your posts over the years, your statement above is positive proof of that!   Thanks for helping folks out and I apologize if my question seemed like a high velocity pork chop.

     

    I'm still curious, though.  Is there a way to call your good line of PowerShell from a stored proc OR a job other than using xp_CmdShell or a PoSh job step and, if no, is there a way to pass a parameter (to control which directory to use, for example) to the PoSh job step?  I ask not only for others but because the code you posted actually DOES have a use for something that I'm doing.

     

    I also missed the fact that Brandie said she already has the "Copy Only" backups in an otherwise exclusive directory.

     

     

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • I'm still curious, though.  Is there a way to call your good line of PowerShell from a stored proc OR a job other than using xp_CmdShell or a PoSh job step and, if no, is there a way to pass a parameter (to control which directory to use, for example) to the PoSh job step?  I ask not only for others but because the code you posted actually DOES have a use for something that I'm doing.

    Hmm... if using xp_cmdshell is just not an option and as far as I know there's no way to pass a parameter into a job step(i guess you could actually modify the job then call it but that seems kind of silly)  What you could do is make an SSIS package that has single command task in it and accepts a parameter and call powershell from in there.  That could be run from TSQL if it's saved to the SSISDB.  But that really depends on how dynamic the calls needs to be.

    I might make a powershell script like this and put it in a directory SQL Server has access to,

    Param(
    [Parameter(Mandatory=$true)][string] $search_str,
    [int] $keep_count = 4
    )

    (Get-ChildItem -Path $search_str | Sort-Object -Property LastWriteTime -Descending | Select-Object -Skip $keep_count) | ForEach-Object{Write-Output $_.FullName; $_.Delete()}

    Then instead of having to run inline powershell code the command becomes much simpler for example powershell -file <path to script> -search_str "C:\Test\*.txt" in a command(normal command prompt, xp_cmdshell etc..) or in a powershell job step just <path to script> -search_str "C:\Test\*.txt"

    Replace that C:\Test\*.txt with a variable/parameter or whatever if hardcoding it.  Actually using a script file instead of doing it inline also makes it easier to add validation steps, which you might want to do for example C:\Test\ would just pull everything in the directory.

     

    • This reply was modified 4 years, 11 months ago by  ZZartin.
    • This reply was modified 4 years, 11 months ago by  ZZartin.
  • Thanks, ZZ.  What you wrote have been my findings, as well.  I'm amazed the MS hasn't made PowerShell easier to use from SQL Server.  I'm also amazed that they have done things like build a BULK EXPORT and haven't made either BCP or BULK INSERT auto-magically hand things like true CSV or the comedic version that EXCEL creates when you export from EXCEL.  Yes, I know there are work arounds (I do use PowerShell's IMPORT-CSV followed immediately by EXPORT-CSV to "normalize" such files prior to import) but it's a real shame that work arounds on such things are necessary.

     

    Brandie, thanks for letting us hi-jack your thread for a bit.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Jeff Moden wrote:

    IIRC, the file paths and the fact that they're "COPY ONLY" are stored in the MSDB database.  I think it would be a simple, very quick, and very accurate method of finding the backups you want to delete.   As a bit of a side bar, it would also be a lot simpler and safer if you did your "COPY ONLY" backups to a different directory rather than the same directory as the "real" backups.   You might also want to consider just doing PIT restores from the production backups because that would serve to prove that your backups are actually viable with no extra effort.

    My copy only backups actually are going to a different NAS share than the real backups. The real backups are protected from deletes for 30 days and are going to an encrypted NAS share. Of course, we're probably going to have to encrypt this one as well, but yes, we've already considered the "don't get them mixed up" issue.

    Thank you for the MSDB info. I think I might have seen that when previously trolling through the db tables, but I totally forgot it. You're right that it might be the fastest way of checking the date.

    Brandie Tarvin, MCITP Database AdministratorLiveJournal Blog: http://brandietarvin.livejournal.com/[/url]On LinkedIn!, Google+, and Twitter.Freelance Writer: ShadowrunLatchkeys: Nevermore, Latchkeys: The Bootleg War, and Latchkeys: Roscoes in the Night are now available on Nook and Kindle.

  • No problem with the hijacking.

    FYI: My boss just figured out a way to pass PoSH parameters into a SQL job. So, it is a thing. Let me look into it and see how he did it. I'll get you a response as soon as I can.

    EDIT: We are currently calling powershell scripts via SSIS packages which are run via a SQL Server job and via a PoSH step in SQL Server, but each environment has its own PoSH script. So I don't think that's what you want.

    • This reply was modified 4 years, 11 months ago by  Brandie Tarvin.
    • This reply was modified 4 years, 11 months ago by  Brandie Tarvin. Reason: adding info

    Brandie Tarvin, MCITP Database AdministratorLiveJournal Blog: http://brandietarvin.livejournal.com/[/url]On LinkedIn!, Google+, and Twitter.Freelance Writer: ShadowrunLatchkeys: Nevermore, Latchkeys: The Bootleg War, and Latchkeys: Roscoes in the Night are now available on Nook and Kindle.

  • Brandie Tarvin wrote:

    Thank you for the MSDB info. I think I might have seen that when previously trolling through the db tables, but I totally forgot it. You're right that it might be the fastest way of checking the date.

    heh, msdb was mentioned in the first reply to your post

    "get list of 4 last backups from appropriate server (msdb) - they are needed, the rest is not"

  • Yeah. I had so many things going on when that got posted I must have missed it. I was scanning the code responses and missed a lot of the rest of the convo. My bad.

    Brandie Tarvin, MCITP Database AdministratorLiveJournal Blog: http://brandietarvin.livejournal.com/[/url]On LinkedIn!, Google+, and Twitter.Freelance Writer: ShadowrunLatchkeys: Nevermore, Latchkeys: The Bootleg War, and Latchkeys: Roscoes in the Night are now available on Nook and Kindle.

  • So I've changed my math to be just a DateDiff because, yay simple!

    SELECT dt.Subdirectory, DATEDIFF(hh,GETDATE(),bus.backup_finish_date) AS HrsAged
    INTO #PurgeBackups
    FROM #DirTree dt
    INNER JOIN LinkedSvr.msdb.dbo.backupmediafamily bmf
    ON dt.Subdirectory = REPLACE(bmf.physical_device_name,@PurgeFolder,'')
    INNER JOIN LinkedSvr.msdb.dbo.backupset bus
    ON bmf.media_set_id = bus.media_set_id
    WHERE bus.database_name = 'ABC1234' AND bus.is_copy_only = 1
    AND DATEDIFF(hh,GETDATE(),bus.backup_finish_date) < -12;

    Brandie Tarvin, MCITP Database AdministratorLiveJournal Blog: http://brandietarvin.livejournal.com/[/url]On LinkedIn!, Google+, and Twitter.Freelance Writer: ShadowrunLatchkeys: Nevermore, Latchkeys: The Bootleg War, and Latchkeys: Roscoes in the Night are now available on Nook and Kindle.

  • btw, two things to consider:

    1. reality vs expectations

    when you analyse the list of files, you analyse existing things which are really present on a disk

    when you analyse backup history, you analyse only historical information (records about backups)

    these records can be missing due to several reasons, for example, when someone delete a db from GUI (ssms), its backup history will also can be deleted,  or  regular backup history cleanup job.

    2.   imported backup info

    When you restore a db from backup made on another SQL server, its backup information is imported into backup history on the current SQL server. So, it's worth amend your query and analyse the server_name and/or machine_name (table msdb..backupset).

     

  • So I haven't played with taking this to T-SQL yet, but my boss set up variables in a C# SSIS script task, using System.Management.Automation.Runspaces and adding the reference to the System.Management.Automation.dll. Then called the powershell script within the script task and added the parameters via the command.Parameters.Add(null, <variablename>) code. And in the PoSH script, using the $args[] array to pull the parameter in.

    As soon as I get a moment, I'm going to play, but I'm certain that $args can be used to pull in parameters pushed in by T-SQL. It's just a matter of figuring out how to stack it properly.

    Brandie Tarvin, MCITP Database AdministratorLiveJournal Blog: http://brandietarvin.livejournal.com/[/url]On LinkedIn!, Google+, and Twitter.Freelance Writer: ShadowrunLatchkeys: Nevermore, Latchkeys: The Bootleg War, and Latchkeys: Roscoes in the Night are now available on Nook and Kindle.

  • Andrey, regarding your second point... this isn't really a restore solution. The backups are getting stored on a non-prod NAS for redundancy DR. Long story short, we needed a workaround for restoring an "in crisis" production database quickly to a non-prod server without having to load nearly 100 transaction logs along with the weekend's full backup.

    This is a tweak to existing DR practices that we're trying out to see if it's worth the effort and NAS space cost. We restore the first copy-only of the day to ensure the backups are still good, but the rest are just there for the "CYA, because our VP and executive staff had a concern."

    EDIT: Regarding the first point, I have an inner join from my temp table to the msdb tables precisely because I'm deleting files from the NAS share and only want data that matches. If someone's deleting stuff from the msdb tables, we have a problem. If someone's deleting the source database, we have a bigger issue then I'm trying to manage here because that's our main production database. And if that goes away, we're all fired and won't have to worry about the fallout from one little NAS share / msdb tables not having the copy-only backup information.

    Brandie Tarvin, MCITP Database AdministratorLiveJournal Blog: http://brandietarvin.livejournal.com/[/url]On LinkedIn!, Google+, and Twitter.Freelance Writer: ShadowrunLatchkeys: Nevermore, Latchkeys: The Bootleg War, and Latchkeys: Roscoes in the Night are now available on Nook and Kindle.

  • Brandie, thanks for clarification.

    My points were posted just to let you/others know, that information in msdb can be either absent (deleted) or imported from another server (as a result of restore).

    This is a common case and you better know can it happen in your environment or not.

    If it's not applicable for you, then it's fine.

    However, if this approach (analysis of backup information in msdb) is used by anyone else in different env/circumstances, then knowing that can help to avoid misunderstanding and confusion.

    Regards,

    Andrey.

  • If you want a quick "on disk" variant, you can take advantage of the fact your date stamps are sortable, i.e.

    Select SubDirectory from #DirTree

    Except

    Select SubDirectory From

    (Select Top (4) SubDirectory From #DirTree Order By SubDirectory Desc) X

    Which bypasses the whole date thing entirely and just keeps the 4 backups you're allowed to have.

Viewing 15 posts - 16 through 30 (of 39 total)

You must be logged in to reply to this topic. Login to reply