Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase ««123»»

I'm trying to create a proc that will drop and create a function Expand / Collapse
Author
Message
Posted Tuesday, January 15, 2013 1:10 PM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Friday, February 7, 2014 2:31 PM
Points: 11, Visits: 74
I'd recommend using the SSIS as a previous poster recommended, allowing you to run each SQL file individually.

I also recommend someone to stop pissing in Celko's cheerios. Damn that was harsh.
Post #1407449
Posted Tuesday, January 15, 2013 1:18 PM
Hall of Fame

Hall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of Fame

Group: General Forum Members
Last Login: Friday, July 18, 2014 2:05 PM
Points: 3,943, Visits: 2,957
Agreed. Not everyone is obsessed with ISO standards.


Tally Tables - Performance Personified
String Splitting with True Performance
Best practices on how to ask questions
Post #1407454
Posted Tuesday, January 15, 2013 1:39 PM


Say Hey Kid

Say Hey KidSay Hey KidSay Hey KidSay Hey KidSay Hey KidSay Hey KidSay Hey KidSay Hey Kid

Group: General Forum Members
Last Login: Thursday, June 26, 2014 4:08 AM
Points: 708, Visits: 3,286
Haha.... I made the effort to read some of Mr Celko's previous posts sometime ago... Lol he really lets loose sometimes but you know he does make some valid points. It's just his style is so damn rude!!

Take it wth a shovel of salt.

Anyway, as GSquared said, GO is not a TSQL command, it's just a word that SSMS uses to know that you want to send a bunch of code in one batch.

You say you have 30 scripts and some are dependent on others then you must also know the order the must run yes?

If so then why not

A) open new .sql file, copy and paste content of the first script, add a GO then paste in the second script and so on.

b) schedule a sql agent job and place each script in its own step.

C) create SSIS package with as a series of execute TSQL components

D) batch file

Thanks.


---------------------------------------------------------


It takes a minimal capacity for rational thought to see that the corporate 'free press' is a structurally irrational and biased, and extremely violent, system of elite propaganda.
David Edwards - Media lens

Society has varying and conflicting interests; what is called objectivity is the disguise of one of these interests - that of neutrality. But neutrality is a fiction in an unneutral world. There are victims, there are executioners, and there are bystanders... and the 'objectivity' of the bystander calls for inaction while other heads fall.
Howard Zinn
Post #1407475
Posted Tuesday, January 15, 2013 2:17 PM


SSCoach

SSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoach

Group: General Forum Members
Last Login: Friday, June 27, 2014 12:43 PM
Points: 15,444, Visits: 9,596
Yeah, it's pretty safe to ignore the rabid weasle Joe in this instance. Nothing he wrote here actually has any value to the discussion. (Sometimes it does, and it's just poorly communicated. But not this time. Waste of magnetic dots on a platter this time.)

If you have 30 files to run, I don't see copying those into a single file as being a realistic option. Too cumbersome to copy-paste.

A batch file that runs the files in order, using sqlcmd, is probably the best bet in that case.

Are you in a position where you can take over a test server, run the scripts (by batch file or otherwise), and do a before/after check? If so, set up a test server just like the server you will eventually run the scripts on, back everything on it up, and run the scripts. Then you can fix/tweek them if they throw errors, then retore the test server and run them again.

Rinse ... repeat ...

Once they run without error, then compare what you end up with against your expected end result, and you'll know pretty well what to expect from your go-live.

If you can't test before-hand in that manner, just assume the whole thing will blow up in production, and go from there. Have triage and emergency crews on-hand, expect problems, and make sure you have restorable backups before you run it. Plan for about 10 times the downtime that would be reasonable. Sometimes you just have do a "damn the torpedoes, full speed ahead" type thing, and make sure your damage-control team is on the ball. But avoid that if you can at all do so, and do a dry-run on a test server.


- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread

"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
Post #1407492
Posted Tuesday, January 15, 2013 3:25 PM
SSCommitted

SSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommitted

Group: General Forum Members
Last Login: Today @ 10:27 AM
Points: 1,964, Visits: 2,900
The "sqlcmd" utility does recognize "GO", as well as many other things, including invoking OS cmds.

I think you should be able to use "sqlcmd" to run the scripts.


SQL DBA,SQL Server MVP('07, '08, '09)
"In America, every man is innocent until proven broke!" Brant Parker
Post #1407508
Posted Wednesday, January 16, 2013 7:48 AM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Tuesday, July 22, 2014 7:56 AM
Points: 144, Visits: 355
1) Sql cmd exe is disabled on all prod sql instances due to security issues.

2) I can’t add them all to 1 script because it’s dropping and creating objects as well as loading data. Because the scripts are laden w/ “go” throughout if the portion before a go fails the second portion will run regardless. So, for example. If after say the second batch is an insert that fails and say 400 batches later is another insert into a table that uses the initial table whose insert failed. Then it would insert 0 records and so on. I can’t have that.

So the separation of the scripts is kind of a failsafe to ensure the process doesn’t get too far if something fails. Again, I didn’t write this. Just stating the facts. So combine all into one is not an option at this point. If file three fails I need to run it till it succeeds before I run file four. Each script is written in a way that it’s self-healing in a sense. If a script fails half way I can rerun the whole script and everything is dropped and recreated. Which is actually part of the problem. It take around 18 hours to run all the files and I have a short window of time to get it done once we do the final push.


The rub is u can’t drop and create certain object w/o a GO in between. As my original example states. U can’t drop a function then create a function w/o running them in separate batches. Unless I’m missing something. Is there some other method other than using GO? To be honest, it shouldn’t blindly drop and recreate. It should be checking for existence and creating if it doesn’t exist instead of blindly dropping and recreating.


It should have drop and create objects script then a data load script.

Keep in mind the current process works It’s a bit manual but it works. So it’s not imperative I find a solution. It’s just been on my mind lately. So worst case I kick one off. Then 1.32 hours later when I expect it’s done I manually check. If all is good I kick off the next script. I was just hoping to automate it and have it send me success or failure emails.

Actually at the very least I could have it send me a completion email at the end of each script. I think I’ll add that now. Just thinking out loud. Again. I wouldn’t waste too much mental energy on this because I do have a working method.
Post #1407843
Posted Wednesday, January 16, 2013 8:03 AM


SSCoach

SSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoachSSCoach

Group: General Forum Members
Last Login: Friday, June 27, 2014 12:43 PM
Points: 15,444, Visits: 9,596
There are ways to put more robust error handling into scripts like that, but I'm not suggesting them because they'd take a lot of editing, and you don't have time/resources to do that right now from what I gather.

There are ways to handle errors and rollbacks/commits even across batches. Have you used RedGate SQL Compare? Take a look at how they do error handling in the scripts that generates and you'll see a working example of it. (If you haven't played with that, I think you can get a free trial copy, and you can take a look at it that way.) But that would require significant editing of your script files.

This situation is really something that has to be planned for and worked into the solution right from the beginning. Not sure how to handle it, coming in when the scripts have already been written and your go-live deadline is looming.


- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread

"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
Post #1407849
Posted Wednesday, January 16, 2013 8:08 AM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Tuesday, July 22, 2014 7:56 AM
Points: 144, Visits: 355
Oh, and as far as Joe goes. As my dad used to say. "Every a-hole's got an opinion." I thought it was funny. I sent the post to all my friends. It’s was classic. He actually stated I was a detriment to society. That’s good stuff.

Well, funny and disturbing at the same time. I looked him up. He knows his stuff. We can't deny that. The disturbing part is the fact that here’s a guy at the top of his field, an expert sql guy and he has nothing better to do but trolling the web searching for people to spew his elitist jargon upon. It’s sad really. I'd like to think I’d be fishing in Bermuda of diving somewhere sipping a cocktail w/ my beautiful wife and chocolate lab.
Post #1407859
Posted Wednesday, January 16, 2013 9:41 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Friday, February 7, 2014 2:31 PM
Points: 11, Visits: 74
If you run SQLCMD via command line, you can set it to run as your user account (or the one you plan to make changes with). I believe the poster that referenced SQLCMD in this fashion, not via SSMS.

You still have a block against that?

Write a batch file with the following (assuming your BINN folder is in the path variable):
sqlcmd -U MyLogin -S <ComputerName>\<InstanceName> -i <MyScript.sql> -o <MyOutput.rpt>

create a separate line for each script. the rpt data will allow you to see the results of each step. I'm not batch champion but you could even incorporate some error logic I think to get it to stop in case you have problems with one line. You have access to google/bing/yahoobing so you can find that part out on your own.

If you're using a different user account and it's windows based, I'd open your command line via that account (shift right click, Run as Different User).

You could always test the technique by doing a simple select in your sql file...
Post #1407942
Posted Wednesday, January 16, 2013 11:11 AM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Tuesday, July 22, 2014 7:56 AM
Points: 144, Visits: 355
actually. no. i shouldn't have any security issues running sql smd prompt manually. i was thinking of running it in script call or via job. I'll give that a try in a lil while.



thanks
Post #1408009
« Prev Topic | Next Topic »

Add to briefcase ««123»»

Permissions Expand / Collapse