Viewing 15 posts - 226 through 240 (of 518 total)
GilaMonster (10/5/2010)
Derrick Smith (10/5/2010)
(I'm just shy of MCDBA..which I should get, but I don't want to pay for it myself :))
Not any more... All the MCDBA exams (excluding maybe 1)...
October 5, 2010 at 2:53 pm
GilaMonster (10/5/2010)
sqlbuddy123 (10/5/2010)
My two cents..You can check these books to become a master
I'll disagree slightly...
Books alone will not make you a master. Books + experience + experimentation may, with sufficient...
October 5, 2010 at 2:48 pm
Or script out the schema, recreate, and then to a data flow task for every table (you may need to drop all foreign keys, insert data, then recreate them).
October 5, 2010 at 1:58 pm
See here: http://www.sqlservercentral.com/Forums/Topic672795-145-1.aspx
edit: disregard, same link that Shawn posted. Yay Google!
October 5, 2010 at 1:42 pm
davemcsheffrey (10/5/2010)
I will try that out thanks very much!
Ah I was posting that as you replied with the extra criteria..
try this
;WITH CTE AS ( SELECT
RN = ROW_NUMBER() OVER (PARTITION BY...
October 5, 2010 at 1:21 pm
Something like this would just return the serial numbers ordered 2 or more times in the last 60 days from today's date. Without table schemas/data I can't completely verify it..but...
October 5, 2010 at 1:07 pm
tosscrosby (10/5/2010)
Lowell (10/5/2010)
Derrick Smith (10/5/2010)
Lowell (10/5/2010)
bump, as I wanted to know if kras followed up on the reasoning behind wanting to drop the production database.
Wait, we need "reasons"...
October 5, 2010 at 12:56 pm
Getting closer..just need some specifics.
How do you want it to be displayed? Do you just want a list of serial numbers that have been ordered twice in the last 60...
October 5, 2010 at 12:54 pm
Yep.
Table structures, some sample data, expected output, and what you have written so far will be enough to answer 99% of the questions you will ever post.
October 5, 2010 at 12:31 pm
We'd need a bit more info from you to help out with an accurate query
Refer to this post by the resident metal bar hater: http://www.sqlservercentral.com/articles/Best+Practices/61537/
October 5, 2010 at 12:26 pm
You can output failed rows to a file using -e c:\errorfile (or whatever path you'd like).
October 5, 2010 at 12:21 pm
That's 2 terabytes, which when talking about memory in MS products, is synonymous with 0 or Unlimited.
You'll see those 3 interchanged quite frequently, but they all mean the same thing...
October 5, 2010 at 12:16 pm
I prefer staging tables for the simple reason that if something fails, you can just select from it and see which data might be causing the issue instead of filtering...
October 5, 2010 at 12:10 pm
Robert Hankin (9/28/2010)
October 5, 2010 at 12:07 pm
Viewing 15 posts - 226 through 240 (of 518 total)