Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase «««4,3284,3294,3304,3314,332»»»

Are the posted questions getting worse? Expand / Collapse
Author
Message
Posted Wednesday, March 12, 2014 6:43 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Yesterday @ 8:51 PM
Points: 35,606, Visits: 32,190
Luis Cazares (3/12/2014)
dwain.c (3/12/2014)
Luis Cazares (3/12/2014)
The Dixie Flatline (3/12/2014)
I have a database. It doesn't work. Please advise.

Thanks in advance.

P.S. Please don't ask for DDL and sample data as I've already said that it doesn't work.

I'm sure we can help. Here's a possible solution for you:
Solution to almost any problem


BWAHAHAHAHAHAA!

Nice one Luis!

I've been tempted to do something like this several times (I might have done it).

In another order of ideas, today I had a good experience to prove your quote about indexing and cat food. A process with an insert and update to the same table was taking almost 2 hours to finish and the update used 90 minutes. After trying to create an index that would work for this query, I proposed to remove the update and include the value in the insert. The insert didn't take longer and we could deliver the information one and a half hour faster each day.


But... but... but we've never done it THAT way before. We've always done it the other way.


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #1550489
Posted Wednesday, March 12, 2014 6:44 PM


Hall of Fame

Hall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of Fame

Group: General Forum Members
Last Login: Monday, November 24, 2014 4:53 AM
Points: 3,422, Visits: 5,368
Luis Cazares (3/12/2014)

In another order of ideas, today I had a good experience to prove your quote about indexing and cat food. A process with an insert and update to the same table was taking almost 2 hours to finish and the update used 90 minutes. After trying to create an index that would work for this query, I proposed to remove the update and include the value in the insert. The insert didn't take longer and we could deliver the information one and a half hour faster each day.


That's a cool story Luis. I just wish I could get that point across to some of the folks I work with.

We've got a customer using a high transactional volume database that many developers have touched, and consequently through lack of knowledge have added lots and lots of indexes. I ran across an index on a really wide table (probably the widest in the entire application) that included in the key something like 80% of the table's columns. They've been complaining about how in general transaction processing was slow.

My advice: drop all the bloody indexes and start from scratch, adding back only what's needed one at a time. The blank look that I got was pretty impressive.



My mantra: No loops! No CURSORs! No RBAR! Hoo-uh!

My thought question: Have you ever been told that your query runs too fast?

My advice:
INDEXing a poor-performing query is like putting sugar on cat food. Yeah, it probably tastes better but are you sure you want to eat it?
The path of least resistance can be a slippery slope. Take care that fixing your fixes of fixes doesn't snowball and end up costing you more than fixing the root cause would have in the first place.


Need to UNPIVOT? Why not CROSS APPLY VALUES instead?
Since random numbers are too important to be left to chance, let's generate some!
Learn to understand recursive CTEs by example.
Splitting strings based on patterns can be fast!
Post #1550490
Posted Wednesday, March 12, 2014 7:03 PM


SSCarpal Tunnel

SSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal Tunnel

Group: General Forum Members
Last Login: Friday, October 24, 2014 12:43 PM
Points: 4,126, Visits: 3,428
dwain.c (3/12/2014)
Luis Cazares (3/12/2014)

In another order of ideas, today I had a good experience to prove your quote about indexing and cat food. A process with an insert and update to the same table was taking almost 2 hours to finish and the update used 90 minutes. After trying to create an index that would work for this query, I proposed to remove the update and include the value in the insert. The insert didn't take longer and we could deliver the information one and a half hour faster each day.


That's a cool story Luis. I just wish I could get that point across to some of the folks I work with.

We've got a customer using a high transactional volume database that many developers have touched, and consequently through lack of knowledge have added lots and lots of indexes. I ran across an index on a really wide table (probably the widest in the entire application) that included in the key something like 80% of the table's columns. They've been complaining about how in general transaction processing was slow.

My advice: drop all the bloody indexes and start from scratch, adding back only what's needed one at a time. The blank look that I got was pretty impressive.

People who think that they know, even if the don't, are always more dangerous than those who know their limitations and are trying to learn and get second opinions.
Post #1550493
Posted Wednesday, March 12, 2014 7:04 PM


Say Hey Kid

Say Hey KidSay Hey KidSay Hey KidSay Hey KidSay Hey KidSay Hey KidSay Hey KidSay Hey Kid

Group: General Forum Members
Last Login: Yesterday @ 1:54 PM
Points: 675, Visits: 6,815
dwain.c (3/12/2014)
Luis Cazares (3/12/2014)

In another order of ideas, today I had a good experience to prove your quote about indexing and cat food. A process with an insert and update to the same table was taking almost 2 hours to finish and the update used 90 minutes. After trying to create an index that would work for this query, I proposed to remove the update and include the value in the insert. The insert didn't take longer and we could deliver the information one and a half hour faster each day.


That's a cool story Luis. I just wish I could get that point across to some of the folks I work with.

We've got a customer using a high transactional volume database that many developers have touched, and consequently through lack of knowledge have added lots and lots of indexes. I ran across an index on a really wide table (probably the widest in the entire application) that included in the key something like 80% of the table's columns. They've been complaining about how in general transaction processing was slow.

My advice: drop all the bloody indexes and start from scratch, adding back only what's needed one at a time. The blank look that I got was pretty impressive.


That is what the Experts call a 'covering index'.
Although it probably never gets used, it does do something.
Might be interesting to see what indexes actually are getting used.
Post #1550494
Posted Wednesday, March 12, 2014 7:39 PM


Hall of Fame

Hall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of Fame

Group: General Forum Members
Last Login: Monday, November 24, 2014 4:53 AM
Points: 3,422, Visits: 5,368
Greg Edwards-268690 (3/12/2014)
dwain.c (3/12/2014)
Luis Cazares (3/12/2014)

In another order of ideas, today I had a good experience to prove your quote about indexing and cat food. A process with an insert and update to the same table was taking almost 2 hours to finish and the update used 90 minutes. After trying to create an index that would work for this query, I proposed to remove the update and include the value in the insert. The insert didn't take longer and we could deliver the information one and a half hour faster each day.


That's a cool story Luis. I just wish I could get that point across to some of the folks I work with.

We've got a customer using a high transactional volume database that many developers have touched, and consequently through lack of knowledge have added lots and lots of indexes. I ran across an index on a really wide table (probably the widest in the entire application) that included in the key something like 80% of the table's columns. They've been complaining about how in general transaction processing was slow.

My advice: drop all the bloody indexes and start from scratch, adding back only what's needed one at a time. The blank look that I got was pretty impressive.


That is what the Experts call a 'covering index'.
Although it probably never gets used, it does do something.
Might be interesting to see what indexes actually are getting used.


If that index was really needed for something (which I personally think not), it might be a little better done if most of the columns were in an INCLUDE.



My mantra: No loops! No CURSORs! No RBAR! Hoo-uh!

My thought question: Have you ever been told that your query runs too fast?

My advice:
INDEXing a poor-performing query is like putting sugar on cat food. Yeah, it probably tastes better but are you sure you want to eat it?
The path of least resistance can be a slippery slope. Take care that fixing your fixes of fixes doesn't snowball and end up costing you more than fixing the root cause would have in the first place.


Need to UNPIVOT? Why not CROSS APPLY VALUES instead?
Since random numbers are too important to be left to chance, let's generate some!
Learn to understand recursive CTEs by example.
Splitting strings based on patterns can be fast!
Post #1550504
Posted Thursday, March 13, 2014 5:40 AM
SSCarpal Tunnel

SSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal Tunnel

Group: General Forum Members
Last Login: Yesterday @ 11:35 AM
Points: 4,482, Visits: 3,938
dwain.c (3/12/2014)
Greg Edwards-268690 (3/12/2014)
dwain.c (3/12/2014)
Luis Cazares (3/12/2014)

In another order of ideas, today I had a good experience to prove your quote about indexing and cat food. A process with an insert and update to the same table was taking almost 2 hours to finish and the update used 90 minutes. After trying to create an index that would work for this query, I proposed to remove the update and include the value in the insert. The insert didn't take longer and we could deliver the information one and a half hour faster each day.


That's a cool story Luis. I just wish I could get that point across to some of the folks I work with.

We've got a customer using a high transactional volume database that many developers have touched, and consequently through lack of knowledge have added lots and lots of indexes. I ran across an index on a really wide table (probably the widest in the entire application) that included in the key something like 80% of the table's columns. They've been complaining about how in general transaction processing was slow.

My advice: drop all the bloody indexes and start from scratch, adding back only what's needed one at a time. The blank look that I got was pretty impressive.


That is what the Experts call a 'covering index'.
Although it probably never gets used, it does do something.
Might be interesting to see what indexes actually are getting used.


If that index was really needed for something (which I personally think not), it might be a little better done if most of the columns were in an INCLUDE.

You're using INCLUDEd columns? <sarcasm>And here I thought we were just supposed to keep creating more and more redundant indexes on the same combination of columns over and over again until SQL recognized one and started using it. It doesn't matter how many there are...just keep creating more and SQL will eventually catch up. You mean that's not right? Could index maintenance possibly be important? How could this be? </sarcasm>



Tally Tables - Performance Personified
String Splitting with True Performance
Best practices on how to ask questions
Post #1550638
Posted Thursday, March 13, 2014 6:35 AM
SSCarpal Tunnel

SSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal TunnelSSCarpal Tunnel

Group: General Forum Members
Last Login: Yesterday @ 11:35 AM
Points: 4,482, Visits: 3,938
Jeff Moden (3/12/2014)

But... but... but we've never done it THAT way before. We've always done it the other way.

If only I could estimate how many times I've heard that before.
If the way we've always done it doesn't work, then FIX IT!



Tally Tables - Performance Personified
String Splitting with True Performance
Best practices on how to ask questions
Post #1550658
Posted Thursday, March 13, 2014 7:03 AM


Old Hand

Old HandOld HandOld HandOld HandOld HandOld HandOld HandOld Hand

Group: General Forum Members
Last Login: Today @ 3:29 AM
Points: 312, Visits: 3,533
The Dixie Flatline (3/12/2014)
SQLRNNR (3/12/2014)
The Dixie Flatline (3/12/2014)
I have a database. It doesn't work. Please advise.

Thanks in advance.

P.S. Please don't ask for DDL and sample data as I've already said that it doesn't work.


Dear expert, you should drop your database.


Okay, I dropped the database but it still doesn't work. Someone on another website says I should do a backup, but wouldn't tell me how. Would you agree that this is a good idea? If so, could you give me instructions? If you don't agree, could you talk to him about this for me?











This is entirely correct. You need to back it up From Disk, from the drive you store your data on. The command to backup From Disk is just an abbreviation of From Disk; FDISK - so, to back up your database files on D:\ just open up a dos command window and run FDISK D


I'm a DBA.
I'm not paid to solve problems. I'm paid to prevent them.
Post #1550678
Posted Thursday, March 13, 2014 7:23 AM


Ten Centuries

Ten CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen CenturiesTen Centuries

Group: General Forum Members
Last Login: Tuesday, October 28, 2014 12:50 PM
Points: 1,061, Visits: 2,580
GilaMonster (3/11/2014)
Koen Verbeeck (3/11/2014)
Grant Fritchey (3/11/2014)
By the gods, people get hung up on the weirdest darned things. They can focus like a laser on stuff that's just a waste of time. It's weird.


I think we're talking about the same topic


Yup.


GilaMonster (3/11/2014)
Don't make weird, non-standard design decisions based on an unfounded expectation of some unknown performance improvement.


This one is going on the cubicle wall!


Jason Wolfkill
Blog: SQLSouth
Twitter: @SQLSouth
Post #1550688
Posted Thursday, March 13, 2014 8:14 AM


SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Sunday, August 17, 2014 3:10 PM
Points: 2,787, Visits: 6,098
andrew gothard (3/13/2014)
The Dixie Flatline (3/12/2014)
SQLRNNR (3/12/2014)
The Dixie Flatline (3/12/2014)
I have a database. It doesn't work. Please advise.

Thanks in advance.

P.S. Please don't ask for DDL and sample data as I've already said that it doesn't work.


Dear expert, you should drop your database.


Okay, I dropped the database but it still doesn't work. Someone on another website says I should do a backup, but wouldn't tell me how. Would you agree that this is a good idea? If so, could you give me instructions? If you don't agree, could you talk to him about this for me?



This is entirely correct. You need to back it up From Disk, from the drive you store your data on. The command to backup From Disk is just an abbreviation of From Disk; FDISK - so, to back up your database files on D:\ just open up a dos command window and run FDISK D



Sorry to be so long in replying. Had problems with my laptop after backing up both my D: and C: drives per your instructions. While a technician is looking at it, a co-worker is letting me borrow their laptop so I can send this. Where do the FDISK backups reside? Do I still need to do a FDISK backup of the Z drive?

My database still isn't working.

Thanks for all the constructive suggestions!! Although my boss is unhappy that we can't do business, I've at least been able to explain to him that some of the best SQL experts are working on our problem, but that these things take time.


__________________________________________________

Against stupidity the gods themselves contend in vain. -- Friedrich Schiller
Stop, children, what's that sound? -- Stephen Stills
Post #1550735
« Prev Topic | Next Topic »

Add to briefcase «««4,3284,3294,3304,3314,332»»»

Permissions Expand / Collapse