Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase 12345»»»

Guest Editorial: On writing SQL Expand / Collapse
Author
Message
Posted Thursday, December 11, 2008 9:37 PM


Mr or Mrs. 500

Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500

Group: General Forum Members
Last Login: 2 days ago @ 11:26 AM
Points: 561, Visits: 2,415
Comments posted to this topic are about the item Guest Editorial: On writing SQL


Best wishes,

Phil Factor
Simple Talk
Post #618365
Posted Friday, December 12, 2008 4:02 AM


SSChampion

SSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampion

Group: General Forum Members
Last Login: Today @ 3:30 PM
Points: 14,802, Visits: 27,275
I'm actually not entirely sure what you're asking. Do you mean something like: Always join on the key, the whole key. Or are you looking for tricks with XML execution plans?

----------------------------------------------------
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood..." Theodore Roosevelt
The Scary DBA
Author of: SQL Server 2012 Query Performance Tuning
SQL Server 2008 Query Performance Tuning Distilled
and
SQL Server Execution Plans

Product Evangelist for Red Gate Software
Post #618477
Posted Friday, December 12, 2008 4:13 AM


Mr or Mrs. 500

Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500

Group: General Forum Members
Last Login: 2 days ago @ 11:26 AM
Points: 561, Visits: 2,415
The question I get asked is much higher-level than that. it is more like 'How do you actually go about writing a stored procedure'. I always tend to start out on the same tack as you, but I soon find I'm not answering the question. It is more like how you go about debugging code, changing sections of it, refactoring it, or just plain and simply how you check on the behavior of the system without having breakpoints or sophisticated debugging.

You have developed a stored procedure, maybe, and you want to check on the parameters passed to it, or perhaps the value of a variable, or the contents of a temporary table when the system is actually running. How would you construct a test harness for it? How do you go about profiling the performance of a stored procedure to find where the performance problems are?

What sort of advice would you give?



Best wishes,

Phil Factor
Simple Talk
Post #618481
Posted Friday, December 12, 2008 4:43 AM


SSChampion

SSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampion

Group: General Forum Members
Last Login: Today @ 3:30 PM
Points: 14,802, Visits: 27,275
If I understand the question, I guess it's something I do every day (all day sometimes). First, I need to have a set of working parameters. Preferably, I have some of the most painful parameters, the ones that cause the most work or return the most data. Assuming I have them (I can get them by using Profiler, but let's not go there for the moment), I always start with an execution plan (shocking I'm sure). In addition, I get the Statistics I/O and TIME. Usually, these three bits of information are enough. But if you get into really tough procedures, you might go back to Profiler and set up a trace that captures statement executions (I've been doing a bunch of that this week on a particularly ugly query I was tuning). This will show which statement within the query is causing pain. If you're really stuck, you should also capture wait states to see what specifically is causing the slow down (this week it was I/O latches caused by fragmented indexes, I'm beating, uh, I mean talking to, the admin team about the missing defrag job).

Is that the sort of thing you're looking for? Obviously I'm skipping hundreds of details on every step.


----------------------------------------------------
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood..." Theodore Roosevelt
The Scary DBA
Author of: SQL Server 2012 Query Performance Tuning
SQL Server 2008 Query Performance Tuning Distilled
and
SQL Server Execution Plans

Product Evangelist for Red Gate Software
Post #618502
Posted Friday, December 12, 2008 4:45 AM


SSChampion

SSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampionSSChampion

Group: General Forum Members
Last Login: Today @ 3:30 PM
Points: 14,802, Visits: 27,275
Thinking about it some more... how do you write a procedure...

Well, I assume that you know what you want to get, what data set you're looking for, first. Armed with that you have to look at the database to see if contains that information and in what form. From there it gets really tricky.

This is a hard one.


----------------------------------------------------
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood..." Theodore Roosevelt
The Scary DBA
Author of: SQL Server 2012 Query Performance Tuning
SQL Server 2008 Query Performance Tuning Distilled
and
SQL Server Execution Plans

Product Evangelist for Red Gate Software
Post #618503
Posted Friday, December 12, 2008 5:08 AM


Mr or Mrs. 500

Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500Mr or Mrs. 500

Group: General Forum Members
Last Login: 2 days ago @ 11:26 AM
Points: 561, Visits: 2,415
I was getting into an argument with some Agile XP people who were making out that they were the first to see the central importance of unit-testing when developing software. OK, I thought, I've been writing functions, views and procedures in TSQL for years, and I've always had that focus. You sit down at the keyboard and you say to yourself, How in the blazes am I going to test this routine I haven't yet written? Once you've worked out how you can test it, Then you write it. Surely the principles of Test Driven Development (TDD) have been around for years?

It is the same with performance profiling. I'd have thought it is always better to design routines in a way that makes it easy to monitor performance issues, than to wade in retrospectively. (I use a debugging log to do this, that is switched in or out as required)

I've worked with many other SQL Developers who have done it a similar way, even to the point of using test harnesses in VB or C++/C#, but there are a whole lot of other ways of writing TSQL, especially now we have the profiler, can read execution plans (with Grant's book there and open at the right page), and all sorts of other tools to make life easier.

I had the idea a while back of writing something about developing SQL Code using very simple test harnesses. I was then gripped with panic, thinking that it was possibly a completely unnecessary skill, now that the information we can get about the dynamic behaviour of SQL routines is so sophisticated, and I'd make a complete idiot of myself describing it. But then, the Agile people got me thinking again....



Best wishes,

Phil Factor
Simple Talk
Post #618512
Posted Friday, December 12, 2008 5:35 AM
SSC-Addicted

SSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-Addicted

Group: General Forum Members
Last Login: Monday, March 31, 2014 10:11 AM
Points: 431, Visits: 604
I spend more of my time on the development side rather than the DBA side. To help with overall testing, I've taken to creating a set of test data that I can load up into the QA environment for our unit testing pass. This gives me predictable results from a known starting point which I also use in development.

Any time a "new" bug bites me, I add a sampling of the data that caused it to the test set.


Regards,
Michael Lato
Post #618525
Posted Friday, December 12, 2008 5:53 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Saturday, April 05, 2014 3:15 PM
Points: 23, Visits: 145
I'll bet no-one's going to like this answer... somebody may even call me an idiot, LOL. And maybe my answer is situational... may not work for everyone:

I single thread my queries: a series of SELECT INTOs, with one JOIN apiece.

And I did not come to this style to ease debugging: I came to it based on performance observations.

I suppose SQL is intended to be smart enough to handle multiple joins in one pass with just as much efficiency as single joins... but it's not. (Well, maybe 2008 is -- haven't tasted that one yet.)

In our environment, where I'm often joining tables with millions of records apiece... it has often given a query a ten-fold performance gain, to start with a SELECT INTO on *ONE* table, with *NO* joins... and then, maybe, do multiple joins on the subsequent SELECT INTOs... but keeping each pass pretty simple.

NOW, back to the topic: this also happens to make debugging very simple: you can look at the resulting tables, to see what data was picked up on each pass.

(This approach doesn't make as much use of a graphical query designer... but 2005's graphical interface is worse than Enterprise Manager's was, anyway.)

Again, the above may be situational: if you've got better indices than mine, or your tables aren't millions of rows long, or... there may be a variety of reasons why your performance is just fine, with complicated JOIN setups.

Good luck, and be well,
-- Doug Ivison
Post #618533
Posted Friday, December 12, 2008 5:59 AM


UDP Broadcaster

UDP BroadcasterUDP BroadcasterUDP BroadcasterUDP BroadcasterUDP BroadcasterUDP BroadcasterUDP BroadcasterUDP Broadcaster

Group: General Forum Members
Last Login: Today @ 12:35 PM
Points: 1,496, Visits: 2,634
On a high level, I consider the stored procedure to be what in electronics is called a black box - you define the inputs and the expected outputs.

All of the following would be done in the development environment using a recent backup from production.

Begin a new query, DECLARE the input parameters and SET those parameters to sample values. Write some T-SQL using those parameters to produce the desired results.

For INSERT, UPDATE or DELETE sp's I would also insert BEGIN TRAN and ROLLBACK TRAN with a few "SELECT *" statements before and after the T-SQL so I can verify the data changes.

At this point you have a working batch & you can determine if results & performance meet the objectives.

Once the batch is working, comment out the DECLARE's and SET's and add the CREATE PROC statement at the top of the code. Create & then execute the sp with various combinations of parameters to verify that the outputs are still correct. I would also run it with some inappropriate input values to see if I can break it.

After all that, you can remove any TRAN and SELECT * statements and the sp is ready for QA testing.
Post #618535
Posted Friday, December 12, 2008 6:32 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Monday, April 14, 2014 9:08 AM
Points: 23, Visits: 91
I think the important thing is to "establish my interface" first. Exactly what am I required to provide? What must the data look like? Do I need one result set or two? A scalar or a result set? How many rows per record set? What does the aggregation or nullability require of me? Etc.

Once I have that I try to determine the logical order to get that data. "Store this data in a temp table then ...". What are the major building blocks as I assemble the data and what order must it happen in?

From there I start at the "inside" of the first query and, iteratively, work outward. Then the next query, etc.

At least that's what I remember through the creative fog. :)


Tom Powell
http://philergia.wordpress.com/
Post #618543
« Prev Topic | Next Topic »

Add to briefcase 12345»»»

Permissions Expand / Collapse