Comments posted to this topic are about the content posted at http://www.sqlservercentral.com/columnists/mcurnutt/usingstoredproceduresforsqlserverdml.asp
A few comments on the article, which I whole heartedly agree on. With SQL Server it is all to easy for developers to write the database side of an application, however the majority of the time they do not have the understanding of performance, scalability and manageability as you mention.
My comments are related to two points. The first is your line "It is impossible for SQL Server to reuse query plans created by dynamic stored procedures, and so each time a new query plan must be created.". This is not 100% correct.
Using the EXECUTE statement causes the server to try and auto parametrize the query in the same manner as if the query had been set via any other client (e.g VB). There is some level of reuse but it is known this process is not too hot when the query is complex.
More importantly is the fact that the correct use of sp_executesql allows for complete reuse of the plans that have been produced. This method I find is essential for performing searches where criteria is optional. If the stored procedure route is taken, then the use of dynamic SQL is not an option, unless you want a SP for each permutation of variables (not too manageable).
One draw back is that sp_executesql only allows for a 4000 charactre unicode string as the SQL. whereas the EXECUTE statement allows for any length string.
The second point is in relation to the paragraph
"One additional note: While it may take more code, it is worthwhile to create parameters for your stored procedure calls explicitly in your code instead of using the .Refresh ADO method. Doing so eliminates back and forth trips to the server to provide a list of parameters and will speed things up even more."
I would go even further in saying that the .refresh method should not be used anywhere in a production system that requires scalability. I recently reviewed a system that used this and found that 50% of the database activity was due to the use of this.
Additionally be aware that the creation of parameters in ADO is very expensive due to them being COM objects. I have found that the use of parameters in ADO is only of benefit when the query is complex, with SPs the call is very basic and so the auto parameterization step does not impact the SQL Server.
There is a very good webcast on plan reuse and auto parameterization on the microsoft website http://support.microsoft.com/servicedesks/webcasts/wc050101/wcblurb050101.asp.
Simon, it is interesting that (I must admit) I did not even know about sp_executesql until a few days after I finished writing this article. I was in a class about SQL Server Performance Tuning and it was mentioned. It can also be used to help direct SQL Server to use the right 'version' of a plan in some cases. In fact, I think a good article might be one on all of the different ways to use sp_executesql for performance advantages. I would think that most developers out there don't know about this.
On the second note -
I wasn't aware that the .refresh was quite as costly as you have pointed out. Wow!
Thank you for your comments!
Hey Simon (& Mindy)
My only concern about NOT using refresh is that you're trading "scalability" for design flexibility. It's easy enough to generate code that creates the parameters collection for you, but it's a hassle to RE-generate it if you change your params. Something I like to do is use refresh during design, once Im stable replace it with a defined set of parameters.
Yes, it can be tedious during development, and I see your point. However, (especially in larger teams) I would predict that most of the time the .refresh method would not get replaced prior to going to production. If the project is large, there would be quite a bit of code to search and alter, and it is difficult for a company to justify taking the time to fix something that already 'works.'
I would like to play the devil's advocate.
I don't really think there is any doubt that the "best way" to use SQL Server is to do all data updates through stored procedures, but there is an important trade-off between flexibility, development time and the "best way".
A lot depends on the precise situation and objectives: scalability is a major issue, as is return on investment.
I am actually doing a lot of development work which just breaks every rule I've seen on all the discussion forums I've been following in the last few years. Dynamic client side SQL, bound controls (e.g. combo boxes which get populated at run time from SQL tables), form/subforms in Access clients, and so on. On the whole, things are working well and productivity is high.
I can remember how stupified I ws the first time I opened the stored procedure window in SQL Enterprise Manager and got this "type your code in here" Window without any kind of wizard or graphical tool for even the simplest DML operation (this was a shock for me, I'm used to working in Access). Access 2000 doesn't provide anything better for working with SQL SPs, a big disappointment, I hope to find something in Access XP.
I have recently used the freeware version of Lockwood's Proc Blaster, and it has certainly contributed to reducing my reluctance to use SP's, but I still find that using bound controls in Access forms is the most productive solution and I can sure as hell write code which handles updates through recordsets much faster than I can write code for handling SPs (I just can't believe how complicated managing the ADO commnand object is).
OK I am talking about low numbers of users and low volumes of traffic, and a fairly relaxed security situation. So why am I using SQL and not just Access? The benefits are incredible, even for small-medium sized operations, where the problem is often not the number of users or transactions, but the number of applications.
I think your article is very well written and I will certainly refer to it and make the developers who work for me read it, and I will certainly be trying out the other tool that you referred to in your article. Don't you agree that we need more tools for creating SPs (and the VB code to use them) semi-automatically?
On the topic of environments for creating stored procedures. The .net development environment is a great improvement on preceding environments. Currently I have a database that is being written just using Powerdeigner as the CASE tool and the .net IDE as the SP, trigger, view and test data writter. You can have a solution with nice organised folders each with their own DB connection, all under source control Right click to run the code, move to the server explorer and debug the procedure. (You can use the database as the source of files and have them under source control, but I find it is better to use files outside of the DB)
It still lacks some of the features of Query Analyser but you can see where Microsoft are going.
You can also see that they are not going to implement a totally new ide for writing t-sql when in 18 months time Yukon will be out when I imagine you will be able to use the complete IDE as your SPs can be written in any .net language.
It is very hard to argue with an application that users are satisfied with and that "works". If you are able to maintain it and everyone is happy with the performance, and you don't see that there is any possibility of the user load or database size to increase drastically, then you have a perfectly suitable solution in my opinion. If it's not broke, why fix it?
One of my earliest database experiences was with Access 2.0. At the time, I felt that program was the best thing I had ever seen! When I was first introduced to VB I had a real problem figuring out combo boxes and forms and how to get them to work. I was so used to Access! Eventually, I saw the power and the advantages of VB. Again, when I started using ASP, I could not get how to make a form, and the whole thing seemed so arduous! But over time I "got it" and saw the strength and weaknesses when compared to VB or Access.
Access is a good tool, but is often misused. I have seen many production systems bogged down by not 10s, but 100s of "linked table" type connections on user machines all over the company. I also found an application secretly written in Access that itself was burdening the SQL Server system. We were unable to change schemas, passwords or anything else on the production box because we were being held hostage by all of these "mission critical" users and applications that had attached and grown secretly over time.
I do agree that MS needs to make it easier for stored procedures to be used. I think that this would prevent a lot of the inefficient code style that gets used. Unfortunately, the easiest ways are often the most inefficient. So, you trade ease for efficiency.
On the other hand, once I really dug in and learned how to code efficiently, it really isn't as hard as it looks. It just takes a little more in depth understanding and not so much dependence on GUI tools to which we all tend to gravitate.
In writing this article I was hoping that it would increase the awareness of the efficiency that is lost by using some of the easier or demonstrated coding methods. I realize that these can't be used in all cases, especially once you are already so far down a particular path. Also, it depends upon your development team, and their skillset and tool productivity.
Thanks for your comments! It is very interesting for me to hear what others have to say about this subject.
I have not had the opportunity yet to use any of the .net programming "stuff" that you are talking about. Sounds very interesting? Would you email me a screen shot of the interface?
I do not understand how you could write a stored procedure in any language. What do you mean by that? The language will in turn write the T-SQL and compile onto the server?
If anyone wants a screen shot I can send you one.
If you have ever used the interdev database project you will see some similarties. The right hand side is my solution with files in folders for each database object type. The left hand side is a server explorer, which displays the solutions database connections and a server explorer which combines features of enterprise manager, computer management and the query analyser object browser.
One other point to note is that the ide identifies any SQL block, outlines it and allows you to right click on it and design it, which then opens a query designer window.
Still no autocomplete, which would be very good given the deferred compilation of SQL 7 onwards.
This is the bain of my life having to fully test each route through an SP.
On the different language point. You will write the SP in C# or vb.net of cobol.net etc and it will run as per any .net language using the Common Language Runtime, which is common to all .net languages. If you read the transcript of Bill Gates keynote at teched 2001 he displays a demo of this by going to edit an SP and the language is C#. This is for Yukon which will have more and more support for the .net common runtime language.
Thank you simonsabin,
I am going to have to make some time to look into this stuff further.
You have been very helpful!
Viewing 11 posts - 1 through 10 (of 10 total)