Passing a Table to A Stored Procedure

  • I have been building DB systems on SQL Server since the days when Microsoft and Sybase were the same thing. My code is tightly structured and reusable. Read that as "full use of subroutines". Temp tables provide a simple and direct way of passing a table. I do it all the time.

    The XML solution seems to be an unnecessary layer of complexity. I have found temp table performance to be excellent. The code is direct and more native to the environment. In fact, when dealing with very LARGE interim results, I have found temp tables to outperform table variables.

    While on the subject of table variables, Microsoft's implementation of table variables is quite lame. Especially annoying, apart from not being able to pass them as arguments, is the inability to select into them: we are constrained to defining them in advance, and then inserting. This is denormalized design. We now must design the schema and then repeat the schema in the code, as opposed to inheriting the schema as in SELECT INTO.

    When Borland designed ObjectPAL tableCursor variables, they had the right idea: they could be handled just like any other variable, and could be manipulated just like a physical table. Hooray for Borland. Hooray for the ghost.

    - - Herb

  • Sergiy, too funny. Thanks for the laugh. I think you made one good point, and missed about 3 of my points. Oh well.

    For me, I've *only* found XML to be a decent work-around to the inability to pass table-variables FROM A CLIENT application. I've never found it useful inside the database tier and I hope others do not either. I may be stupid, but I don't see how temp-tables solves the client-to-server data passing problem at all.

    Re. temp-tables versus table variables; you make them out to behave identically. They have similarities, but do not work identically. On large data-warehousing projects I have done extensive load testing and found the same exact code, swapping #tables for @tables to behave quite differently in performance. On small recordsets table-variables always won out. On very large sets, temp-tables worked better (usually because we can index them like true tables while table-variables you have to use hack composite keys to "fake" an index). These results varied by server configuration (disk subsystem, memory, etc.) also.

    You have to balance BOL reading with real-world testing, I'm afraid. And, by the way, SQL's "winning" out was much more due to ANSI standardization than any technical superiority. That's fun historic reading as well.

    Cheers.

  • You welcome. It's good I bring some good mood to some good people.

    OK, I've got you point about client application.

    But, what kind of application suppose to send tables to a server?

    Is it a database? What tables are doing there at the first point?

    By holding disconnected tables in memory on Client's side you break data integrity and, in fact, create many separate databases with additional problems at synchronization point.

    Impossibility of sending tables to SP stops you from designing wrong systems. And now you are inventing bad workaround to stick with wrong design.

    God luck with this!

    When you'll fail (no "if"s here!) you may call me or Jeff. For reasonably enormous amount of money we would probably put you back on track.

    P.S. I do a lot of real-world testing. And other people do test my systems. And performance - on of key requirements.

    When I broke 1 big SELECT into 3, 2 of them population # tables, and 3rd joined those 2 #tables with static tables in database, the total execution time was dropped from 8 s. to 1 - 1.5 s. for the same dataset returned.

    And, by the way, only technically superior approaches become standards.

    _____________
    Code for TallyGenerator

  • '..only technically superior approaches become standards'

    Are you kidding me?

    I'm not saying you do or don't know SQL and or RDBMS well but I can tell your not famialir with marketing.  Sadly enough marketing normally trumps technically superior every time.  Classic example would be the hey days of Beta verses VHS or CD vs Mini-Disc.  And for the not too distant future you'll probably see the more technically superior format in the HiDef DVD wars turn out the loser. 

    It's not who's better or has the better product but sadly enough who can make theirs look and sound better during the sale

    Ed

    Kindest Regards,

    Just say No to Facebook!
  • Use the right tool for the job. XML was designed as a replacement for EDI (stands for Electronic Document Interchange, I think). It was designed to be used to transfer data between disparate systems. Just because you can do a lot of "cool" things with it doesn't mean you should. XML is one of the most inefficient data storage mechanisms available, mainly because it must be "human readable."

    Why anyone would design a system around an internal XML usage such as this article proposes is beyond me, unless the system is a throw-away or only for small jobs. As has been duly noted, parsing XML is slow and does not scale well.

    Anyone who has done systems design for any length of time will tell you that a system almost always grows larger and more complex over time, and using XML for something like this will inevitably come back and bite you in the rear.

  • Actually, one of the other things I am an expert at is EDI (specifically ANSI X12 healthcare datasets).  That mechanism is INCREDIBLY compact, efficient, flexible and well documented.  Systems have been around for a LONG time that work VERY well with the data.  Now you can't just add new elements in and have the schema tell you what is in it like you can with XML so you lose that bit of flexibility.  But if you did that you would have to change code to handle whatever new stuff came along anyway so the gain is limited.

    I have always chaffed at the way BizTalk does the XML conversion stuff when chewing through EDI.  Just sooooo inefficient!  But I am borderline obsessive-compulsive when it comes to efficiency . . . :-))

    Best,
    Kevin G. Boles
    SQL Server Consultant
    SQL MVP 2007-2012
    TheSQLGuru on googles mail service

  • Thank you Jakob, you have explained the great things... Your articles are really very usefull for me...I eagerly waiting for your ideas..

  • Thanks, Jacobs, Sergiy and others in this nice discussion.

    The sollution with triggers was indeed over my head, despite the fact that I have created views with instead of triggers

    Practical use of xml approach (Ive used it few times, also comma delimited once ) is when we pass very small amount of data to the server. In my most current use of it I pass list of selected goods to a stored procedure.

    User fills an application, chooses goods that he would like to buy and I need to present him with list of options for payment schedule. They still are not written to database and their number is usually 1-3 items. This list is only one of the parameters, there are few more and it is most practical to do this filtering of possibilities on database layer, at application server are calculated some paramters to these options and the list is presented to user.

    I think that XML approach is the valid one here

  • Hi Jacob, thanks for the clear wrtings

    Would this be an alternative:?

    SET

    ANSI_NULLS ON

    GO

    SET

    QUOTED_IDENTIFIER ON

    GO

    -- =============================================

    -- Author: Neil Osman

    -- Create date: June 4th 2007

    -- Description: get's an object name by it's id

    -- =============================================

    CREATE

    PROCEDURE GetObjNameById

    @colId int = 0,

    @colName nvarchar(50),

    @tblName nvarchar(50),

    @objValue int

    AS

    BEGIN

    SET NOCOUNT ON;

    DECLARE @sqlStr varchar(255)

    SELECT @sqlStr = 'select ' + @colName + ' from ' + @tblName + ' where ' + @colId + '=' + @objValue

    EXEC master..xp_cmdshell @sqlStr

    END

    GO

  • Bojidar,

    if your user is creating an order, and Order Number to be generated by some algorythm how you can make sure 2 users creating 2 orders at the same time not gonna get identical numbers?

    Right, by reserving a number for order prototype.

    For this you MUST have roundtrip to database and create an order prototype.

    As soon as Order is created anyway, it's logical to continue with roundtrips for each line (or each value, depending on which model you choose) saving values in lines as soon as ser has finished with them.

    Then, when user presses "submit" the prototype just becomes an order.

    And users will really appresiate the possibility to continue with not finished order after their desktops crashed for some reason or they just pressed the wrong button and closed the application.

    _____________
    Code for TallyGenerator

  • Hi Neil,

    I see that @sqlStr is a TSQL statement. So I am wondering why you execute it using xp_cmdshell.

    .

  • Thank you everyone for reading the article and participating the discussion. A lot of great ideas were discussed in the forum and I am pretty sure that the ideas must have helped many people.

    I believe the best solution for a given problem cannot be suggested unless we study the specific problem. So it would not be wise to say DONT DO THAT and DO THIS. There are general programming rules and practices that are to be considered as basic guidelines. Based on the specific problem you are working, the environment and requirement, you should override some of the basic rules as and when required. We de-normalize tables many of the times though a normalized table is reccommended by the programming rules.

    I see SqlServerCentral to be a great platform where many approaches/ideas for a given problem can be discussed. However, based on various factors related to the problem, each one of us needs to decide the best approach for our solution. If any one, who really does not understand your specific problem 100% suggests a specific solution, I dont think that would be corret.

    I suppose atleast some of you here would agree to me on this 🙂

    Cheers!

    .

  • Thanks for your opinion, Sergiy it is valid but I want to clarify the situation. It is web based application with separate database, application server and web server. Indeed I at the begining of filling application write it to the DB and get number.

    It's a long form so is separated on few pages (wizard style) and after each page change (if there is new data) it is written to the database right for the reason to be able to continue filling form if something breaks (its over Internet anyway). There are many page partial post backs (AJAX) to the web server during edit of an page to render different options or validate something but it does not make sense to make round trips to database for every single changed field. They are kept in hierarchy of objects at the web server side and are saved to database only when it worths.

    Database is usually the bottleneck right?

    So why not just do a simple query to database to give me list of 4-5 options and supply theoretically unlimited but in reality 1-3 goods

    Anyway I tried to show an valid use case of xml supplied list as parameter in a working production environment

  • > Database is usually the bottleneck right? [Wink]

    Not really. Only if it's designed badly.

    In my last project test with 400 users simultaneously accessing database revealed that internet traffic was a bottleneck.

    > So why not just do a simple query to database to give me list of 4-5 options and supply theoretically unlimited but in reality 1-3 goods [Smile]

    As many people mentioned here XML is very (no, extremely) bad in terms of scalability. So, don't even think about "theoretically unlimited" thing.

    And query recording single line is gonna be much simpler then recording the whole document, would you agree?

    Don't forget, users better take small pauses after committing each line (knowing that this line is saved now) than long wait for whole document to be processed fearing they gonna lose whole thing if something went wrong.

    That limiyation is not a curse, it's actually a hint.

    It's trying to stop you from doing wrong things.

    Do you really believe you're the first one who come up with hat "brilliant" idea? Believe me, it was in agenda even before SQL Server as we know it.

    But it's still not there. And products having it there are not there themselves.

    Think about the reason - why?

    _____________
    Code for TallyGenerator

  • All,

    Just got the PASS Community Connector - June 11, 2007

    newsletter today looks like microsoft finally got around to the task of making this a feature for the upcomming version of SQL 2008 (10 years to late?). 

    Guess the holy was will have to now drift into a new topic

    • Table Value Parameters

    • In many customer scenarios, it is necessary to pass a set of table structured values (rows) to a stored procedure/function on the server. These values may be used for populating/updating a table directly or for more complex manipulation of data for business logic purpose. Table valued parameter will provide an easier way to define a table type as well as allow applications to create, populate and pass table structured parameters to stored procedures and functions

    enjoy!

Viewing 15 posts - 31 through 45 (of 100 total)

You must be logged in to reply to this topic. Login to reply