The ARRAY In SQL Server 2000

  • Comments posted to this topic are about the content posted at

  • Nice Article by the way  I would like to correct some comments in it though

    >> A similar process is required for TV, but all operations are conducted 100% in memory <<

    That statement is not completly correct.

     FYI :;en-us;305977


    * Noel

  • Interesting reading.  Thank you for taking the time to share the ideas.

  • Good article.

    Why do I get an variable declaration error with this statement -

    Select * from @tbl INNER JOIN Products p ON @tbl.ItemName = Products.ProductName


  • because @tbl must be a declared table variable.

  • Good article, here is an variation that I use ...


     @SQL   Varchar(200),

      @Array Varchar(200),

      @Tic   Varchar(8),

      @Field Varchar(25)

    Set @Array  = ''

    --Set @Field  = 'SupplierId'

    --Set @Tic    = '' -- for nbrs

    Set @Field  = 'ContactName '

    Set @Tic    = '''' -- for chars

    Select  @Array = @Array + @Tic+ fldName  +@Tic+',' 

     From (Select cast(ContactName as Varchar(50)) as fldName -- enter @field name

       FROM Suppliers WHERE ContactName LIKE 'c%') as s

     Group by fldName

    --Print @Array

    If len(@Array) > 0


      Set @Array = left(@Array,len(@Array)-1)

      Set @SQL = 'Select CompanyName from Suppliers where '+@Field+' In ('+@Array+')'

      --print @SQL



  • In reply to Remi, here is the full code. I still get an error.

    Declare @tbl table (RowId int identity(1,1), ItemName varchar(100))

    Insert @tbl

      Select ProductName

       FROM Products

        WHERE ProductName like 'M%'

    Select * from @tbl INNER JOIN Products p ON @tbl.ItemName = Products.ProductName

  • Select * from @tbl T INNER JOIN Products p ON T.ItemName = Products.ProductName

  • I'd be intereseted to see some actual performance results.  In my mind, it doesn't make sense that a table variable used in this fasion is faster then a cursor.  A table variable adds the over head of having to create the table with an additional column and then select into it before you can even get started.

    I do see the value of a table variable in being able to pass it to a store procedure, and the more limited scope and clean up, but I still use temp tables (mainly because it is easier to use the "insert into #mytemp   exec my_sp")


  • From memory, many cursors (depending on how they are declared) use temp tables to provide the forward/backward functionality anyway.

    Also, table variables are not always entirely in memory - they too use tempDB, but have a much more tightly defined scope and so can be managed by SQL more efficiently and may be able to stay in memory - as you've said  

    I sometimes still use temp tables in a stored proc when it is a stored proc that returns Iseveral result sets, or performs many calculations, all relying on the one temp table and accessing / joining on several of its columns - it is more efficient to be able to create a clustered index and other indices on the temp table and suffer its recompilation overhead than it is to have slow performing queries using table scans.   I've only made use of it for this reason a couple of times - I usually stick to a table variable and then ALWAYS check out the query plan for anything that is more than a couple of lines (have cut running times from several mins to several seconds by making the switch from variable to temp table)

    As an aside, if I need to pass many values (eg an array of ints) to a stored proc from my application, I pass them in a comma-separated varchar(8000) - seems to work a treat.   I have a UDF to parse the string and return a table.  I suppose that XML would also work, but seems a bit too much overhead for such a simple task!

    My 2c - cheers!  Ian

    PS - nice article - I like the ones that generate plenty of discussion!

  • Again article suggesting replacing a cursor with one row at time processing loop. That's not more efficient solution. It would be nice if author would test the solution before claiming its superiority.

  • I have been told that using TT is more efficient than using TV when building dynamic TSQL queries, does anyone have any opinions on the validity of this statement?



  • Great Article and very good idea to simulate ARRAY functionality. A for the CURSOR simulation that has already been discussed earlier and different users have had different experiences with the approach: still I prefer the WHILE struct since, to me, it works better than a CURSOR struct

  • I just spent two days determining if the table variable approach was faster than the CURSOR approach.  It was a no-brainer.  The CURSOR approach was faster.  And it became even faster when I created my CURSOR with LOCAL FAST_FORWARD.

    With the TV approach your reads go up significantly.  Then duration time might go down a little but not enough to make up for the increase in reads.  And it was very inconsistent.  Somtimes it went up and sometimes it went down.

    With LOCAL FAST_FORWARD added to by CURSORs my reads and duration dropped significantly.

    Here are my results on the CURSOR test.






    CPU -- 172

    READS -- 10216

    DURATION -- 283

    The DURATION went from 750 to 283!!!!!

    I spent the rest of the day changing all of our CURSORS to LOCAL FAST_FOWARD!

  • Great article. I my shop it is punishable by being forced to turn on an overhead light in your office if you use a cursor. I tebd to use while loops and temp tables. I will definitely give this a try though.

Viewing 15 posts - 1 through 15 (of 56 total)

You must be logged in to reply to this topic. Login to reply