# The Elusive Conditional WHERE Clause

• I'll save all of you a bit of trouble. Here's my "standard" million row hammer for things like this. I'll let you determine what the indexing should be. Change it to suit your needs but do let it be a need because until the code speaks, everyone is guessing even if they may be highly educated guesses. π As my good friend Sergiy says, "A Developer must not guess. A Developer must KNOW!" π

`-- DROP TABLE dbo.JBMTest;`

`GO`

`--===== Create and populate a 1,000,000 row test table.`

` -- "SomeID" has a range of 1 to 1,000,000 unique numbers`

` -- "SomeInt" has a range of 1 to 50,000 numbers`

` -- "SomeLetters2" has a range of "AA" to "ZZ"`

` -- "SomeMoney has a range of 10.00 to 100.00 numbers`

` -- "SomeDate" has a range of >=01/01/2010 & <01/01/2020 whole dates`

` -- "SomeDateTime" has a range of >=01/01/2010 & <01/01/2020 Date/Times`

` -- "SomeRand" contains the value of RAND just to show it can be done`

` -- without a loop (DEMO-3).`

` SELECT TOP 1000000`

` SomeID = IDENTITY(INT,1,1),`

` SomeInt = ABS(CHECKSUM(NEWID())) % 50000 + 1,`

` SomeLetters2 = CHAR(ABS(CHECKSUM(NEWID())) % 26 + 65)`

` + CHAR(ABS(CHECKSUM(NEWID())) % 26 + 65),`

` SomeMoney = CAST(RAND(CHECKSUM(NEWID())) * 90 + 10 AS DECIMAL(9,2)),`

` SomeDate = DATEADD(dd,ABS(CHECKSUM(NEWID()))`

` % DATEDIFF(dd,'2010','2020'),'2010'),`

` SomeDateTime = DATEADD(dd,DATEDIFF(dd,0,'2010'),`

` RAND(CHECKSUM(NEWID())) * DATEDIFF(dd,'2010','2020')),`

` SomeRand = RAND(CHECKSUM(NEWID()))`

` INTO dbo.JBMTest`

` FROM sys.all_columns ac1 --Cross Join forms up to a 16 million row`

` CROSS JOIN sys.all_columns ac2 --Pseudo Cursor`

`;`

--Jeff Moden

RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
First step towards the paradigm shift of writing Set Based code:
________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

Change is inevitable... Change for the better is not.

How to post code problems
How to Post Performance Problems
Create a Tally Function (fnTally)

• John Hick-456673 (4/18/2015)

spaghettidba (4/18/2015)

John Hick-456673 (4/17/2015)

spaghettidba (4/17/2015)

craig 81366 (4/18/2013)

John Hick-456673 (4/18/2013)

I have tested it against large tables (1 billion plus rows) and performance is much better than with dynamic SQL.

I'm going to be blunt here.

I find that extremely hard to believe.

+1

Dynamic SQL is difficult to beat in this case.

I would go as far as to say that suggesting this technique is a bad advice.

You are two years behind the eight-ball, there. But, thank you for playing. π

You might have missed that your article is featuring on the main page of SQL Server Central.

That might be - but after the vicious attacks from the first go-round, I am not feeling particularly merciful. Therefore, I presume that you have never written for SCC. I was informed a few weeks ago that the article was to be republished. I braced myself for another round of flames. Surprisingly, there have been only a couple of you who have had the bad grace to treat this as a first publishing - and that as though I am maliciously guiding people astray.

But, maybe I should embrace triangular joins and cursors - no, wait! NESTED CURSORS with extraneous function calls within! THAT'S MY NEW PARADIGM! RBAR, HERE I COME!

HEY, EVERYBODY! THIS HOW YOU CAN BRING DOWN A SYSTEM! JUST LISTEN TO ME!

Really, is this what you want? Someone writes an article in the spirit of trying to help and the community wants to burn him at the stake? Really? OK, I get it. Bye!

jhick signing out on SQL Sever Central, permanently. I'm done. I used to sign in at least a couple of times a week - but this was the last.

Nah. Don't do that. You'd be cheating a whole lot of people that desperately need help out of your good knowledge and experience.

And this is nothing. If you really want to see someone get flamed, visit my last article on running totals. You haven't seen flaming until you've seen that little gem and I even had proof in code! Keep writing. Even if it turns out that you're wrong, people learn a fantastic amount from the discussions that follow articles like this one... heh... even if it's "just" learning how to post without flaming or how to grow a really thick skin that will benefit you and those around you in the future. It has certainly helped me in all walks of life. π And, the flaming also helps me anticipate what I need to do on the next article to keep me from getting flamed. That has REALLY helped me sell code and design ideas in real life.

As it turned out on that article and thanks to my patience with the "ring knockers" that attacked my methods in a seriously ad hominem manner, a REALLY important improvement came out that supported my method making it even better.

Hang in there. Like "Red Green" says, "We're all in this together and I'm pullin' for ya".

Shifting gears, I believe that a good fellow and friend of mine is in the process taking me up on the challenge of proof even as I write this. I'm also pretty sure that he's only going to let the code do the talking. Hang around and see what happens.

As to the larger point, thank you for taking the time to write the article. At the very least, which is still tremendously important, it brings guns to bear on a very common problem. So, please, don't go away. You understand what RBAR means and that makes you a "witch" that people will sometimes attempt to burn at the stake. Put on your asbestos underwear and enjoy the "warmth" and learning even if the lessons aren't what you expected. π

--Jeff Moden

RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
First step towards the paradigm shift of writing Set Based code:
________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

Change is inevitable... Change for the better is not.

How to post code problems
How to Post Performance Problems
Create a Tally Function (fnTally)

• This is probably the only way to perform this (hereβs another challenge for you: can you do this WITHOUT using a CTE?).

Why not use the tally table technique? I use a tally table to replace almost every loop I find myself writing. I have a table called number with one column, value containing integers 1 - 1000000 (value being a clustered primary key). The following code, I believe, would produce the same output as your code. (not sure of the performance but a quick test showed a tiny improvement on the code someone posted earlier using the MAXRECURSION option):

`DECLARE`

`@StartDate DATETIME = '20150201'`

`, @EndDate DATETIME = CAST(GETDATE() AS DATE)`

`SELECT`

`CAST(Value AS DATETIME) [Date]`

`, 'W' + CAST(DATEPART(wk, Value) - DATEPART(wk, @StartDate) + 1 AS VARCHAR) + 'D' + CAST(DATEPART(dw, Value) AS VARCHAR) [Wk]`

`, DATEPART(dw, Value) [DOW]`

`FROM Number`

`WHERE Value >= @StartDate`

`AND Value <= @EndDate`

Just noticed the cross over of years messes up the week number but I'm sure you could sort that out. Have to do some work now tho π

Just tested this on 1million rows completes in 10.05 seconds vs. 17.99 for the CTE solution

I'd always thought that creating this table in the database improved performance but using a CTE instead (see below) is just as quick. Mind you, requires more code:

`; WITH Number`

`AS`

`(`

`SELECT TOP 1000000 ROW_NUMBER() OVER (ORDER BY c1.column_id) Value`

`FROM sys.columns c1, sys.columns c2`

`)`

• Jeff Moden (4/19/2015)

John Hick-456673 (4/18/2015)

spaghettidba (4/18/2015)

John Hick-456673 (4/17/2015)

spaghettidba (4/17/2015)

craig 81366 (4/18/2013)

John Hick-456673 (4/18/2013)

I have tested it against large tables (1 billion plus rows) and performance is much better than with dynamic SQL.

I'm going to be blunt here.

I find that extremely hard to believe.

+1

Dynamic SQL is difficult to beat in this case.

I would go as far as to say that suggesting this technique is a bad advice.

You are two years behind the eight-ball, there. But, thank you for playing. π

You might have missed that your article is featuring on the main page of SQL Server Central.

That might be - but after the vicious attacks from the first go-round, I am not feeling particularly merciful. Therefore, I presume that you have never written for SCC. I was informed a few weeks ago that the article was to be republished. I braced myself for another round of flames. Surprisingly, there have been only a couple of you who have had the bad grace to treat this as a first publishing - and that as though I am maliciously guiding people astray.

But, maybe I should embrace triangular joins and cursors - no, wait! NESTED CURSORS with extraneous function calls within! THAT'S MY NEW PARADIGM! RBAR, HERE I COME!

HEY, EVERYBODY! THIS HOW YOU CAN BRING DOWN A SYSTEM! JUST LISTEN TO ME!

Really, is this what you want? Someone writes an article in the spirit of trying to help and the community wants to burn him at the stake? Really? OK, I get it. Bye!

jhick signing out on SQL Sever Central, permanently. I'm done. I used to sign in at least a couple of times a week - but this was the last.

Nah. Don't do that. You'd be cheating a whole lot of people that desperately need help out of your good knowledge and experience.

And this is nothing. If you really want to see someone get flamed, visit my last article on running totals. You haven't seen flaming until you've seen that little gem and I even had proof in code! Keep writing. Even if it turns out that you're wrong, people learn a fantastic amount from the discussions that follow articles like this one... heh... even if it's "just" learning how to post without flaming or how to grow a really thick skin that will benefit you and those around you in the future. It has certainly helped me in all walks of life. π And, the flaming also helps me anticipate what I need to do on the next article to keep me from getting flamed. That has REALLY helped me sell code and design ideas in real life.

As it turned out on that article and thanks to my patience with the "ring knockers" that attacked my methods in a seriously ad hominem manner, a REALLY important improvement came out that supported my method making it even better.

Hang in there. Like "Red Green" says, "We're all in this together and I'm pullin' for ya".

Shifting gears, I believe that a good fellow and friend of mine is in the process taking me up on the challenge of proof even as I write this. I'm also pretty sure that he's only going to let the code do the talking. Hang around and see what happens.

As to the larger point, thank you for taking the time to write the article. At the very least, which is still tremendously important, it brings guns to bear on a very common problem. So, please, don't go away. You understand what RBAR means and that makes you a "witch" that people will sometimes attempt to burn at the stake. Put on your asbestos underwear and enjoy the "warmth" and learning even if the lessons aren't what you expected. π

+1

Jason...AKA CirqueDeSQLeil
_______________________________________________
I have given a name to my pain...MCM SQL Server, MVP
SQL RNNR
Posting Performance Based Questions - Gail Shaw[/url]
Learn Extended Events

• Thanks for the info. I've used this on a limited basis.

• Thank you, Jeff, and you, Jason. I've learned more from lurking on SSC than I did in my CS classes back in the day.

You two, Dwain Camps, Paul White, a few more - all bring so much to this forum. Maybe I will try my hand at another flame magnet.

Again, thanks.

• +1

• John Hick-456673 (4/20/2015)

Thank you, Jeff, and you, Jason. I've learned more from lurking on SSC than I did in my CS classes back in the day.

You two, Dwain Camps, Paul White, a few more - all bring so much to this forum. Maybe I will try my hand at another flame magnet.

Again, thanks.

We would be looking forward to it.

Jason...AKA CirqueDeSQLeil
_______________________________________________
I have given a name to my pain...MCM SQL Server, MVP
SQL RNNR
Posting Performance Based Questions - Gail Shaw[/url]
Learn Extended Events

• John Hick-456673 (4/20/2015)

Thank you, Jeff, and you, Jason. I've learned more from lurking on SSC than I did in my CS classes back in the day.

You two, Dwain Camps, Paul White, a few more - all bring so much to this forum. Maybe I will try my hand at another flame magnet.

Again, thanks.

Go for it, John.

βWrite the query the simplest way. If through testing it becomes clear that the performance is inadequate, consider alternative query forms.β - Gail Shaw

Understanding and using APPLY, (I) and (II) Paul White
Hidden RBAR: Triangular Joins / The "Numbers" or "Tally" Table: What it is and how it replaces a loop Jeff Moden

• Thanks, Chris.

• Even with some of the harsh feedback that comes with publishing I feel the need to write at least one article myself.

Certainly I owe many of the previous writers that kept me progressing at a pace!

First, I got to verify my topic is not covered by someone else before (I suspect not, hard to be sure).

And exclude the risk of having a trick that has been written about in a book (that sure would give me unwanted flak).

• peter-757102 (4/21/2015)

Even with some of the harsh feedback that comes with publishing I feel the need to write at least one article myself.

Certainly I owe many of the previous writers that kept me progressing at a pace!

First, I got to verify my topic is not covered bu someone else before (I suspect not, hard to be sure).

And exclude the risk of having a trick that has been written about in a book (that sure would give me unwanted flak).

PM a few folks. I'll volunteer, I'm sure Dwain & Jeff will too.

βWrite the query the simplest way. If through testing it becomes clear that the performance is inadequate, consider alternative query forms.β - Gail Shaw

Understanding and using APPLY, (I) and (II) Paul White
Hidden RBAR: Triangular Joins / The "Numbers" or "Tally" Table: What it is and how it replaces a loop Jeff Moden

• peter-757102 (4/21/2015)

Even with some of the harsh feedback that comes with publishing I feel the need to write at least one article myself.

Certainly I owe many of the previous writers that kept me progressing at a pace!

First, I got to verify my topic is not covered bu someone else before (I suspect not, hard to be sure).

And exclude the risk of having a trick that has been written about in a book (that sure would give me unwanted flak).

Except for my second article on "Hierarchies On Sterioids", EVERYTHING I've written about in my 33 articles HAS been written about by someone else right here on SSC and certainly in the world of such forums and blogs. For example, the article I wrote on the Tally/Numbers table was written about thousands of times before I wrote mine. I just took a different slant on it to try to make it easier to understand how to use it. The same goes with my articles on Cross Tabs and Pivots. Certainly neither were revolutionary concepts as Cross Tabs have been used since the beginning of SysBase and the concept of Pivot is as old as Lotus 1-2-3. I just wrote something to make it easier to understand for neophytes and alumni that may have struggled with the concept in the past.

What you have to do is think like the "BASF" company... they don't necessary invent the things they work with... they just find a way to make them better. Sometimes, it's not even making something better. Rather it may be that you've taken a slew of information about a given subject and concentrated it into a single article or two.

You also have to be a bit skilled in setting expectations in the Introduction to your article. Since both of the articles I spoke of were actually "old hat" and I didn't want a whole bunch of people whining about it being "nothing new" for them personally, I included that the reason why I was writing new articles about old things which was either a flurry of forum posts that had a need that the article solved or explained or just because I wanted to try to make a difficult concept easier to understand.

It's worked pretty well so far and the articles have done what I never dreamed they would do. For example, use of the Tally/Numbers Table was previously considered by many to be a rather advanced technique. I can't speak of other forums but it's mostly considered to be a household word/technique on SSC now.

An article doesn't have to be on a new subject and, considering that only 1 out of the 33 articles I've written are actually about something new, I wouldn't let "something new" be the criteria for writing an article. Increasing understanding of something that already exists or making a concept easier to understand is one of the best things that an article can do.

So... in the article...

1. Identify the need and the purpose.

2. Set expectations and identify what will be taught.

3. Teach.

4. Tell them what they've just been taught.

Here are a couple of survival tips. Learn how to generate lot's of data to support your teaching. If you make a claim of performance or lower resource usage and don't prove it with a million row test table, people will eat you alive. If you have spelling errors, the spelling police will pick your bones apart. And although I've had to suffer through some folks that think you must write using the King's English, be yourself. I write like I'm trying to teach someone one on one or telling a story to a group. And for goodness sake, don't use a While Loop or Recursive CTE to generate the data especially if you're writing about a performance related subject.

In case you don't know how to easily generate a shedload of data that your readers can reproduce simply by running your code, please take a look at the following two articles and then use your imagination.

http://www.sqlservercentral.com/articles/Data+Generation/87901/

http://www.sqlservercentral.com/articles/Test+Data/88964/

Just as an example so that you can see just how easy it is to generate a million rows of data to support what you're teaching, here's my "standard" million row test table which demonstrates some of the methods in the two articles. With a little imagination, you can do ANYTHING.

`--===== Create and populate a 1,000,000 row test table.`

` -- "SomeID" has a range of 1 to 1,000,000 unique numbers`

` -- "SomeInt" has a range of 1 to 50,000 numbers`

` -- "SomeLetters2" has a range of "AA" to "ZZ"`

` -- "SomeMoney has a range of 10.00 to 100.00 numbers`

` -- "SomeDate" has a range of >=01/01/2010 & <01/01/2020 whole dates`

` -- "SomeDateTime" has a range of >=01/01/2010 & <01/01/2020 Date/Times`

` -- "SomeRand" contains the value of RAND just to show it can be done`

` -- without a loop (DEMO-3).`

` SELECT TOP 1000000`

` SomeID = IDENTITY(INT,1,1),`

` SomeInt = ABS(CHECKSUM(NEWID())) % 50000 + 1,`

` SomeLetters2 = CHAR(ABS(CHECKSUM(NEWID())) % 26 + 65)`

` + CHAR(ABS(CHECKSUM(NEWID())) % 26 + 65),`

` SomeMoney = CAST(RAND(CHECKSUM(NEWID())) * 90 + 10 AS DECIMAL(9,2)),`

` SomeDate = DATEADD(dd,ABS(CHECKSUM(NEWID()))`

` % DATEDIFF(dd,'2010','2020'),'2010'),`

` SomeDateTime = DATEADD(dd,DATEDIFF(dd,0,'2010'),`

` RAND(CHECKSUM(NEWID())) * DATEDIFF(dd,'2010','2020')),`

` SomeRand = RAND(CHECKSUM(NEWID()))`

` INTO dbo.JBMTest`

` FROM sys.all_columns ac1 --Cross Join forms at least a 16 million row`

` CROSS JOIN sys.all_columns ac2 --Pseudo Cursor`

`;`

Last but not least, live to learn from the harsh feedback. You have absolutely no idea how much you can learn from it even if it's just about how to handle nasty ol' ring-knockers, zealous "best practices" bible-thumpers, lynch-mobs, and outright trolls. You also find a wealth of support when you handle them correctly. If you respond in kind, then you're no better than they are. Admittedly, it's sometimes difficult to keep your cool in their presence but it's outstanding practice in how to deal with such people when you meet the same types in real life.

I've found that the greatest path to success in such articles is to let the code do the talking because that takes "opinion" out of the picture. In most cases, if you don't have code and enough test data (or links to someone's article that does) to support your position, then your article is based on "opinion" and those with opinions of their own will eat you alive even if their opinions suck. π

Also remember to embrace those that submit alternatives or improvements. That's actually what the whole thing is about. I've even modified the intros to some of my articles to point them to the post in the discussion where someone found a better way. My article on converting multiple adjacent spaces to a single space is a great example of that.

--Jeff Moden

RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
First step towards the paradigm shift of writing Set Based code:
________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

Change is inevitable... Change for the better is not.

How to post code problems
How to Post Performance Problems
Create a Tally Function (fnTally)

• Thanks Jeff,

I also entered a PM session with ChrisM and he brought up a related problem that I think I just solved as well :).

If you want to know details, please PM me as I do not want to hi-jack this thread!

• Sorry for the delay, work got in the way and of course I got a little carried away too:

Having read the article and already looking into a similar problems, I felt compelled to take on Jeff's challenge. I took one of my transactional test set generators and applied a somewhat simplified version of a more realistic problem, in order to establish the POC of the stated functionality.

The test set is a βtransaction tableβ look-alike of a ACME Inc. online and outlet trading. To conform to the problem in the article, I added a week day number to the set and six applicalble indices.

π

Test set

`USE tempdb;`

`GO`

`SET NOCOUNT ON;`

`/* Drop the dbo.TBL_SAMPLE_TRANSACTION test set table rather`

` than truncating it, makes it easier to alter the structure.`

`*/`

`IF OBJECT_ID(N'dbo.TBL_SAMPLE_TRANSACTION') IS NOT NULL DROP TABLE dbo.TBL_SAMPLE_TRANSACTION;`

`/* View wrapper for using the NEWID() function`

` within a table value function`

`*/`

`IF OBJECT_ID(N'dbo.VNEWID') IS NULL`

`BEGIN`

` DECLARE @CREATE_VIEW NVARCHAR(MAX) = N'`

` CREATE VIEW dbo.VNEWID`

` AS`

` SELECT NEWID() AS NID;`

` '`

` EXEC (@CREATE_VIEW);`

`END`

`/* Test set generator, inspired by Lynn Pettis's random`

` string function:`

` http://www.sqlservercentral.com/blogs/lynnpettis/2009/04/04/a-variable-length-random-string/`

`*/`

`IF OBJECT_ID(N'dbo.ITVFN_DO_SHAKESPEARE') IS NULL`

`BEGIN`

` DECLARE @CREATE_FUNCTION NVARCHAR(MAX) = N'`

`/*`

` Sample text set generator, having an infinite number of code`

` monkeys calling this function for infinite number of times`

` explains the name ;-)`

` 2015-01-18`

` Eirikur Eiriksson`

`*/`

`CREATE FUNCTION dbo.ITVFN_DO_SHAKESPEARE`

`(`

` @BASE_LENGTH INT`

` ,@BASE_VARIANCE INT`

` ,@WORD_LENGTH INT`

` ,@WORD_VARIANCE INT`

` ,@ROWCOUNT INT`

` ,@DELIMITER CHAR(1)`

`)`

`RETURNS TABLE`

`AS`

`RETURN`

` WITH T(N) AS (SELECT N FROM (VALUES (NULL),(NULL),(NULL),(NULL),(NULL),(NULL),(NULL),(NULL),(NULL),(NULL)) AS X(N))`

` ,NUMS(N) AS (SELECT TOP (@BASE_LENGTH + @BASE_VARIANCE) ROW_NUMBER() OVER (ORDER BY (SELECT NULL) DESC) AS N FROM T T1,T T2,T T3,T T4)`

` ,RNUM(R) AS (SELECT TOP (@ROWCOUNT) ROW_NUMBER() OVER (ORDER BY (SELECT NULL) DESC) AS N FROM T T1,T T2,T T3,T T4,T T5,T T6,T T7,T T8,T T9)`

`SELECT`

` RN.R`

` ,((SELECT TOP(@BASE_LENGTH + ((SELECT CHECKSUM(NID) FROM dbo.VNEWID) % @BASE_VARIANCE))`

` CASE`

` WHEN (NM.N + RN.R + (CHECKSUM(X.NID) % @WORD_LENGTH)) % @WORD_VARIANCE = 0 THEN @DELIMITER`

` ELSE CHAR(65 + (ABS(CHECKSUM(X.NID)) % 26))`

` END`

` FROM NUMS NM`

` CROSS APPLY dbo.VNEWID X`

` FOR XML PATH(''''), TYPE).value(''.[1]'',''VARCHAR(8000)'')) AS RND_TXT`

`FROM RNUM RN;`

`';`

` EXEC (@CREATE_FUNCTION);`

`END`

`/* Test set parameters`

` Minimum value for @SAMPLE_SIZE is 500 without changing any of the`

` other parameters.`

`*/`

`DECLARE @SAMPLE_SIZE INT = 1000000; -- Number of "Transactions"`

`DECLARE @OUTLET_COUNT INT = @SAMPLE_SIZE / 200; -- Number of "Outlets"`

`DECLARE @BASE_DATE DATE = CONVERT(DATE,'2014-01-01',126); -- Base Date, all dates are based on this.`

`DECLARE @ZERO_DATE DATE = CONVERT(DATE,'1900-01-01',126); -- Monday 1st. January 1900.`

`DECLARE @DATE_RANGE INT = 1096; -- +/- 3 Years`

`DECLARE @MAX_PAY_DAYS INT = 90; -- Pay by date offset`

`DECLARE @MAX_ITEMS INT = 20; -- Maximum number of Items`

`DECLARE @ACT_PAY_DAYS INT = 99; -- "Actual" Pay Date`

`DECLARE @AVG_PER_GROUP INT = 500; -- Additional Group Identifier Parameter`

`DECLARE @GROUP_COUNT INT = @SAMPLE_SIZE / @AVG_PER_GROUP; -- Number of Groups`

`DECLARE @CUSTOMER_COUNT INT = @SAMPLE_SIZE / 4; -- Number of Customers`

`/* Random text generation for "customer details" */`

`DECLARE @BASE_LENGTH INT = 50 ;`

`DECLARE @BASE_VARIANCE INT = 49 ;`

`DECLARE @WORD_LENGTH INT = 7 ;`

`DECLARE @WORD_VARIANCE INT = 6 ;`

`/* Get few nulls in the detail column be having slightly fewer`

` entries than possible customer_id`

`*/`

`DECLARE @ROWCOUNT INT = @CUSTOMER_COUNT - 100 ;`

`DECLARE @DELIMITER CHAR(1) = CHAR(32);`

`/* "customer details" */`

`IF OBJECT_ID(N'dbo.TBL_SAMPLE_STRING') IS NOT NULL DROP TABLE dbo.TBL_SAMPLE_STRING;`

`CREATE TABLE dbo.TBL_SAMPLE_STRING`

`(`

` SST_ID INT NOT NULL CONSTRAINT PK_DBO_TBL_SAMPLE_STRING_SST_ID PRIMARY KEY CLUSTERED`

` ,SST_VALUE VARCHAR(500) NOT NULL`

`);`

`/* Create "Customer Details" */`

`INSERT INTO dbo.TBL_SAMPLE_STRING (SST_ID, SST_VALUE)`

`SELECT`

` X.R`

` ,X.RND_TXT`

`FROM dbo.ITVFN_DO_SHAKESPEARE(@BASE_LENGTH,@BASE_VARIANCE,@WORD_LENGTH,@WORD_VARIANCE,@ROWCOUNT,@DELIMITER) AS X;`

`/* Inline Tally Table`

` 20^7 = 1,280,000,000 Max`

`*/`

`;WITH T(N) AS (SELECT N FROM (VALUES (NULL),(NULL),(NULL),(NULL),(NULL),(NULL),(NULL),(NULL),(NULL),(NULL)`

` ,(NULL),(NULL),(NULL),(NULL),(NULL),(NULL),(NULL),(NULL),(NULL),(NULL)) AS X(N))`

`, NUMS(N) AS (SELECT TOP(@SAMPLE_SIZE) ROW_NUMBER() OVER (ORDER BY (SELECT NULL)) AS N FROM T T1,T T2,T T3,T T4,T T5,T T6,T T7)`

`,SAMPLE_DATA AS`

`(`

` SELECT`

` NM.N AS TRAN_ID`

` ,DATEADD(DAY,CHECKSUM(NEWID()) % @DATE_RANGE,@BASE_DATE) AS TRAN_DATE`

` ,(ABS(CHECKSUM(NEWID())) % @OUTLET_COUNT) + 1 AS OUTLET_ID`

` ,(ABS(CHECKSUM(NEWID())) % @GROUP_COUNT) + 1 AS GROUP_ID`

` ,(ABS(CHECKSUM(NEWID())) % @CUSTOMER_COUNT) + 1 AS CUSTOMER_ID`

` ,(ABS(CHECKSUM(NEWID())) % @AVG_PER_GROUP) + 1 AS DETAIL_ID`

` ,CHAR(65 + ( ABS(CHECKSUM(NEWID())) % 26 ))`

` --+ CHAR(65 + ( ABS(CHECKSUM(NEWID())) % 26 ))`

` + CHAR(45) + CONVERT(VARCHAR(6),ABS(CHECKSUM(NEWID())) % 10,0)`

` + CHAR(58) + CONVERT(VARCHAR(6),ABS(CHECKSUM(NEWID())) % 100,0) AS PROD_NO`

` ,CONVERT(NUMERIC(12,2),SQRT(ABS(CHECKSUM(NEWID())) + 2),0) AS TOTAL_AMOUNT`

` ,(ABS(CHECKSUM(NEWID())) % @MAX_PAY_DAYS) + 1 AS PAY_BY_DAYS`

` ,(ABS(CHECKSUM(NEWID())) % @ACT_PAY_DAYS) + 1 AS ACT_PAY_DAYS`

` ,(ABS(CHECKSUM(NEWID())) % @MAX_ITEMS) + 1 AS ITEM_COUNT`

` --,ASCII(':')`

` FROM NUMS NM`

`)`

`SELECT`

` ISNULL(SD.TRAN_ID,1) AS TRAN_ID`

` ,ISNULL(SD.TRAN_DATE ,@BASE_DATE) AS TRAN_DATE`

` ,ISNULL((DATEDIFF(DAY,@ZERO_DATE,SD.TRAN_DATE) % 7) + 1 ,0) AS WEEK_DAY`

` ,ISNULL(DATEADD(DAY,SD.PAY_BY_DAYS,SD.TRAN_DATE) ,@BASE_DATE) AS PAY_BY_DATE`

` ,ISNULL(DATEADD(DAY,SD.ACT_PAY_DAYS,SD.TRAN_DATE),@BASE_DATE) AS ACT_PAY_DATE`

` ,ISNULL(DATEADD(DAY`

` ,FLOOR((SD.PAY_BY_DAYS + SD.ACT_PAY_DAYS) / 2)`

` ,SD.TRAN_DATE),@BASE_DATE) AS DELIVERY_DATE`

` ,CHAR(65 + ( SD.OUTLET_ID % 26 ))`

` + CHAR(65 + ( SD.OUTLET_ID % 20 )) AS LOCATION_CODE`

` ,ISNULL(CHAR(65 + ( SD.ACT_PAY_DAYS % 26 ))`

` + CHAR(65 + ( SD.ITEM_COUNT % 20 ))`

` + RIGHT(CONVERT(VARCHAR(8),1000000`

` + (SD.ACT_PAY_DAYS * SD.ITEM_COUNT),0),6),'ZZ999999') AS EMP_ID`

` ,ISNULL(SD.OUTLET_ID ,1) AS OUTLET_ID`

` ,ISNULL(CONVERT(`

` TINYINT,1 - SIGN(3 - (SD.OUTLET_ID & 0x03)),0) ,1) AS IS_ONLINE`

` ,ISNULL(CONVERT(`

` TINYINT,1 - SIGN(7 - (SD.OUTLET_ID & 0x07)),0) ,1) AS IS_PICKUP`

` ,NULLIF(CHAR((68 +`

` (CHECKSUM(SD.PROD_NO,SD.TOTAL_AMOUNT,SD.ACT_PAY_DAYS) % 4))`

` * (1 - SIGN(3 - (SD.OUTLET_ID & 0x03)))),'') AS ONLCSR`

` ,NULLIF(CHAR((68 +`

` (CHECKSUM(SD.PROD_NO,SD.TOTAL_AMOUNT,SD.ACT_PAY_DAYS) % 4))`

` * (SIGN(3 - (SD.OUTLET_ID & 0x03)))`

` * SIGN(CHECKSUM(CD.SST_VALUE))),'') AS OFFLCSR`

` ,ISNULL(SD.CUSTOMER_ID ,1) AS CUSTOMER_ID`

` ,ISNULL(SD.GROUP_ID ,1) AS GROUP_ID`

` ,ISNULL(SD.DETAIL_ID ,1) AS DETAIL_ID`

` ,ISNULL(SD.PROD_NO,'ZZ-9:99') AS PROD_NO`

` ,ISNULL(SD.TOTAL_AMOUNT,99.99) AS TOTAL_AMOUNT`

` ,ISNULL(SD.ITEM_COUNT,1) AS ITEM_COUNT`

` ,ISNULL(CONVERT(NUMERIC(9,2),(0.1 * SD.TOTAL_AMOUNT),0),0) AS TAX_AMOUNT`

` ,ISNULL(CONVERT(NUMERIC(9,2)`

` ,(0.9 * SD.TOTAL_AMOUNT / ITEM_COUNT),0),0) AS UNIT_PRICE`

` ,CD.SST_VALUE AS CUSTOMER_DETAIL`

`INTO dbo.TBL_SAMPLE_TRANSACTION`

`FROM SAMPLE_DATA SD`

`LEFT OUTER JOIN dbo.TBL_SAMPLE_STRING CD`

`ON SD.CUSTOMER_ID = CD.SST_ID;`

`/* Jeff Moden's ISNULL trick for not null columns doesn't work on all`

` data types so let's put some constraints in place`

`*/`

`ALTER TABLE dbo.TBL_SAMPLE_TRANSACTION ADD CONSTRAINT PK_DBO_SAMPLE_TRANSACTION_TRAN_ID PRIMARY KEY CLUSTERED (TRAN_ID ASC);`

`ALTER TABLE dbo.TBL_SAMPLE_TRANSACTION ALTER COLUMN TRAN_DATE DATE NOT NULL;`

`ALTER TABLE dbo.TBL_SAMPLE_TRANSACTION ALTER COLUMN PAY_BY_DATE DATE NOT NULL;`

`ALTER TABLE dbo.TBL_SAMPLE_TRANSACTION ALTER COLUMN ACT_PAY_DATE DATE NOT NULL;`

`ALTER TABLE dbo.TBL_SAMPLE_TRANSACTION ALTER COLUMN DELIVERY_DATE DATE NOT NULL;`

`ALTER TABLE dbo.TBL_SAMPLE_TRANSACTION ALTER COLUMN LOCATION_CODE CHAR(2) NOT NULL;`

`CREATE NONCLUSTERED INDEX NCLIDX_DBO_SAMPLE_TRANSACTION_WEEK_DAY_FLTR_1_INCL_TRAN_ID_CUSTOMER_ID ON dbo.TBL_SAMPLE_TRANSACTION`

`(WEEK_DAY ASC)`

`INCLUDE (TRAN_ID,CUSTOMER_ID)`

`WHERE WEEK_DAY = 1;`

`CREATE NONCLUSTERED INDEX NCLIDX_DBO_SAMPLE_TRANSACTION_FLTR_WEEK_DAY_IN_6_7_IS_ONLINE_1_INCL_TRAN_ID_CUSTOMER_ID ON dbo.TBL_SAMPLE_TRANSACTION`

`(WEEK_DAY ASC, IS_ONLINE ASC)`

`INCLUDE (TRAN_ID,CUSTOMER_ID)`

`WHERE WEEK_DAY = 1 AND IS_ONLINE IN ( 6 , 7 );`

`CREATE NONCLUSTERED INDEX NCLIDX_DBO_SAMPLE_TRANSACTION_FLTR_WEEK_DAY_IN_6_7_IS_ONLINE_1_IS_PICKUP_1_INCL_TRAN_ID_CUSTOMER_ID ON dbo.TBL_SAMPLE_TRANSACTION`

`(WEEK_DAY ASC, IS_ONLINE ASC, IS_PICKUP ASC)`

`INCLUDE (TRAN_ID,CUSTOMER_ID)`

`WHERE WEEK_DAY = 1 AND IS_ONLINE IN ( 6 , 7 ) AND IS_PICKUP = 1;`

`CREATE NONCLUSTERED INDEX NCLIDX_DBO_SAMPLE_TRANSACTION_FLTR_WEEK_DAY_IN_6_7_IS_ONLINE_1_IS_PICKUP_1_ONLCSR_A_B_C_INCL_TRAN_ID_CUSTOMER_ID ON dbo.TBL_SAMPLE_TRANSACTION`

`(WEEK_DAY ASC, IS_ONLINE ASC, IS_PICKUP ASC, ONLCSR ASC)`

`INCLUDE (TRAN_ID,CUSTOMER_ID)`

`WHERE WEEK_DAY = 1 AND IS_ONLINE IN ( 6 , 7 ) AND IS_PICKUP = 1 AND ONLCSR IN ( 'A' , 'B' , 'C' );`

`CREATE NONCLUSTERED INDEX NCLIDX_DBO_SAMPLE_TRANSACTION_TRAN_DATE_INCL_TRAN_ID_CUSTOMER_ID ON dbo.TBL_SAMPLE_TRANSACTION`

`(TRAN_DATE ASC)`

`INCLUDE (TRAN_ID,CUSTOMER_ID);`

`CREATE NONCLUSTERED INDEX NCLIDX_DBO_SAMPLE_TRANSACTION_TRAN_DATE_FLTR_ONLCSR_A_INCL_TRAN_ID_CUSTOMER_ID ON dbo.TBL_SAMPLE_TRANSACTION`

`(TRAN_DATE ASC, ONLCSR ASC)`

`INCLUDE (TRAN_ID,CUSTOMER_ID)`

`WHERE ONLCSR = 'A';`

`CREATE NONCLUSTERED INDEX NCLIDX_DBO_SAMPLE_TRANSACTION_WEEK_DAY_IS_ONLINE_IS_PICKUP_INCL_TRAN_ID_CUSTOMER_ID ON dbo.TBL_SAMPLE_TRANSACTION`

`(WEEK_DAY ASC, IS_ONLINE ASC, IS_PICKUP ASC)`

`INCLUDE (TRAN_ID,CUSTOMER_ID);`

`CREATE NONCLUSTERED INDEX NCLIDX_DBO_SAMPLE_TRANSACTION_TRAN_DATE_IS_ONLINE_IS_PICKUP_INCL_TRAN_ID_CUSTOMER_ID ON dbo.TBL_SAMPLE_TRANSACTION`

`(TRAN_DATE ASC, IS_ONLINE ASC, IS_PICKUP ASC)`

`INCLUDE (TRAN_ID,CUSTOMER_ID);`

`CREATE NONCLUSTERED INDEX NCLIDX_DBO_SAMPLE_TRANSACTION_TRAN_DATE_ONLCSR_INCL_TRAN_ID_CUSTOMER_ID ON dbo.TBL_SAMPLE_TRANSACTION`

`(TRAN_DATE ASC, ONLCSR ASC)`

`INCLUDE (TRAN_ID,CUSTOMER_ID)`

`WHERE ONLCSR IS NOT NULL;`

Static search procedure

`USE tempdb;`

`GO`

`/**********************************************************`

` Static Multi Parameter Search Procedure`

` The dbo.TBL_SAMPLE_TRANSACTION has filtered indices`

` that satisfy all the @pUseCondition options.`

`**********************************************************/`

`ALTER PROCEDURE dbo.USP_STATIC_MULTI_SEARCH`

`(`

` @pUseCondition INT = 0`

` ,@TRAN_DATE DATE = NULL`

`)`

`AS`

`DECLARE @BIGINT_BUCKET BIGINT = 0;`

`DECLARE @INT_BUCKET INT = 0;`

`SELECT`

` @BIGINT_BUCKET = ST.TRAN_ID`

` ,@INT_BUCKET = ST.CUSTOMER_ID`

`FROM dbo.TBL_SAMPLE_TRANSACTION ST`

`WHERE ( 0 = @pUseCondition )`

`OR ( 1 = @pUseCondition AND ST.WEEK_DAY = 1 )`

`OR ( 2 = @pUseCondition AND ST.WEEK_DAY IN ( 6 , 7 ) AND ST.IS_ONLINE = 1 )`

`OR ( 3 = @pUseCondition AND ST.WEEK_DAY IN ( 6 , 7 ) AND ST.IS_ONLINE = 1 AND ST.IS_PICKUP = 1 )`

`OR ( 4 = @pUseCondition AND ST.WEEK_DAY IN ( 6 , 7 ) AND ST.IS_ONLINE = 1 AND ST.IS_PICKUP = 1 AND ST.ONLCSR IN ( 'A' , 'B' , 'C' ) )`

`OR ( 5 = @pUseCondition AND ST.TRAN_DATE = @TRAN_DATE )`

`OR ( 6 = @pUseCondition AND ST.TRAN_DATE >= @TRAN_DATE AND ST.ONLCSR = 'A' )`

`;`

`GO`

Dynamic SQL search procedure

`USE tempdb;`

`GO`

`/**********************************************************`

` Dynamix SQL Multi Parameter Search Procedure`

` The dbo.TBL_SAMPLE_TRANSACTION has filtered indices`

` that satisfy all the @pUseCondition options.`

`**********************************************************/`

`ALTER PROCEDURE dbo.USP_DYNAMIC_SQL_MULTI_SEARCH`

`(`

` @pUseCondition INT = 0`

` ,@TRAN_DATE DATE = NULL`

`)`

`AS`

`DECLARE @NL NCHAR(2) = NCHAR(13) + NCHAR(10);`

`DECLARE @PARAM_STR NVARCHAR(15) = N'@TRAN_DATE DATE';`

`DECLARE @SQL_STR NVARCHAR(MAX) = N'`

`DECLARE @BIGINT_BUCKET BIGINT = 0;`

`DECLARE @INT_BUCKET INT = 0;`

`SELECT`

` @BIGINT_BUCKET = ST.TRAN_ID`

` ,@INT_BUCKET = ST.CUSTOMER_ID`

`FROM dbo.TBL_SAMPLE_TRANSACTION ST' + @NL +`

`CASE`

` WHEN @pUseCondition = 1 THEN N'WHERE ST.WEEK_DAY = 1'`

` WHEN @pUseCondition = 2 THEN N'WHERE ST.WEEK_DAY IN ( 6 , 7 ) AND ST.IS_ONLINE = 1'`

` WHEN @pUseCondition = 3 THEN N'WHERE ST.WEEK_DAY IN ( 6 , 7 ) AND ST.IS_ONLINE = 1 AND ST.IS_PICKUP = 1'`

` WHEN @pUseCondition = 4 THEN N'WHERE ST.WEEK_DAY IN ( 6 , 7 ) AND ST.IS_ONLINE = 1 AND ST.IS_PICKUP = 1 AND ST.ONLCSR IN ( ''A'' , ''B'' , ''C'' )'`

` WHEN @pUseCondition = 5 THEN N'WHERE ST.TRAN_DATE = @TRAN_DATE'`

` WHEN @pUseCondition = 6 THEN N'WHERE ST.TRAN_DATE >= @TRAN_DATE AND ST.ONLCSR = ''A'''`

` ELSE N''`

`END + @NL + N';';`

`EXEC SP_EXECUTESQL @SQL_STR, @PARAM_STR, @TRAN_DATE;`

`GO`

Test harness

`USE tempdb;`

`GO`

`SET NOCOUNT ON;`

`DECLARE @TRAN_DATE DATE = CONVERT(DATE,'2015-05-01',126);`

`DECLARE @timer TABLE (T_TEXT VARCHAR(100) NOT NULL, T_TS DATETIME2(7) NOT NULL DEFAULT(SYSDATETIME()));`

`DECLARE @BIGINT_BUCKET BIGINT = 0;`

`DECLARE @INT_BUCKET INT = 0;`

`INSERT INTO @timer(T_TEXT) VALUES('Dry run');`

`SELECT`

` @BIGINT_BUCKET = ST.TRAN_ID`

` ,@INT_BUCKET = ST.CUSTOMER_ID`

`FROM dbo.TBL_SAMPLE_TRANSACTION ST;`

`INSERT INTO @timer(T_TEXT) VALUES('Dry run');`

`INSERT INTO @timer(T_TEXT) VALUES('Static 0');`

`EXEC dbo.USP_STATIC_MULTI_SEARCH 0, @TRAN_DATE;`

`INSERT INTO @timer(T_TEXT) VALUES('Static 0');`

`INSERT INTO @timer(T_TEXT) VALUES('Dynamic 0');`

`EXEC dbo.USP_DYNAMIC_SQL_MULTI_SEARCH 0, @TRAN_DATE;`

`INSERT INTO @timer(T_TEXT) VALUES('Dynamic 0');`

`INSERT INTO @timer(T_TEXT) VALUES('Static 1');`

`EXEC dbo.USP_STATIC_MULTI_SEARCH 1, @TRAN_DATE;`

`INSERT INTO @timer(T_TEXT) VALUES('Static 1');`

`INSERT INTO @timer(T_TEXT) VALUES('Dynamic 1');`

`EXEC dbo.USP_DYNAMIC_SQL_MULTI_SEARCH 1, @TRAN_DATE;`

`INSERT INTO @timer(T_TEXT) VALUES('Dynamic 1');`

`INSERT INTO @timer(T_TEXT) VALUES('Static 2');`

`EXEC dbo.USP_STATIC_MULTI_SEARCH 2, @TRAN_DATE;`

`INSERT INTO @timer(T_TEXT) VALUES('Static 2');`

`INSERT INTO @timer(T_TEXT) VALUES('Dynamic 2');`

`EXEC dbo.USP_DYNAMIC_SQL_MULTI_SEARCH 2, @TRAN_DATE;`

`INSERT INTO @timer(T_TEXT) VALUES('Dynamic 2');`

`INSERT INTO @timer(T_TEXT) VALUES('Static 3');`

`EXEC dbo.USP_STATIC_MULTI_SEARCH 3, @TRAN_DATE;`

`INSERT INTO @timer(T_TEXT) VALUES('Static 3');`

`INSERT INTO @timer(T_TEXT) VALUES('Dynamic 3');`

`EXEC dbo.USP_DYNAMIC_SQL_MULTI_SEARCH 3, @TRAN_DATE;`

`INSERT INTO @timer(T_TEXT) VALUES('Dynamic 3');`

`INSERT INTO @timer(T_TEXT) VALUES('Static 4');`

`EXEC dbo.USP_STATIC_MULTI_SEARCH 4, @TRAN_DATE;`

`INSERT INTO @timer(T_TEXT) VALUES('Static 4');`

`INSERT INTO @timer(T_TEXT) VALUES('Dynamic 4');`

`EXEC dbo.USP_DYNAMIC_SQL_MULTI_SEARCH 4, @TRAN_DATE;`

`INSERT INTO @timer(T_TEXT) VALUES('Dynamic 4');`

`INSERT INTO @timer(T_TEXT) VALUES('Static 5');`

`EXEC dbo.USP_STATIC_MULTI_SEARCH 5, @TRAN_DATE;`

`INSERT INTO @timer(T_TEXT) VALUES('Static 5');`

`INSERT INTO @timer(T_TEXT) VALUES('Dynamic 5');`

`EXEC dbo.USP_DYNAMIC_SQL_MULTI_SEARCH 5, @TRAN_DATE;`

`INSERT INTO @timer(T_TEXT) VALUES('Dynamic 5');`

`INSERT INTO @timer(T_TEXT) VALUES('Static 6');`

`EXEC dbo.USP_STATIC_MULTI_SEARCH 6, @TRAN_DATE;`

`INSERT INTO @timer(T_TEXT) VALUES('Static 6');`

`INSERT INTO @timer(T_TEXT) VALUES('Dynamic 6');`

`EXEC dbo.USP_DYNAMIC_SQL_MULTI_SEARCH 6, @TRAN_DATE;`

`INSERT INTO @timer(T_TEXT) VALUES('Dynamic 6');`

`SELECT`

` T.T_TEXT`

` ,DATEDIFF(MICROSECOND,MIN(T.T_TS),MAX(T.T_TS)) AS DURATION`

`FROM @timer T`

`GROUP BY T.T_TEXT`

`ORDER BY DURATION;`

Results

`T_TEXT DURATION`

`------------- ---------`

`Dynamic 5 2000`

`Dynamic 6 3000`

`Dynamic 1 23002`

`Dynamic 2 97006`

`Dynamic 3 100005`

`Dynamic 4 102006`

`Dynamic 0 143008`

`Dry run 151008`

`Static 1 286016`

`Static 5 304018`

`Static 2 324018`

`Static 6 325018`

`Static 3 328019`

`Static 4 348020`

`Static 0 640037`

Viewing 15 posts - 61 through 75 (of 79 total)