January 6, 2009 at 8:01 am
peter (1/6/2009)
First impression....a lot of code (and a lot of work).I do not have the time to read it the lengthy article in full but as the OP rightly articulated, the problem is well known and has a myriad of (often half) solutions. This means I made one for myself not too long ago that supports some extra twists and does not look as complicated while performing well for the datasets it was meant to.
I will make a proper contributing post and share the code tomorrow, right now I have a deadline to meet, so stay tuned!
I know this article is quite a mammoth :). I tried to trim the article as much as possible, but there is just too much information to cover. Thank you for taking the time to look over the article.
I look forward to reading your post.
Thanks,
Adam
January 6, 2009 at 8:15 am
"Row_Number makes pagination a breeze..."
Um.
With all due respect to the author (and I truly mean that. It's a great article.) but was this statement a joke? I wouldn't exactly call this a "breeze", but maybe that's because I'm not near the expert that this guy is.
January 6, 2009 at 8:27 am
I have a question about the dynamic SQL block of code you have included in your article.
So the way you have constructed your dynamic SQL, you took @FirstName and @LastName and converted them to their actual values and included them in your @sql variable.
Shouldn't you rather have used @FirstName and @LAstNAme as parameters in @sql so that the query execution plan will be reused for all combinations of @FirstName and @LAstNAme ? Otherwise, each time that a new value is passed for @FirstName and @LastNAme, a new execution plan will be generated.
What I mean is this..
Instead of :
[font="Courier New"]Set @sql = 'select....where FirstNAme=' + Quotename(@firstname,'''') + ' and LastName=' + QuoteName(@lastname,'''')[/font]
Use:
[font="Courier New"]Set @sql = 'Select ... Where FirstName=@FirstName And LastName=@LastName'
and pass the parameters for @FirstName and @LastName[/font]
January 6, 2009 at 8:29 am
CREATE PROCEDURE [dbo].[Sp_Customer_Paging]
@PageSize INT=50,
@PageNumber INT=1
AS
Declare
@sql varchar(4000),
@FirstRow INT,
@LastRow INT,
@TmpSQLDetail1 varchar(4000),
@TmpSQLDetail2 varchar(4000)
SELECT@FirstRow = ( @PageNumber - 1) * @PageSize + 1,@LastRow = (@PageNumber - 1) * @PageSize + @PageSize ;
Set @sql =' WITH Members AS
( SELECT Cust_No,CardNo,CustType,
ROW_NUMBER() OVER (ORDER BY vw_customers.Cust_No) AS RowNumber
FROM vw_customers WHERE 1=1 '
ListRecordCount AS ( SELECT Cust_No,CardNo,CustType,(SELECT MAX(RowNumber) AS RecordCount FROM Members)as RCount FROM Members)'
+' FROMListRecordCount'
+' WHERERowNumber BETWEEN '+ CAST(@FirstRow AS VARCHAR(10)) +' AND '+cast(@LastRow as VARCHAR(10))+' '
+' ORDER BY RowNumber ASC';
EXEC(@SQL)
--sample calling store procedure exec Sp_Customer_Paging 100,1
--result will first 100 rows with RCount you can arrange number of page with rcount/100
--rcount=500 means you have 5 page when you call Sp_Customer_Paging 100,3 it will returns between 200 and 300 records etc..
January 6, 2009 at 8:40 am
I prefer keeping the notion of paging in the client space, in my opinion it is a cleaner API to use something like System.IO.StreamReader and provide input params defining the start row and number of rows. Treat the db basically like a stream or buffer. You won't find an API in the platform that has inputs numItemsPerPage, numPages.
January 6, 2009 at 8:40 am
rakhi.bhatia (1/6/2009)
I have a question about the dynamic SQL block of code you have included in your article.So the way you have constructed your dynamic SQL, you took @FirstName and @LastName and converted them to their actual values and included them in your @sql variable.
Shouldn't you rather have used @FirstName and @LAstNAme as parameters in @sql so that the query execution plan will be reused for all combinations of @FirstName and @LAstNAme ? Otherwise, each time that a new value is passed for @FirstName and @LastNAme, a new execution plan will be generated.
What I mean is this..
Instead of :
[font="Courier New"]Set @sql = 'select....where FirstNAme=' + Quotename(@firstname,'''') + ' and LastName=' + QuoteName(@lastname,'''')[/font]
Use:
[font="Courier New"]Set @sql = 'Select ... Where FirstName=@FirstName And LastName=@LastName'
and pass the parameters for @FirstName and @LastName[/font]
The short answer is Yes. But the long answer is you end up making the code very complex. I originally created a version of the code that did just this; however, the problem is you have to use control flow logic for sp_executesql command. You have to do this because you need to create a parameter to be used in scope with the dynamic sql. You cannot declare a sp_execute parameter unless it is used in the command text. The article was quite long already and I did not want to introduce an intimidating example; however, because you asked I will display the code here.
I would also like to point out that i didnt see any huge performance gains from my tests, but that doesnt mean there are none. The benifit here is query plan reuse.
Below is a sample of what the code will look like:
ALTER PROCEDURE usp_ContactPaging
(
@SortCol VARCHAR(25)='ContactId ASC',
@FirstName VARCHAR(25)=NULL,
@LastName VARCHAR(25)=NULL,
@pgSize INT=25, @pgNbr INT=1
)
AS
BEGIN
DECLARE @sql nvarchar(max),
@predicate nvarchar(500),
@params nvarchar(500),
@NbrPages INT
IF @FirstName IS NULL AND @LastName IS NULL
BEGIN
SET @predicate = N''
SET @params = N'@size int, @nbr int, @Pages INT, @dSort VARCHAR(25)'
SELECT @NbrPages =
CEILING(count(*)/(@pgSize*1.0))
FROM Contacts
END
IF @FirstName IS NOT NULL AND @LastName IS NULL
BEGIN
SET @predicate = N'WHERE FirstName LIKE @FN + ' + N'''%'''
SET @params = N'@FN VARCHAR(25),@size int, @nbr int, @Pages INT'
SELECT @NbrPages =
CEILING(count(*)/(@pgSize*1.0))
FROM Contacts
WHERE [FirstName] LIKE @FirstName + '%'
END
IF @FirstName IS NULL AND @LastName IS NOT NULL
BEGIN
SET @predicate = N'WHERE LastName LIKE @LN + ' + N'''%'''
SET @params = N'@LN VARCHAR(25),@size int, @nbr int, @Pages INT'
SELECT @NbrPages =
CEILING(count(*)/(@pgSize*1.0))
FROM Contacts
WHERE [LastName] LIKE @LastName + '%'
END
IF @FirstName IS NOT NULL AND @LastName IS NOT NULL
BEGIN
SET @predicate = N'WHERE FirstName LIKE @FN + ' + N'''%''' + N' AND ' +
N'LastName LIKE @LN + ' + N'''%'''
SET @params = N'@FN VARCHAR(25),@LN VARCHAR(25),@size int, @nbr int, @Pages INT'
SELECT @NbrPages =
CEILING(count(*)/(@pgSize*1.0))
FROM Contacts
WHERE [FirstName] LIKE @FirstName + '%' AND
[LastName] LIKE @LastName + '%'
END
SET @sql = N'
;WITH PagingCTE (Row_ID,ContactId,FirstName,LastName)
AS
(
SELECT
ROW_NUMBER()
OVER(ORDER BY
CASE WHEN @sort='FirstName DESC' THEN FirstName END DESC,
CASE WHEN @sort='FirstName ASC' THEN FirstName END ASC,
CASE WHEN @sort='LastName ASC' THEN LastName END ASC,
CASE WHEN @sort='LastName DESC' THEN LastName END DESC,
CASE WHEN @sort='ContactID ASC' THEN ContactId END ASC,
CASE WHEN @sort='ContactID DESC' THEN ContactId END DESC
) AS [Row_ID],
ContactId,
FirstName,
LastName
FROM Contacts' + CHAR(13) +
@predicate + '
)
SELECT
Row_ID,
ContactId,
FirstName,
LastName,
@Pages AS NbrPages
FROM PagingCTE
WHERE Row_ID >= (@size * @nbr) - (@size -1) AND
Row_ID <= @size * @nbr
'
--PRINT @sql
IF @FirstName IS NULL AND @LastName IS NULL
BEGIN
EXEC sp_executesql
@sql,
@size = @pgSize,
@nbr = @pgNbr,
@Pages = @NbrPages,
@sort = @SortCol
END
IF @FirstName IS NOT NULL AND @LastName IS NULL
BEGIN
EXEC sp_executesql
@sql,
@FN = @FirstName,
@size = @pgSize,
@nbr = @pgNbr,
@Pages = @NbrPages,
@sort = @SortCol
END
IF @FirstName IS NULL AND @LastName IS NOT NULL
BEGIN
EXEC sp_executesql
@sql,
@LN = @LastName,
@size = @pgSize,
@nbr = @pgNbr,
@Pages = @NbrPages
END
IF @FirstName IS NOT NULL AND @LastName IS NOT NULL
BEGIN
EXEC sp_executesql
@sql,
@FN = @FirstName,
@LN = @LastName,
@size = @pgSize,
@nbr = @pgNbr,
@Pages = @NbrPages
END
END
GO
January 6, 2009 at 8:43 am
Thanks Steve.
January 6, 2009 at 8:45 am
Adam, why you dont'use the order by in the final select? (.... order by Row_ID).
Otherwise there is no guarantee that the records are always sorted correctly, what do you think?
January 6, 2009 at 8:54 am
alessad (1/6/2009)
Adam, why you dont'use the order by in the final select? (.... order by Row_ID).Otherwise there is no guarantee that the records are always sorted correctly, what do you think?
I think I was waiting for this to arise :). Yes I agree an order by clause is necessary to gaurentee ordering. I missed this in my code post and didnt realize it until after i submitted. The good news is using Row_Number in a CTE does a sort on the data before it is returned. The sort will controlled by the order detailed in the order by clause of the row_number function. While the ordering will be correct because of the way a cte works, I would still recommend that the order by clause be used. You can add an order by and sort by the seq column to gaurentee correctness.
January 6, 2009 at 9:16 am
This is another example of great free content delivered on SQLServerCentral. Thank you Mr.Hines.
My current position is at a large manufacturing firm with a web front end. Currently our data tier has some pretty high contention, so I try to push off some work to the app tier. This approach has worked very well for me. I would love to have you guys shoot some holes in it(I will fire back).
1. If I can support all permutations of a given search, i.e. less than 24 procs(4 search inputs), tune/create an individual proc for each search.
2. The "search" proc returns back only a surrogate key in sorted order, and that is persisted by the app/tier.
3. The app tier then calls individual procs that are tuned for a known set of page sizes. (i.e. usp_GetData25, usp_GetData50,usp_GetData100), the input is only the surrogate key.
Pros:
*Efficient in the data tier. I can get an excellent plan for every search.
*Very minimal data to persist for the session as I am only sending back surrogate keys
*App only fetches data that is used, so the physical paging is very efficient
Cons:
*To many procs
*Limited use, as many screens have more than 4 searchable fields. In that case I go to what the OP has presented. I still have the web/app tier try to some add some value to the equation buy moving the code forks into there code(i.e I never like one big proc)
Excited to get any feed back!
January 6, 2009 at 9:29 am
2 points to add:
1) If you are going with dynamic sql, the go completely with it. Constructs such as this:
CASE WHEN @sort=''FirstName DESC'' THEN FirstName END DESC,
CASE WHEN @sort=''FirstName ASC'' THEN FirstName END ASC,
CASE WHEN @sort=''LastName ASC'' THEN LastName END ASC,
CASE WHEN @sort=''LastName DESC'' THEN LastName END DESC,
CASE WHEN @sort=''ContactID ASC'' THEN ContactId END ASC,
CASE WHEN @sort=''ContactID DESC'' THEN ContactId END DESC
can be boiled down to the explicit single statement based on the ACTUAL input value of @sort.
2) more importantly, consider situations where putting all of the output data into the CTE can actually hinder performance. I have seen it work much more efficiently (especially with larger numbers of columns/data types) to simply output the necessary key values for each table involved and then select out the fields from those tables hitting the tables a second time on what will be very efficient index seeks/bookmark lookups.
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
January 6, 2009 at 9:33 am
Your point #2 is always where I get my best performance. Even if I persist the output to a physical file on the web server. I have not found a way to beat this in Enterprise level searches. Would love to find a way to not persist physically.
January 6, 2009 at 9:48 am
e.linthorst (1/6/2009)
Created this one a while back and still use it. You might want to give it a try.It aint as fancy as yours, but it will save you a lot of maintenance.
Since the user most of the time dictates the sortorder
and the where clause I prefer passing these in from the outside.
You can simply leave them blank where you don't need them.
CREATE procedure [dbo].[proc_GetSortedPage]
(
@TableName VARCHAR(500),
@SortClause VARCHAR(4000),
@WhereClause VARCHAR(4000),
@Pagesize int,
@Pageindex int
)
as
-- @TableName: name of table or view
-- @SortClause: the sort clause without the ORDER BY statement, but including ASC or DESC
. . .
I see where you are going (or have gone ..?!) with this. It is easy(ier) than Adam's solution in some regards.
I certainly will implement this sproc in my list of search queries, thank you.
The challenge this presents in the realm of .NET developers (I among them) having a preference for strongly-typed datasets is that that this sproc does not return a strongly-typed dataset.
The overarching problem with pagination is an element I will visit ion a separate reply.
Thx,
GaryN
January 6, 2009 at 10:06 am
I noted in an earlier on this article:
The challenge this (reply to the article) presents in the realm of .NET developers (I among them) is that of having a preference for strongly-typed datasets which the sproc (not the article's sproc, but a sproc in a reply) does not return a strongly-typed dataset.
The overarching problem with pagination is an element I will visit ion a separate reply.
So, 'overarching' is perhaps a stronger word than intended.
As noted, a decent percentage of .NET developers (I among them) have a preference for strongly-typed datasets. With Microsoft's built-in GridView Paging, 2 calls to the server are needed, which requires 2 separate sprocs, yet both using the same search query. This I loathe.
I've looked, though perhaps not high/low enough, for a methodology (in this madness) to use a single sproc that does return the desired resultset along with the total count which could easily be used in a .NET GridView or DataView and using the built-in paging.
Yes, my reply here is a mix of SQL Server and .NET, of which may be beyond the scope of this specific article and/or its readers to consider upon. Yet (here comes my justification for posting my rant [er, reply]), this article (and good it is, don't think me stating otherwise) is a general focus toward any potential solution(s) to this element of 2 separate products by the same company (MS) yet presenting a duplexity (or more) of effort to arrive at a solution [e.g. the left hand not knowing what the right hand is doing, or that there even is a right hand . . .].
GaryN
January 6, 2009 at 10:25 am
TheSQLGuru (1/6/2009)
2 points to add:1) If you are going with dynamic sql, the go completely with it. Constructs such as this:
CASE WHEN @sort=''FirstName DESC'' THEN FirstName END DESC,
CASE WHEN @sort=''FirstName ASC'' THEN FirstName END ASC,
CASE WHEN @sort=''LastName ASC'' THEN LastName END ASC,
CASE WHEN @sort=''LastName DESC'' THEN LastName END DESC,
CASE WHEN @sort=''ContactID ASC'' THEN ContactId END ASC,
CASE WHEN @sort=''ContactID DESC'' THEN ContactId END DESC
can be boiled down to the explicit single statement based on the ACTUAL input value of @sort.
2) more importantly, consider situations where putting all of the output data into the CTE can actually hinder performance. I have seen it work much more efficiently (especially with larger numbers of columns/data types) to simply output the necessary key values for each table involved and then select out the fields from those tables hitting the tables a second time on what will be very efficient index seeks/bookmark lookups.
SQLGuru,
Thanks for taking the time to read through the article. I have a few responses to your statements.
1) I proposed the case expression in the order by for 1 reason, to parameterize the dynamic sql. This method does have limitations, but does a better job at allowing query plan reuse and makes the code a little more secure. I agree that it is easier to do this one statement. You just have to weigh security vs maintainability.
2) I agree with this statement. The article depicted a core method for pagination, using SQL Server 2005. There are a lot of different ways to performance tune the supplied code. It is my hope that those who read this article will take away a fundamental approach to achieving a pagination solution, which does not depend on canned application code.
Viewing 15 posts - 16 through 30 (of 68 total)
You must be logged in to reply to this topic. Login to reply
This website stores cookies on your computer.
These cookies are used to improve your website experience and provide more personalized services to you, both on this website and through other media.
To find out more about the cookies we use, see our Privacy Policy