Recent PostsRecent Posts Popular TopicsPopular Topics
 Home Search Members Calendar Who's On

 Generating a Range Rate Topic Display Mode Topic Options
Author
 Message
 Posted Wednesday, April 05, 2006 12:27 PM
 SSCrazy Group: General Forum Members Last Login: Today @ 8:46 AM Points: 2,750, Visits: 1,410
Post #271364
 Posted Wednesday, April 12, 2006 1:28 AM
 Ten Centuries Group: General Forum Members Last Login: Monday, October 13, 2008 4:19 AM Points: 1,202, Visits: 3
 Hello,personnally, I would use some statements along the lines of the ones below (taken from an article by Y.Ben-Gan, on www.sqlmag.com; search in google for the words CREATE TABLE NUMS). The idea is to populate the table exponentially, by doubling the number of records at each iterations. In order to achieve the desired results, one would populate the table by setting @max = (@end - @start +1) where @start and @end denotes the desired range, and then issuing an update statement likeUPDATE nums set n = n -1 + @startI wonder how this solution would compare to the others in terms of performance ...Sincerly,Emmanuel Nanchen CREATE TABLE Nums(n INT NOT NULL);DECLARE @max AS INT, @rc AS INT;SET @max = 1000; -- revise max number according to your needsSET @rc = 1;BEGIN TRAN  INSERT INTO Nums VALUES(1);  WHILE @rc * 2 <= @max  BEGIN    INSERT INTO Nums SELECT n + @rc FROM Nums;    SET @rc = @rc * 2;  END  INSERT INTO Nums    SELECT n + @rc FROM Nums WHERE n + @rc <= @max;COMMIT TRANALTER TABLE Nums ADD CONSTRAINT PK_Nums PRIMARY KEY(n); Kindest Regards,Emmanuel Nanchen
Post #272598
 Posted Wednesday, April 12, 2006 2:05 AM
 SSC Veteran Group: General Forum Members Last Login: Monday, July 07, 2008 5:00 AM Points: 262, Visits: 4
 I am dealing long enough with number ranges. I did tried approaches similar to ones described in this article. Sadly, the only really fast solution here is the dumbest one. Create a table NUMS and fill it with as many numbers as needed. Add index or primary key. Then use this pre-filled table in any sql statements that require sequential range of numbes. This is really fast. It takes less that a second in my application.
Post #272603
 Posted Wednesday, April 12, 2006 4:26 AM
 SSC Veteran Group: General Forum Members Last Login: Tuesday, October 30, 2012 5:30 PM Points: 246, Visits: 46
 Hi everybody,I used to make some tests of the exponential approach, also posted some results on SSC and it was faster than any other approach (we are talking about large result sets, not those with 1000 records ). I agree with aka'a oppinion,that having a static table is the fastest way, but it has limitations, too. Not always you know in advance the maximum number of records, and eventually it has to be populated somehow at the beginning. So, if you ask me, the really usefull solution is a table-valued UDF that returns a recordset that it populates internally, using the exponential approach. I tested with UDFs I posted on SSC few years ago, and the final cross join code completed in 4 minutes, while the UDF completed in 13 seconds. In both cases, I executedselect count(*) from the_UDF_name (5, 567720)Regards,Goce.
Post #272633
 Posted Wednesday, April 12, 2006 6:32 AM
 SSC Veteran Group: General Forum Members Last Login: Thursday, August 12, 2010 8:45 AM Points: 293, Visits: 39
 Hi, Echoing the post about the algorithm suggested above that is by Itzik Ben Gan. This is the one I regularly use and after doing a brief performance test against the best performing query in the article I am getting average results of fnCrossJoinRange2(5,567720): 14.5 s and fn_nums(5,567720): 11.2 sI have taken Itziks algorithm and added the min and max logic.CREATE FUNCTION dbo.fn_nums(@min int, @max int) RETURNS @Nums TABLE(n int NOT NULL)ASBEGIN IF @max > 0 BEGIN DECLARE @factor AS int SET @factor = 1 INSERT INTO @Nums VALUES(@min) WHILE @factor * 2 <= @max BEGIN INSERT INTO @Nums SELECT n + @factor FROM @nums SET @factor = @factor * 2 END INSERT INTO @Nums SELECT n + @factor FROM @nums WHERE n + @factor <= @max END RETURNENDThanksAlex Weatherall
Post #272656
 Posted Wednesday, April 12, 2006 7:00 AM
 SSCommitted Group: General Forum Members Last Login: Tuesday, May 29, 2012 11:22 AM Points: 1,755, Visits: 4,652
 I can't find the article by Itzik Ben Gan - can someone please post a link.Any idea when he 'came up with' the algorithm? I ask because it is suspiciously like the one I posted on here in September 2002! http://www.sqlservercentral.com/scripts/viewscript.asp?scriptid=321 Ryan RandallSolutions are easy. Understanding the problem, now, that's the hard part.
Post #272667
 Posted Wednesday, April 12, 2006 7:22 AM
 SSC Veteran Group: General Forum Members Last Login: Thursday, August 12, 2010 8:45 AM Points: 293, Visits: 39
 he 'came up with' the algorithm Hi, those quotes are yours. I'm not trying to imply that it is *his* algorithm (even though that was what I said), but that he has promoted it as a efficient way of generating numbers on the fly.Yes, your script does perform the generation in a similar manner. But it is conceivable that you've independently come up with the same idea CheersAlex
Post #272676
 Posted Wednesday, April 12, 2006 7:29 AM
 SSCrazy Group: General Forum Members Last Login: Friday, May 17, 2013 6:44 AM Points: 2,553, Visits: 513
 Here's one link - scroll down 2/3rds or search for itzik ben-gan...& you know what they say about great minds... **ASCII stupid question, get a stupid ANSI !!!**
Post #272678
 Posted Wednesday, April 12, 2006 7:42 AM
 SSC Veteran Group: General Forum Members Last Login: Monday, July 07, 2008 5:00 AM Points: 262, Visits: 4
 If a very specific range is required, a mixed approach can be used. So the predefined NUMS table is used as a source, with one or two outer SQLs that multiply, add or whatever the source range.
Post #272686
 Posted Wednesday, April 12, 2006 8:44 AM
 Valued Member Group: General Forum Members Last Login: Thursday, May 03, 2007 1:07 PM Points: 52, Visits: 1
 I disagree with David’s conclusion that "However, I found that for small ranges, say under 500 records, there was little difference in performance between the procedural and set based versions. Once beyond the 500 threshold the set based version won hands down."I am working with a result set of 500 rows but my initial value is a large number and the set based approach on my test machine takes 6 seconds to run and the procedural approach takes less then a second SELECT * FROM [dbo].[fnCrossJoinRange2](1001000, 1001500) Run time 6 secSELECT * FROM [dbo].[fnRange](1001000, 1001500) Run time 0 sec I like David’s approach but it does not perform as well for large initial values
Post #272712

 Permissions