SQLServerCentral.com / / Latest PostsInstantForum.NET v2.9.0SQLServerCentral.comhttp://www.sqlservercentral.com/Forums/notifications@sqlservercentral.comThu, 26 Nov 2015 09:53:22 GMT20RE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspxThis is about the simplest method I can think of.Executes for about 1.5 seconds to get all primes below 1,000,000.[code]CREATE TABLE #Numbers ( Prime INT NOT NULL, Number BIGINT PRIMARY KEY CLUSTERED );DECLARE @Max INT = 1000000;WITH n0(p)AS ( SELECT 1 UNION ALL SELECT 1), n1(p)AS ( SELECT 1 FROM n0 AS a CROSS JOIN n0 AS b),n2(p)AS ( SELECT 1 FROM n1 AS a CROSS JOIN n1 AS b),n3(p)AS ( SELECT 1 FROM n2 AS a CROSS JOIN n2 AS b),n4(p)AS ( SELECT 1 FROM n3 AS a CROSS JOIN n3 AS b),n5(p)AS ( SELECT 1 FROM n4 AS a CROSS JOIN n4 AS b)INSERT #Numbers ( Prime, Number )SELECT f.Prime, f.Prime * f.Prime AS NumberFROM ( SELECT TOP (1 + @Max / 30) 30 * ROW_NUMBER() OVER (ORDER BY p) FROM n5 ) AS v(Value)CROSS APPLY ( VALUES (v.Value - 23), (v.Value - 19), (v.Value - 17), (v.Value - 13), (v.Value - 11), (v.Value - 7), (v.Value - 1), (v.Value + 1) ) AS f(Prime)WHERE f.Prime <= @Max;SELECT PrimeFROM ( VALUES (2), (3), (5) ) AS v(Prime)WHERE Prime <= @MaxUNION ALLSELECT n.PrimeFROM #Numbers AS nWHERE NOT EXISTS ( SELECT * FROM #Numbers AS p WHERE p.Number <= n.Prime AND n.Prime % p.Prime = 0 )DROP TABLE #Numbers;[/code]Thu, 14 Mar 2013 08:13:39 GMTSwePesoRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspxThe problem with the first method is you are assuming a sequence table is already there, and that removing entries from it isn't an issue. If the sequence table is existing, then it can be assumed it sould be left as-is or at least re-constructed at the end, otherwise it wouldn't have been in existance in the first place.Also, we ended up going with 10,000,000 as a figure to reach because lower figures were too fast to compare.Mon, 24 Aug 2009 15:27:47 GMTpleitchRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspxI decided to use Sequence table only.I have two methods on the implementation of same algorithm. One is using set operations while the other is using a WHILE loop. Thsi is the algorithm: Remove1 and Start from 2, remove all multiples of each number and get the rest. Finally I will select the remaining numbers; -- Method 1: Using WHILE LOOP DECLARE @n int, @i intSELECT @n = FLOOR(SQRT(MAX(Seq))) FROM sequence;DELETE sequence WHERE Seq = 1;SET @i=2;WHILE @i <=@nBEGIN DELETE sequence WHERE seq % @i = 0 AND seq >= @i*@i; SELECT @i = MIN(seq) FROM sequence WHERE seq > @i;END SELECT * FROM sequence;-- Method 2: SET Based DeleteDECLARE @n int;SELECT @n = FLOOR(SQRT(MAX(Seq))) FROM sequence;DELETE sequence WHERE seq =1;DELETE b FROM sequence a INNER JOIN sequence bON a.seq<= FLOOR(SQRT(b.seq)) AND b.seq %a.seq =0;SELECT * FROM sequence;Both methods, When I compared for 100,000 numbers, perform under 1.5 seconds. The first method was slightly faster around (100 milli seconds) Even though I expected the first method to be using more resources due to multiple deletes, It had a total of 7,887 logical reads while the second method had 663,523 logical reads. It also created a worktable. The reason is the first method effectively removes all non-primary numbers. Note: All calculations are only for the code submitted here. (Creation of the sequence table and populating data are excluded for calculations).[b]I will go for the first method. [/b]Mon, 24 Aug 2009 11:45:49 GMTG.R.Prithiviraj KulasinghamRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspxWe made it into a newsletter editorial [url] http://www.sqlservercentral.com/articles/Editorial/67817/ [/url]to give the competition a bit of publicity! Don't tell me you don't subscribe to the newsletter!Tue, 11 Aug 2009 07:54:16 GMTPhil FactorRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspxHey! How is it going? When will we have any interesting news? :-)Tue, 11 Aug 2009 06:06:33 GMTandriy.zabavskyyRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspxWe're aiming for Monday. That's the date we originally promised, not quite realizing how many entries there would be!Thu, 06 Aug 2009 01:47:55 GMTPhil FactorRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspxJust curious if there's any word yet on when the winner will be announced...Steve(aka smunson):-):-):-)Wed, 05 Aug 2009 16:59:51 GMTsgmunsonRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspx[quote][b]Mike Ross (8/3/2009)[/b][hr][quote][b]Jeff Moden (8/3/2009)[/b][hr]Heh... What side effects? The function is the one with "side effects". You can't raise the proper error if you need it. :-P The function also has the side effect of being slower even if it's just a bit so. The in-memory solution isn't actually faster as you stated... not even on your box. :-DWhatever. Thanks for running the test, Mike. :-)[/quote]Ha ha. Fire off a bunch of disk-intensive tasks and then try your "test" again.The function is faster where it counts, on our production system. Even if it wasn't, the reduced dependencies make it worth it.A "side effect" is not where one method runs less than 10ms faster under a very contrived and unrealistic test. Nor is it one of the many design limitations in SQL2000.A side effect and dependency is having one more, completely unnecessary, table to: script, track, and manage.Haven't you ever had a DBA or developer change the contents of a table on you, for very good-seeming reasons? Or use an undocumented table?Our functions are handled very well by source control. Further, if someone changes the function code, the interface and results stay the same -- no side effects.I suppose we could try to put table [i]contents[/i] under source control, but that seems absurd and has not been necessary for anything else.[/quote]Heh... you and I certainly have different ideas as to what a side effect may be. We also have different ideas what a DBA is allowed to do or not. We also keep source control for all such reference/tool tables and the code that generates their initial values.You never did say... which RDBMS are you using on the non-MS servers?Mon, 03 Aug 2009 19:34:15 GMTJeff ModenRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspxThanks Steve! And thanks for all of you hard work on this too...Mon, 03 Aug 2009 15:32:56 GMTRBarryYoungRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspxHere's the results for your code after I added a start and end variable and computed the difference and tacked on a PRINT for it at the end, to make it look like all my other testing.Begin: 2009-08-03 13:29:31.187Made #Sequence: 2009-08-03 13:29:32.233Primes7toRt4: 2009-08-03 13:29:32.303Primes7toRt2: 2009-08-03 13:29:32.313Candidates: 2009-08-03 13:29:35.050rotation count: 3163Rotations: 2009-08-03 13:29:35.053Composites: 2009-08-03 13:29:42.933PrimesRt2toN: 2009-08-03 13:29:46.013#Primes: 2009-08-03 13:29:46.257#Displayed: 2009-08-03 13:29:50.433Total Duration: 19.246 secondsI then decided to see how your code would fare if it didn't need to write to a temp table, and here's the results:Begin: 2009-08-03 13:33:13.460Made #Sequence: 2009-08-03 13:33:14.550Primes7toRt4: 2009-08-03 13:33:14.623Primes7toRt2: 2009-08-03 13:33:14.633Candidates: 2009-08-03 13:33:17.370rotation count: 3163Rotations: 2009-08-03 13:33:17.373Composites: 2009-08-03 13:33:25.153PrimesRt2toN: 2009-08-03 13:33:28.170#Primes: 2009-08-03 13:33:32.257#Displayed: 2009-08-03 13:33:32.257Total Duration: 18.796 secondsBoth ways it worked beautifully, and produced correct results against the list of primes from the site provided by Mr. Celko. SWEET !!!Steve(aka smunson):-):-):-)[quote][b]RBarryYoung (8/3/2009)[/b][hr][quote][b]Phil Factor (8/3/2009)[/b][hr]Thanks for doing the 'final submissions'. It was a great help as we were faced with daunting task of having to wade through 22 pages and a lot of attachments to try to work out the definitive version of each entry. There are quite few other criteria that we will judge the entries by, as well as out-and-out performance, For this reason, we will not going to either favour or rule out tally-table-based solutions. We'll try to stick with the rules that Joe gave in his article. Thanks once again for all the entries. We've been amazed by the energy and the ingenuity of the entries. Some pioneering work has been done and we've learned a lot about set-based algorithms, SQL Server and SQL itself. We hope we can do justice to all the entrants, and to the huge number of man-hours that have gone into this task in the past week. This is SQL Server Central forums at their best.[/quote]Phil & Joe:Sorry for the late correction, but Peter Larsson pointed out an obscure bug in my submission that I was not able to correct until this morning. Hope you will accept this, :-) ...And this is my one-and-only definitive Final Submission.[/quote]Mon, 03 Aug 2009 11:35:18 GMTsgmunsonRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspx[quote][b]Phil Factor (8/3/2009)[/b][hr]Thanks for doing the 'final submissions'. It was a great help as we were faced with daunting task of having to wade through 22 pages and a lot of attachments to try to work out the definitive version of each entry. There are quite few other criteria that we will judge the entries by, as well as out-and-out performance, For this reason, we will not going to either favour or rule out tally-table-based solutions. We'll try to stick with the rules that Joe gave in his article. Thanks once again for all the entries. We've been amazed by the energy and the ingenuity of the entries. Some pioneering work has been done and we've learned a lot about set-based algorithms, SQL Server and SQL itself. We hope we can do justice to all the entrants, and to the huge number of man-hours that have gone into this task in the past week. This is SQL Server Central forums at their best.[/quote]Phil & Joe:Sorry for the late correction, but Peter Larsson pointed out an obscure bug in my submission that I was not able to correct until this morning. Hope you will accept this, :-) ...And this is my one-and-only definitive Final Submission.Mon, 03 Aug 2009 08:33:32 GMTRBarryYoungRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspx[quote][b]Mike Ross (8/2/2009)[/b][hr][quote][b]RBarryYoung (8/2/2009)[/b][hr]Because of the incessant carping about Numbers tables ...[/quote]Jeez. Ok, I'm willing to learn.Can anyone here [b][u]honestly[/u][/b] say that they had a Tally table in [i][b]money-making[/b][/i] use (not random, recreational math experiments) of 1 million or more rows, [b][i]before[/i][/b] this contest?If so, why? I have never needed more than 65,000 rows in a tally table for business reasons and 2000 rows is typical. What use am I missing?[/quote]Yes. I used to have a 5-million row numbers table (same concept as Tally/Sequence/whatever) for a production database that had to deal with multi-million name address lists. For performance purposes, I had three tables: SmallNumbers (0-255), Numbers (0-10,000), BigNumbers (0-5,000,000). The procs would pick which one was appropriate, and call sub-procs based on that decision. (Different sub-procs kept the cached solutions appropriate.)Built those in 2003, for very specific purposes pertinent to the company I worked for at that time.Don't currently keep anything more than 0-10,000 in a table, because a CTE with a cross join and the row_number() function is faster once you get up beyond 5 digits (I've done speed tests), and the business I'm currently working for doesn't need the huge numbers for anything I do here anyway. But I used to have a very good use for such a huge table.Mon, 03 Aug 2009 08:22:01 GMTGSquaredRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspx[quote][b]Jeff Moden (8/3/2009)[/b][hr]Heh... What side effects? The function is the one with "side effects". You can't raise the proper error if you need it. :-P The function also has the side effect of being slower even if it's just a bit so. The in-memory solution isn't actually faster as you stated... not even on your box. :-DWhatever. Thanks for running the test, Mike. :-)[/quote]Ha ha. Fire off a bunch of disk-intensive tasks and then try your "test" again.The function is faster where it counts, on our production system. Even if it wasn't, the reduced dependencies make it worth it.A "side effect" is not where one method runs less than 10ms faster under a very contrived and unrealistic test. Nor is it one of the many design limitations in SQL2000.A side effect and dependency is having one more, completely unnecessary, table to: script, track, and manage.Haven't you ever had a DBA or developer change the contents of a table on you, for very good-seeming reasons? Or use an undocumented table?Our functions are handled very well by source control. Further, if someone changes the function code, the interface and results stay the same -- no side effects.I suppose we could try to put table [i]contents[/i] under source control, but that seems absurd and has not been necessary for anything else.Mon, 03 Aug 2009 02:48:26 GMTmiker-212697RE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspx[quote][b]Joe Celko (7/31/2009)[/b][hr]Phil and I need this weekend to "grade papers" on this puzzle. Can we get everyone to post a final version of their work? I think this might be worth an article on SQL problem solving -- brute force, small clever tricks, researching the Atkin's algorithm, etc. make a good progression for teaching.[/quote]I am sorry - wasn't able to access internet thru weekend. So my final version as it is actually in my last post:[code]if exists (select * from sysobjects where id = object_id(N'Sequence')and type in (N'U')) drop table Sequenceif exists (select * from sysobjects where id = object_id(N'Primes')and type in (N'U')) drop table Primesif object_id('tempdb..#digits') is not null drop table #digits create table Sequence (seq INTEGER not null primary key check(seq > 0));create table Primes (p INTEGER not null primary key check(p > 1));declare @limit integer set @limit = 10000;select i into #digits from (select 1 union select 2 union select 3 union select 4 union select 5 union select 6 union select 7 union select 8 union select 9 union select 0) as X(i)---------------------------------------------------------------------------------------------------- Fullfilling a Sequence table which will be used for "iterations"--------------------------------------------------------------------------------------------------insert into Sequence (seq)select (D4.i * 10000 + D3.i * 1000 + D2.i * 100 + D1.i * 10 + D0.i + 1) as seq from #digits as D0, #digits as D1, #digits as D2, #digits as D3, #digits as D4 where (D4.i * 10000 + D3.i * 1000 + D2.i * 100 + D1.i * 10 + D0.i + 1) <= sqrt(@limit) ---------------------------------------------------------------------------------------------------- basic Primes initialization --------------------------------------------------------------------------------------------------insert into Primes select 2 union select 3---------------------------------------------------------------------------------------------------- put in candidate primes: -- integers which have an odd number of representations by certain quadratic forms--------------------------------------------------------------------------------------------------insert into Primesselect N.c from (select case T.num when 1 then (select case when N.x <= @limit and (N.x % 12 = 1 or N.x % 12 = 5) then N.x end from (select (4 * i.seq * i.seq + j.seq * j.seq))N(x)) when 2 then (select case when N.x <= @limit and N.x % 12 = 7 then N.x end from (select (3 * i.seq * i.seq + j.seq * j.seq))N(x)) when 3 then (select case when N.x <= @limit and i.seq > j.seq and N.x % 12 = 11 then N.x end from (select (3 * i.seq * i.seq - j.seq * j.seq))N(x)) end from Sequence i cross join Sequence j cross join (select 1 union select 2 union select 3)T(num) where i.seq <= sqrt(@limit) and j.seq <= sqrt(@limit) )N(c) where N.c is not null group by N.chaving count(*) % 2 <> 0---------------------------------------------------------------------------------------------------- eliminate composites by sieving. Casting to bigint for big number cases-------------------------------------------------------------------------------------------------delete from Primeswhere p in( select cast(P.p as bigint)*cast(P.p as bigint)*cast(s.seq as bigint) from Primes P cross join (select top(@limit/25) row_number() over(order by s1.seq) rn from Sequence s1 cross join Sequence s2 )S(seq) where P.p >= 5 and P.p <= sqrt(@limit) and cast(P.p as bigint)*cast(P.p as bigint)*cast(s.seq as bigint) <= @limit) select * from Primes[/code]I'd like also to include a cross version variant which is a bit slower:[code]if exists (select * from sysobjects where id = object_id(N'Sequence')and type in (N'U')) drop table Sequenceif exists (select * from sysobjects where id = object_id(N'Primes')and type in (N'U')) drop table Primesif object_id('tempdb..#digits') is not null drop table #digits create table Sequence (seq INTEGER not null primary key check(seq > 0));create table Primes (p INTEGER not null primary key check(p > 1));declare @limit integer set @limit = 10000;select i into #digits from (select 1 union select 2 union select 3 union select 4 union select 5 union select 6 union select 7 union select 8 union select 9 union select 0) as X(i)---------------------------------------------------------------------------------------------------- Fullfilling a Sequence table which will be used for "iterations"--------------------------------------------------------------------------------------------------insert into Sequence (seq)select (D4.i * 10000 + D3.i * 1000 + D2.i * 100 + D1.i * 10 + D0.i + 1) as seq from #digits as D0, #digits as D1, #digits as D2, #digits as D3, #digits as D4 where (D4.i * 10000 + D3.i * 1000 + D2.i * 100 + D1.i * 10 + D0.i + 1) <= sqrt(@limit) ---------------------------------------------------------------------------------------------------- basic Primes initialization --------------------------------------------------------------------------------------------------insert into Primes select 2 union select 3---------------------------------------------------------------------------------------------------- put in candidate primes: -- integers which have an odd number of representations by certain quadratic forms--------------------------------------------------------------------------------------------------insert into Primesselect N.c from (select case T.num when 1 then (select case when N.x <= @limit and (N.x % 12 = 1 or N.x % 12 = 5) then N.x end from (select (4 * i.seq * i.seq + j.seq * j.seq))N(x)) when 2 then (select case when N.x <= @limit and N.x % 12 = 7 then N.x end from (select (3 * i.seq * i.seq + j.seq * j.seq))N(x)) when 3 then (select case when N.x <= @limit and i.seq > j.seq and N.x % 12 = 11 then N.x end from (select (3 * i.seq * i.seq - j.seq * j.seq))N(x)) end from Sequence i cross join Sequence j cross join (select 1 union select 2 union select 3)T(num) where i.seq <= sqrt(@limit) and j.seq <= sqrt(@limit) )N(c) where N.c is not null group by N.chaving count(*) % 2 <> 0---------------------------------------------------------------------------------------------------- eliminate composites by sieving. Casting to bigint for big number cases-------------------------------------------------------------------------------------------------delete from Primes from Primes cross join Primes p2 where p2.p >=5 and p2.p < sqrt(@limit) and Primes.p % (p2.p * p2.p) = 0select * from Primes[/code]Sorry for this late post,Best regards,Andriy ZabavskyyMon, 03 Aug 2009 02:31:41 GMTandriy.zabavskyyRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspxThanks for doing the 'final submissions'. It was a great help as we were faced with daunting task of having to wade through 22 pages and a lot of attachments to try to work out the definitive version of each entry. There are quite few other criteria that we will judge the entries by, as well as out-and-out performance, For this reason, we will not going to either favour or rule out tally-table-based solutions. We'll try to stick with the rules that Joe gave in his article. Thanks once again for all the entries. We've been amazed by the energy and the ingenuity of the entries. Some pioneering work has been done and we've learned a lot about set-based algorithms, SQL Server and SQL itself. We hope we can do justice to all the entrants, and to the huge number of man-hours that have gone into this task in the past week. This is SQL Server Central forums at their best.Mon, 03 Aug 2009 02:30:23 GMTPhil FactorRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspxOh... almost forgot to ask you, Mike. You said you were "almost SQL Server less"... what are you using for the non-Sql Server RDBMS's?Mon, 03 Aug 2009 01:24:11 GMTJeff ModenRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspxHeh... What side effects? The function is the one with "side effects". You can't raise the proper error if you need it. :-P The function also has the side effect of being slower even if it's just a bit so. The in-memory solution isn't actually faster as you stated... not even on your box. :-DWhatever. Thanks for running the test, Mike. :-)Mon, 03 Aug 2009 01:19:03 GMTJeff ModenRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspx[quote][b]Jeff Moden (8/2/2009)[/b][hr]By the way, you can (kind of) raise an error in a function...[code="sql"] IF @iNumRows > 1000 OR @iNumRows < 1 SELECT @iNumRows = 1/0 [/code][/quote]AARRRGH! No, No, NO! That's one reason why we are less-and-less a Microsoft Organization -- cryptic errors that have no logical relation to the cause. We won't introduce one on purpose![quote]Once a Tally table is used once, it gets cached and becomes a memory only solution...[/quote]That "GO 10" doesn't seem to work on SQL2000.Here's the results of that code in a while loop:[size="1"][i]========== Tally ==========SQL Server Execution Times: CPU time = 0 ms, elapsed time = 0 ms.SQL Server parse and compile time: CPU time = 0 ms, elapsed time = 0 ms.(1000 row(s) affected)SQL Server Execution Times: CPU time = 0 ms, elapsed time = 1 ms.========== FUNCTION ==========SQL Server Execution Times: CPU time = 0 ms, elapsed time = 0 ms.(1000 row(s) affected)SQL Server Execution Times: CPU time = 0 ms, elapsed time = 9 ms.================================================================================ Tally ==========SQL Server Execution Times: CPU time = 0 ms, elapsed time = 0 ms.SQL Server parse and compile time: CPU time = 0 ms, elapsed time = 0 ms.(1000 row(s) affected)SQL Server Execution Times: CPU time = 0 ms, elapsed time = 13 ms.========== FUNCTION ==========SQL Server Execution Times: CPU time = 0 ms, elapsed time = 0 ms.(1000 row(s) affected)SQL Server Execution Times: CPU time = 0 ms, elapsed time = 35 ms.================================================================================ Tally ==========SQL Server Execution Times: CPU time = 0 ms, elapsed time = 0 ms.SQL Server parse and compile time: CPU time = 0 ms, elapsed time = 0 ms.(1000 row(s) affected)SQL Server Execution Times: CPU time = 0 ms, elapsed time = 7 ms.========== FUNCTION ==========SQL Server Execution Times: CPU time = 0 ms, elapsed time = 0 ms.(1000 row(s) affected)SQL Server Execution Times: CPU time = 0 ms, elapsed time = 10 ms.================================================================================ Tally ==========SQL Server Execution Times: CPU time = 0 ms, elapsed time = 0 ms.SQL Server parse and compile time: CPU time = 0 ms, elapsed time = 0 ms.(1000 row(s) affected)SQL Server Execution Times: CPU time = 0 ms, elapsed time = 7 ms.========== FUNCTION ==========SQL Server Execution Times: CPU time = 0 ms, elapsed time = 0 ms.(1000 row(s) affected)SQL Server Execution Times: CPU time = 0 ms, elapsed time = 13 ms.================================================================================ Tally ==========SQL Server Execution Times: CPU time = 0 ms, elapsed time = 0 ms.SQL Server parse and compile time: CPU time = 0 ms, elapsed time = 0 ms.(1000 row(s) affected)SQL Server Execution Times: CPU time = 0 ms, elapsed time = 5 ms.========== FUNCTION ==========SQL Server Execution Times: CPU time = 0 ms, elapsed time = 0 ms.(1000 row(s) affected)SQL Server Execution Times: CPU time = 0 ms, elapsed time = 12 ms.================================================================================ Tally ==========SQL Server Execution Times: CPU time = 0 ms, elapsed time = 0 ms.SQL Server parse and compile time: CPU time = 0 ms, elapsed time = 0 ms.(1000 row(s) affected)SQL Server Execution Times: CPU time = 0 ms, elapsed time = 6 ms.========== FUNCTION ==========SQL Server Execution Times: CPU time = 0 ms, elapsed time = 0 ms.(1000 row(s) affected)SQL Server Execution Times: CPU time = 0 ms, elapsed time = 10 ms.================================================================================ Tally ==========SQL Server Execution Times: CPU time = 0 ms, elapsed time = 0 ms.SQL Server parse and compile time: CPU time = 0 ms, elapsed time = 0 ms.(1000 row(s) affected)SQL Server Execution Times: CPU time = 0 ms, elapsed time = 11 ms.========== FUNCTION ==========SQL Server Execution Times: CPU time = 0 ms, elapsed time = 0 ms.(1000 row(s) affected)SQL Server Execution Times: CPU time = 0 ms, elapsed time = 9 ms.================================================================================ Tally ==========SQL Server Execution Times: CPU time = 0 ms, elapsed time = 0 ms.SQL Server parse and compile time: CPU time = 0 ms, elapsed time = 0 ms.(1000 row(s) affected)SQL Server Execution Times: CPU time = 0 ms, elapsed time = 8 ms.========== FUNCTION ==========SQL Server Execution Times: CPU time = 0 ms, elapsed time = 0 ms.(1000 row(s) affected)SQL Server Execution Times: CPU time = 0 ms, elapsed time = 13 ms.================================================================================ Tally ==========SQL Server Execution Times: CPU time = 0 ms, elapsed time = 0 ms.SQL Server parse and compile time: CPU time = 0 ms, elapsed time = 0 ms.(1000 row(s) affected)SQL Server Execution Times: CPU time = 0 ms, elapsed time = 6 ms.========== FUNCTION ==========SQL Server Execution Times: CPU time = 0 ms, elapsed time = 0 ms.(1000 row(s) affected)SQL Server Execution Times: CPU time = 0 ms, elapsed time = 10 ms.======================================================================[/i][/size]As you can see, the difference is negligible. Looking at the notes in Subversion, tests -- on the project at the time -- showed that the function was faster in that implementation.More importantly, the function reduces dependencies and the potential for side effects. It's worked well-enough and has not been an expensive part of any procedure where it's been used.Sun, 02 Aug 2009 22:58:43 GMTmiker-212697RE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspxOnce a Tally table is used once, it gets cached and becomes a memory only solution...[code="sql"]DECLARE @iNumRows INT SET @iNumRows = 1000 PRINT '========== Tally ==========' SET STATISTICS TIME ON SELECT N FROM dbo.Tally WHERE N <= 1000 SET STATISTICS TIME OFF PRINT '========== FUNCTION ==========' SET STATISTICS TIME ON SELECT * FROM dbo.tGetUpto1000_SequentialNumbers (@iNumRows) SET STATISTICS TIME OFF PRINT REPLICATE('=',20)GO 10[/code]Here's the output from that (well, on my humble box, anyway :-P) ...Beginning execution loop========== Tally ==========SQL Server parse and compile time: CPU time = 0 ms, elapsed time = 0 ms.(1000 row(s) affected)SQL Server Execution Times: CPU time = 0 ms, elapsed time = 1 ms.========== FUNCTION ==========(1000 row(s) affected)SQL Server Execution Times: CPU time = 16 ms, elapsed time = 69 ms.============================== Tally ==========SQL Server parse and compile time: CPU time = 0 ms, elapsed time = 0 ms.(1000 row(s) affected)SQL Server Execution Times: CPU time = 0 ms, elapsed time = 6 ms.========== FUNCTION ==========(1000 row(s) affected)SQL Server Execution Times: CPU time = 16 ms, elapsed time = 69 ms.============================== Tally ==========SQL Server parse and compile time: CPU time = 0 ms, elapsed time = 0 ms.(1000 row(s) affected)SQL Server Execution Times: CPU time = 0 ms, elapsed time = 5 ms.========== FUNCTION ==========(1000 row(s) affected)SQL Server Execution Times: CPU time = 15 ms, elapsed time = 84 ms.============================== Tally ==========SQL Server parse and compile time: CPU time = 0 ms, elapsed time = 0 ms.(1000 row(s) affected)SQL Server Execution Times: CPU time = 0 ms, elapsed time = 3 ms.========== FUNCTION ==========(1000 row(s) affected)SQL Server Execution Times: CPU time = 16 ms, elapsed time = 21 ms.============================== Tally ==========SQL Server parse and compile time: CPU time = 0 ms, elapsed time = 0 ms.(1000 row(s) affected)SQL Server Execution Times: CPU time = 0 ms, elapsed time = 4 ms.========== FUNCTION ==========(1000 row(s) affected)SQL Server Execution Times: CPU time = 31 ms, elapsed time = 20 ms.============================== Tally ==========SQL Server parse and compile time: CPU time = 0 ms, elapsed time = 0 ms.(1000 row(s) affected)SQL Server Execution Times: CPU time = 0 ms, elapsed time = 8 ms.========== FUNCTION ==========(1000 row(s) affected)SQL Server Execution Times: CPU time = 16 ms, elapsed time = 21 ms.============================== Tally ==========SQL Server parse and compile time: CPU time = 0 ms, elapsed time = 0 ms.(1000 row(s) affected)SQL Server Execution Times: CPU time = 0 ms, elapsed time = 7 ms.========== FUNCTION ==========(1000 row(s) affected)SQL Server Execution Times: CPU time = 15 ms, elapsed time = 21 ms.============================== Tally ==========SQL Server parse and compile time: CPU time = 0 ms, elapsed time = 0 ms.(1000 row(s) affected)SQL Server Execution Times: CPU time = 0 ms, elapsed time = 8 ms.========== FUNCTION ==========(1000 row(s) affected)SQL Server Execution Times: CPU time = 16 ms, elapsed time = 23 ms.============================== Tally ==========SQL Server parse and compile time: CPU time = 0 ms, elapsed time = 0 ms.(1000 row(s) affected)SQL Server Execution Times: CPU time = 0 ms, elapsed time = 47 ms.========== FUNCTION ==========(1000 row(s) affected)SQL Server Execution Times: CPU time = 16 ms, elapsed time = 20 ms.============================== Tally ==========SQL Server parse and compile time: CPU time = 0 ms, elapsed time = 0 ms.(1000 row(s) affected)SQL Server Execution Times: CPU time = 0 ms, elapsed time = 8 ms.========== FUNCTION ==========(1000 row(s) affected)SQL Server Execution Times: CPU time = 15 ms, elapsed time = 20 ms.====================Batch execution completed 10 times.Sun, 02 Aug 2009 22:14:54 GMTJeff ModenRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspxAck... surprise is on me. I believe you said that you only had 2k installations. I'll be right back with a 2k solution.Sun, 02 Aug 2009 22:07:17 GMTJeff ModenRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspx[quote][b]Mike Ross (8/2/2009)[/b][hr]Here's some typical code. I will be consolidating and streamlining this code based on lessons learned/rediscovered during this contest...[/quote]Thanks, Mike. I'll check it out.By the way, you can (kind of) raise an error in a function...[code="sql"]-------------------------------------------------------------------------------------- FUNCTION tGetUpto1000_SequentialNumbers ----------------------------------------------------------------------------------------- This function returns a table of integers from 1 to N. ------ ----------------------------------------------------------------------------------------- Inputs: iNumRows -- How many rows? If more than 1000 needed, ------ use one of the other "tGetUpto" functions. ------ ------ Output: A table of ints from 1 to @iNumRows. --------------------------------------------------------------------------------------CREATE FUNCTION dbo.tGetUpto1000_SequentialNumbers (@iNumRows int) RETURNS @tSeqTable TABLE (iNum INT PRIMARY KEY NOT NULL)ASBEGIN IF @iNumRows > 1000 OR @iNumRows < 1 SELECT @iNumRows = 1/0 DECLARE @Digits TABLE (K INTEGER NOT NULL PRIMARY KEY CHECK (K >= 0)); INSERT INTO @Digits (K) SELECT 0 UNION SELECT 1 UNION SELECT 2 UNION SELECT 3 UNION SELECT 4 UNION SELECT 5 UNION SELECT 6 UNION SELECT 7 UNION SELECT 8 UNION SELECT 9; INSERT INTO @tSeqTable (iNum) SELECT 100*hundreds.K + 10*tens.K + units.K + 1 AS iNum FROM @Digits hundreds, @Digits tens, @Digits units WHERE (100*hundreds.K + 10*tens.K + units.K + 1) <= @iNumRows RETURNEND -- FUNCTION tGetUpto1000_SequentialNumbers[/code]I also have a surprise for you if your machine isn't too fast... :-P[code="sql"]DECLARE @iNumRows INT SET @iNumRows = 1000 SELECT TOP (@iNumRows) ROW_NUMBER() OVER (ORDER BY ac1.Object_ID) FROM Master.sys.All_Columns ac1 PRINT REPLICATE('-',20) SELECT * FROM dbo.tGetUpto1000_SequentialNumbers (@iNumRows) PRINT REPLICATE('=',20)GO 10[/code]Sun, 02 Aug 2009 22:01:03 GMTJeff ModenRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspx[quote][b]Jeff Moden (8/2/2009)[/b][hr]... I suggest that you narrow it down to what you feel are your best 2 pieces of code and resubmit them to this thread with the words "Final submission for [i]yournamehere[/i]" just to make it easy on the judges.[/quote]Capital idea! I love the notion of "Final submission" (Uh, not like that. AS in: "Walk away from the madness.")The "Final submission from Mike Ross" is attached.[quote][b]Jeff Moden (8/2/2009)[/b][hr][quote][b]Mike Ross (8/1/2009)[/b][hr] ... except for small ones that were generated, in memory, because it was faster than reading from disk.[/quote]Ummmm... got code? ;-)[/quote]Here's some typical code. I will be consolidating and streamlining this code based on lessons learned/rediscovered during this contest...[code="sql"]-------------------------------------------------------------------------------------- FUNCTION tGetUpto1000_SequentialNumbers ----------------------------------------------------------------------------------------- This function returns a table of integers from 1 to N. ------ ----------------------------------------------------------------------------------------- Inputs: iNumRows -- How many rows? If more than 1000 needed, ------ use one of the other "tGetUpto" functions. ------ ------ Output: A table of ints from 1 to @iNumRows. --------------------------------------------------------------------------------------CREATE FUNCTION dbo.tGetUpto1000_SequentialNumbers (@iNumRows int) RETURNS @tSeqTable TABLE (iNum INT PRIMARY KEY NOT NULL)ASBEGIN /*-- WARNING! RAISERROR does not work in SQL Server functions!! IF @iNumRows > 1000 OR @iNumRows < 1 RAISERROR ('tGetUpto1000_SequentialNumbers valid input is 1 to 1000.', 10, 86) */ DECLARE @Digits TABLE (K INTEGER NOT NULL PRIMARY KEY CHECK (K >= 0)); INSERT INTO @Digits (K) SELECT 0 UNION SELECT 1 UNION SELECT 2 UNION SELECT 3 UNION SELECT 4 UNION SELECT 5 UNION SELECT 6 UNION SELECT 7 UNION SELECT 8 UNION SELECT 9; INSERT INTO @tSeqTable (iNum) SELECT 100*hundreds.K + 10*tens.K + units.K + 1 AS iNum FROM @Digits hundreds, @Digits tens, @Digits units WHERE (100*hundreds.K + 10*tens.K + units.K + 1) <= @iNumRows RETURNEND -- FUNCTION tGetUpto1000_SequentialNumbers[/code][quote][b]smunson (8/2/2009)[/b][hr]... I'm wondering how you came to the 20% faster conclusion. Was it for only 1 million ?[/quote]No, I eventually tested at 10M and my code was still faster (See a previous post): 88 versus 99 seconds. The 20% was based on coarsely measured runs at 1M (8 sec versus 11 sec). It takes too long to test above 1M on my machine.YMMV, quite a bit, apparently. :( Here's hoping that Joe and Phil have old, slow, SQL2000, single-core machines! :-DEDIT: Corrected math.Sun, 02 Aug 2009 21:43:00 GMTmiker-212697RE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspx[quote][b]Peso (8/2/2009)[/b][hr]Much faster implementation of my "squarefree" solution here...//Peso[/quote]This is some pretty neat stuff, Peso. Heh, you've probably just wasted my whole week as I try to figure out all of the math behind this. :w00t:Sun, 02 Aug 2009 21:08:00 GMTRBarryYoungRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspxFinal submission for smunson:Good suggestion Jeff. My two solutions are here along with the result times on my machine from each power of 10 going from 10,000 to 10,000,000. One uses my Tally Table, which has 10 million records in it with values starting at 1, and the other generates the values for such a table. Both use the same basic concept of generating candidates whose right-most digit is 1, 3, 7, or 9, and then limit those candidates to those that match the form of 6 times N plus or minus 1. The candidate table is then joined to itself to allow for validating each candidate prime for potential factors up to that candidate's square root, which was calculated as part of creating the candidate table, and included therein. A closer examination of the original article offering up this challenge never actually mentions scalability, and only refers to primes up to 10 thousand as part of a statement that asks to "improve on his solution to calculating the prime numbers between 1 and 10000.", which is at the very beginning, and above the actual article itself, which then only refers to 1000 as a convenient number. Many of us realized that these small numbers might make it difficult to determine actual performance levels, and thus scaled up the number to as high as 10 million, where differences in performance become a lot more obvious. Additionally, there's nothing in the article that actually says a table needs to be generated - only that the primes need to be generated, and a result set, at least technically, would qualify as the primes having been generated. Thus I'll show my solutions timings based on either creating a temp table or not doing so. It's really amazing to me the difference in performance between the two.For the sake of comparison to other results, my machine is as follows:SQL Server 2005 Developer Edition 64-bit, w/SP3 and no additional hotfixes, running on Vista Ultimate 64-bit w/SP2 and all current hotfixes posted via Microsoft Update. The machine uses an Intel Q9550 Quad-Core cpu at 2.83 GHz, and has 8GB of RAM.Here's the final solution timings for the tally table code which creates a table with the result set:----10,000: Total Duration: 0.046 seconds---100,000: Total Duration: 0.356 seconds-1,000,000: Total Duration: 5.046 seconds10,000,000: Total Duration: 73.100 secondsAdjusting the code to just generate a result set and not create the table gets the following results:----10,000: Total Duration: 0.040 seconds---100,000: Total Duration: 0.223 seconds-1,000,000: Total Duration: 1.973 seconds10,000,000: Total Duration: 26.260 secondsHere's the code showing the commenting out of the table creation for the result set and set for 10 million:[code]--SET SHOWPLAN_XML ON--GOIF OBJECT_ID('tempdb..#UNITS') IS NOT NULL DROP TABLE #UNITSIF OBJECT_ID('tempdb..#DIGITS') IS NOT NULL DROP TABLE #DIGITSIF OBJECT_ID('tempdb..#PRIMES') IS NOT NULL DROP TABLE #PRIMESIF OBJECT_ID('tempdb..#NUMS') IS NOT NULL DROP TABLE #NUMSSET NOCOUNT ONDECLARE @START AS DATETIME, @END AS DATETIME, @MAX_PRIME AS INTSET @START = GETDATE()SET @MAX_PRIME = 10000000/*CREATE TABLE #PRIMES ( PRIME INT NOT NULL PRIMARY KEY CLUSTERED) */CREATE TABLE #NUMS ( N INT NOT NULL PRIMARY KEY CLUSTERED,-- SR AS CAST(ROUND(SQRT(N),0,1) AS INT) --PERSISTED SR INT NOT NULL)INSERT INTO #NUMS(N, SR)SELECT CAST(3 AS INT), CAST(ROUND(SQRT(3),0,1) AS INT)UNION ALLSELECT CAST(5 AS INT), CAST(ROUND(SQRT(5),0,1) AS INT)UNION ALLSELECT N, CAST(ROUND(SQRT(N),0,1) AS INT) AS SRFROM master.dbo.TallyWHERE N > 6 AND N < @MAX_PRIME AND RIGHT(CAST(N AS varchar(7)),1) IN ('1','3','7','9') AND N % 6 IN (1, 5)--INSERT INTO #PRIMES(PRIME) SELECT 2 AS PRIME UNION ALL SELECT N1.N FROM #NUMS AS N1 WHERE NOT EXISTS ( SELECT 1 FROM #NUMS AS N2 WHERE N2.N <= N1.SR AND N1.N % N2.N = 0 )--SELECT *--FROM #PRIMES--ORDER BY PRIMESET @END = GETDATE()PRINT 'Total Duration: ' + CAST(CAST(DATEDIFF(ms, @START, @END)/1000. AS decimal(7,3)) AS varchar(15)) + ' seconds'GO--SET SHOWPLAN_XML OFF--GO[/code]Here's the final solution timings for the NON tally table code which creates a table with the result set:----10,000: Total Duration: 0.046 seconds---100,000: Total Duration: 0.393 seconds-1,000,000: Total Duration: 4.780 seconds10,000,000: Total Duration: 75.613 secondsAdjusting the code to just generate a result set and not create the table gets the following results:----10,000: Total Duration: 0.036 seconds---100,000: Total Duration: 0.236 seconds-1,000,000: Total Duration: 2.376 seconds10,000,000: Total Duration: 28.840 secondsHere's the code showing the commenting out of the table creation for the result set and set for 10 million:[code]--SET SHOWPLAN_XML ON--GOIF OBJECT_ID('tempdb..#UNITS') IS NOT NULL DROP TABLE #UNITSIF OBJECT_ID('tempdb..#TENS') IS NOT NULL DROP TABLE #TENSIF OBJECT_ID('tempdb..#HUNDREDS') IS NOT NULL DROP TABLE #HUNDREDSIF OBJECT_ID('tempdb..#THOUSANDS') IS NOT NULL DROP TABLE #THOUSANDSIF OBJECT_ID('tempdb..#TEN_THOUSANDS') IS NOT NULL DROP TABLE #TEN_THOUSANDSIF OBJECT_ID('tempdb..#HUNDRED_THOUSANDS') IS NOT NULL DROP TABLE #HUNDRED_THOUSANDSIF OBJECT_ID('tempdb..#MILLIONS') IS NOT NULL DROP TABLE #MILLIONSIF OBJECT_ID('tempdb..#PRIMES') IS NOT NULL DROP TABLE #PRIMESIF OBJECT_ID('tempdb..#NUMS') IS NOT NULL DROP TABLE #NUMSSET NOCOUNT ONDECLARE @START AS DATETIME, @END AS DATETIMESET @START = GETDATE()CREATE TABLE #UNITS ( Num INT NOT NULL PRIMARY KEY,)INSERT INTO #UNITSSELECT 1 UNION ALLSELECT 3 UNION ALLSELECT 7 UNION ALLSELECT 9CREATE TABLE #TENS ( Num INT NOT NULL PRIMARY KEY,)INSERT INTO #TENSSELECT 0 UNION ALLSELECT 10 UNION ALLSELECT 20 UNION ALLSELECT 30 UNION ALLSELECT 40 UNION ALLSELECT 50 UNION ALLSELECT 60 UNION ALLSELECT 70 UNION ALLSELECT 80 UNION ALLSELECT 90CREATE TABLE #HUNDREDS ( Num INT NOT NULL PRIMARY KEY,)INSERT INTO #HUNDREDSSELECT 0 UNION ALLSELECT 100 UNION ALLSELECT 200 UNION ALLSELECT 300 UNION ALLSELECT 400 UNION ALLSELECT 500 UNION ALLSELECT 600 UNION ALLSELECT 700 UNION ALLSELECT 800 UNION ALLSELECT 900CREATE TABLE #THOUSANDS ( Num INT NOT NULL PRIMARY KEY,)INSERT INTO #THOUSANDSSELECT 0 UNION ALLSELECT 1000 UNION ALLSELECT 2000 UNION ALLSELECT 3000 UNION ALLSELECT 4000 UNION ALLSELECT 5000 UNION ALLSELECT 6000 UNION ALLSELECT 7000 UNION ALLSELECT 8000 UNION ALLSELECT 9000CREATE TABLE #TEN_THOUSANDS ( Num INT NOT NULL PRIMARY KEY,)INSERT INTO #TEN_THOUSANDSSELECT 0 UNION ALLSELECT 10000 UNION ALLSELECT 20000 UNION ALLSELECT 30000 UNION ALLSELECT 40000 UNION ALLSELECT 50000 UNION ALLSELECT 60000 UNION ALLSELECT 70000 UNION ALLSELECT 80000 UNION ALLSELECT 90000CREATE TABLE #HUNDRED_THOUSANDS ( Num INT NOT NULL PRIMARY KEY,)INSERT INTO #HUNDRED_THOUSANDSSELECT 0 UNION ALLSELECT 100000 UNION ALLSELECT 200000 UNION ALLSELECT 300000 UNION ALLSELECT 400000 UNION ALLSELECT 500000 UNION ALLSELECT 600000 UNION ALLSELECT 700000 UNION ALLSELECT 800000 UNION ALLSELECT 900000CREATE TABLE #MILLIONS ( Num INT NOT NULL PRIMARY KEY,)INSERT INTO #MILLIONSSELECT 0 UNION ALLSELECT 1000000 UNION ALLSELECT 2000000 UNION ALLSELECT 3000000 UNION ALLSELECT 4000000 UNION ALLSELECT 5000000 UNION ALLSELECT 6000000 UNION ALLSELECT 7000000 UNION ALLSELECT 8000000 UNION ALLSELECT 9000000/*CREATE TABLE #PRIMES ( PRIME INT NOT NULL PRIMARY KEY CLUSTERED) */CREATE TABLE #NUMS ( N INT NOT NULL PRIMARY KEY CLUSTERED, SR INT NOT NULL)INSERT INTO #NUMS(N, SR)SELECT N, CAST(ROUND(SQRT(N),0,1) AS INT) AS SRFROM ( SELECT m.Num + ht.Num + tt.Num + T.Num + h.Num + t.Num + u.Num AS N FROM #MILLIONS AS m, #HUNDRED_THOUSANDS AS ht, #TEN_THOUSANDS AS tt, #THOUSANDS AS T, #HUNDREDS AS h, #TENS AS t, #UNITS AS u ) AS XWHERE N > 6 AND N % 6 IN (1, 5)--INSERT INTO #PRIMES(PRIME) SELECT 2 AS PRIME UNION ALL SELECT 3 AS PRIME UNION ALL SELECT 5 AS PRIME UNION ALL SELECT N1.N FROM #NUMS AS N1 WHERE NOT EXISTS ( SELECT 1 FROM #NUMS AS N2 WHERE N2.N <= N1.SR AND N1.N % N2.N = 0 )--SELECT *--FROM #PRIMES--ORDER BY PRIMESET @END = GETDATE()PRINT 'Total Duration: ' + CAST(CAST(DATEDIFF(ms, @START, @END)/1000. AS decimal(7,3)) AS varchar(15)) + ' seconds'GO--SET SHOWPLAN_XML OFF--GO[/code]Steve(aka smunson):-):-):-)Sun, 02 Aug 2009 20:16:43 GMTsgmunsonRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspx[quote][b]Jeff Moden (8/2/2009)[/b][hr][quote][b]RBarryYoung (7/30/2009)[/b][hr]Plain Ol' C, on the other hand, is so optimized it's practically assembler!For reference, it takes about 3 seconds to do N=10,000,000. If anybody can beat that in just SQL (possible since it's doing all kinds of extra work), I will be truly impressed.[/quote]I would very much enjoy seeing your VB and C code for this, Barry. Thanks.[/quote]Of course, you might have to register for StackOverflow.com to get to that page. So I'll copy it over here for you:[code="vb"]Imports System.MathPublic Class TotientSerialCalculator 'Implements an extremely efficient Serial Totient(phi) calculator ' ' This implements an optimized windowed Sieve of Eratosthenes. The' ' window size is set at Sqrt(N) both to optimize collecting and ' ' applying all of the Primes below Sqrt(N), and to minimize ' ' window-turning overhead. ' ' ' ' CPU complexity is O( N * Log(Log(N)) ), which is virtually linear.' ' ' ' MEM Complexity is O( Sqrt(N) ). ' ' ' ' This is probalby the ideal combination, as any attempt to further ' 'reduce memory will almost certainly result in disproportionate increases' 'in CPU complexity, and vice-versa. ' Structure NumberFactors Dim UnFactored As Long 'the part of the number that still needs to be factored' Dim Phi As Long 'the totient value progressively calculated' ' (equals total numbers less than N that are CoPrime to N)' 'MEM = 8 bytes each' End Structure Private ReportInterval As Long Private PrevLast As Long 'the last value in the previous window' Private FirstValue As Long 'the first value in this windows range' Private WindowSize As Long Private LastValue As Long 'the last value in this windows range' Private NextFirst As Long 'the first value in the next window' 'Array that stores all of the NumberFactors in the current window.' ' this is the primary memory consumption for the class and it' ' is 16 * Sqrt(N) Bytes, which is O(Sqrt(N)).' Public Numbers() As NumberFactors ' For N=10^12 (1 trilion), this will be 16MB, which should be bearable anywhere.' '(note that the Primes() array is a secondary memory consumer' ' at O(pi(Sqrt(N)), which will be within 10x of O(Sqrt(N)))' Public Event EmitTotientPair(ByVal k As Long, ByVal Phi As Long) '===== The Routine To Call: ========================' Public Sub EmitTotientPairsToN(ByVal N As Long) 'Routine to Emit Totient pairs {k, Phi(k)} for k = 1 to N' ' 2009-07-14, RBarryYoung, Created.' Dim i As Long Dim k As Long 'the current number being factored' Dim p As Long 'the current prime factor' 'Establish the Window frame:' ' note: WindowSize is the critical value that controls both memory' ' usage and CPU consumption and must be SQRT(N) for it to work optimally.' WindowSize = Ceiling(Sqrt(CDbl(N))) ReDim Numbers(0 To WindowSize - 1) 'Initialize the first window:' MapWindow(1) Dim IsFirstWindow As Boolean = True 'adjust this to control how often results are show' ReportInterval = N / 100 'Allocate the primes array to hold the primes list:' ' Only primes <= SQRT(N) are needed for factoring' ' PiMax(X) is a Max estimate of the number of primes <= X' Dim Primes() As Long, PrimeIndex As Long, NextPrime As Long 'init the primes list and its pointers' ReDim Primes(0 To PiMax(WindowSize) - 1) Primes(0) = 2 '"prime" the primes list with the first prime' NextPrime = 1 'Map (and Remap) the window with Sqrt(N) numbers, Sqrt(N) times to' ' sequentially map all of the numbers <= N.' Do 'Sieve the primes across the current window' PrimeIndex = 0 'note: cant use enumerator for the loop below because NextPrime' ' changes during the first window as new primes <= SQRT(N) are accumulated' Do While PrimeIndex < NextPrime 'get the next prime in the list' p = Primes(PrimeIndex) 'find the first multiple of (p) in the current window range' k = PrevLast + p - (PrevLast Mod p) Do With Numbers(k - FirstValue) .UnFactored = .UnFactored \ p 'always works the first time' .Phi = .Phi * (p - 1) 'Phi = PRODUCT( (Pi-1)*Pi^(Ei-1) )' 'The loop test that follows is probably the central CPU overhead' ' I believe that it is O(N*Log(Log(N)), which is virtually O(N)' ' ( for instance at N = 10^12, Log(Log(N)) = 3.3 )' Do While (.UnFactored Mod p) = 0 .UnFactored = .UnFactored \ p .Phi = .Phi * p Loop End With 'skip ahead to the next multiple of p: ' '(this is what makes it so fast, never have to try prime factors that dont apply)' k += p 'repeat until we step out of the current window:' Loop While k < NextFirst 'if this is the first window, then scan ahead for primes' If IsFirstWindow Then For i = Primes(NextPrime - 1) + 1 To p ^ 2 - 1 'the range of possible new primes' 'Dont go beyond the first window' If i >= WindowSize Then Exit For If Numbers(i - FirstValue).UnFactored = i Then 'this is a prime less than SQRT(N), so add it to the list.' Primes(NextPrime) = i NextPrime += 1 End If Next End If PrimeIndex += 1 'move to the next prime' Loop 'Now Finish & Emit each one' For k = FirstValue To LastValue With Numbers(k - FirstValue) 'Primes larger than Sqrt(N) will not be finished: ' If .UnFactored > 1 Then 'Not done factoring, must be an large prime factor remaining: ' .Phi = .Phi * (.UnFactored - 1) .UnFactored = 1 End If 'Emit the value pair: (k, Phi(k)) ' EmitPhi(k, .Phi) End With Next 're-Map to the next window ' IsFirstWindow = False MapWindow(NextFirst) Loop While FirstValue <= N End Sub Sub EmitPhi(ByVal k As Long, ByVal Phi As Long) 'just a placeholder for now, that raises an event to the display form' ' periodically for reporting purposes. Change this to do the actual' ' emitting.' If (k Mod ReportInterval) = 0 Then RaiseEvent EmitTotientPair(k, Phi) End If End Sub Public Sub MapWindow(ByVal FirstVal As Long) 'Efficiently reset the window so that we do not have to re-allocate it.' 'init all of the boundary values' FirstValue = FirstVal PrevLast = FirstValue - 1 NextFirst = FirstValue + WindowSize LastValue = NextFirst - 1 'Initialize the Numbers prime factor arrays' Dim i As Long For i = 0 To WindowSize - 1 With Numbers(i) .UnFactored = i + FirstValue 'initially equal to the number itself' .Phi = 1 'starts at mulplicative identity(1)' End With Next End Sub Function PiMax(ByVal x As Long) As Long 'estimate of pi(n) == {primes <= (n)} that is never less' ' than the actual number of primes. (from P. Dusart, 1999)' Return (x / Log(x)) * (1.0 + 1.2762 / Log(x)) End FunctionEnd Class[/code]Sun, 02 Aug 2009 19:57:10 GMTRBarryYoungRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspxI'll provide two reasons why I still maintain that the tally table should be excluded:1. Any argument along the lines of "you should have one anyway" can be applied to a table of prime numbers. In the field I work in, I have had a table of prime numbers for nearly a decade (and t-values, and chi values). Therefore, joining to a pre-created table is as good as pre-creating the prime table!!! It's simply false logic. If you didn't like the house example, then maybe a car example: It's like saying "All cars I've seen have four wheels, so let's assume all cars do." But but but... not all cars DO have 4 wheels. Some popular micro cars had 3, and there are some neat cars out of Europe with three. Similarly you can't assume that a table exists just because you think it should. It doesn't exist as a standard install, and even if the user did create it there is no garantee that they have the same format and names as you have used!!!2. The code becomes so basic AND so fast BUT only because it "hides" the performance hit of generating the numbers to be reduced. If it is agreed that the "tally" table could/should exist prior to the prime filtering, then the same can be said for any "Sequence" table generation.If the judges insist that use of a tally table is okay, then here is my submission which out-strips any code I can create that is a total, encapsulated answer (it also strips-out any higher order thought patterns). It runs at half the time of my next entry. For this I've created one of your "tally" tables - it just so happens I'm going to call mine "SillyTally" (rename to whatever you call yours) with numbers from 2-10,000,000.[code]if object_id('Primes') is not null drop table Primes declare @Start as datetime, @End as datetimedelete from sillytally where num = 1 --If not already done... and if you want put it back in at the end.set @start = getdate()select numInto Primes from sillytallywherenum >3163and not exists(select num from(select num from sillytallywhere sillytally.num < 3163 andnot exists (select others.num from sillytally as otherswhere others.num < 3163 andothers.num <= sqrt(sillytally.num) andsillytally.num % others.num = 0)) as [Raw]where sillytally.num % [raw].num = 0)union allselect num from sillytallywhere sillytally.num <= 3163 andnot exists (select others.num from sillytally as otherswhere others.num <= 3163 andothers.num <= sqrt(sillytally.num) andsillytally.num % others.num = 0)SET @END = GETDATE() PRINT 'Total Duration: ' + CAST(CAST(DATEDIFF(ms, @START, @END) / 1000. AS decimal(12,3)) AS varchar(12)) + ' seconds'[/code]However, my "serious" entry is more of a joint entry as it is only a very slight modification of a version posted by Steve Munson:[code]DROP TABLE ##PrimesDROP TABLE ##Sequencego/* Peter Larsson Ryan Randall w/scale-out updates by Steve MunsonFinal edits by Philip Leitch*/SET NOCOUNT ON-- Initialize user supplied parameterDECLARE @MaxPrime INT, @START AS DateTime, @END AS DateTimeSET @START = GETDATE() SET @MaxPrime = 10000000 -- Prepare helper variablesDECLARE @MaxNumber INT SET @MaxNumber = SQRT(@MaxPrime) -- Create staging tablesCREATE TABLE ##Primes ( Prime INT PRIMARY KEY CLUSTERED ) CREATE TABLE ##Sequence ( Seq INT PRIMARY KEY CLUSTERED, Seq2 AS CAST(Seq AS BIGINT) * CAST(Seq AS BIGINT) --PERSISTED ) BEGIN TRANSACTION INSERT ##Sequence ( Seq )SELECT 1000 * s.Digit + 100 * h.Digit + 10 * t.Digit + u.Digit + 1FROM ( SELECT 0 AS Digit UNION ALL SELECT 1 UNION ALL SELECT 2 UNION ALL SELECT 3 UNION ALL SELECT 4 UNION ALL SELECT 5 UNION ALL SELECT 6 UNION ALL SELECT 7 UNION ALL SELECT 8 UNION ALL SELECT 9 ) AS uCROSS JOIN ( SELECT 0 AS Digit UNION ALL SELECT 1 UNION ALL SELECT 2 UNION ALL SELECT 3 UNION ALL SELECT 4 UNION ALL SELECT 5 UNION ALL SELECT 6 UNION ALL SELECT 7 UNION ALL SELECT 8 UNION ALL SELECT 9 ) AS tCROSS JOIN ( SELECT 0 AS Digit UNION ALL SELECT 1 UNION ALL SELECT 2 UNION ALL SELECT 3 UNION ALL SELECT 4 UNION ALL SELECT 5 UNION ALL SELECT 6 UNION ALL SELECT 7 UNION ALL SELECT 8 UNION ALL SELECT 9 ) AS hCROSS JOIN ( SELECT 0 AS Digit UNION ALL SELECT 1 UNION ALL SELECT 2 UNION ALL SELECT 3 UNION ALL SELECT 4 UNION ALL SELECT 5 UNION ALL SELECT 6 UNION ALL SELECT 7 UNION ALL SELECT 8 UNION ALL SELECT 9 ) AS s INSERT ##Primes ( Prime )SELECT kFROM ( SELECT k FROM ( SELECT 4 * a.Seq2 + b.Seq2 AS k FROM ##Sequence AS a CROSS JOIN ##Sequence AS b WHERE a.Seq <= @MaxNumber AND b.Seq <= @MaxNumber ) AS c WHERE k <= @MaxPrime AND k % 12 IN (1, 5) UNION ALL SELECT k FROM ( SELECT 3 * a.Seq2 + b.Seq2 AS k FROM ##Sequence AS a CROSS JOIN ##Sequence AS b WHERE a.Seq <= @MaxNumber AND b.Seq <= @MaxNumber ) AS c WHERE k <= @MaxPrime AND k % 12 = 7 UNION ALL SELECT k FROM ( SELECT 3 * a.Seq2 - b.Seq2 AS k FROM ##Sequence AS a INNER JOIN ##Sequence AS b ON b.Seq < a.Seq WHERE a.Seq <= @MaxNumber ) AS c WHERE k <= @MaxPrime AND k % 12 = 11 ) AS dWHERE k % 10 IN (1, 3, 7, 9) and k % 7 > 0 and k % 11 > 0 and k % 13 > 0GROUP BY kHAVING COUNT(k) IN (1, 3) DECLARE @i INT SET @i = 17 WHILE @i < @MaxNumber BEGIN DELETE ##Primes WHERE Prime % @i = 0 and prime >= @i * @i SELECT @i = MIN(Prime) FROM ##Primes WHERE Prime > @i END insert into ##Primes (Prime)select digit from (Select 2 Digit UNION ALL SELECT 3 UNION ALL SELECT 5 UNION ALL SELECT 7 UNION ALL SELECT 11 UNION ALL SELECT 13) as W COMMIT TRANSACTION SELECT PrimeFROM ##PrimesORDER BY Prime SET @END = GETDATE() PRINT 'Total Duration: ' + CAST(CAST(DATEDIFF(ms, @START, @END) / 1000. AS decimal(12,3)) AS varchar(12)) + ' seconds'[/code]Sun, 02 Aug 2009 19:56:06 GMTpleitchRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspx[quote][b]Jeff Moden (8/2/2009)[/b][hr][quote][b]RBarryYoung (7/30/2009)[/b][hr]Plain Ol' C, on the other hand, is so optimized it's practically assembler!For reference, it takes about 3 seconds to do N=10,000,000. If anybody can beat that in just SQL (possible since it's doing all kinds of extra work), I will be truly impressed.[/quote]I would very much enjoy seeing your VB and C code for this, Barry. Thanks.[/quote]It's just VB code, but it shouldn't be too hard to convert. Anyway you can find it [url=http://stackoverflow.com/questions/1024640/calculating-phik-for-1kn/1134851#1134851]here[/url].Sun, 02 Aug 2009 19:53:49 GMTRBarryYoungRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspxFolks, I have a suggestion to prevent any hard feelings and possible omissions. Many of you have submitted just a pot wad of code and the hour of decision draws near. I suggest that you narrow it down to what you feel are your best 2 pieces of code and resubmit them to this thread with the words "Final submission for [i]yournamehere[/i]" just to make it easy on the judges. The hard feelings I'm talking about would occur if the judges couldn't figure which of your many posts is the best one that you submitted.All together now...Sun, 02 Aug 2009 18:37:18 GMTJeff ModenRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspx[quote][b]pleitch (8/2/2009)[/b][hr]In other words, the time test should be capable of running on a vanilla SQL Server instance. Every operation that needs to be issued must count towards the cost of generating the list of primes.[/quote]For the record and in general, I disagree with that notion (heh, regardless of what type of house your buying :-P). If this contest were about the performance of finite split functions, it would be ridiculous to penalize someone for creating a tool to do it. The tool one be created just one time, not every time the function were called.The same holds true for this contest to an extent (I'll explain why not in a minute). No matter how many times you run the prime number algorithm, you would only need to create the Tally table (a very common and widely recognized performance tool) once.The only reason why I suggest that a Tally table isn't appropriate here is because of the scalability requirement. That is, scalability is required by the rules of the contest and no limits were stated as to how far that scalability should go. I would say that most folks who know of Tally tables keep a Tally table in their databases at all times. I'll also say that most folks don't keep a Tally table larger than a million rows. If the rules stated that scalability were limited to prime numbers <= 1 million, then the time to create a million row Tally table should not be included in the total time it takes the code to run.Of course, that's just my opinion... thankfully for a lot of us (especially me :hehe:), I'm not a judge on for this contest.Sun, 02 Aug 2009 18:31:48 GMTJeff ModenRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspxI respectfully disagree. I like the idea of a numbers source - and for my money a date source table for the same reasons - being built into SQL Server. However, it isn't there "out of the box".Therefore I strongly disagree that we should consider the table "there" because it should be. Regardless as to whether a feature SHOULD be in SQL Server, we can't consider it "there" if it isn't very actually there. In other words, the time test should be capable of running on a vanilla SQL Server instance. Every operation that needs to be issued must count towards the cost of generating the list of primes.It would be like the old practice of buying a house. You see a sticker price, but then when you go to finalise the purchase you find out there are stamp duty fees, transfer fees, mortgage insurance fees and so on so that the final price you pay is different to the advertised price. Same here, you go to run the search for primes because it is marked as "the lowest cost", then find out that there is another resource penalty you have to pay first, which means the final time cost is nothing like the one advertised.Using the "it should be there so we don't have to count it" logic, a table of primes "SHOULD" be included in SQL Server, because it has so many uses (well, for me anyway). In that case the optimal solution is to do nothing - the list of primes can already be counted as being constructed.[quote][b]RBarryYoung (7/31/2009)[/b][hr]I don't think that you'll find many people here who would agree with you Mike. IMHO, an adequately sized and performant Numbers source should be a standard addition for any SQL database. They are useful for so many applications and performance problems that there really is no good excuse NOT to have one already built.There's no reason that using something that we should already have and that should be pre-built should count against you for a challenge like this. Plus, the Puzzle Poser, Joe Celko, has already stipulated it as being available.[/quote]Sun, 02 Aug 2009 17:20:56 GMTpleitchRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspxWhile I'm at it, here's the tally table solution at a similar performance point:Total Duration: 29.560 seconds[code]--SET SHOWPLAN_XML ON--GOIF OBJECT_ID('tempdb..#UNITS') IS NOT NULL DROP TABLE #UNITSIF OBJECT_ID('tempdb..#DIGITS') IS NOT NULL DROP TABLE #DIGITSIF OBJECT_ID('tempdb..#PRIMES') IS NOT NULL DROP TABLE #PRIMESIF OBJECT_ID('tempdb..#NUMS') IS NOT NULL DROP TABLE #NUMSSET NOCOUNT ONDECLARE @START AS DATETIME, @END AS DATETIMESET @START = GETDATE()/*CREATE TABLE #PRIMES ( PRIME INT NOT NULL PRIMARY KEY CLUSTERED) */CREATE TABLE #NUMS ( N INT NOT NULL PRIMARY KEY CLUSTERED,-- SR AS CAST(ROUND(SQRT(N),0,1) AS INT) --PERSISTED SR INT NOT NULL)INSERT INTO #NUMS(N, SR)SELECT CAST(3 AS INT), CAST(ROUND(SQRT(3),0,1) AS INT)UNION ALLSELECT CAST(5 AS INT), CAST(ROUND(SQRT(5),0,1) AS INT)UNION ALLSELECT N, CAST(ROUND(SQRT(N),0,1) AS INT) AS SRFROM master.dbo.TallyWHERE N > 6 AND RIGHT(CAST(N AS varchar(10)),1) IN ('1','3','7','9') AND N % 6 IN (1, 5)--INSERT INTO #PRIMES(PRIME) SELECT 2 AS PRIME UNION ALL SELECT N1.N FROM #NUMS AS N1 WHERE NOT EXISTS ( SELECT 1 FROM #NUMS AS N2 WHERE N2.N <= N1.SR AND N1.N % N2.N = 0 )--SELECT *--FROM #PRIMES--ORDER BY PRIMESET @END = GETDATE()PRINT 'Total Duration: ' + CAST(CAST(DATEDIFF(ms, @START, @END)/1000. AS decimal(7,3)) AS varchar(15)) + ' seconds'GO--SET SHOWPLAN_XML OFF--GO[/code]Steve(aka smunson):-):-):-)Sun, 02 Aug 2009 16:59:45 GMTsgmunsonRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspxGiven RBarryYoung's significant achievement at less than 18 seconds, I'm wondering if we can just produce the result-set and not worry about getting the data into a table? Anyway, here's the code that produces this result:Total Duration: 27.740 seconds[code]USE TEST--SET SHOWPLAN_XML ON--GOIF OBJECT_ID('tempdb..#UNITS') IS NOT NULL DROP TABLE #UNITSIF OBJECT_ID('tempdb..#TENS') IS NOT NULL DROP TABLE #TENSIF OBJECT_ID('tempdb..#HUNDREDS') IS NOT NULL DROP TABLE #HUNDREDSIF OBJECT_ID('tempdb..#THOUSANDS') IS NOT NULL DROP TABLE #THOUSANDSIF OBJECT_ID('tempdb..#TEN_THOUSANDS') IS NOT NULL DROP TABLE #TEN_THOUSANDSIF OBJECT_ID('tempdb..#HUNDRED_THOUSANDS') IS NOT NULL DROP TABLE #HUNDRED_THOUSANDSIF OBJECT_ID('tempdb..#MILLIONS') IS NOT NULL DROP TABLE #MILLIONSIF OBJECT_ID('tempdb..#PRIMES') IS NOT NULL DROP TABLE #PRIMESIF OBJECT_ID('tempdb..#NUMS') IS NOT NULL DROP TABLE #NUMSSET NOCOUNT ONDECLARE @START AS DATETIME, @END AS DATETIMESET @START = GETDATE()CREATE TABLE #UNITS ( Num INT NOT NULL PRIMARY KEY,)INSERT INTO #UNITSSELECT 1 UNION ALLSELECT 3 UNION ALLSELECT 7 UNION ALLSELECT 9CREATE TABLE #TENS ( Num INT NOT NULL PRIMARY KEY,)INSERT INTO #TENSSELECT 0 UNION ALLSELECT 10 UNION ALLSELECT 20 UNION ALLSELECT 30 UNION ALLSELECT 40 UNION ALLSELECT 50 UNION ALLSELECT 60 UNION ALLSELECT 70 UNION ALLSELECT 80 UNION ALLSELECT 90CREATE TABLE #HUNDREDS ( Num INT NOT NULL PRIMARY KEY,)INSERT INTO #HUNDREDSSELECT 0 UNION ALLSELECT 100 UNION ALLSELECT 200 UNION ALLSELECT 300 UNION ALLSELECT 400 UNION ALLSELECT 500 UNION ALLSELECT 600 UNION ALLSELECT 700 UNION ALLSELECT 800 UNION ALLSELECT 900CREATE TABLE #THOUSANDS ( Num INT NOT NULL PRIMARY KEY,)INSERT INTO #THOUSANDSSELECT 0 UNION ALLSELECT 1000 UNION ALLSELECT 2000 UNION ALLSELECT 3000 UNION ALLSELECT 4000 UNION ALLSELECT 5000 UNION ALLSELECT 6000 UNION ALLSELECT 7000 UNION ALLSELECT 8000 UNION ALLSELECT 9000CREATE TABLE #TEN_THOUSANDS ( Num INT NOT NULL PRIMARY KEY,)INSERT INTO #TEN_THOUSANDSSELECT 0 UNION ALLSELECT 10000 UNION ALLSELECT 20000 UNION ALLSELECT 30000 UNION ALLSELECT 40000 UNION ALLSELECT 50000 UNION ALLSELECT 60000 UNION ALLSELECT 70000 UNION ALLSELECT 80000 UNION ALLSELECT 90000CREATE TABLE #HUNDRED_THOUSANDS ( Num INT NOT NULL PRIMARY KEY,)INSERT INTO #HUNDRED_THOUSANDSSELECT 0 UNION ALLSELECT 100000 UNION ALLSELECT 200000 UNION ALLSELECT 300000 UNION ALLSELECT 400000 UNION ALLSELECT 500000 UNION ALLSELECT 600000 UNION ALLSELECT 700000 UNION ALLSELECT 800000 UNION ALLSELECT 900000CREATE TABLE #MILLIONS ( Num INT NOT NULL PRIMARY KEY,)INSERT INTO #MILLIONSSELECT 0 UNION ALLSELECT 1000000 UNION ALLSELECT 2000000 UNION ALLSELECT 3000000 UNION ALLSELECT 4000000 UNION ALLSELECT 5000000 UNION ALLSELECT 6000000 UNION ALLSELECT 7000000 UNION ALLSELECT 8000000 UNION ALLSELECT 9000000/*CREATE TABLE #PRIMES ( PRIME INT NOT NULL PRIMARY KEY CLUSTERED) */CREATE TABLE #NUMS ( N INT NOT NULL PRIMARY KEY CLUSTERED, SR INT NOT NULL)INSERT INTO #NUMS(N, SR)SELECT CAST(3 AS INT), CAST(ROUND(SQRT(3),0,1) AS INT)UNION ALLSELECT CAST(5 AS INT), CAST(ROUND(SQRT(5),0,1) AS INT)UNION ALLSELECT N, CAST(ROUND(SQRT(N),0,1) AS INT) AS SRFROM ( SELECT m.Num + ht.Num + tt.Num + T.Num + h.Num + t.Num + u.Num AS N FROM #MILLIONS AS m, #HUNDRED_THOUSANDS AS ht, #TEN_THOUSANDS AS tt, #THOUSANDS AS T, #HUNDREDS AS h, #TENS AS t, #UNITS AS u ) AS XWHERE N > 6 AND N % 6 IN (1, 5)--INSERT INTO #PRIMES(PRIME) SELECT 2 AS PRIME UNION ALL SELECT N1.N FROM #NUMS AS N1 WHERE NOT EXISTS ( SELECT 1 FROM #NUMS AS N2 WHERE N2.N <= N1.SR AND N1.N % N2.N = 0 )--SELECT *--FROM #PRIMES--ORDER BY PRIMESET @END = GETDATE()PRINT 'Total Duration: ' + CAST(CAST(DATEDIFF(ms, @START, @END)/1000. AS decimal(7,3)) AS varchar(15)) + ' seconds'GO--SET SHOWPLAN_XML OFF--GO[/code]A benefit of the overall design is that it's quite easily scaled up in terms of just adding another table for each digit, with values having just one more zero than the previous digit, then adding that table to the join list. I had no idea having to place the data in a table would incur such incredible overhead relative to the overall duration of the query.Steve(aka smunson):-):-):-)Sun, 02 Aug 2009 16:39:04 GMTsgmunsonRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspxThank you.I never said it was the fastest, but probably the simplest. Here are my two final versions of my suggestions; the Sieve of Atkins implementation and the SquareFree implementation.Here is the stats on SquareFree implementation on my laptop.It scales well with respect to reads, but not CPU and Duration.[code] MaxPrime -> 1,000 10,000 100,000 1,000,000 Reads 550 5,384 53,500 533,867 CPU 0 125 2,465 59,436 Duration 8 239 2,609 64,250 Writes 0 0 1 1[/code]And with that, good night and good luck to you all!This is my final submission.//PeterSun, 02 Aug 2009 16:26:51 GMTSwePesoRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspxPeso,I was able to scale up your original Square-Free solution to 10 million records by adding a couple more sets of values that were 16 times larger than the previous set, and propagating that through as needed, and it ran in 7:32. I then limited the last of the extra number sets to having a maximum value of just over 2 million, and it ran in 5:42. I only had to add one set to your latest update, with the same limit as before, and it ran in 5:27. Correct results were verified.Steve(aka smunson):-):-):-)Sun, 02 Aug 2009 16:06:29 GMTsgmunsonRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspx[quote][b]smunson (8/2/2009)[/b][hr]I've had 10 million rows ever since I installed SQL Server[/quote]Understood and I appreciate the feedback... but that's on a home machine. Mike's question was referring to real live production machines.Sun, 02 Aug 2009 15:58:05 GMTJeff ModenRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspxI've had 10 million rows ever since I installed SQL Server back in November of last year, and I use it to help generate large test data sets when I come across interesting problems that lend themselves to large volume testing. Mind you, this is a home machine, and it's not even a server, but it does a do a rather nice whiz-bang job on large amounts of data when I can get all 4 cores working on something in parallel. Now if I could just find 4GB sticks of RAM that are FBDIMM's and NON-ECC, and at a reasonable cost, I could get up from 8GB of RAM to 16GB.Steve(aka smunson):-):-):-)[quote][b]Jeff Moden (8/2/2009)[/b][hr]Like Matt Miller, R. Barry Young, and several others on this forum, I can see the utility of having a million row Tally table especially if you're running SQL Server 2005 or above. But no one has yet answered Mike's question...[quote][b]Mike Ross (8/2/2009)[/b][hr]Can anyone here [b][u]honestly[/u][/b] say that they had a Tally table in [i][b]money-making[/b][/i] use (not random, recreational math experiments) of 1 million or more rows, [b][i]before[/i][/b] this contest?If so, why?[/quote]Even though I'm a bit of a Tally table zealot, I can't see generally having a Tally table of more than a million rows. I might be able to see it for some special purposes if used on a regular basis, but not in most DB's in general.So the question I have is the same as Mike's... Who can honestly say they have a Tally table of more than a million rows on a real live production box and why? :blink: Heh... "Enquiring minds want to know." :-P[/quote]Sun, 02 Aug 2009 15:44:58 GMTsgmunsonRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspxLike Matt Miller, R. Barry Young, and several others on this forum, I can see the utility of having a million row Tally table especially if you're running SQL Server 2005 or above. But no one has yet answered Mike's question...[quote][b]Mike Ross (8/2/2009)[/b][hr]Can anyone here [b][u]honestly[/u][/b] say that they had a Tally table in [i][b]money-making[/b][/i] use (not random, recreational math experiments) of 1 million or more rows, [b][i]before[/i][/b] this contest?If so, why?[/quote]Even though I'm a bit of a Tally table zealot, I can't see generally having a Tally table of more than a million rows. I might be able to see it for some special purposes if used on a regular basis, but not in most DB's in general.So the question I have is the same as Mike's... Who can honestly say they have a Tally table of more than a million rows on a real live production box and why? :blink: Heh... "Enquiring minds want to know." :-PSun, 02 Aug 2009 14:59:50 GMTJeff ModenRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspx[quote][b]Peso (8/2/2009)[/b][hr]Much faster implementation of my "squarefree" solution here...//Peso[/quote]Interesting. Thanks, Peter.By the way, in the spirit of all that's been said on this discussion, here's my one line way of returning all of the prime numbers some people will ever need...[code] SELECT * FROM dbo.PrimeNumber ORDER BY N[/code]... and here's the code for all the prime numbers that I'll likely ever need...[code] SELECT * FROM dbo.PrimeNumber ORDER BY NMsg 208, Level 16, State 1, Line 1Invalid object name 'PrimeNumber'.[/code];-)Sun, 02 Aug 2009 14:49:37 GMTJeff ModenRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspxMuch faster implementation of my "squarefree" solution here...//PesoSun, 02 Aug 2009 14:22:27 GMTSwePesoRE: Celko's Summer SQL Stumpers: Prime numbershttp://www.sqlservercentral.com/Forums/Topic759697-1604-1.aspx[quote][b]Mike Ross (8/2/2009)[/b][hr][quote][b]RBarryYoung (8/2/2009)[/b][hr]It gets more interesting, the higher you go, at 20,000,000, the improved RBAR takes 153 sec and mine takes 78 sec. In short, mine is running almost twice as fast.[/quote]Interesting. At 10M on my machine, your [i]first[/i] run is faster but after that yours averages 99 seconds and mine averages 88 seconds. You're probably also running on SQ2005? [/quote]You've hit the fallacy of such a code contest right square on the head. Identical code takes different times and has different effeciences based on the machine that it is executed on.Phil Factor and I had a bit of a contest between ourselves on splitting all of the words in the text of "Moby Dick". On my machine, the Tally table split easily beat his While loop. On his machine, just the opposite was true.That also means that the myth of portability is just that... a performance myth. Code must be tweaked for whatever machine and RDBMS it is to be run on. While you're at it, you might as well use some of the advanced features that are not ANSI or ISO compliant to get the best performance possible.My recommendation would be for everyone to stop bickering about how fast or slow someone's code for this contest is on [i]their [/i]machine. It will only matter as to how fast the code runs on the test machine that Celko and Factor elect to use. ;-)Sun, 02 Aug 2009 14:21:50 GMTJeff Moden