Viewing 15 posts - 91 through 105 (of 245 total)
SQL 2000 does not have a perfect query optimizer.
My query is against a tera byte plus of data and the joins in question contain millions of rows. I have...
December 28, 2006 at 4:54 pm
thanks
or
select *
from table1
join (table2 join table3
on tbale2.col = table1.col
December 28, 2006 at 9:43 am
trying to improve performance of query
several multi million row tables joing together want to experiment with join order structure.
i thought you could pre join certain chunks of the from clause
somethig...
December 28, 2006 at 9:24 am
for whats it worth franks function runs fastest in my sql express inastallatin on my creaking lap top
(11000 row(s) affected)
(100000 row(s) affected)
--===== Frank's function
DBCC execution completed. If DBCC printed...
December 15, 2006 at 6:54 am
if your using sql 2005 which i suspect you might be given your name! you should use try catch blocks which are much neater and much efficient as you dont...
December 14, 2006 at 11:07 am
you database design isnt right. You shouldnt be storing the same data in two different tables becasue you are likely to have problems with consistency.
why not just add a bit...
December 14, 2006 at 11:02 am
'so prioritise those in a certain order.' Not sure what you mean by this. This code will run fast. Much much faster than a cursor.
December 8, 2006 at 3:46 am
Version above only distinguishes between transaction on different days. Version below will do it down to milli seconds which is probably what you want.
---DDL--
if
December 7, 2006 at 10:11 am
---DDL--
if
object_id('test_donations') is not null
December 7, 2006 at 10:07 am
select min(transaction_date), customer_id
from
(
select
transaction_date, communication_id, customer_id
from
test_donations
union
December 7, 2006 at 9:07 am
---DDL--
if
object_id('test_donations') is not null drop...
December 7, 2006 at 6:08 am
Read this. They platform they are using is mysql but the theory is identical for all RDBMS and it has a nice worked example
http://dev.mysql.com/tech-resources/articles/intro-to-normalization.html
This is a bit more advanced
December 7, 2006 at 5:31 am
--try this. note that the table variable defintion must fit the select statement, I have guessed at some of the data types.
declare
@start int
November 29, 2006 at 7:49 am
post the select statement that returns the record set you want paged into chunks.
November 29, 2006 at 7:35 am
why not use a union
so
select * from table1 join table2 on table1.col = yourcondition
union
select * from table1 join table3 on table1.col = yourcondition
November 29, 2006 at 7:26 am
Viewing 15 posts - 91 through 105 (of 245 total)