September 21, 2007 at 4:15 am
HI,
I know there are not SQL recursive in 2000.
My problem is I have this table :
matricul;company;Activity;date_strat;date_end;number_of_day
1;AA;20070926;20070928;3
1;AA;20071001;20071006;6
1;AA;20071007;20071007;1
I would to do a SQL as result :
1;AA;20070926;20070928;3
1;AA;200701;20071007;7
If there are only one day between date_end and date_strat for two consecutive line, I have to sum number_of_day else I don't.
Thanks a lot.
September 21, 2007 at 7:05 am
So... if there are, say, 7 dates that are all where the startdate of the next row is 1 greater than the enddate of the previous row like this...
1;AA;20070921;20070922;10
1;AA;20070922;20070923;20
1;AA;20070923;20070924;30
1;AA;20070924;20070925;40
1;AA;20070925;20070926;50
1;AA;20070926;20070927;60
1;AA;20070927;20070928;70
... would the result be a single row that looks like this
...
1;AA;20070921;20070928;280
--Jeff Moden
Change is inevitable... Change for the better is not.
September 21, 2007 at 7:14 am
Hi, no exactely...
I have to sum only if the current line and the next line are 1 day to difference. If there are more that One day, it's not the same sum like my example.
1;AA;20070921;20070922;10
1;AA;20070922;20070923;20
1;AA;20070924;20070924;40
1;AA;20070926;20070927;60
1;AA;20070927;20070928;70
The result must be :
1;AA;20070921;20070923;30
1;AA;20070924;20070924;40
1;AA;20070926;20070928;130
Thanks
September 21, 2007 at 8:14 am
This should be a functional one. Not a well performing one on large set (because of the nested subquery). But it does get you out of the recursive business.
It uses an additional table. I didn't define indexes, but performance will be greatly enhanced by indexing date_start and date_end (on both tables).
--set up process
create
table matric (matricul int,company char(2),date_start datetime, date_end datetime,number_of_day int)
create
table endmatric (matricul int,company char(2),date_start datetime, date_end datetime,number_of_day int)
--Insert your testing data here
delete
from endmatric --just to not have garbage from before
--find all the breaks in days first
insert
endmatric (matricul,company,date_start )
select
m.matricul,m.company,m.date_start from matric m left outer join matric m1 on m.date_start=m1.date_end where m1.matricul is null
--here are the results.
select
e.matricul,e.company, e.date_start, max(m.date_end), sum(m.number_of_day)
from
matric m, endmatric e where e.date_start = (select max(date_start) from endmatric e1 where e1.date_start<=m.date_start) --this is going to SUCK performance-wise
group
by e.matricul,e.company, e.date_start
order
by e.date_start
----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
September 21, 2007 at 8:48 am
Second version: should do better on large sets, as long as you don't have incredibly long "unbroken streaks". Same remark re: indexes.
this one is iterative in nature, doing n update operations (where n=length of the longest chain). So - if you
delete
from endmatric
insert
endmatric (matricul,company,date_start, date_end,number_of_day )
select
m.matricul,m.company,m.date_start, m.date_end,m.number_of_day from matric m left outer join matric m1 on m.date_start=m1.date_end where m1.matricul is null
declare
@rows int
set
@rows=1
WHILE
@rows>0
Begin
update
endmatric
set
date_end=matric.date_end, number_of_day=endmatric.number_of_day+matric.number_of_day
from
endmatric
inner
join matric on endmatric.date_end=matric.date_start
set
@rows=@@rowcount
End
select
* from endmatric order by date_start
----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
September 21, 2007 at 9:32 am
i think i was able to do this as a set based operation by joining the table against itself:
take a look at the same data and my results, adn tell me if this is doing what you expect.
--because you need to add a day to DATE_STRAT to determine if they are consecutive or not,
--these must be datetime or int values...if they are varchars, you designed your table wrong. correct it.
DECLARE @SAMPLE TABLE(MATRICUL INT ,COMPANY VARCHAR(30),ACTIVITY VARCHAR(30),DATE_STRAT DATETIME,DATE_END DATETIME, NUMBER_OF_DAY int)
INSERT INTO @SAMPLE (MATRICUL,COMPANY,DATE_STRAT,DATE_END,NUMBER_OF_DAY)
SELECT 1,'AA','20070926','20070928',3 UNION
SELECT 1,'AA','20071001','20071006',6 UNION
SELECT 1,'AA','20071007','20071007',1 UNION
SELECT 1,'AA','20070101','20070212',3 UNION
SELECT 1,'AA','20070214','20070215',6 UNION
SELECT 1,'AA','20070216','20070221',6
SELECT
MIN(START.DATE_STRAT),
COALESCE(MAX(ENDER.DATE_END),MAX(START.DATE_END))
FROM @SAMPLE START
LEFT OUTER JOIN @SAMPLE ENDER ON START.DATE_END + 1 = ENDER.DATE_STRAT
WHERE ENDER.DATE_STRAT =START.DATE_END + 1 OR ENDER.DATE_STRAT IS NULL
GROUP BY START.DATE_STRAT
results
this kind of requires that DATE_STRAT is unique; note i repeated 20070101 on purpose, and it's kind of misleading.
| 2007-01-01 00:00:00.000 | 2007-02-12 00:00:00.000 |
| 2007-02-14 00:00:00.000 | 2007-02-21 00:00:00.000 |
| 2007-02-16 00:00:00.000 | 2007-02-21 00:00:00.000 |
| 2007-09-26 00:00:00.000 | 2007-09-28 00:00:00.000 |
| 2007-10-01 00:00:00.000 | 2007-10-07 00:00:00.000 |
| 2007-10-07 00:00:00.000 | 2007-10-07 00:00:00.000 |
Lowell
September 21, 2007 at 10:36 am
the above also would not take into consideration if there were 3 or more rows that should condense to a single time slot you'd need to handle it differently, most likely with Jeff Moden's famous Tally table, but I haven't envisioned that solution yet, as your sample data was limited for the example.
Lowell
September 21, 2007 at 6:05 pm
Thank you for the feedback and I have a very high speed solution in mind... but I need to know what you want done with the data that I posted... I understand what you want done with the data that you posted, but I need to cover the contingency for the data I posted, please. ![]()
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 8 posts - 1 through 8 (of 8 total)
You must be logged in to reply to this topic. Login to reply