July 29, 2013 at 12:14 pm
Hi All,
Need help in writing tsql for creating comma separated values for a table based on batch size.
For eg i have table tblA with 10 records with following records
ID
1
2
3
..
10
Now i want to create a column that will have comma separated values with 2 values each if i pass the batch size as '2' i.e
1,2
3,4
..
9, 10
similarly if i pass the batch size as 5 then there will be 2 set of records like,
1,2,3,4,5
6,7,8,9,10
Please let me know how to implement the logic.
Thanks in advance.
Sam
July 29, 2013 at 1:10 pm
I did using the following code .using northwind database..
with mycte as(select ntile(15) over( order by customerid) as grp1, customerid from dbo.Customers)
,mycte2 as(select distinct grp1 from mycte)
SELECT grp1
, customerid = STUFF((
SELECT ',' + customerid
FROM mycte
WHERE grp1 = mycte2.grp1
ORDER BY customerid
FOR XML PATH(''), TYPE).value('.','varchar(100)'), 1, 1, '')
FROM mycte2
with ntile im breaking the records as per the required group..
thanks
July 31, 2013 at 3:17 pm
-- Prepare sample data
DECLARE@Sample TABLE
(
Value INT NOT NULL
);
INSERT@Sample
(
Value
)
SELECTNumber
FROMmaster.dbo.spt_values
WHERE[Type] = 'P';
-- SwePeso
DECLARE@BatchSize TINYINT = 5;
WITH cteSource(Value, theGrp)
AS (
SELECTValue,
(ROW_NUMBER() OVER (ORDER BY Value) - 1) / @BatchSize AS theGrp
FROM@Sample
)
SELECTd.theGrp,
STUFF(f.Data, 1, 1, '') AS Data
FROM(
SELECTNumber AS theGrp
FROMmaster.dbo.spt_values
WHERE[Type] = 'P'
AND Number <= (SELECT COUNT(*) FROM @Sample) / @BatchSize
) AS d
CROSS APPLY(
SELECT',' + CAST(x.Value AS VARCHAR(12))
FROMcteSource AS x
WHEREx.theGrp = d.theGrp
ORDER BYx.Value
FOR XMLPATH('')
) AS f(Data);
N 56°04'39.16"
E 12°55'05.25"
August 6, 2013 at 8:49 pm
How about this?
To change the number of batch, change the value of @batch variable.
declare @batch as smallint
set @batch = 6
if object_id('tempdb..#sample') is not null
drop table #sample
SELECT top 20
abs(id) as number
into #sample
FROMmaster.dbo.sysobjects
select * from #sample
;with batched
as
(
select
batch_id = (row_number() over(order by number) - 1)/@batch,
number
from #sample
), sorted
as
(
select
batch_id,
sort_id = row_number() over(partition by batch_id order by number),
number
from batched
),GatherAll
as
(
select batch_id, sort_id, cast(number as varchar(max)) as Gathered
from sorted
where sort_id = 1
union all
select s.batch_id, s.sort_id, cast(rtrim(Gathered) + ',' + cast(s.number as varchar(50)) as varchar(max))
from sorted s
inner join GatherAll g on g.batch_id = s.batch_id and s.sort_id = g.sort_id + 1
)
select batch_id, max(Gathered)
from GatherAll
group by Batch_Id
Happy Coding!!!
~~ CK
August 6, 2013 at 11:05 pm
ck9663 (8/6/2013)
How about this?To change the number of batch, change the value of @batch variable.
declare @batch as smallint
set @batch = 6
if object_id('tempdb..#sample') is not null
drop table #sample
SELECT top 20
abs(id) as number
into #sample
FROMmaster.dbo.sysobjects
select * from #sample
;with batched
as
(
select
batch_id = (row_number() over(order by number) - 1)/@batch,
number
from #sample
), sorted
as
(
select
batch_id,
sort_id = row_number() over(partition by batch_id order by number),
number
from batched
),GatherAll
as
(
select batch_id, sort_id, cast(number as varchar(max)) as Gathered
from sorted
where sort_id = 1
union all
select s.batch_id, s.sort_id, cast(rtrim(Gathered) + ',' + cast(s.number as varchar(50)) as varchar(max))
from sorted s
inner join GatherAll g on g.batch_id = s.batch_id and s.sort_id = g.sort_id + 1
)
select batch_id, max(Gathered)
from GatherAll
group by Batch_Id
Happy Coding!!!
~~ CK
Please see the following article for why you might not want to use a recursive CTE to count.
[font="Arial Black"]Hidden RBAR: Counting with Recursive CTE's[/font]
[/url]
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 5 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply
This website stores cookies on your computer.
These cookies are used to improve your website experience and provide more personalized services to you, both on this website and through other media.
To find out more about the cookies we use, see our Privacy Policy