Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase ««123»»

column to calculate the balance stock dynamically Expand / Collapse
Author
Message
Posted Tuesday, March 3, 2009 9:20 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Yesterday @ 8:58 PM
Points: 36,794, Visits: 31,253
beezell (3/3/2009)
Hmmm... there are very few scenarios where I've seen cursors faster than CTE or non-RBAR (row by agonizing row) methods. Just curious how many rows you're running through?

Cheers,
Brian


That's because what looked like set based code is actually RBAR on steroids known as a "Triangular Join" and it will cripple a server on very low rowcounts even as low as 20,000 rows. Read all about in the following link...

http://www.sqlservercentral.com/articles/T-SQL/61539/


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #667817
Posted Tuesday, March 3, 2009 9:26 PM


SSC-Insane

SSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-Insane

Group: General Forum Members
Last Login: Yesterday @ 7:45 PM
Points: 23,081, Visits: 31,620
Jeff Moden (3/3/2009)
Lynn Pettis (3/3/2009)
Oh, wow. Looks like I may have beat Jeff. But of course, he is probably setting up a million row example to demonstrate his code, where I just took what was presented and went from there. Jeff's code will also be heavily commented as well.


Heh... you know me all to well, Brother Lynn! :D Although Mathew did a very good job of attaching sample data, the column names are quite a bit different than those posted and I also wanted to show a shorter way to post sample data. And, as you so very well put it, I was playing with my typical million row example and then saw you beat me to answering this. The only thing you forgot was the TABLOCKX to keep people out of the table while the update is occuring. :)

If anyone wants to see the million row update to make a running total, you'll just have to wait until I can publish the rewritten article.


You can also find the same code in my article, but it lacks all the detail behind it, as my work was based on Jeff's. (Sorry, shameless plugs.)



Lynn Pettis

For better assistance in answering your questions, click here
For tips to get better help with Performance Problems, click here
For Running Totals and its variations, click here or when working with partitioned tables
For more about Tally Tables, click here
For more about Cross Tabs and Pivots, click here and here
Managing Transaction Logs

SQL Musings from the Desert Fountain Valley SQL (My Mirror Blog)
Post #667819
Posted Tuesday, March 3, 2009 9:56 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Yesterday @ 8:58 PM
Points: 36,794, Visits: 31,253
Samuel Vella (2/26/2009)
mh, your solution is fine but when you test it with a lot of data against a cursor, the cursor is orders of magnitude faster


Man, you have absolutely the correct idea... you put your money where your mouth was with code! Well done!

Just to share a few tricks with someone who actually tests code for performance, here's your test code generator...
drop table sales
CREATE TABLE Sales (DayCount smallint, channel varchar(50), Sales money)
CREATE CLUSTERED INDEX ndx_DayCount ON Sales(DayCount)
go

SET NOCOUNT ON

DECLARE @DayCount smallint, @Sales money, @channel varchar(50)
SET @DayCount = 0
SET @Sales = 10
set @channel = 'a'

WHILE @DayCount < 10000
BEGIN
INSERT Sales VALUES (@DayCount,@channel, @Sales)
SET @DayCount = @DayCount + 1
SET @Sales = @Sales + 15
set @channel = case
when right(cast(@daycount as varchar), 1) = 0 then 'a'
when right(cast(@daycount as varchar), 1) = 1 then 'b'
when right(cast(@daycount as varchar), 1) = 2 then 'c'
when right(cast(@daycount as varchar), 1) = 3 then 'd'
when right(cast(@daycount as varchar), 1) = 4 then 'e'
when right(cast(@daycount as varchar), 1) = 5 then 'f'
when right(cast(@daycount as varchar), 1) = 6 then 'g'
when right(cast(@daycount as varchar), 1) = 7 then 'h'
when right(cast(@daycount as varchar), 1) = 8 then 'i'
when right(cast(@daycount as varchar), 1) = 9 then 'j'
end
END

It does a fine job of making 10,000 rows in somewhere just over 3 seconds on my 6 year old box... but, what if you wanted a million rows of that same example just to make sure? The following code makes the same kind of data example, but it makes 100 times more rows in only about 4-5 seconds more...

--===== Conditionaly drop the test table
IF OBJECT_ID('TempDB..#Sales','U') IS NOT NULL
DROP TABLE #Sales

--===== Populate the table with a million rows of data similar to yours.
-- This ISNULL is to make the resulting DayCount column NOT NULL so
-- we can put a primary key on it later.
SELECT TOP 1000000
ISNULL(CAST(ROW_NUMBER() OVER (ORDER BY sc1.ID) - 1 AS INT),0) AS DayCount,
CAST(ROW_NUMBER() OVER (ORDER BY sc1.ID) *15 AS MONEY) AS Sales,
CHAR(CAST(ROW_NUMBER() OVER (ORDER BY sc1.ID) - 1 AS INT)%10+ASCII('a')) AS Channel
INTO dbo.#Sales
FROM Master.sys.SysColumns sc1
CROSS JOIN Master.sys.SysColumns sc2

--===== Add the primary key
-- I don't name these on temp tables because they must be unique.
ALTER TABLE dbo.#Sales
ADD PRIMARY KEY CLUSTERED (DayCount)

Like I said, well done on the testing... now you have a way to do some really heavy duty testing that no one will be able to argue with.


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #667829
Posted Tuesday, March 3, 2009 9:57 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Yesterday @ 8:58 PM
Points: 36,794, Visits: 31,253
Lynn Pettis (3/3/2009)
Jeff Moden (3/3/2009)
Lynn Pettis (3/3/2009)
Oh, wow. Looks like I may have beat Jeff. But of course, he is probably setting up a million row example to demonstrate his code, where I just took what was presented and went from there. Jeff's code will also be heavily commented as well.


Heh... you know me all to well, Brother Lynn! :D Although Mathew did a very good job of attaching sample data, the column names are quite a bit different than those posted and I also wanted to show a shorter way to post sample data. And, as you so very well put it, I was playing with my typical million row example and then saw you beat me to answering this. The only thing you forgot was the TABLOCKX to keep people out of the table while the update is occuring. :)

If anyone wants to see the million row update to make a running total, you'll just have to wait until I can publish the rewritten article.


You can also find the same code in my article, but it lacks all the detail behind it, as my work was based on Jeff's. (Sorry, shameless plugs.)


Shameless or not, it's a great article... people should go take a look!


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #667832
Posted Monday, September 10, 2012 2:56 PM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Friday, June 6, 2014 1:15 PM
Points: 6, Visits: 30
Hey, just wanted to thank you guys for this thread. I have been working on a weighted average cost calculation for two or three days, and stumbled across this link. I've been trying to avoid doing a cursor calculation as I have over 300,000 different inventory items I have to walk through (over 73 million rows total). As you can see from the comments I left in, I shamelessly copied Jeff's code and modified it to fit my needs. Thanks a million.

Randy Stone

CREATE TABLE #WACTran(
[TranOrder] INT PRIMARY KEY CLUSTERED,
[TranType] VARCHAR(3) NULL,
[Qty] INT NULL,
[UnitCost] DECIMAL(14, 4) NULL,
[ExtendedCost] DECIMAL(14, 4) NULL,
[QtyOnHand] INT
)




INSERT INTO [#WACTran](TranOrder, TranType, Qty, UnitCost, ExtendedCost)
VALUES
(0 ,'BB',100 ,1.5000, 150.0000 )
,(1 ,'S',-50 ,0.0000, 0.0000 )
,(2 ,'RCT',15 ,0.0000, 0.0000 )
,(3 ,'S',-5 ,0.0000, 0.0000 )
,(4 ,'VC',15 ,1.6000, 0.0000 )
,(5 ,'S',-5 ,0.0000, 0.0000 )
,(6 ,'TO',-4 ,0.0000, 0.0000 )
,(7 ,'TI',2 ,0.0000, 0.0000 )
,(8 ,'VTI',2 ,1.6500, 0.0000 )
,(9 ,'RCT',10 ,0.0000, 0.0000 )
,(10,'S',-5 ,0.0000, 0.0000 )
,(11,'VC',10 ,1.7500, 0.0000 )
,(12,'S',-5 ,0.0000, 0.0000 )
,(13,'TO',-4 ,0.0000, 0.0000 )
,(14,'TI',2 ,0.0000, 0.0000 );



--===== Declare some obviously named variables
DECLARE @PrevTranOrder INT,
@PrevQtyOnHand INT,
@PrevUnitCost DECIMAL(14,4),
@PrevExtendedCost DECIMAL(14,4)
--===== Do the "quirky" update using a very high speed pseudo-cursor,
-- This is very similar to what you would do in a language like "C" except the
-- "Read a row/Write a row" is built into the update.
UPDATE #WACTran
SET @PrevQtyOnHand = QtyOnHand = CASE
WHEN TranType = 'BB'
THEN Qty
WHEN TranType IN ('VTI','VC')
THEN @PrevQtyOnHand
ELSE @PrevQtyOnHand + Qty
END,
@PrevUnitCost = UnitCost = CASE
WHEN TranType = 'BB'
THEN UnitCost
WHEN TranType IN ('VTI','VC')
THEN (((@PrevQtyOnHand - Qty)*@PrevUnitCost)+(Qty*UnitCost))/@PrevQtyOnHand
ELSE @PrevUnitCost
END,
@PrevExtendedCost = ExtendedCost = CASE
WHEN TranType = 'BB'
THEN ExtendedCost
ELSE @PrevUnitCost*@PrevQtyOnHand
END,
@PrevTranOrder = TranOrder --Just an "anchor", but gotta have it to guarantee things.
FROM [#WACTran] WITH (INDEX(0), TABLOCKX)

--===== Display the results
SELECT *
FROM [#WACTran]
ORDER BY TranOrder

DROP TABLE #WACTran;


Post #1357024
Posted Tuesday, September 11, 2012 1:14 PM


SSCommitted

SSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommitted

Group: General Forum Members
Last Login: Yesterday @ 7:55 AM
Points: 1,945, Visits: 2,863
The short answer is that this is a running total problem which can be done with the SUM () OVER (PARTITION BY ..
ORDER BY ..
ROWS BETWEEN UNBOUNDED PRECEDING AND CURRENT ROW)

The better answer is that the DDL you attached is full of fundamental design errors. I also loved the leading commas, which I use as a symptom to diagnosis bad SQL. Your mindset is still stuck in punch cards and sequential magnetic tape files. Trust me; I have been at this for a few decades.

Why is your vague “<nothing in particular>_qty” a FLOAT? Is it the on hand quantity, the back order quantity, the sold quantity? A fifty character “<nothing in particular>_status”? Invite crap data like this and it will come. You have no keys, so this is not a table at all.

CREATE TABLE Stockroom
(item_name VARCHAR(25) NOT NULL),
trans_date DATE DEFAULT CURRENT_TIMESTAMP NOT NULL,
transaction_qty INTEGER NOT NULL
CHECK (onhand_qty >= 0));

The “id” and “seq” were obviously redundant, caused by an attempt to mimic a mag tape record number in in SQL. But so is the transaction type. Issuing something is a negative quantity; stocking it is positive. Use the sign of the quantity to put that fact into one column.

INSERT INTO Stockroom
VALUES
('ACTIFED', '2004-01-01', +100),
('ACTIFED', '2004-03-28', -5),
('ACTIFED', '2004-04-01', -5),
('ACTIFED', '2004-05-05', -5),
('ACTIFED', '2004-05-10', -5),
('AGIOLAX', '2004-01-01', +50),
('AGIOLAX', '2004-02-12', -10),
('AGIOLAX', '2004-03-28', +50),
('AGIOLAX', '2004-05-05', -25),
('OTRIVIN CHILD 0.5% NDrops', '2004-01-01', +200),
('OTRIVIN CHILD 0.5% NDrops', '2004-03-28', +50),
('OTRIVIN CHILD 0.5% NDrops', '2004-04-01', -15),
('OTRIVIN CHILD 0.5% NDrops', '2004-05-05', -10);

SELECT item_name, trans_date,
SUM (transaction_qty)
OVER (PARTITION BY item_name ORDER BY trans_date)
ROWS BETWEEN UNBOUNDED PRECEDING AND CURRENT ROW)
AS ohhand_qty
FROM Stock_Room;


Books in Celko Series for Morgan-Kaufmann Publishing
Analytics and OLAP in SQL
Data and Databases: Concepts in Practice
Data, Measurements and Standards in SQL
SQL for Smarties
SQL Programming Style
SQL Puzzles and Answers
Thinking in Sets
Trees and Hierarchies in SQL
Post #1357649
Posted Tuesday, September 11, 2012 1:27 PM


SSC-Insane

SSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-Insane

Group: General Forum Members
Last Login: Yesterday @ 7:45 PM
Points: 23,081, Visits: 31,620
CELKO (9/11/2012)
The short answer is that this is a running total problem which can be done with the SUM () OVER (PARTITION BY ..
ORDER BY ..
ROWS BETWEEN UNBOUNDED PRECEDING AND CURRENT ROW)

The better answer is that the DDL you attached is full of fundamental design errors. I also loved the leading commas, which I use as a symptom to diagnosis bad SQL. Your mindset is still stuck in punch cards and sequential magnetic tape files. Trust me; I have been at this for a few decades.

Why is your vague “<nothing in particular>_qty” a FLOAT? Is it the on hand quantity, the back order quantity, the sold quantity? A fifty character “<nothing in particular>_status”? Invite crap data like this and it will come. You have no keys, so this is not a table at all.

CREATE TABLE Stockroom
(item_name VARCHAR(25) NOT NULL),
trans_date DATE DEFAULT CURRENT_TIMESTAMP NOT NULL,
transaction_qty INTEGER NOT NULL
CHECK (onhand_qty >= 0));

The “id” and “seq” were obviously redundant, caused by an attempt to mimic a mag tape record number in in SQL. But so is the transaction type. Issuing something is a negative quantity; stocking it is positive. Use the sign of the quantity to put that fact into one column.

INSERT INTO Stockroom
VALUES
('ACTIFED', '2004-01-01', +100),
('ACTIFED', '2004-03-28', -5),
('ACTIFED', '2004-04-01', -5),
('ACTIFED', '2004-05-05', -5),
('ACTIFED', '2004-05-10', -5),
('AGIOLAX', '2004-01-01', +50),
('AGIOLAX', '2004-02-12', -10),
('AGIOLAX', '2004-03-28', +50),
('AGIOLAX', '2004-05-05', -25),
('OTRIVIN CHILD 0.5% NDrops', '2004-01-01', +200),
('OTRIVIN CHILD 0.5% NDrops', '2004-03-28', +50),
('OTRIVIN CHILD 0.5% NDrops', '2004-04-01', -15),
('OTRIVIN CHILD 0.5% NDrops', '2004-05-05', -10);

SELECT item_name, trans_date,
SUM (transaction_qty)
OVER (PARTITION BY item_name ORDER BY trans_date)
ROWS BETWEEN UNBOUNDED PRECEDING AND CURRENT ROW)
AS ohhand_qty
FROM Stock_Room;


Quite true Mr. Celko, IF (and the operative word here is IF) we were using SQL Server 2012. This is a SQL Server 2005 forum so I would suggest, sir, that keep your solutions to something that will work in the version of SQL Server that the individual is currently using. It might be nice to show someone what they are missing, but it doesn't help solve the current problem, now does it?



Lynn Pettis

For better assistance in answering your questions, click here
For tips to get better help with Performance Problems, click here
For Running Totals and its variations, click here or when working with partitioned tables
For more about Tally Tables, click here
For more about Cross Tabs and Pivots, click here and here
Managing Transaction Logs

SQL Musings from the Desert Fountain Valley SQL (My Mirror Blog)
Post #1357660
Posted Thursday, September 13, 2012 8:11 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Friday, June 6, 2014 1:15 PM
Points: 6, Visits: 30
Actually, I am using 2012; but I'm not doing a simple running total calculation, I'm doing a weighted average cost calculation, and I have to keep track of three aggregate values that are inter-related but do not always change at the same time:

QtyOnHand, which is the running total of Qty; This changes on all TranTypes except Verifications (VTI, VC)

UnitCost, which is an average based on the equation ((Qty_1 * UnitCost_1)..+(Qty_n * UnitCost_n))/SUM(Qty) which changes with Verifications, and may also change on onther incoming TranTypes if the UnitCost is known (sometimes it is, sometimes it isn't)

ExtendedCost, which is the QtyOnHand * UnitCost (but remember that UnitCost is being calculated at the same time).

The code I posted above doesn't include my group by, as I'm still working on that one. My parts are grouped by:

Branch (store branch)
PartNo (part number)
SKey (a subset of the part number inventory that is tagged and put aside for an order); Inventory not currently tagged has an SKey of 0.
And everything is ordered by date.

The fun part comes in because even though the WAC (Weighted Average Cost) is calculated at the Branch level, sometimes SKeys include more than one branch's running totals - for example, Branch 12 may order parts from Branches 11 and 13 to fill an order, and the unit cost on parts from Branch 11 can be different from the unit cost for parts from Branch 13. So far, so good, but wait - there's more.

Just to make things interesting, the unit cost of the parts from Branch 13 may change AFTER they were sent, because their original unit cost had not been verified yet. (We receive parts on an ongoing basis from our vendors, and may not know about a price increase until we get the invoice at the end of the month). I'm still working on how to get this change into my calculations in the proper order.

Post #1358590
Posted Thursday, September 13, 2012 8:25 AM


SSC-Insane

SSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-Insane

Group: General Forum Members
Last Login: Yesterday @ 7:45 PM
Points: 23,081, Visits: 31,620
rc_stone_1 (9/13/2012)
Actually, I am using 2012; ...



May I suggest that you post your question in a more appropriate forum in the future? You did post this in a SQL Server 2005 forum, so you are more likely to receive a solution that will work in SQL Server 2005. Using SQL Server 2012 opens doors to other options to help solve your problem that could be more efficient.



Lynn Pettis

For better assistance in answering your questions, click here
For tips to get better help with Performance Problems, click here
For Running Totals and its variations, click here or when working with partitioned tables
For more about Tally Tables, click here
For more about Cross Tabs and Pivots, click here and here
Managing Transaction Logs

SQL Musings from the Desert Fountain Valley SQL (My Mirror Blog)
Post #1358602
Posted Thursday, September 13, 2012 8:37 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Friday, June 6, 2014 1:15 PM
Points: 6, Visits: 30
Been there, done that; Celko's solution works with a simple running total, but I wasn't able to make it work correctly with the WAC calculation unless I ran three separate times - once for QtyOnHand, once to calculate the Weighted Average of Unit Cost, and once to multiply those together to get the Extended Cost. I just wanted to express my thanks for giving me a more elegant solution.
Post #1358616
« Prev Topic | Next Topic »

Add to briefcase ««123»»

Permissions Expand / Collapse