Query Help

  • Hi  i have some issue maybe someone of you guys can help me .
    i have the amount table ,below:
    date              amount
    1/1/2016         1000
    20/5/2016       1000
    20/7/2016        3000

    i need to write a query to get the accumulate amount table :

    1/1/2016       1000
    20/5/2016      2000
    20/7/2016      5000i

  • RazLinky - Sunday, July 8, 2018 2:56 AM

    Hi  i have some issue maybe someone of you guys can help me .
    i have the amount table ,below:
    date              amount
    1/1/2016         1000
    20/5/2016       1000
    20/7/2016        3000

    i need to write a query to get the accumulate amount table :

    1/1/2016       1000
    20/5/2016      2000
    20/7/2016      5000i

    Select Sum(Amount) Over( Order BY Date) as Sum from Table

  • Here is an example, should be enough to get you over this hurdle.
    😎

    Feel free to ping back if you need any further assistance.


    USE TEEST;
    GO
    SET NOCOUNT ON;
    --
    ;WITH SAMPLE_DATA(TID,TDATE,TVALUE) AS
    (
      SELECT 1,CONVERT(DATE,'20160101',120), 1000 UNION ALL
      SELECT 1,CONVERT(DATE,'20160520',120), 1000 UNION ALL
      SELECT 1,CONVERT(DATE,'20160720',120), 3000 UNION ALL
      SELECT 2,CONVERT(DATE,'20160101',120), 3000 UNION ALL
      SELECT 2,CONVERT(DATE,'20160520',120), 5000 UNION ALL
      SELECT 2,CONVERT(DATE,'20160720',120), 2000
    )
    SELECT
      SD.TID
     ,SD.TDATE
     ,SD.TVALUE
     ,SUM(SD.TVALUE) OVER
       (
        PARTITION BY SD.TID
        ORDER BY  SD.TDATE ASC
        ROWS BETWEEN UNBOUNDED PRECEDING
          AND  CURRENT ROW
       ) AS ACCUM_SUM
    FROM  SAMPLE_DATA  SD;

    Output

    TID    TDATE    TVALUE    ACCUM_SUM
    1    2016-01-01    1000    1000
    1    2016-05-20    1000    2000
    1    2016-07-20    3000    5000
    2    2016-01-01    3000    3000
    2    2016-05-20    5000    8000
    2    2016-07-20    2000    10000

  • RAJIVR67 - Sunday, July 8, 2018 3:15 AM

    RazLinky - Sunday, July 8, 2018 2:56 AM

    SET NOCOUNT ON;

    --

    You can try this method as well along with Erikur's method.

    ;WITH SAMPLE_DATA(TID,TDATE,TVALUE) AS
    (
    SELECT 1,CONVERT(DATE,'20160101',120), 1000 UNION ALL
    SELECT 1,CONVERT(DATE,'20160520',120), 1000 UNION ALL
    SELECT 1,CONVERT(DATE,'20160720',120), 3000 UNION ALL
    SELECT 2,CONVERT(DATE,'20160101',120), 3000 UNION ALL
    SELECT 2,CONVERT(DATE,'20160520',120), 5000 UNION ALL
    SELECT 2,CONVERT(DATE,'20160720',120), 2000
    )
    SELECT
    SD.TID
    ,SD.TDATE
    ,SD.TVALUE
    ,SUM(SD.TVALUE)OVER ( PARTITION BY SD.TID ORDER BY SD.TID,SD.TDATE) AS ACCUM_SUM
    FROM SAMPLE_DATA SD
    ORDER BY SD.TID,SD.TDATE

    6 rows (showing 1 to 6)

    Saravanan

  • saravanatn - Sunday, July 8, 2018 7:19 AM

    RAJIVR67 - Sunday, July 8, 2018 3:15 AM

    RazLinky - Sunday, July 8, 2018 2:56 AM

    SET NOCOUNT ON;

    --

    You can try this method as well along with Erikur's method.

    ;WITH SAMPLE_DATA(TID,TDATE,TVALUE) AS
    (
    SELECT 1,CONVERT(DATE,'20160101',120), 1000 UNION ALL
    SELECT 1,CONVERT(DATE,'20160520',120), 1000 UNION ALL
    SELECT 1,CONVERT(DATE,'20160720',120), 3000 UNION ALL
    SELECT 2,CONVERT(DATE,'20160101',120), 3000 UNION ALL
    SELECT 2,CONVERT(DATE,'20160520',120), 5000 UNION ALL
    SELECT 2,CONVERT(DATE,'20160720',120), 2000
    )
    SELECT
    SD.TID
    ,SD.TDATE
    ,SD.TVALUE
    ,SUM(SD.TVALUE)OVER ( PARTITION BY SD.TID ORDER BY SD.TID,SD.TDATE) AS ACCUM_SUM
    FROM SAMPLE_DATA SD
    ORDER BY SD.TID,SD.TDATE

    Be careful here, first of all, using the partitioning entity in the order by clause is futile as there will only be on entry there for each partition, secondly, relying on the default framing rather than explicitly defining the framing may produce unexpected results. Best practices when using the windowing functions are defining fully all parameters rather than depending on the defaults.
    😎
    As we know by a painful experience, default values and SQL Server have proven to be a poisonous mixture 😉

  • Eirikur Eiriksson - Sunday, July 8, 2018 7:32 AM

    saravanatn - Sunday, July 8, 2018 7:19 AM

    RAJIVR67 - Sunday, July 8, 2018 3:15 AM

    RazLinky - Sunday, July 8, 2018 2:56 AM

    SET NOCOUNT ON;

    --

    You can try this method as well along with Erikur's method.

    ;WITH SAMPLE_DATA(TID,TDATE,TVALUE) AS
    (
    SELECT 1,CONVERT(DATE,'20160101',120), 1000 UNION ALL
    SELECT 1,CONVERT(DATE,'20160520',120), 1000 UNION ALL
    SELECT 1,CONVERT(DATE,'20160720',120), 3000 UNION ALL
    SELECT 2,CONVERT(DATE,'20160101',120), 3000 UNION ALL
    SELECT 2,CONVERT(DATE,'20160520',120), 5000 UNION ALL
    SELECT 2,CONVERT(DATE,'20160720',120), 2000
    )
    SELECT
    SD.TID
    ,SD.TDATE
    ,SD.TVALUE
    ,SUM(SD.TVALUE)OVER ( PARTITION BY SD.TID ORDER BY SD.TID,SD.TDATE) AS ACCUM_SUM
    FROM SAMPLE_DATA SD
    ORDER BY SD.TID,SD.TDATE

    Be careful here, first of all, using the partitioning entity in the order by clause is futile as there will only be on entry there for each partition, secondly, relying on the default framing rather than explicitly defining the framing may produce unexpected results. Best practices when using the windowing functions are defining fully all parameters rather than depending on the defaults.
    😎
    As we know by a painful experience, default values and SQL Server have proven to be a poisonous mixture 😉

    I agreed Erikkur. That' s why I asked the OP to use your method. Partition by and order by the same column may be painful because of ranges. I would like to make OP aware that a problem can be solved by different method 🙂.

    Saravanan

  • saravanatn - Sunday, July 8, 2018 7:42 AM

    Eirikur Eiriksson - Sunday, July 8, 2018 7:32 AM

    saravanatn - Sunday, July 8, 2018 7:19 AM

    RAJIVR67 - Sunday, July 8, 2018 3:15 AM

    RazLinky - Sunday, July 8, 2018 2:56 AM

    SET NOCOUNT ON;

    --

    You can try this method as well along with Erikur's method.

    ;WITH SAMPLE_DATA(TID,TDATE,TVALUE) AS
    (
    SELECT 1,CONVERT(DATE,'20160101',120), 1000 UNION ALL
    SELECT 1,CONVERT(DATE,'20160520',120), 1000 UNION ALL
    SELECT 1,CONVERT(DATE,'20160720',120), 3000 UNION ALL
    SELECT 2,CONVERT(DATE,'20160101',120), 3000 UNION ALL
    SELECT 2,CONVERT(DATE,'20160520',120), 5000 UNION ALL
    SELECT 2,CONVERT(DATE,'20160720',120), 2000
    )
    SELECT
    SD.TID
    ,SD.TDATE
    ,SD.TVALUE
    ,SUM(SD.TVALUE)OVER ( PARTITION BY SD.TID ORDER BY SD.TID,SD.TDATE) AS ACCUM_SUM
    FROM SAMPLE_DATA SD
    ORDER BY SD.TID,SD.TDATE

    Be careful here, first of all, using the partitioning entity in the order by clause is futile as there will only be on entry there for each partition, secondly, relying on the default framing rather than explicitly defining the framing may produce unexpected results. Best practices when using the windowing functions are defining fully all parameters rather than depending on the defaults.
    😎
    As we know by a painful experience, default values and SQL Server have proven to be a poisonous mixture 😉

    I agreed Erikkur. That' s why I asked the OP to use your method. Partition by and order by the same column may be painful because of ranges. I would like to make OP aware that a problem can be solved by different method 🙂.

    The thing that needs to be added here is that for those methods to perform properly, a POC (Partitioning Order Covering) index should be in place on the source table.
    😎

    Thanks Saravanan for expanding the solution domain here, this is exactly how this community works!

  • It should be noted here that the default frame is RANGE BETWEEN UNBOUNDED PRECEDING AND CURRENT ROW (probably because it is deterministic), but this performs much worse than ROWS BETWEEN UNBOUNDED PRECEDING AND CURRENT ROW.  The reason for this is the way that the two methods handle ties: RANGE will include all tied records in the total and therefore will potentially need to store intermediate results until all ties have been accounted for, whereas ROWS will arbitrarily break ties and only ever needs to know the previous total and the current amount.

    Drew

    J. Drew Allen
    Business Intelligence Analyst
    Philadelphia, PA

Viewing 8 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic. Login to reply