Integer convert to percent

  • Having trouble with the below formula. Technically I want to display it by a percentage with two decimal points but when I run the below I get it with over ten numbers to the right. Please help. Seems like a simple thing but I am off with something.

    ((CAST(COLUMNA as decimal(10,2)))/CAST(COLUMNB as decimal(10,2))) *100 as "PERCENT"

    It gives me results such as 0.6666666666666666666666

    I want it to be either .66 or 66.6%

  • CAST the whole thing as decimal(10,2).  That'll give 0.67; if you really need 0.66 then CAST to varchar to chop the end off, and CAST back to decimal.  Alternatively, you may be able to use FLOOR to round down.

    John

  • I understand kinda what you are saying but it looks like i am not doing it right because I am getting an error now.

    (CAST(ABC/XYZ) as decimal(10,2)) as "PERCENT2"

  • I think it is close.  This formula gives me the correct format but now anything past the decimal is 0.  How do I convert this formula to show the numbers past the decimal?

    CONVERT(DECIMAL (10,2),CONVERT(DECIMAL(10,2),ABC)/CONVERT(DECIMAL (10,2),XYZ)) *100

    For instance instead of getting 16.333 i will get 16.0

     

     

  • Multiply by 100 prior to doing your last convert.  your division (for example) is giving you 0.66666666666666... and you convert that to decimal(10,2), the number is now 0.67.  Then you multiply that by 100, you get 67.0.  BUT if you have 0.6666666666 and multiply it by 100, you get 66.666666666.... and then convert to decimal(10,2) you now get 66.67.

    TL;DR - your *100 should happen before your last convert to Decimal(10,2)

    The above is all just my opinion on what you should do. 
    As with all advice you find on a random internet forum - you shouldn't blindly follow it.  Always test on a test server to see if there is negative side effects before making changes to live!
    I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.

  • nymets86 wrote:

    Having trouble with the below formula. Technically I want to display it by a percentage with two decimal points but when I run the below I get it with over ten numbers to the right. Please help. Seems like a simple thing but I am off with something.

    ((CAST(COLUMNA as decimal(10,2)))/CAST(COLUMNB as decimal(10,2))) *100 as "PERCENT"

    It gives me results such as 0.6666666666666666666666

    I want it to be either .66 or 66.6%

     

    Try this...

     CAST(ColumnA*100.0/ColumnB AS DECIMAL(10,2))

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Your suggestion worked.  Thanks.  I am just getting the hang of this but will post the working formula even though there is probably a better way to do this.

    CONVERT(DECIMAL (10,2),100 * CONVERT(DECIMAL(10,2),ABC)/CONVERT(DECIMAL (10,2),XYZ)) as "PERCENT"

  • nymets86 wrote:

    Your suggestion worked.  Thanks.  I am just getting the hang of this but will post the working formula even though there is probably a better way to do this.

    CONVERT(DECIMAL (10,2),100 * CONVERT(DECIMAL(10,2),ABC)/CONVERT(DECIMAL (10,2),XYZ)) as "PERCENT"

    There is... see my previous post just above yours. 😉

     

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • An alternative is to use the format function to convert it to a percentage.  Note that I'm only casting the FirstInteger to a decimal(18,10) because the SecondInteger is implicitly converted to the same when divided.

     

    DECLARE @MyInts TABLE(FirstIntegerValue INT NOT NULL, SecondIntegerValue INT NOT NULL)

    INSERT @MyInts(FirstIntegerValue, SecondIntegerValue)
    VALUES
    (2, 3)


    SELECT *
    ,cast(FirstIntegerValue as decimal(18,10) ) / SecondIntegerValue as 'Unformatted'
    ,format(cast(FirstIntegerValue as decimal(18,10) ) / SecondIntegerValue, '0.0%') AS 'Percent'
    FROM @MyInts
  • I'd recommend against implicit conversions wherever possible.  There are a lot of articles online about implicit conversion and even SQL Server, via the execution plan, will give a warning about implicit conversions.

    If I see a check engine light come on in my car, I get it looked at.  If I see a warning in an execution plan while tuning a query, I look into how to fix it.

    Brent Ozar has a good writeup on implicit conversions here - https://www.brentozar.com/blitz/implicit-conversion/.  It is an extreme example, but it is good advice.

    The above is all just my opinion on what you should do. 
    As with all advice you find on a random internet forum - you shouldn't blindly follow it.  Always test on a test server to see if there is negative side effects before making changes to live!
    I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.

  • williamskevinjames wrote:

    An alternative is to use the format function to convert it to a percentage.  Note that I'm only casting the FirstInteger to a decimal(18,10) because the SecondInteger is implicitly converted to the same when divided.

    DECLARE @MyInts TABLE(FirstIntegerValue INT NOT NULL, SecondIntegerValue INT NOT NULL)

    INSERT @MyInts(FirstIntegerValue, SecondIntegerValue)
    VALUES
    (2, 3)


    SELECT *
    ,cast(FirstIntegerValue as decimal(18,10) ) / SecondIntegerValue as 'Unformatted'
    ,format(cast(FirstIntegerValue as decimal(18,10) ) / SecondIntegerValue, '0.0%') AS 'Percent'
    FROM @MyInts

    Using the FORMAT function is typically 43 times slower than using CAST or CONVERT even on some of the more complicated formatting.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Mr. Brian Gale wrote:

    I'd recommend against implicit conversions wherever possible.  There are a lot of articles online about implicit conversion and even SQL Server, via the execution plan, will give a warning about implicit conversions.

    If I see a check engine light come on in my car, I get it looked at.  If I see a warning in an execution plan while tuning a query, I look into how to fix it.

    Brent Ozar has a good writeup on implicit conversions here - https://www.brentozar.com/blitz/implicit-conversion/.  It is an extreme example, but it is good advice.

    I agree... that's an extreme example designed to produce a given problem.  I'll also agree that implicit conversions are real killers when they occur on ON operators and WHERE clauses.

    To be sure, though, are you talking about things like the implicit conversion induced by the *100.0 in the following code if it's used in the SELECT list?

     SELECT CAST(ColumnA*100.0/ColumnB AS DECIMAL(10,2))
    FROM dbo.SomeTable

    That won't actually produce any warnings in the execution plan.  If you dig into the operator blocks in the execution plan, it will show that implicit conversions took place but not as a warning.

    Here's some test code...

    PRINT '--===== Building and populating the test table..;'
    ;
    DROP TABLE #TestTable

    CREATE TABLE #TestTable
    (
    RowNum INT IDENTITY(1,1) PRIMARY KEY CLUSTERED
    ,ColumnA INT NOT NULL
    ,ColumnB INT NOT NULL
    )
    ;
    INSERT INTO #TestTable WITH (TABLOCK)
    (ColumnA, ColumnB)
    SELECT TOP (10000000)
    ColumnA = ABS(CHECKSUM(NEWID())%100)+1
    ,ColumnB = ABS(CHECKSUM(NEWID())%100)+1
    FROM sys.all_columns ac1
    CROSS JOIN sys.all_columns ac2
    ;
    CHECKPOINT
    ;
    PRINT REPLICATE('=',119); --Just a separator to make the output easier to read
    GO
    PRINT '
    --===== Demo with implicit conversions using an explicit converesion
    -- only to format the final output (which usually shouldn''t be done).'
    DECLARE @BitBucket DECIMAL(10,2);
    SET STATISTICS TIME ON
    SELECT @BitBucket = CAST(ColumnA*100.0/ColumnB AS DECIMAL(10,2))
    FROM #TestTable;
    SET STATISTICS TIME OFF
    ;
    GO
    PRINT REPLICATE('=',119); --Just a separator to make the output easier to read
    GO
    PRINT '--===== Demo with everything explicitly cast.'
    DECLARE @BitBucket DECIMAL(10,2);
    SET STATISTICS TIME ON
    SELECT @BitBucket = CAST(CAST(ColumnA AS DECIMAL(10,2))*CAST(100 AS DECIMAL(10,2))/CAST(ColumnB AS DECIMAL(10,2)) AS DECIMAL(10,2))
    FROM #TestTable;
    SET STATISTICS TIME OFF
    ;

    Here are the results... it shows (to me, anyway) that SQL Server made some better choices for performance using the implicit conversions than we did with the explicit conversions.

    --===== Building and populating the test table..;

    (10000000 rows affected)

    =======================================================================================================================

    --===== Demo with implicit conversions using an explicit converesion

    -- only to format the final output (which usually shouldn't be done).

    SQL Server parse and compile time:

    CPU time = 0 ms, elapsed time = 0 ms.

    SQL Server Execution Times:

    CPU time = 2469 ms, elapsed time = 2458 ms.

    =======================================================================================================================

    --===== Demo with everything explicitly cast.

    SQL Server parse and compile time:

    CPU time = 0 ms, elapsed time = 0 ms.

    SQL Server Execution Times:

    CPU time = 2922 ms, elapsed time = 2916 ms.

    Now, I agree that the differences in performance are less than 500ms or so across 10 million rows but (to me anyway and because of the serious number of calculations we do on 100's of millions of rows sometimes thousands of times each day) milliseconds do matter.

    What about accuracy?  Let's check the differences between the two methods (in this case)...

    PRINT '
    --===== Demo with implicit conversions using an explicit converesion
    -- only to format the final output (which usually shouldn''t be done).'
    SET STATISTICS TIME ON
    SELECT RowNum, Result = CAST(ColumnA*100.0/ColumnB AS DECIMAL(10,2))
    INTO #ResultImplicit
    FROM #TestTable;
    SET STATISTICS TIME OFF
    ;
    GO
    PRINT REPLICATE('=',119); --Just a separator to make the output easier to read
    GO
    PRINT '--===== Demo with everything explicitly cast.'
    SET STATISTICS TIME ON
    SELECT RowNum, Result = CAST(CAST(ColumnA AS DECIMAL(10,2))*CAST(100 AS DECIMAL(10,2))/CAST(ColumnB AS DECIMAL(10,2)) AS DECIMAL(10,2))
    INTO #ResultExplicit
    FROM #TestTable;
    SET STATISTICS TIME OFF
    ;
    GO
    PRINT '--===== Check for differences in results both forwards and backwards'
    SELECT * FROM #ResultImplicit
    EXCEPT
    SELECT * FROM #ResultExplicit
    ;
    SELECT * FROM #ResultExplicit
    EXCEPT
    SELECT * FROM #ResultImplicit
    ;

    Here are the results from that code.  Again, implicit conversions won the foot race (about 15% faster and 12% less CPU) and there's no differences in the results.

    --===== Demo with implicit conversions using an explicit converesion

    -- only to format the final output (which usually shouldn't be done).

    SQL Server Execution Times:

    CPU time = 8515 ms, elapsed time = 1025 ms.

    (10000000 rows affected)

    =======================================================================================================================

    --===== Demo with everything explicitly cast.

    SQL Server Execution Times:

    CPU time = 9532 ms, elapsed time = 1186 ms.

    (10000000 rows affected)

    --===== Check for differences in results both forwards and backwards

    (0 rows affected)

    (0 rows affected)

    To be sure, I try to avoid any and all conversions (implicit or otherwise) as criteria that will require data from a column to be converted before it can be compared (non-SARGable queries... I would never play an NVARCHAR() value against a VARCHAR() column, for example) and I certainly make sure that the implicit conversions aren't going to cause improper calculations but, like all else in SQL Server and T-SQL, "It Depends".  We just seen that implicit conversions can pretty easily save on resource usage and duration and something > 10% savings isn't something to ignore or intentionally defeat.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

Viewing 12 posts - 1 through 11 (of 11 total)

You must be logged in to reply to this topic. Login to reply