I am trying to find documentation or an explanation as to the behaviour when dividing one integer by another .
Take my code as example:
declare @WeightInGrams int
set @WeightInGrams = 164
@WeightInGrams/1000 AS [NoCast],
CAST(@WeightInGrams AS decimal(18,3))/1000 AS [CastColumnOnly],
@WeightInGrams / CAST(1000 AS decimal(18,3)) AS [CastValueOnly],
CAST(@WeightInGrams AS decimal(18,3)) / CAST(1000 AS decimal(18,3)) AS [CastBoth],
CAST( CAST(@WeightInGrams AS decimal(18,3)) / CAST(1000 AS decimal(18,3)) AS decimal(18,3) ) AS [DoubleCast]
The result of [NoCast] I think I understand because both types are integer and the bit after the decimal is dropped. so 0.146 becomes 0 as a decimal.
The result of [CastVariableOnly] and [Cast1000Only] and [CastBoth] I do not understand - they vary in the number of decimal places (one is 19 places, the other 20, and the other is 8 decimal places). And neither seem to have taken on board my cast to decimal(18,3) which I thought they would after reading about Data Type Precedence (where the decimal type takes precedence because it has a higher precedence than integer)
I am trying to get a decimal to 3 decimal places when dividing one integer by another. Is the [DoubleCast] column the only way I can achieve this?