• Frank Hamersley (1/4/2012)


    Whomever the bright sparks were that decided an arbitary 6 digit minimum is to be forced on the resultant without any other consideration except adding up the number of significant digits on the input side....I am still gobsmacked that anyone would think this is viable. Slack in my book.

    Two points: first DECIMAL(77,40) into 38 precision just won't go. Second, a strongly-typed language requires a type that has a defined precision and scale before the computation occurs. As far as the decision to choose 6 is concerned, here's a quote from the SQL Server Programmability Team:

    [font="Verdana"]...we try to avoid truncating the integral part of the value by reducing the scale (thus truncating the decimal part of the value instead). How much scale should be sacrificed? There is no right answer. If we preserve too much, and the result of the multiplication of large numbers will be way off. If we preserve too little, multiplication of small numbers becomes an issue.[/font]

    http://blogs.msdn.com/b/sqlprogrammability/archive/2006/03/29/564110.aspx