May 2, 2006 at 12:43 pm
Other than storage size savings, are there reasons to use integer rather than decimal(n,0) for numeric columns? Are there some useful functions that would require casting to int from dec(n,0)?
May 3, 2006 at 7:22 am
Integers will perform better than decimal because a CPU can't do arithmetic on decimal values. They have to be converted to a scaled integer representation to be used in expressions, then converted back to decimal for storage. So whether you explicitly convert them or not, the conversions are happening.
On the other hand, the CPU time to operate on decimal values is miniscule compared to the time required to move data pages around. CPU load may be slightly higher while processing the query but elapsed time might not show a difference.
I think the size difference is the biggest issue. Using a larger decimal form to store integer values makes rows slightly bigger, which may mean fewer rows per page, which may mean more I/O to move pages around. This is a small effect, and if there is one decimal value per row you may never see a difference. But unless you need to store values that are too big to fit in a bigint I don't see any advantage to using decimal.
Viewing 2 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply