October 6, 2008 at 2:53 pm
Hi
I have a table which has a column , say tokens
Tokens has been defined to be an integer
and the entries for tokens are:
3
4
5
6
now, when i execute the command ,
select avg(tokens) from table,
then it gives the result floored to an integer. How do i get the exact decimal value of the average ??
I don't want tokens to be defined as double.
October 6, 2008 at 3:11 pm
Try casting token as a decimal before taking the average.
Example:
select avg(cast(token as decimal)) from table
For best practices on asking questions, please read the following article: Forum Etiquette: How to post data/code on a forum to get the best help[/url]
October 6, 2008 at 9:44 pm
In SQL Server, the results of one or more integers is always an integer. SQL Server returns a value as a datatype having the highest prcedence among the data types involved in the operation. In this case, all values are intgers, the result will also be integer.
As Alvin mentioned, you can fix this by applying a cast operation on the integer value.
This document explains the precedence of each data type.
http://msdn.microsoft.com/en-us/library/ms190309.aspx
.
October 7, 2008 at 9:10 am
jacob sebastian (10/6/2008)
In SQL Server, the results of one or more integers is always an integer. SQL Server returns a value as a datatype having the highest prcedence among the data types involved in the operation. In this case, all values are intgers, the result will also be integer.As Alvin mentioned, you can fix this by applying a cast operation on the integer value.
This document explains the precedence of each data type.
or
select avg(tokens*1.0) from table,
Failing to plan is Planning to fail
Viewing 4 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply