• I got the answer

    Beginning with SQL Server 2005, the server reports the duration of an event in microseconds (one millionth, or 10-6, of a second) and the amount of CPU time used by the event in milliseconds (one thousandth, or 10-3, of a second). In SQL Server 2005 and later, the SQL Server Profiler graphical user interface displays the Duration column in milliseconds by default, but when a trace is saved to either a file or a database table, the Duration column value is written in microseconds.

    http://msdn.microsoft.com/en-us/library/ms175848.aspx