Not very happy with this question.
The first answer option is very clearly wrong - no idea where that 0.12 could possibly come from.
The other two options are the same. Arithmetically, there is no difference between 123, 123.00000, 123.0, or 123.000000000000000000000000000. The only difference is display, and how values are displayed is determined by the client program used, not by SQL Server itself (unless you ask SQL Server to convert to string).
To reply to the question asked above - the only difference between the two is the data type. The value is the same, and the data type is used by the client to govern how the value is displayed.
Query 1 converts two values to numeric(5,0), then divides them. This results in a numeric(11,6) (see http://msdn.microsoft.com/en-us/library/ms190476.aspx). This is multiplied by 100, an integer; this integer is first converted to numeric(3,0) (see http://msdn.microsoft.com/en-us/library/ms190309.aspx), and the result of the multiplication is then numeric(15,6) (same source as for division).
Query 2 converts two values to float, divides them (resulting in float), converts 100 to float and multiplies (again resulting in float), and then invokes the ROUND function, which also returns float. So the end result here is float.
The different answers are then caused by how the client chooses to format the different data types. Using SSMS will indeed result in the answer marked as "correct" (at least on my system and with default settings - this is probably influenced by locale settings, Windows settings, and maybe SSMS settings as well).
When using sqlcmd.exe, I get different results - 123.000000 and 123.0. When creating these two queries as pass-through queries in Access 2010, I get 123 for both queries. I did not try other client programs, but maybe someone else can - ideas to try are a default datagrid in a custom .Net application, Excel, Query Analyzer, or third party tools.