This is a great question, I really, really enjoyed it. The most difficult part was to figure out that the line:
set @result = cast (@@rowcount as varchar)
does 2 things:
1. it sets the value of the @result variable to 1 because @@rowcount was equal to 1 as a result of the last insert
2. it resets @@rowcount to 1 as a result of @result assignment.
The @@rowcount then stays unchanged (still equal to 1) because the
select @result = @result + cast (@@rowcount as varchar) from #n;
which pads the @result has the @@rowcount inside of it, but the moment the statement bails out the @@rowcount is then reset to whatever the number of affected records was, which happens to be 3.
The commented line reading set nocount off; was there just for demonstration purposes I suppose, meaning that it could be any other set statement, such as set ansi_nulls on; or set transaction isolation level read uncommitted; or whatever other set. Any such set statement when uncommented would just reset the @@rowcount to 0 changing the final answer to 10003 from 11113.
I really hope that what I figured is correct, and my answer was not a result of some lucky guess.
Thanks again.
Oleg