I have to agree. Every time I look at things like this my brain starts repeating the following mantra:
"Code to enforce Business Rules should not be in SQL!"
Validation of user interfact data is best done by application Functions or Subroutines, not DML.
The trouble with that approach is that when misinterpreted by certain people it leads to stupidities like "uniqueness should be forced by application code, not database constraints" and "since all normalisation except 1NF is about enforcing business rules by schema structure no normalisation beyond 1NF should ever be done". I've heard both of those too often from people who have misunderstood the perfectly simple rule that the application should ensure the data being fed to the database is correct and think that it means that the database should not do it.
The countervailing mantra is that everything, including the databse, should validate every parameter that is fed to it and also ensure that everything it passes to another component is valid. Ideally most of this is done by a decent type system that allows strong domain constraints to be specified. Unfortunately (T-)SQL doesn't (yet) have such a type system, so we get badly thrown together bandaids (like IsNumeric) to patch it up with; TRY_CONVERT is a big improvement, but not perfect - and there may be performance considerations working against a nice clean test delivering 1 for "can convert and is not null", 2 for "can convert and is null", and 0 for "can't covert" (that should include the case where destination type doesn't permit null but the parameter is null, but SQL support for not null types outside of tables is non-existent) since if it delivers a non-zero result the next thing to happen will probably be the conversion and that was probably already done once for the test (a type system catering properly for pairs would of course allow something both efficient and clean, but we don't have even that much of a decent type system).