I don't really understand the explanation. In what was is anything "crazy" about the handling of negatives? Whatever the precedence you end up dividing a negative by a negative, which must always be positive. The only question to decide was whether multiplication or division is executed first.
What's "crazy" isn't the final sign. It's that SQL doesn't recognize unary minus. In math, and in every ordinary programming language I've ever encountered, unary operators have higher precedence than anything except parentheses. SQL doesn't recognize unary minus. If it's not a unary minus, SQL sees subtraction. Subtraction occurs after multiplication and division. So SQL treats it as if there's a set of parentheses after the division.
The minus signs still cancel, as you say, but the interpreted parentheses trump multiplication and division's normal left to right processing, and you wind up with the unexpected result of 0.1.
If unary minus were a thing, as in math and ordinary programming languages, the unary minus operator would act first, which interprets as:
The minus signs still cancel, but now the multiplication and division are the only operations remaining. They operate left to right, and give us a result of 10.