OK, so this is going to show my ignorance/stupidity but the mentioning of Hex and varbinary is bringing up a topic that I've been confused about for a while. As I've understood it, binary represents 0's and 1's...so the byte for the letter "a" is represented as 01100001. Yet whenever you run a command in SQL server like:
declare @n as varchar(20) = 'abcdefg'
select CAST(@n as varbinary)
You get a result like:
0x61626364656667
Why does SQL Server spit the results out in this format rather than in straight binary, like this:
01100001 01100010 01100011 01100100 01100101 01100110 01100111
I apologize if this is a really stupid question.
Thanks,
George