• Jeff,

    I don't disagree with you that I should and will do my own testing should I need to. Unfortunately I don't have the time to run tests on every article I come across, so I asked.

    Back in the time of SQL Server 6.5 (still Sybase) the performance of char was better than varchar as there was no need to check length. The problem with char was that if the string was actually shorter than the length it was required to right-trim the string. That is, if you had a string defined as char(10) and the value was 'abc', when you query for the length of string, it wouldn't return three as the length of the string, but 10 as the string definition. In order to return len(string) as three, you needed to right-trim the string.

    I believe the performance issue went away with SQL Server 7 (I may be wrong on my versions), and since then I have switched to use varchar for all strings. Since varchar was null-terminated, whether the two variables were defined as varchar(9) and varchar(10) was not an issue.

    With SQL Server 2012 if you define a string as char(10), populate it with 'abc' and query for the length, it returns three and not the length of the string definition. I had the impression that defining a string as char(10) or varchar(10) would yield the same results. With that said, I was surprised to see that performance of char was still a problem. I'm confident that varchar doesn't pose a problem as I'm sure the issues with different length definitions for varchar is prevalent in many of the databases I have come across over the last 10+ years, even though I have not checked that specifically. And, since I have not experienced the performance issues I wondered if the there was any experience in defining the strings as varchar.