Got to know that by default it takes the length as 1 if the size is not declared. Thank you....
However, I still did not get the difference between length of normal declaration and Cast/Convert... Please find below the queries....
LEN('xyz') AS 'Declared'
, LEN(CONVERT(VARCHAR,'xyz')) AS 'Converted'
, LEN(CAST('xyz' AS VARCHAR)) AS 'Cast'
Ans : 3,3,3
DECLARE @a VARCHAR ='xyz'
LEN(@a) AS 'Declared'
, LEN(CONVERT(VARCHAR,@a)) AS 'Converted'
, LEN(CAST(@a AS VARCHAR)) AS 'Cast'
Ans : 1,1,1
I did not get default of 30 (which you mentioned in the answer in case of Cast/Convert) in what ever way I try... Can you please explain more on this?
You're changing the question by your examples here - it's about what the resultant variable type ends up as in each case - putting @a into each one fundamentally changes what's being asked, as does not using @a at all. It's only when you mix them as in the original question that you will get the specified behaviour.
If you explicitly declare a variable as just 'VARCHAR' it has a default length of 1 character. As such:
DECLARE @a VARCHAR = 'xyz'
actually creates @a as a varchar(1) which contains only the character 'x'.
Casting or converting a string literal behaves differently and defaults to 30 characters so when you
CAST('xyz' as VARCHAR)
you get a 30 character varchar with only three characters in it, so the 'LEN' call returns 3.
In your first version you call
which is taking the length of a string literal, which is implicitly cast to a VARCHAR type so internally will be represented (someone correct me if I'm wrong here) as a VARCHAR(30) and hence the length shows as 3. It's only when you declare it as a VARCHAR explicitly that you get length of 1. That's what happens with your second version where you use the explicitly declare length 1 varchar for all your length checks.
Hope that's clearer than it feels...