Chris Howarth (7/1/2009)
Another new feature of SQL Server 2008 is an extension to the CONVERT function whereby a binary value can be converted to a string representation.
SELECT CONVERT(VARCHAR(MAX), 0x484920544845524521, 1)
This removes the need for a 'BinToHexStr' function.
Thank you very much for pointing it out Chris. I wish that this extension to the CONVERT function was available in SQL Server 2005, but it is not. For example, running the following query:
select isnull(convert(varchar(max), [Document], 1), 'null') blob from Production.Document;
does not return the data in desired format (it just prints few unreadable characters per value). On the other hand the master.dbo.fn_varbintohexstr function has a very interesting limitation which I overlooked. The function was available in SQL Server 2000, but because NVARCHAR size was limited to 4000 characters, it returned null for an argument longer than 1999 characters in length. Since the 2005 version returns NVARCHAR(max), I assumed that this limitation was removed. Unfortunately, it was not.
shows that calling master.dbo.fn_varbintohexstr passing @pbinin varbinary(max) as a parameter delegates the work to sys.fn_varbintohexsubstring passing 1, @pbinin, 1, 0.
Show that the 2000 version limitation is still there:
if ( ((@cbytesin * 2) + 2 > 4000) or ((@cbytesin * 2) + 2 < 1) )
return NULL -- @cbytesin is the datalength of the varbinary argument
This is very unfortunate. The return type was changed from nvarchar(4000) to NVARCHAR(max) but the limitation was not commented out.
I am sure that the need to generate inserts for tables which include binary columns with data (JPEG images, songs, MS Word documents) is limited, but if someone does have to script such data then they might consider creating a copy of the sys.fn_varbintohexsubstring with aforementioned limitation on the data size removed. Removing this limitation will make the function deadly slow 🙁
Theoretically, if performance of the script generating catalog inserts for deployment when the large binary data is present is not an issue then removing the limitation and adjusting originally posted procedure still allows the task to be accomplished.
There is a whitepaper titled To BLOB or Not To BLOB published my Microsoft research in 2006. Its verdict stating that "objects smaller than 256K are best stored in a database while objects larger than 1M are best stored in the filesystem" helps my case I guess.