Writing strings with chars 128-160 to an varchar column

  • I am trying to write a string containing chars in the range 128-160 to an varchar column.

    Chars in the range 128-160 are however converted to char 63.

    How can I write chars in the 128-160 to the db?

    I guess this shouldn't be too difficult, you just have to know how to.

    Thanks in advance for your help!

    Martin

  • Please post your code.

  • Might I suggest you read this from Books On Line:

    ms-help://MS.SQLCC.v9/MS.SQLSVR.v9.en/udb9/html/e123fb74-022e-4a62-9639-b2053a9b4d7e.htm

    Is it possible for you to change your columns data type from varchar to nvarchar (unicode character)?

    If everything seems to be going well, you have obviously overlooked something.

    Ron

    Please help us, help you -before posting a question please read[/url]
    Before posting a performance problem please read[/url]

  • You shouldn't have a problem with using varchar and CHAR for the characters you are talking about... see the following...

    Running this code...

    DECLARE @i int

    DECLARE @STR varchar(222)

    SET @i = 33 /*Start at 33 to skip SPACE and non-printable characters */

    SET @STR = ''

    WHILE (SELECT @i) < 256

    BEGIN

    SET @STR = @STR + CHAR(@i)

    SET @i = @i + 1

    END

    PRINT @STR

    Returns the following...

    !"#$%&'()*+,-./0123456789:; ?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ ¡¢£¤¥¦§¨©ª«¬­®¯°±²³´µ¶·¸¹º»¼½¾¿ÀÁÂÃÄÅÆÇÈÉÊËÌÍÎÏÐÑÒÓÔÕÖ×ØÙÚÛÜÝÞßàáâãäåæçèéêëìíîïðñòóôõö÷øùúûüýþ

  • I attached the C# code of the test application including the SQL command to create a test table

    It works if I use an nchar column, but I know it is possible to store the data in a varchar column without loss of data, because this is done somehow from a C++ application using ODBC.

    So I want to know how to do it while using a varchar column.

    Thanks,

    Martin

  • I think the problem is the unicode->ascii conversion that is happening somewhere along the way. C# strings are always 16-bit.

    Try changing

    parameters.Add("@TestColumn", SqlDbType.NVarChar);

    parameters["@TestColumn"].Value = str;

    to

    parameters.Add("@TestColumn", SqlDbType.VarBinary);

    parameters["@TestColumn"].Value = Encoding.GetEncoding(28591).GetBytes(str);

    ____________________________________________________

    Deja View - The strange feeling that somewhere, sometime you've optimised this query before

    How to get the best help on a forum

    http://www.sqlservercentral.com/articles/Best+Practices/61537

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply