Altering table to change datatype

  • Need to change some columns datatype from nvarchar to varchar.

    Just need to drop the indexes that are on that column.

    Do we get any problems from changing nvarchar to varchar

  • as long as you due your due diligence, and check to make sure you won't have any data conversion errors, you'd be ok;

    here's a quick example: if you radaptation of my example returns any rows, you have an issue, if there's no data found, you would not have any problems, i think.

    CREATE TABLe Example (ExampleID int,SomeString nvarchar(30))

    INSERT INTO Example

    SELECT 1, N'??' UNION ALL

    SELECT 2, N'Test String'

    SELECT *,CONVERT(varchar,SomeString) FROM Example WHERE SomeString <> CONVERT(varchar,SomeString)

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • ramana3327 (1/21/2015)


    Need to change some columns datatype from nvarchar to varchar.

    Just need to drop the indexes that are on that column.

    Do we get any problems from changing nvarchar to varchar

    Here is a function you can use to check the existing character data

    😎

    CREATE FUNCTION dbo.TVFN_GET_HIGH_CHARCODE

    (

    @INPUT_STRING NVARCHAR(4000)

    )

    RETURNS TABLE WITH SCHEMABINDING AS

    RETURN

    WITH T(N) AS (SELECT N FROM (VALUES (NULL),(NULL),(NULL),(NULL),(NULL),(NULL),(NULL),(NULL),(NULL),(NULL)) AS X(N))

    ,NUMS(N) AS (SELECT TOP (LEN(ISNULL(@INPUT_STRING,1))) ROW_NUMBER() OVER (ORDER BY (SELECT NULL)) AS N FROM T T1,T T2,T T3,T T4)

    SELECT

    MAX(UNICODE(SUBSTRING(@INPUT_STRING,NM.N,1))) AS HIGH_CHARCODE

    FROM NUMS NM;

    Usage example

    SELECT

    X.HIGH_CHARCODE

    ,COUNT(*) AS INSTANCE_COUNT

    FROM dbo.TBL_SAMPLE_STRING SS

    CROSS APPLY dbo.TVFN_GET_HIGH_CHARCODE(SS.SST_VALUE) AS X

    WHERE X.HIGH_CHARCODE > 255

    GROUP BY X.HIGH_CHARCODE;

  • ramana3327 (1/21/2015)


    Need to change some columns datatype from nvarchar to varchar.

    Just need to drop the indexes that are on that column.

    Do we get any problems from changing nvarchar to varchar

    Yes, you will first need to drop any indexes that reference the column you're altering.

    If this is a large table, then another issue will be page splits resulting from changing column to datatype of a different width. This will result in page fragmentation and transaction logging. So plan to perform this operation during a maintenance window, and insure there is free sufficient free space (about double size of table) on both the data drive and log drive.

    In the example below, I've created a table with nvarchar column and populated it with sample data. It's initial space reserved is 20,880 KB.

    CREATE TABLE Example

    (ExampleID int not null identity(1,1) primary key,

    SomeString nvarchar(500));

    insert into Example ( SomeString )

    select description from sys.sysmessages;

    (98692 row(s) affected)

    exec sp_spaceused 'Example';

    Here I'm altering column's datatype to varchar. It's space reserved has now doubled to 41,552 KB. Varchar generally requires less page storage than nvarchar, however, the issue at this point has to do with clustered index fragmentation.

    alter table Example alter column SomeString varchar(500);

    exec sp_spaceused 'Example';

    Finally, after running DBREINDEX, the space reserved is now down to 11,512 KB, which half the original after reallocating all the pages.

    dbcc dbreindex ('Example');

    exec sp_spaceused 'Example';

    "Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho

Viewing 4 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply