Converting Large Object Data Types

  • ohack

    SSCrazy

    Points: 2043

    Comments posted to this topic are about the item Converting Large Object Data Types

  • RichB

    SSCrazy Eights

    Points: 9651

    Is the change to (max) a biggie or just metadata? On say a 50GB table is it going to plough away at it, blocking the table for ages while the log grows out of control?

    Is it possible to get the 'move' effect a different way? Surely that is itself going to be an enormously intensive operation?

    A helpful article, but something that sounds like it needs several days of downtime to achieve on a big database.

  • SQLDoubleG

    Hall of Fame

    Points: 3090

    Good stuff,

    but I'm afraid you missed something which can be important and it's the NULLability of the columns.

    If you have a column which is text NOT NULL, your script will generate the following statement

    ALTER TABLE [dbo].

    ALTER COLUMN [colText] varchar(max)

    Changing from NOT NULL to NULL is a simple task, but going the way back to NOT NULL can be a really expensive task depending of the number of rows

    From BOL (http://technet.microsoft.com/en-gb/library/ms190273(v=sql.105).aspx)

    •ANSI_NULL defaults are always on for ALTER COLUMN; if not specified, the column is nullable.

    That can be easily fixed using IS_NULLABLE within a CASE

    (CASE WHEN c.IS_NULLABLE = 'NO' THEN ' NOT' ELSE '' END) + ' NULL'

    Cheers

  • maddogs

    SSCrazy

    Points: 2364

    Good stuff. Will remember this when the shop I work for ever moves off SQL 2000!


    maddog

  • Mike Brockington

    SSC Eights!

    Points: 874

    Nice little script, but what about the benefits (or lack) of them? Does the server run better or worse after this?

    Apart from the NULLable thing, are there any potential gotcha's?

    Throw away your pocket calculators; visit www.calcResult.com
  • Mike Dougherty-384281

    SSCrazy

    Points: 2764

    Good article.

    The comments always find little details about your "generic" solution to make it a little more "generic" and flexible. 🙂

    As far as sharing your use-case (telling a story) and taking us through the process of writing code to do the job AND thinking how others might benefit from it: that's really valuable. Thanks for illustrating how you can spend an extra few minutes to turn a solution "for today" into something that you [or others] might reuse. In my experience it is difficult to teach that skill.

  • ohack

    SSCrazy

    Points: 2043

    Thanks for all the positive feedback so far, in particular raulggonzalez for covering off the NULL issue I'd overlooked.

    For the less confident raulggonzalez's line needs to be inserted into the script after the first END and before "+ ';'" with an extra " + " before it.

    Or you can change:

    + ';' + CHAR(13) + CHAR(10)

    To:

    + (CASE WHEN c.IS_NULLABLE = 'NO' THEN ' NOT' ELSE '' END) + ' NULL;' + CHAR(13) + CHAR(10)

    RichB - I don't think this is something you'd want to run in a big production database without thorough testing - the initial conversion wouldn't be the problem but the unfiltered UPDATE statement could certainly wreak havoc - this was a dev scenario.

    mike brockington - as mentioned there should be a performance gain in the common case where most of the rows actually contain 8k or less data.

Viewing 7 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic. Login to reply