Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase

Slow performance with ALTER COLUMN Expand / Collapse
Author
Message
Posted Thursday, May 29, 2014 5:02 AM
SSC Eights!

SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!

Group: General Forum Members
Last Login: Today @ 7:09 AM
Points: 878, Visits: 2,395
I am working for a company that has a 3rd party vendor application.

The Table that having an ALTER Column run against it is a 300 column heap table (import only).

The vendors application issues a number of ALTER TABLE ALTER COLUMN (sometimes 50-60 separate columns) for each import, The change is the data type length eg Decimal (18,2) to (30,10).

These seem to be taking a long time to execute especially when the table has around 4GB of data in it.

Is it possible that the problem is being caused because the data requires extra pages so the engine is spooling the existing table into the TempDB, and back again once the space has been allocated or is it a meta data only operation?


_________________________________________________________________________
SSC Guide to Posting and Best Practices
Post #1575514
Posted Thursday, May 29, 2014 5:28 AM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Monday, September 15, 2014 7:23 AM
Points: 5,074, Visits: 11,852
Jason-299789 (5/29/2014)
I am working for a company that has a 3rd party vendor application.

The Table that having an ALTER Column run against it is a 300 column heap table (import only).

The vendors application issues a number of ALTER TABLE ALTER COLUMN (sometimes 50-60 separate columns) for each import, The change is the data type length eg Decimal (18,2) to (30,10).

These seem to be taking a long time to execute especially when the table has around 4GB of data in it.

Is it possible that the problem is being caused because the data requires extra pages so the engine is spooling the existing table into the TempDB, and back again once the space has been allocated or is it a meta data only operation?


Not really answering your question, but surely altering the table with data in it is liable to failure, if any columns ever shrink?

If this is an 'import table', I would have expected it to be empty immediately preceding each import ...

But I guess the vendor has reasons.

If the columns are only ever getting bigger, why not make them all huge right now and do away with the ALTER issue?



Help us to help you. For better, quicker and more-focused answers to your questions, consider following the advice in this link.

When you ask a question (and please do ask a question: "My T-SQL does not work" just doesn't cut it), please provide enough information for us to understand its context.
Post #1575521
Posted Thursday, May 29, 2014 5:41 AM
SSC Eights!

SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!

Group: General Forum Members
Last Login: Today @ 7:09 AM
Points: 878, Visits: 2,395
Thanks for the reply Phil. I would agree about setting them to a max size and leaving it, but we have no control over the database. It seems the 3rd Party application had a number of templates for import from Excel/CSV and these vary depending on the supplier of the original template, thus the changes to columns.

I personally wouldn't develop that or after load process the data, then truncate the table and start on the next import format.

From what I've read the DECIMAL data type is a Fixed length, which makes sense, so there is likely to be a lot of disk IO when making a change to a column to increase as data after the column has to be shuffled, while on the decrease side, a scan is performed to ensure that the data will fit into the new scale and precision.

It doesn't help that 40+ columns can change and they are each being run as an ATLER TABLE ALTER COLUMN.



_________________________________________________________________________
SSC Guide to Posting and Best Practices
Post #1575527
Posted Thursday, May 29, 2014 5:51 AM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Monday, September 15, 2014 7:23 AM
Points: 5,074, Visits: 11,852
Jason-299789 (5/29/2014)
Thanks for the reply Phil. I would agree about setting them to a max size and leaving it, but we have no control over the database. It seems the 3rd Party application had a number of templates for import from Excel/CSV and these vary depending on the supplier of the original template, thus the changes to columns.

I personally wouldn't develop that or after load process the data, then truncate the table and start on the next import format.

From what I've read the DECIMAL data type is a Fixed length, which makes sense, so there is likely to be a lot of disk IO when making a change to a column to increase as data after the column has to be shuffled, while on the decrease side, a scan is performed to ensure that the data will fit into the new scale and precision.

It doesn't help that 40+ columns can change and they are each being run as an ATLER TABLE ALTER COLUMN.



It seems that you have no control whatsoever over the schema of this beast. So maybe instead you can do something about the data which is in it at the time that the ALTERs get executed?



Help us to help you. For better, quicker and more-focused answers to your questions, consider following the advice in this link.

When you ask a question (and please do ask a question: "My T-SQL does not work" just doesn't cut it), please provide enough information for us to understand its context.
Post #1575532
Posted Thursday, May 29, 2014 6:01 AM
SSC Eights!

SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!SSC Eights!

Group: General Forum Members
Last Login: Today @ 7:09 AM
Points: 878, Visits: 2,395
Thanks Phil, I think that might have to be the solution, I'll have chat with the manager to check on the process that they follow when importing data and see if it can be amended to make it more efficient.

ultimately the import application should truncate the table for each load then run the scripts, so that's another avenue to explore.



_________________________________________________________________________
SSC Guide to Posting and Best Practices
Post #1575533
« Prev Topic | Next Topic »

Add to briefcase

Permissions Expand / Collapse