• Thanks for the script. It seems to cover most cases easily. I do have a suggestion. I recently worked on importing data from a similar script and found that SSMS could not open the file file because it was too large. I had to open the file in Notepad and copy and paste large blocks of insert statements into SSMS. Most of the data in the file was the column list, not the values list.

    Your script adds the complete column list for every line, which will produce the same problem:

    INSERT INTO TableName (Column1, Column2) VALUES (...)

    It would reduce the size of the output file to eliminate the redundant column list by using the multiple VALUES clause:

    INSERT INTO TableName (Column1, Column2)

    SELECTColumn1, Column2

    FROM(VALUES

    (1,'Apples'),

    (2,'Bananas'),

    (3,'Oranges')

    ) AS x (Column1, Column2)

    I tried your script on one of my narrower tables, and the column list was 334 characters while the VALUES list was 176 characters. The DML was twice as long as the values being manipulated. For my table of 1000 rows, the original script would generate an output file of 510,000 characters (510 characters per statement, 1000 statements). Eliminating the repeated column list would produce an output file of 176,668 characters (176 characters per line for 1000 statements, plus the 334 column list twice). That is about 65% size reduction, meaning that we could output three times as much data before SSMS ran into a size constraint on the script file. Of course, column name length to datalength will change for each table, but it is something to consider.