Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase «««1234»»

Script all data of a table Expand / Collapse
Author
Message
Posted Tuesday, January 05, 2010 5:41 PM


Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Monday, February 08, 2010 5:40 PM
Points: 3, Visits: 12
Nice script, thanks for sharing. For what concerns SQL Server Management Studio data limitations, can you not change the settings in "Options -> Query Results -> Results to Grid and Results to Text ??

_______________________
Giammarco Schisani
Volpet Software - Table Diff
Post #842508
Posted Tuesday, January 05, 2010 6:29 PM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Friday, February 12, 2010 8:34 AM
Points: 2, Visits: 6
You could change that option, but text will limit the amount of data returned per row/column to 8192. However, in SQL Server Management Studio the default length that text returns per column is 256, so you will likely truncate your data pretty quickly when building the complete insert statement for each row. Even at 8192 a row of data can be 8192 characters on its own, let alone the additional data that is being applied to each returned column (INSERT INTO .... VALUES....).

If you use the grid, a column can return 65,635 characters by default which should handle ALL instances of the INSERT statement, the columns and the data associated with each column.
Post #842516
Posted Monday, February 08, 2010 5:38 PM
Old Hand

Old HandOld HandOld HandOld HandOld HandOld HandOld HandOld Hand

Group: General Forum Members
Last Login: Today @ 11:42 AM
Points: 303, Visits: 511
RH says:

[65,635 chars] which should handle ALL


Hmm, no, I wish. Not for columns of XML type holding XML documents.

IAC I'm not able to figure out a way to write substring logic something like udf_varbintohexstr_big to chunk up large XML documents properly, or anything similar. Sending results to a file doesn't work, or maybe I'm doing something wrong.

Anybody have a clue? I agree this is a great script in general, and Florian has obviously figured out a way to handle large blobs for varbinaries. Casting or converting XML to a MAX size variable should be fine, also, so I think it's just a question of figuring out the right way to receive the results.

Anybody ?

Thanks,

>L<
Post #862143
Posted Wednesday, February 10, 2010 3:46 AM


SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Sunday, November 04, 2012 12:23 PM
Points: 2,087, Visits: 3,932
Hi MSzI

I'm currently working on an updated version which handles IDENTITY-Inserts as configurable




Hi Lisa

I don't think you can use SSMS to handle BLOBs like XML files. I'd suggest using SQLCMD instead which should have no restrictions. If even SQLCMD has restrictions for result row lengths, you can use a Powershell script or any simple .NET application.

Greets
Flo



The more I learn, the more I know what I do not know
Blog: Things about Software Architecture, .NET development and T-SQL

How to Post Data/Code to get the best Help How to Post Performance Problems
Post #863060
Posted Wednesday, February 10, 2010 10:00 AM
Old Hand

Old HandOld HandOld HandOld HandOld HandOld HandOld HandOld Hand

Group: General Forum Members
Last Login: Today @ 11:42 AM
Points: 303, Visits: 511
Thanks Flo. That's what I thought. It is not a serious restriction, because your method is sound. I'm just pointing out that the any length limitation -- whether 64k or 255 characters -- is going to remain a problem. >L<
Post #863422
Posted Tuesday, October 19, 2010 7:04 PM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Thursday, October 31, 2013 9:20 PM
Points: 18, Visits: 98
Nice script, thanks! It fails when the table has a lot of columns though, as the @insert variable overflows the limit of NVARCHAR(MAX). I have amended it to use a table var to get around this, probably should also do this for the @select var, although it's less likely that will reach the limit.

/******************************************************************

Author
======
Florian Reischl

Summary
=======
Script to create a SELECT statement to script all data of a specified table

Usage
=====
Open the script on the database to script data from.

* Configure the name of the table to be scripted (@table_name)
* Configure if the udf_varbintohexstr_big is available (only needed for IMAGE and VARBINARY data with more than 3998 bytes)
* Configure if the column names shall be scripted for destination database. This brings more flexibility because the destination table has more columns than the source table or the column order is different. Do not script the column names to save space.
* Execute the script once.
* Take the execution result as statement to script your data (maybe change something withi)
* Execute the result from first execution again
* Take the scripted data to insert them on another database/server.

Parameters
==========

@table_name
The name of the table to be scripted

@handle_big_binary
If set to 1 the user defined function udf_varbintohexstr_big will be used
to convert BINARY, VARBINARY and IMAGE data. For futher information see remarks.

@column_names
If set to 0 only the values to be inserted will be scripted; the column names wont.
This saves memory but the destination tables needs exactly the same columns in
same order.
If set to 1 also the names of the columns to insert the values into will be scripted.

Remarks
=======
Attention:
In case of colums of type BINARY, VARBINARY or IMAGE
you either need the user defined function udf_varbintohexstr_big
and option @handle_big_binary set to 1 or you risk a loss of data
if the data of a cell are larger than 3998 bytes

Data type sql_variant is not supported.

History
=======
V01.00.00.00 (2009-01-15)
* Initial release
V01.01.00.00 (2009-01-25)
* Added support for IMAGE columns with user defined function udf_varbintohexstr_big
V01.01.01.00 (2009-02-04)
* Fixed bug for NTEXT and XML
V01.02.00.00 (2009-02-21)
* Added possibility to script column names
V01.02.01.00 (2010-10-20) R Crawford
* Fix bug when too many columns overflows NVARCHAR(MAX)

******************************************************************/

SET NOCOUNT ON

DECLARE @table_name SYSNAME
DECLARE @handle_big_binary BIT
DECLARE @column_names BIT

-- ////////////////////
-- -> Configuration
SET @table_name = 'dbo.tblUserProfile'
SET @handle_big_binary = 1
SET @column_names = 1
-- <- Configuration
-- ////////////////////

DECLARE @object_id INT
DECLARE @schema_id INT

--SELECT * FROM sys.all_objects
SELECT @object_id = object_id, @schema_id = schema_id
FROM sys.tables
WHERE object_id = OBJECT_ID(@table_name)


DECLARE @columns TABLE (column_name SYSNAME, ordinal_position INT, data_type SYSNAME, data_length INT, is_nullable BIT)

-- Get all column information
INSERT INTO @columns
SELECT column_name, ordinal_position, data_type, character_maximum_length, CASE WHEN is_nullable = 'YES' THEN 1 ELSE 0 END
FROM INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_SCHEMA = SCHEMA_NAME(@schema_id)
AND TABLE_NAME = OBJECT_NAME(@object_id)

DECLARE @select VARCHAR(MAX)
DECLARE @insert VARCHAR(MAX)
DECLARE @insert_tbl TABLE ([insert] VARCHAR(MAX))
DECLARE @crlf CHAR(2)
DECLARE @sql VARCHAR(MAX)
DECLARE @first BIT
DECLARE @pos INT
SET @pos = 1

SET @crlf = CHAR(13) + CHAR(10)

WHILE EXISTS (SELECT TOP 1 * FROM @columns WHERE ordinal_position >= @pos)
BEGIN
DECLARE @column_name SYSNAME
DECLARE @data_type SYSNAME
DECLARE @data_length INT
DECLARE @is_nullable BIT

-- Get information for the current column
SELECT @column_name = column_name, @data_type = data_type, @data_length = data_length, @is_nullable = is_nullable
FROM @columns
WHERE ordinal_position = @pos

-- Create column select information to script the name of the source/destination column if configured
IF (@select IS NULL)
SET @select = ' ''' + QUOTENAME(@column_name)
ELSE
SET @select = @select + ','' + ' + @crlf + ' ''' + QUOTENAME(@column_name)

-- Handle NULL values
SET @sql = ' '
SET @sql = @sql + 'CASE WHEN ' + @table_name + '.' + QUOTENAME(@column_name) + ' IS NULL THEN ''NULL'' ELSE '

-- Handle the different data types
IF (@data_type IN ('bigint', 'bit', 'decimal', 'float', 'int', 'money', 'numeric',
'real', 'smallint', 'smallmoney', 'tinyint'))
BEGIN
SET @sql = @sql + 'CONVERT(VARCHAR(40), ' + @table_name + '.' + QUOTENAME(@column_name) + ')'
END
ELSE IF (@data_type IN ('char', 'nchar', 'nvarchar', 'varchar'))
BEGIN
SET @sql = @sql + ''''''''' + REPLACE(' + @table_name + '.' + QUOTENAME(@column_name) + ', '''''''', '''''''''''') + '''''''''
END
ELSE IF (@data_type = 'date')
BEGIN
SET @sql = @sql + '''CONVERT(DATE, '' + master.sys.fn_varbintohexstr (CONVERT(BINARY(3), ' + @table_name + '.' + QUOTENAME(@column_name) + ')) + '')'''
END
ELSE IF (@data_type = 'time')
BEGIN
SET @sql = @sql + '''CONVERT(TIME, '' + master.sys.fn_varbintohexstr (CONVERT(BINARY(5), ' + @table_name + '.' + QUOTENAME(@column_name) + ')) + '')'''
END
ELSE IF (@data_type = 'datetime')
BEGIN
SET @sql = @sql + '''CONVERT(DATETIME, '' + master.sys.fn_varbintohexstr (CONVERT(BINARY(8), ' + @table_name + '.' + QUOTENAME(@column_name) + ')) + '')'''
END
ELSE IF (@data_type = 'datetime2')
BEGIN
SET @sql = @sql + '''CONVERT(DATETIME2, '' + master.sys.fn_varbintohexstr (CONVERT(BINARY(8), ' + @table_name + '.' + QUOTENAME(@column_name) + ')) + '')'''
END
ELSE IF (@data_type = 'smalldatetime')
BEGIN
SET @sql = @sql + '''CONVERT(SMALLDATETIME, '' + master.sys.fn_varbintohexstr (CONVERT(BINARY(4), ' + @table_name + '.' + QUOTENAME(@column_name) + ')) + '')'''
END
ELSE IF (@data_type = 'text')
BEGIN
SET @sql = @sql + ''''''''' + REPLACE(CONVERT(VARCHAR(MAX), ' + @table_name + '.' + QUOTENAME(@column_name) + '), '''''''', '''''''''''') + '''''''''
END
ELSE IF (@data_type IN ('ntext', 'xml'))
BEGIN
SET @sql = @sql + ''''''''' + REPLACE(CONVERT(NVARCHAR(MAX), ' + @table_name + '.' + QUOTENAME(@column_name) + '), '''''''', '''''''''''') + '''''''''
END
ELSE IF (@data_type IN ('binary', 'varbinary'))
BEGIN
-- Use udf_varbintohexstr_big if available to avoid cutted binary data
IF (@handle_big_binary = 1)
SET @sql = @sql + ' dbo.udf_varbintohexstr_big (' + @table_name + '.' + QUOTENAME(@column_name) + ')'
ELSE
SET @sql = @sql + ' master.sys.fn_varbintohexstr (' + @table_name + '.' + QUOTENAME(@column_name) + ')'
END
ELSE IF (@data_type = 'timestamp')
BEGIN
SET @sql = @sql + '''CONVERT(TIMESTAMP, '' + master.sys.fn_varbintohexstr (CONVERT(BINARY(8), ' + @table_name + '.' + QUOTENAME(@column_name) + ')) + '')'''
END
ELSE IF (@data_type = 'uniqueidentifier')
BEGIN
SET @sql = @sql + '''CONVERT(UNIQUEIDENTIFIER, '' + master.sys.fn_varbintohexstr (CONVERT(BINARY(16), ' + @table_name + '.' + QUOTENAME(@column_name) + ')) + '')'''
END
ELSE IF (@data_type = 'image')
BEGIN
-- Use udf_varbintohexstr_big if available to avoid cutted binary data
IF (@handle_big_binary = 1)
SET @sql = @sql + ' dbo.udf_varbintohexstr_big (CONVERT(VARBINARY(MAX), ' + @table_name + '.' + QUOTENAME(@column_name) + '))'
ELSE
SET @sql = @sql + ' master.sys.fn_varbintohexstr (CONVERT(VARBINARY(MAX), ' + @table_name + '.' + QUOTENAME(@column_name) + '))'
END
ELSE
BEGIN
PRINT 'ERROR: Not supported data type: ' + @data_type
RETURN
END

SET @sql = @sql + ' END'

-- Script line end for finish or next column
IF EXISTS (SELECT TOP 1 * FROM @columns WHERE ordinal_position > @pos)
SET @sql = @sql + ' + '', '' +'
ELSE
SET @sql = @sql + ' + '

-- Remember the data script
INSERT @insert_tbl ([insert]) VALUES (@sql)
SET @pos = @pos + 1
END

-- Close the column names select
SET @select = @select + ''' +'

-- Print the INSERT INTO part
PRINT 'SELECT ''INSERT INTO ' + @table_name + ''' + '

-- Print the column names if configured
IF (@column_names = 1)
BEGIN
PRINT ' ''('' + '
PRINT @select
PRINT ' '')'' + '
END

PRINT ' ''VALUES ('' +'

-- Print the data scripting
DECLARE Insert_Cursor CURSOR FOR
SELECT [insert] FROM @insert_tbl
OPEN Insert_Cursor
FETCH NEXT FROM Insert_Cursor INTO @insert;
WHILE (@@FETCH_STATUS = 0)
BEGIN
PRINT @insert
FETCH NEXT FROM Insert_Cursor INTO @insert;
END

CLOSE Insert_Cursor
DEALLOCATE Insert_Cursor

-- Script the end of the statement
PRINT ' '')'''
PRINT ' FROM ' + @table_name


Regards

ROSCO
Post #1007406
Posted Tuesday, February 22, 2011 8:44 AM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Monday, April 01, 2013 10:16 AM
Points: 117, Visits: 180
I think I'm the only rider on the small schoolbus for this one... but I'm missing something. Can someone give some examples of when you would need to use this script?

Thanks!
Post #1067654
Posted Wednesday, February 23, 2011 10:31 AM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Friday, June 15, 2012 2:49 PM
Points: 10, Visits: 61
This is good, except I needed to change the data type names to lower case in the declaration statements. I suspect this may be because my collation is CS.
Post #1068415
Posted Tuesday, May 17, 2011 8:15 AM
Valued Member

Valued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued Member

Group: General Forum Members
Last Login: Thursday, June 20, 2013 8:56 AM
Points: 55, Visits: 327
Thanks for sharing this script. It works great and is going to be a big help in the future.
Post #1110249
Posted Tuesday, May 17, 2011 4:36 PM
SSC-Enthusiastic

SSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-EnthusiasticSSC-Enthusiastic

Group: General Forum Members
Last Login: Yesterday @ 9:07 AM
Points: 106, Visits: 356
Thanks for the script. It seems to cover most cases easily. I do have a suggestion. I recently worked on importing data from a similar script and found that SSMS could not open the file file because it was too large. I had to open the file in Notepad and copy and paste large blocks of insert statements into SSMS. Most of the data in the file was the column list, not the values list.

Your script adds the complete column list for every line, which will produce the same problem:
INSERT INTO TableName (Column1, Column2) VALUES (...)



It would reduce the size of the output file to eliminate the redundant column list by using the multiple VALUES clause:
INSERT INTO TableName (Column1, Column2)
SELECT Column1, Column2
FROM (VALUES
(1, 'Apples'),
(2, 'Bananas'),
(3, 'Oranges')
) AS x (Column1, Column2)

I tried your script on one of my narrower tables, and the column list was 334 characters while the VALUES list was 176 characters. The DML was twice as long as the values being manipulated. For my table of 1000 rows, the original script would generate an output file of 510,000 characters (510 characters per statement, 1000 statements). Eliminating the repeated column list would produce an output file of 176,668 characters (176 characters per line for 1000 statements, plus the 334 column list twice). That is about 65% size reduction, meaning that we could output three times as much data before SSMS ran into a size constraint on the script file. Of course, column name length to datalength will change for each table, but it is something to consider.
Post #1110643
« Prev Topic | Next Topic »

Add to briefcase «««1234»»

Permissions Expand / Collapse