I found myself using DTS fairly frequently to transfer trivial amounts of data from one database to another - frequently fairly small volumes, such as new additions to coding tables, and restoring that one record or two that someone had accidentally deleted. I do make use of tools like Red Gate's SQL Data Compare for more complex data transfers, and we have an in-house engine that can keep tables synchronized between servers, but I have found this little script very useful for smaller, more "ad hoc" tasks.
The spScriptInsert Procedure
From this came a stored procedure to write a collection of INSERTs for the data (together with a couple of helper functions):
-- Function to format the output value for inclusion in the VALUES list
CREATE FUNCTION spVal(@str nvarchar(4000)) RETURNS nvarchar(4000) AS BEGIN DECLARE @res nvarchar(4000) IF @str IS NULL
SET @res = 'NULL'
SET @res = 'N''' + REPLACE(@str, '''', '''''') + '''' RETURN(@res) END GO --======================================================================
-- Function for the special case of formatting the output value for -- inclusion in the VALUES list
CREATE FUNCTION spMVal(@val money) RETURNS nvarchar(4000) AS BEGIN DECLARE @res nvarchar(4000) IF @val IS NULL
SET @res = 'NULL'
SET @res = CONVERT(varchar(20), @val, 2) RETURN(@res) END GO --======================================================================
-- Create a script for inserting data into the specified table, based -- on the optional condition
CREATE PROC spScriptInsert(@table varchar(80), @condition varchar(80) = '1=1') AS DECLARE @fields nvarchar(4000), @values nvarchar(4000), @SQL nvarchar(4000)
DECLARE @colorder int, @fieldname nvarchar(128), @type varchar(40) SELECT top 100 colorder, syscolumns.name, systypes.name as type
JOIN sysobjects ON sysobjects.id = syscolumns.id
JOIN systypes ON systypes.xusertype = syscolumns.xusertype
WHERE sysobjects.name = @table and systypes.name <> 'text'
ORDER BY colorder DECLARE fieldscursor CURSOR FOR
SELECT colorder, name, type
ORDER BY colorder OPEN fieldscursor FETCH NEXT FROM fieldscursor INTO @colorder, @fieldname, @type SET @fields = ''
SET @values = '' WHILE @@FETCH_STATUS = 0
SET @fields = @fields + ',[' + @fieldname + ']'
IF @type = 'money'
-- Special case for "money" type
SET @values = @values + '+'',''+dbo.spMVal([' + @fieldname + '])'
SET @values = @values + '+'',''+dbo.spVal([' + @fieldname + '])' FETCH NEXT FROM fieldscursor INTO @colorder, @fieldname, @type
END DEALLOCATE fieldscursor SET @SQL = 'SELECT ''INSERT INTO ' + @table + '(' + SUBSTRING(@fields, 2, 2000) +
') VALUES (''+ ' + SUBSTRING(@values, 6, 1000) + '+'')'' FROM ' +
@table + ' WHERE ' + @condition EXEC sp_executeSQL @sql GO
What this does is allow you to specify a table and optionally a condition - for example, in "Northwind":
spScriptInsert 'Products', 'ProductID BETWEEN 70 AND 73'
produces as its results the INSERT statements:
INSERT INTO Products([ProductID],..) VALUES ('70','...
INSERT INTO Products([ProductID],...) VALUES ('71','...
INSERT INTO Products([ProductID],...) VALUES ('72','...
INSERT INTO Products([ProductID],...) VALUES ('73','...
You can then copy the results set, and paste it into another query window in SQL Management Studio or Query Analyser connected to the intended target database.
How it works
The script make liberal use of the system tables, a cursor, and some dynamic SQL at the end. It uses the "old" format system tables, so it works on both SQL Server 2000 and 2005 (no opportunity to try 2008 at the moment!).
The first job is to assemble the field list for the table into the temporary table "#fields", including the type of the field. The cursor "fieldscursor" is then used for a query on this table to iterate thorough the fields to construct "@fields" - a list of the fieldnames (surrounded by brackets "[" and "]" in case the field name happens to be a reserved word), and "@values" - which is what to select from the table to return the information to construct the INSERT statements. Note the use of the "spVal" function to return either a quoted version of the column contents in each case, or "NULL" if appropriate, and the spMVal function for the special case of the "money" data type, which needs a CONVERT function. Note that these functions are executed for every single value returned, evaluated in the dynamic SQL under construction.
The fields list and the values list to be returned are then used to construct the SELECT statement into the @SQL variable, and this is executed to return a list of INSERT statements for each record in the target table, subject to the condition (or defaulting to the condition "1=1" (i.e. all records) if not supplied.
It is a bit "quick and dirty", but it has proved useful in a number of cases to do some little ad-hoc data transfers with the minimum of effort. For production updates, I would use a tool such as Red Gate's SQL Data Compare to roll out from test via a staging environment, but just occasionally, this can be useful to script up some basic coding table additions. It's perhaps more useful for copying data between test environments.
Another use I have found for it is if a particular record is deleted in error in a production database, and you have maybe a log-shipped copy of the database, or access to a backup which still has the record in question. This can be used to script an INSERT for the missing record from the backup version of the database - and of couse two invocations will script both master and detail record inserts.
Some caveats, and possibilities for improvement:
- For tables with a large number of fields, the generated SQL may overflow the available length of the variable receiving the SQL to be executed (the helper functions have short names for this reason!)
- If the table has an IDENTITY field (as indeed the "Products" table in "Northwind" does) it is up to you to enable inserts for it in the target database (i.e. "SET IDENTITY_INSERT Products ON") and then off again when you have finished.
- It does not attempt to copy "text" fields.
- As you can see from the code, it is necessary to have a special case for the "money" type.
- For tables with a lot of fields, the variables used in the spScriptInsert procedure overflow, and it can fail with a fairly criptic error message, due to the syntax errors produced in the constructed SQL. It would be possible to detect this and produce a more "friendly" error messages.
- It uses the "old" system tables, so it works with both SQL Server 2000 and 2005.
These restrictions aside, I hope others find this as useful as I do!