On most software development companies, there are several database instances for the development, QA, staging and production environments. During the development life cycle, developers may insert new records and update existing records for a particular table on the development database. These records can be configuration settings or data relevant to a business requirements. Once the changes are done, they may need to be promoted to your other environments for testing and validation. It is often the case that there will be a need to get the data from the tables because there is no original script for the data, or the updates were done directly to the database and the script was never updated. There are ways to automate the export process of these records and the generation of the insert/update scripts which can be used on the deployment process.
This article describes a simple script that can be used to facilitate the scripting of the records that need to be imported to other database environments using the SQL-DMO (Distributed Management Objects) object model with VBScript. This article does not provide detail information of SQL-DMO.
Solution and Usage
The solution is a simple VBScript which executes a SQL query on the target database. It performs the following tasks:
- Logins to the target server/database using Windows Authentication (it can be enhanced to use SQL logins)
- Executes a query (where clause maybe the records > that a particular id or records created after a date)
- Based on the query, it determines the target table
- Based on the query, it determines what field should be exported (with some exceptions see next item)
- Determines what fields should not be exported (identity, timestamps, computed)
- Creates an insert or update statement for each record
- It writes the scripts to a text file
The script can be executed from a batch file using the following syntax:
WBScript 0g_exportRecord [servername] [database] [query] [filename] [command]
- [servername] server location/ip address/instance name
- [database] database name/initial catalog
- [query] a valid SQL statement with no joins
- [filename] full path where the file should be created (folder should already exist)
- [command] 0 to create insert script , 1 to create update script (numeric value)
WBScript 0g_exportRecors devdb catalog "select * from setting where id in (102,103)" "c:\my files\setting.sql" 0
The script supports any valid SQL statement with no joins. The concept is to script the data from each individual table and create the corresponding insert/update statements for that single table. The following are examples of queries that can be used:
- Select * from [MY_TABLE] where id = 1 or id = 2
- Select col1,col2,col3 from [MY_TABLE] where id in (1,2)
- Select * from [MY_TABLE] (NOLOCK) where col2 like ('text%')
- select * from [MY_TABLE] where created_date > '9/21/2009'
The script has a main function and five helper functions. The entry point validates the arguments in the WScript.Arguments collection, and it looks for a minimum of five arguments. The main function instantiate an instance of SQLDMO.Server object. This object is used to connect to the database. Once the connection is successful, it allocates a reference to a SQLDMO.Database object by using the server database collection. The database reference has a helper method named ExecuteWithResults. This method is used to execute a TSQL query, and it returns a result set in the SQLDMO.QueryResults object.
The query argument is parsed by the GetTableName function to get the table name in the query. The table name is then used by the GetColsToIgnore function which uses the SQLDMO.Database object to get a SQLDMO.Table reference. This is needed because we need to read the columns on that table to determine the identity fields, timestamps and computed fields which should not be scripted. We should note that in some cases there is a need to script out the identity values as well, and that the corresponding insert statement would use the SET IDENTITY_INSERT statement to allow explicit identity values to be inserted.
The next step is to get the list of columns that are available on the QueryResults columns collection by calling the GetColumns function. This allows us to construct the column list part of the insert/update statement. The column names that match any item in the columns to ignore list are not included in the statement. At this point, the script would have constructed a SQL expression with this format:
|SELECT COL1, COL2
WHERE ID = 10
||INSERT MYTABLE (COL1,COL2)
|SELECT COL1, COL2
WHERE ID = 10
SET COL1 =[COL1],
COL2 = [COL2]
The values with square brackets are tokens that will be replaced by the actual column value.
We now need to iterate through the QueryResults rows collection to obtain all the column values for each row. The property ColumnName , ColumnType and method GetColumnString from the QueryResults object is used to obtain the name, data type and value of each field. The helper function GetColumnValue is used to determine how to format the value. For example, if the data type is VARCHAR, and the data has single quotes, the script needs to escape those characters to avoid breaking the SQL syntax. The square bracket tokens are then replaced by their corresponding value.
The last part of the SQL statement is the where clause which is only applicable for an update statement. A call to the GetWhereClause helper function returns this segment. Once all records have been processed, the script dumps the statements to the file provided by the filename input argument. If the file exists, the additional statements are appended to the end of the file. This works great if you want to create a single SQL file for all your statements.
If you like to make your updates directly to SQL, this script can be used to publish the latest changes to a SQL file. Those scripts can then be kept under version control system and be incorporated into your deployment process. This would definitely eliminate the need to get a SQL Server backup after your changes were mysteriously/magically lost.
- Does not export identity fields. This could be added for the insert statement.
- Does not use SQL Logins
- Could be enhanced to read an xml configuration file or SQL table to connect to different servers/databases and execute many queries.