Ever tried to simulate a production enviromnent in a development one but realized that you couln't easily start 500 executions simultaneously ?
In a development environment we cetainly dont have the tools to simulate a production environment, that's why some companies prepare a pré-production ( or staging ) machine just for a "as close as possible" test environment.
One hard task is to simulate real paralellism between executions of the same script, checking how SQL server will handle transaction isolation levels, locks ( deadlocks ), performance issues or even hardware bottlenecks.
There are stress tests scripts and tools that can put your CPU at 100%, memory and disk IO at maximum, but this doesn't test how your code will react with multiple execution of your own code.
A simple line can provide you with paralelism and data required for your testing environment:
C:\For /L %i in (1,1,100) do start /b osql -Syourserver -Uyouruser -Pyourpass -q"yourSQLscripts" < C:\EXIT
I encapsulated it within this procedure:
@executions INT -- times your script will be executed
, @sql VARCHAR(8000) -- the script to be executed
, @server VARCHAR(255) -- connection info: server name of IP address
, @user VARCHAR(255) -- connection info: username
, @pass VARCHAR(255) -- connection info: password
, @debug INT = 0 -- turns extra prints and resultset display
- I used Shell version of FOR to make it simple for the SO to handle.
- I used the Start command with /b to generate the desired paralellism but on a single "window".
- I used the osql command to execute the sql script desired with -S -U -P parameters
- i had to use the "< C:\EXIT" because for some reason, even when returning the prompt after execution, on some tests the cmd sessions never closed by its own, and i had to kill them manually on the server.
- I dind't use any 2005/2008 stuff like varchar(MAX) so that this script could be used under sql server 2000, but obviously that the script you're trying to run is what determine the server requirements. This can cause problems with large resultsets because of the limit of 8000 of the #temp tables used, but fell free to change it to MAX to get enough space for your resultset.
- You will need permission to read/write C:\ or any other directory, change this path on lines 20, 32, 43 to the final destination of the temp file named "EXIT"
- You will also need permission to run master..xp_cmdshell, check sp_configure options to learn how to enable this.
This code starts 100 almost simultaneously executions of a script that, insert its start time at a test table, waits 5 seconds, and update that row with the finishing time.
CREATE TABLE master..test ( id INT IDENTITY(1,1), start DATETIME, finish DATETIME )
DECLARE @sql varchar(1024)
SET @sql = ' insert into master..test(start) values ( getdate() )' + char(13)
+ ' waitfor delay ''00:00:05''' + char(13)
+ ' update master..test set finish = getdate() where id = @@identity'
EXEC sp_multiple_execute 100, @sql, 'smonreal', 'sa', '', 0
SELECT * FROM MASTER..TEST ORDER BY 1
DROP TABLE MASTER..TEST
|1||2009-09-16 12:12:00.373||2009-09-16 12:12:05.420|
|2||2009-09-16 12:12:00.390||2009-09-16 12:12:05.420|
|3||2009-09-16 12:12:00.390||2009-09-16 12:12:05.437|
|4||2009-09-16 12:12:00.390||2009-09-16 12:12:05.437|
|100||2009-09-16 12:12:03.030||2009-09-16 12:12:08.047|
If @debug parameter is set to 0, then only the total duration time is recorded, by total i mean the moment the last of the 100 sessions returned the propmt to the main window.
Otherwise the executed script along with all resultsets and the duration is returned.
and happy testings.