701: There is insufficient system memory to run this query

  • We are Running a .sql script which was created by tablediff.exe using SQLCMD

    The script execution fails with the ' 701: There is insufficient system memory to run this query. ' Exception.

    The following was the SQL Server Configuration

    * Operating System: Windows Server 2003

    * Database : SQL Server 2005 Standard Edition SP2

    * Main Memory: 4GB

    * Virtual Memory: 6GB

    * Processors: 2 nos (Intel XEON 1.6 GHz)

    Since, we are replicating a Database from a staging database, we are using the tablediff.exe.

    The Script file which was created by the tablediff will be at the maximum size of 30MB.

    The following are the SQL Server Settings which we are using now,

    * Min Server Memory: 0

    * Max Server Memory: 2147483647

    * Min Memory Per Query: 2048

    * AWE Enabled: No

    Any ideas on where is the issue ?

    Here is my SP Which executes the script

    USE [ODS]

    GO

    /****** Object: StoredProcedure [dbo].[sp_TableDiff4] Script Date: 11/12/2007 05:58:32 ******/

    4.

    SET ANSI_NULLS ON

    5.

    GO

    6.

    SET QUOTED_IDENTIFIER ON

    7.

    GO

    8.

    ALTER PROCEDURE [dbo].[sp_TableDiff4]

    9.

    -- Add the parameters for the stored procedure here

    10.

    (@sourceTable varchar(100),

    11.

    @targetTable varchar(100),

    12.

    @filepath varchar(100))

    13.

    AS

    14.

    BEGIN

    15.

    DECLARE @sTableDiff nvarchar(1000)

    16.

    DECLARE @sPath nvarchar(100)

    17.

    DECLARE @sTableName nvarchar(40)

    18.

    DECLARE @a nvarchar(50)

    19.

    DECLARE @sSQLCMD nvarchar(100)

    20.

    SET @sTableDiff= ' "C:\Program Files\Microsoft SQL Server\90\COM\tablediff" -sourceserver GRSRV4 -sourceuser username -sourcepassword pwd -sourcedatabase ODS-STG -sourcetable ' + @sourceTable + ' -destinationserver GRSRV4 -destinationuser username -destinationpassword password -destinationdatabase ODS -destinationtable ' + @targetTable + ' -f '+@filepath

    21.

    PRINT @sTableDiff

    22.

    EXEC XP_CMDSHELL @sTableDiff

    23.

    SET @sTableName = @sourceTable

    24.

    PRINT @sTableName

    25.

    SET @sPath = 'd:\' + @sTableName

    26.

    SET @sPath = @filepath

    27.

    SET @sSQLCMD = 'sqlcmd -s GRSRV4 \SQL -U username -P pwd -d ODS -i ' + @sPath

    28.

    PRINT @sSQLCMD

    29.

    EXEC XP_CMDSHELL @sSQLCMD

    30.

    SET @a = 'del ' + @sPath

    31.

    PRINT @a

    32.

    EXEC XP_CMDSHELL @a

    33.

    END

  • Hi

    I presume it is so big because this is your initial setup of all the tables?

    Or do you expect it be this big every time you move from staging to production?

    Is it not possible to get the script text and break it down into smaller pieces and run them separately this time only then for subsequent updates the script should be smaller.

    Otherwise have a look at SQL Packager from Red-Gate. It works pretty well. It also does more than just tables. (I'm not a rep for Red-Gate, I just appreciate their products.....)

  • Yes, I had tried that too. I had broke the script to 10000 Records and run all the script files continuously. But it doesn't solve my problem. The same error occurs.

    My client doesn't recommend third party libraries/tools. So that doesn't help me.

    Any suggestions to release the memory periodically ?

  • Have installed the latest service packs?

    This error was addressed in service pack 1, but I'm not sure if it will help with a 30Mb query.

    I also found this article by Pinal Dave[/url]

    What is the maximum amount of records you can process without getting this error?

    Have you considered using replication?

  • Assuming the service pack doesn't work have you thought about taking a different route?

    And fair warning I'm not an expert on replication. I know just enough to be dangerous.

    Have you considered just taking & applying a new snapshot?

    Or if you are just working with 1 or 2 tables using BCP or some other method to export the data and just completly overlay the existing data? Again using BCP or BULK COPY or something like that?

    I wouldn't recommend either if it wasn't for the large number of changes you seem to be making already.

    Kenneth FisherI was once offered a wizards hat but it got in the way of my dunce cap.--------------------------------------------------------------------------------For better, quicker answers on T-SQL questions, click on the following... http://www.sqlservercentral.com/articles/Best+Practices/61537/[/url]For better answers on performance questions, click on the following... http://www.sqlservercentral.com/articles/SQLServerCentral/66909/[/url]Link to my Blog Post --> www.SQLStudies.com[/url]

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply