Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase

701: There is insufficient system memory to run this query Expand / Collapse
Author
Message
Posted Tuesday, November 13, 2007 5:01 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Friday, September 17, 2010 1:42 AM
Points: 3, Visits: 82
We are Running a .sql script which was created by tablediff.exe using SQLCMD

The script execution fails with the ' 701: There is insufficient system memory to run this query. ' Exception.

The following was the SQL Server Configuration


* Operating System: Windows Server 2003
* Database : SQL Server 2005 Standard Edition SP2
* Main Memory: 4GB
* Virtual Memory: 6GB
* Processors: 2 nos (Intel XEON 1.6 GHz)

Since, we are replicating a Database from a staging database, we are using the tablediff.exe.

The Script file which was created by the tablediff will be at the maximum size of 30MB.

The following are the SQL Server Settings which we are using now,


* Min Server Memory: 0
* Max Server Memory: 2147483647
* Min Memory Per Query: 2048
* AWE Enabled: No


Any ideas on where is the issue ?

Here is my SP Which executes the script

USE [ODS]
GO
/****** Object: StoredProcedure [dbo].[sp_TableDiff4] Script Date: 11/12/2007 05:58:32 ******/
4.
SET ANSI_NULLS ON
5.
GO
6.
SET QUOTED_IDENTIFIER ON
7.
GO
8.
ALTER PROCEDURE [dbo].[sp_TableDiff4]
9.
-- Add the parameters for the stored procedure here
10.
(@sourceTable varchar(100),
11.
@targetTable varchar(100),
12.
@filepath varchar(100))
13.
AS
14.
BEGIN
15.
DECLARE @sTableDiff nvarchar(1000)
16.
DECLARE @sPath nvarchar(100)
17.
DECLARE @sTableName nvarchar(40)
18.
DECLARE @a nvarchar(50)
19.
DECLARE @sSQLCMD nvarchar(100)
20.
SET @sTableDiff= ' "C:\Program Files\Microsoft SQL Server\90\COM\tablediff" -sourceserver GRSRV4 -sourceuser username -sourcepassword pwd -sourcedatabase ODS-STG -sourcetable ' + @sourceTable + ' -destinationserver GRSRV4 -destinationuser username -destinationpassword password -destinationdatabase ODS -destinationtable ' + @targetTable + ' -f '+@filepath
21.
PRINT @sTableDiff
22.
EXEC XP_CMDSHELL @sTableDiff
23.
SET @sTableName = @sourceTable
24.
PRINT @sTableName
25.
SET @sPath = 'd:\' + @sTableName
26.
SET @sPath = @filepath
27.
SET @sSQLCMD = 'sqlcmd -s GRSRV4 \SQL -U username -P pwd -d ODS -i ' + @sPath
28.
PRINT @sSQLCMD
29.
EXEC XP_CMDSHELL @sSQLCMD
30.
SET @a = 'del ' + @sPath
31.
PRINT @a
32.
EXEC XP_CMDSHELL @a
33.
END
Post #421477
Posted Tuesday, November 13, 2007 5:56 AM
SSC Journeyman

SSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC Journeyman

Group: General Forum Members
Last Login: Friday, July 18, 2014 7:04 AM
Points: 77, Visits: 223
Hi

I presume it is so big because this is your initial setup of all the tables?

Or do you expect it be this big every time you move from staging to production?

Is it not possible to get the script text and break it down into smaller pieces and run them separately this time only then for subsequent updates the script should be smaller.

Otherwise have a look at SQL Packager from Red-Gate. It works pretty well. It also does more than just tables. (I'm not a rep for Red-Gate, I just appreciate their products.....)

Post #421522
Posted Tuesday, November 13, 2007 7:50 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Friday, September 17, 2010 1:42 AM
Points: 3, Visits: 82
Yes, I had tried that too. I had broke the script to 10000 Records and run all the script files continuously. But it doesn't solve my problem. The same error occurs.

My client doesn't recommend third party libraries/tools. So that doesn't help me.

Any suggestions to release the memory periodically ?
Post #421603
Posted Tuesday, November 13, 2007 8:47 AM
SSC Journeyman

SSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC JourneymanSSC Journeyman

Group: General Forum Members
Last Login: Friday, July 18, 2014 7:04 AM
Points: 77, Visits: 223
Have installed the latest service packs?

This error was addressed in service pack 1, but I'm not sure if it will help with a 30Mb query.


I also found this article by Pinal Dave

What is the maximum amount of records you can process without getting this error?

Have you considered using replication?
Post #421639
Posted Wednesday, November 14, 2007 9:28 AM


Hall of Fame

Hall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of Fame

Group: General Forum Members
Last Login: 2 days ago @ 2:02 PM
Points: 3,464, Visits: 1,802
Assuming the service pack doesn't work have you thought about taking a different route?

And fair warning I'm not an expert on replication. I know just enough to be dangerous.

Have you considered just taking & applying a new snapshot?

Or if you are just working with 1 or 2 tables using BCP or some other method to export the data and just completly overlay the existing data? Again using BCP or BULK COPY or something like that?

I wouldn't recommend either if it wasn't for the large number of changes you seem to be making already.


Kenneth Fisher
I strive to live in a world where a chicken can cross the road without being questioned about its motives.
--------------------------------------------------------------------------------
For better, quicker answers on T-SQL questions, click on the following...
http://www.sqlservercentral.com/articles/Best+Practices/61537/
For better answers on performance questions, click on the following...
http://www.sqlservercentral.com/articles/SQLServerCentral/66909/

Link to my Blog Post --> www.SQLStudies.com
Post #422165
« Prev Topic | Next Topic »

Add to briefcase

Permissions Expand / Collapse