There is insufficient system memory in resource pool 'internal' to run this query"

  • if you are getting 503 server unavailable ... its not DB issue. Actually it is be throwing from web server. We need to check the web server log or debug to make sure it is from DB. Most of the this is because of web server where it might have some limitation on the number of user connection. When it crosses that limit it will start throwing exception. Try to increase the limit on the server.

  • We've recently experienced this problem and found a solution that appears to have resolved the issue.

    We were running SQL Server 2008 (10.0.4000), but the database in question was in compatibility mode 80 (SQL Server 2000). Changing the compatibility mode to 100 (SQL Server 2008) has so far prevented the issue from re-occurring.

    -Phineas

  • Was this problem solved? I found this too-------

    http://support.microsoft.com/kb/982854

    If possible, would you post how you made out.

    Thank you!

  • This is quite stupid , but I think when I used AWE on Server 2003 Standard on machine with 16GB - I was

    sure SQL uses AWE memory , and it was . The problem is that on Server 2003 Standard /PAE switch will NOT

    give access to memory beyond 4Gb . According to this :

    http://msdn.microsoft.com/en-us/windows/hardware/gg487503

    So , memory in fact , is effectively lock in memory because of AWE switch(instead of using memory beyond 4G it

    locks memory below 4GB ) and at the point where SQL runs out of memory ( around 3.2 GB ) - it will start giving

    the message and they will not go away , until you restart it. Cause it thinks it can use up to 28 and behaves unhappy.

    The solution was to stop using AWE on Standard server , leaving PAE on for extra couple hundred M ,depending on the system.

  • Hi All,

    I am very new to Database and that to SQL server.

    I have developed an application using Core Java(1.5) and SQL Server 2008 Enterprise edition.

    My application is running from last 5 months. There are 10 Java services that are running and storing data in my DB. Each day (12 am) a new table is created and values are stored in it.

    But after 4 months it start giving me exception that "Not enough memory in resource pool default to run this query".

    After facing this problem i use to restart my DB Services and again it use to work. Then exactly after 1 week I was facing this problem.

    I searched for it on net and I increase it CPU and Memory allocation to 100% from 40%. Now it started giving this exception after 2 n half week.

    I checked my DB connection from my java services, I was not closing some connections, so I closed them also.

    But still no improvement was there.

    Than I close all ResultSet and Statement handlers from my Java Services but still nothing happened.

    I like to mention that I am using a same username and password for all services and my GUI also using the same.

    I use to run this query

    select * from sys.dm_resource_governor_resource_pools where name='default'

    mam_memory_kb=1555928;

    used_memory_kb=1135664;

    max_Cpu and memory=100;

    to check used memory of resource pool, and then I run this query

    DBCC FREESYSTEMCACHE ('ALL','default')

    that reduced my used memory. I can write a service which can get used memory and if it exceeds my threshold it will run the free system cache memory query. But is it the rite and optimized way or am I missing something

    but i am not sure that this will work or not....I am not getting any idea why my used memory is getting increased

    day by day

    Please help me on this in any way u can.

    Thanking You All

  • Hi sharda,

    This seems to be problem in your application, I suggest dont use the free memory cache function again and again in sql server or dont create any service to clear that.

    As you told some connections were not getting closed in the application. Coz of this memory in the sql server also was not getting cleared making to hold the data in the memory for long time. So when you try connect with new connections you were getting this error.

    Temporarily you can use free memory cache in sql server which you are using right now. But I suggest you to fix the issue in the application by closing all the connections properly.

  • Thanks for your reply...

    But I have mentioned that initially my DB connections are not getting closed properly, but now I am closing all the connections......

    I am using below query to get number of open connection

    select s.program_name,w.name as workload_group,p.name as pool_name, COUNT(*) AS TOTAL

    FROM sys.dm_exec_sessions AS s

    JOIN sys.resource_governor_workload_groups as w

    JOIN sys.resource_governor_resource_pools as p

    ON w.pool_id = p.pool_id

    ON s.group_id = w.group_id

    WHERE p.name like '%default%'

    group by program_name,w.name,p.name

    If this is not the proper query, plz let me know as I am having very basic knowledge of database...

    Thanks

  • Check the Max server memory assigned to the SQL server using SP_configure. You might enable advance options for viweing this. Reduce the Max server memory by 1 GB or 512 Mb and check if that resolves your issue.

    You should be changing the value by trial and error method ;-). This setting would flush the memory pools and by reducing the MAX server memory and memory occupied by SQL server will be released to OS and your backups would be successful. .. If still they are getting failed try reducing it 512MB further and check.. also keep in mind that this would slightly degrade the performance of the applications. So once changed the settings check with your app. buddies on the performance.

  • Hi Sandy! I see that you are new to SSC.com forums. Welcome! Please note that we try to not post to old items.

    Best,
    Kevin G. Boles
    SQL Server Consultant
    SQL MVP 2007-2012
    TheSQLGuru on googles mail service

Viewing 9 posts - 31 through 38 (of 38 total)

You must be logged in to reply to this topic. Login to reply