Out of memory error

  • Hello,

    I’m developing a report in SSRS 2008. When I previewed the report, it runs for a while and gives me error saying

    An error occurred during report processing. Exception of type 'System.outofmemoryexception' was thrown

    The report has nearly 200k rows. But my system has more than 30gb of free space.

    Can anyone help me out with it?

  • Hello,

    Before giving any solution I want to clarify one thing.

    You are saying "But my system has more than 30gb of free space.". I doubt your are talking about disk space here and your issue is no or less physical memory(RAM) available for SSRS.

    So, if I assume that I am correct in my thinking, you can try out the following steps.

    If your SQL Server database which is source of your report data is present on the same server, it might be eating up all the RAM and therefore no RAM available for your report processing. You can restrict the physical memory usage by your SQL Server through sp_configure. Check out this link.

    http://www.sqlservercentral.com/blogs/glennberry/archive/2009/10/29/suggested-max-memory-settings-for-sql-server-2005_2F00_2008.aspx

    Also, you can reserve the memory for SSRS by adding

    "<WorkingSetMaximum> amount of memory in KB you want to reserver for SSRS</WorkingSetMaximum>" tag in your SSRS config file. Search for this setting "<MemoryThreshold>90</MemoryThreshold>" in SSRS config file and put the newly added tag just above this tag.

    To be on safer side keep a copy of config file before making any changes so that you can revert back if required.

    Thanks.

  • I'm guessing that your SSRS instance is not on the same physical machine as your source database. If it is then some of this won't apply, but the same concepts will...

    What happens to the 200k rows when they get to your SSRS server? I'm hoping they get condensed, aggregated etc. to make a much smaller report. If that is indeed the case, why not make your database server and not SSRS do the heavy lifting. Try doing some of your aggregation in your dataset (query, stored procedure whatever it is). This will help you in 2 ways.

    1) you won't starve your SSRS instance for memory.

    2) your report should render much faster. This is because your server needs to wait while the 200k rows are transferred over the network (which depending on latency, can be the greatest bottleneck) and then SSRS needs to aggregate it all down to your report. While SSRS does this pretty well, your SQL server will probably do it better (faster) given appropriate indexes, good code etc.

    If you need help with rewriting your query please see the first post in my signature to make sure you provide the appropriate information to help us help you.

    -Luke.

    To help us help you read this[/url]For better help with performance problems please read this[/url]

  • SSRS is a great tool, but it does not handle large datasets well. It often does not paginate correctly, if you export large reports to excel, ssrs eats up a ton of cpu and RAM and sometimes cannot complete the export within the timeout window.

    If you have any row-level expressions in the report definition itself (NOT a good idea...), any time you run the report be prepared for ssrs to chew through cpu and memory like its candy.

    I have encountered these reasons and plenty more that say, keep your result set that is fed to an SSRS report SMALL. Aggregate on the database layer and only let SSRS do the absolute bare minimum of processing and work. If you need to export a large dataset, do your server a favor and use SSIS...

  • Hello,

    My SSRS is not on the same physical system of my database.

    I’m developing a drill down report and I need all the rows. I have all done all my calculations at the database level, it is just displaying the rows.

    Thanks

  • Why not have the report just display the aggregates, and create a subreport that has the details, and link to the subreport through an action. This way, if the end user DOES need the details, instead of using the drilldown functionality, they just link over to a subreport? This way, the initial report stay small, will likely not run into memory issues, and then if they really do need to see the details, you will still likely not be returning all 200,000 rows, but just a detail subset of that 200,000.

  • I need to develop a report which has 4 levels of drill down, so instead of creating 4 subreports i just thought of using toogle and develop the report.

  • spending a few extra minutes or hour developing the other(extra) reports (since you'll probably be using the same general format) will over the lifetime of the report cost your company less in processing time than if you pull those 200k records each and every time. It's not SSRS that's failing in large result sets, it's more the network resources. Fast Ethernet or even Gigabit Ethernet is way, way slower than the memory, DIsk IO and CPU paths on the servers, particularly if you're pulling over very wide records (i.e. 5-10-15 or more columns) and then condensing/aggregating them to very narrow(3/4 or less column) result sets. Most times you're bringing back large char or varchar columns and just counting them returning a very small int column. So if you do that at the DB instead of returning MBs of varchar columns when you can return a couple of B or KBs of an int column, you end up with much faster execution times. Doesn't matter which Reporting solution you use.

    -Luke.

    To help us help you read this[/url]For better help with performance problems please read this[/url]

  • If I develop 4 subreports also, when my user wants to see the 4th level which has again 200k rows and I think it will be again the same issue. Please let me know if I’m wrong.

    This issue is only when I run the report locally on my system. It runs fine when I run it on report manager.

    I developed this report in Crystal and it works fine while getting 200k rows.

  • Also, if it runs fine on the server but not your development machine, your dev machine probably needs more ram.

    but this is what I Was trying to say earlier

    when my user wants to see the 4th level which has again 200k row

    . When they expand it out to the fourth level do they really need to see all 200k rows? They certainly won't all fit in the same display area so they will be a lot of pages right? You should be able to structure the report to drill into a subset of the whole 200k without having to display that entire set, unless they are then exporting that to excel or a text file for either a) import into another system (which SSIS or bcp will most likely do a much better job) or b) into excel to do additional manipulations on the data in which case your report design is flawed. The idea behind a report is so that they can have the data as they need it, not so that they have to do additional manipulations later.

    -Luke.

    To help us help you read this[/url]For better help with performance problems please read this[/url]

  • Also remember that when you have drilldown, exp. 4 levels of drilldown, SSRS has to churn over that entire dataset to determine what record should fall under what grouping, even if you are doing grouping in sql server. when you are churning a 200,000 row report, there is an overhead associated with that.

    Breaking up your report to only serve up subsets of your 200,000 row report is going to spare you alot of headache down the road. Either that, or add more RAM...

Viewing 11 posts - 1 through 10 (of 10 total)

You must be logged in to reply to this topic. Login to reply