Configuration :: IIS Memory Usage Is Extreme - Is That Normal
Jan 6, 2011
I have an ASP.NET 3.5 app that collects data from a handful of external pages, parses the relevant bits and displays them in a table. Total data retrieved is 3-4MB and the resulting page is about 1MB. I am using synchronous WebRequest GetResponse for the retrieval, but the same problem existed using an asynchronous BeginGetResponse/EndGetResponse process.
There is no database access, no session storage, but an in-memory list of about 100 objects/1MB of data, plus a good amount of AJAX (AjaxControlToolkit). This issue appears on the very first run of the app, even if I have restarted IIS.
The issue:
When I run the app on my dev computer, the maximum commit charge is about 1.5GB. The biggest user, measured by Task Manager's VM Size, is WebDev.WebServer.exe (600MB). The app runs perfectly.
When I run it on my rent-a-server (IIS 7.5, 1GB RAM), the maximum commit charge is over 3.8GB. The biggest user is w3wp.exe at 2.7GB. IIS grinds to a halt and spits out a timed-out error page.
Given my limited server budget and the hope of having multiple simultaneous users, I'm kind of in a panic.
I have a update panel combined with gridview with sorting and paging.
I go into task manager to monitor the memory usage of the worker process (w3wp)
What I do is just click on the sort buttons rapidly.
With each click the memory of the process increases with about 2 mb
So I go from 30 mb memory usage to about 90. Then it stops at remains there, no memory is freed up. I am not using caching or session/application state.
What can be causing this, is there a setting in IIS to reduce the mem usage?
I also used .net profiler to examine my app memory usage: 4 mb, so what is the other 86 used for??? Even though it repots 4mb, in task manager it says 90 mb, so this leads me to believe that the rest is namanaged memory which must be used by IIS in some way.
I have really tried to Google it but only articles about how to troubleshoot memory issues come up. Before I start to troubleshoot, I would like to know if my web site's memory usage is really abnormal or not.
So it is an asp.net mvc 2 website that runs on IIS 7.5 in production. I guess normal memory usage depends upon traffic, so here are the numbers of an average day:
300 unique visitor 400 visits 3000 page views
I would be really happy to get some idea how much is the normal memory usage for this traffic. Also I would be curious to know how memory usage normally increases with traffic growth.
I have an ASP.NET app that scrapes data from a handful of external pages, parses the relevant bits and displays them in a table. Total data retrieved is 3-4MB and the resulting page is about 1MB. I am using synchronous WebRequest GetResponse for the retrieval, but the same problem existed using an asynchronous BeginGetResponse/EndGetResponse process.There is no database access, no session storage, no caching, but an in-memory list of about 100 objects (total 1MB of data), plus a good amount of AJAX (AjaxControlToolkit). This issue appears on the very first run of the app, even if I have restarted IIS.
The issue:
When I run the app on my dev computer, the maximum commit charge is about 1.5GB. The biggest user, measured by Task Manager's VM Size, is WebDev.WebServer.exe (600MB). The app runs perfectly.
When I run it on my rent-a-server (IIS 7.5, 1GB RAM), the maximum commit charge is over 3.8GB. The biggest user is w3wp.exe at 2.7GB. IIS grinds to a halt and spits out a timed-out error page.
Given my limited server budget and the hope of having multiple simultaneous users, I'm kind of in a panic.
Is this normal? If I bump the server RAM up to 4GB, will that be enough?
On an ASP.net site at my place of work, the following chunk of code is responsible for handling file downloads (NOTE: Response.TransmitFile is not used here because the contents of the download are being streamed from a zip file):
[code]....
I've just read about the 'buffer' property of the Response object. If I set that to false, will that prevent the Response.BinaryWrite() calls from buffering the data in memory? In general, what is a good way to limit memory usage in this situation? Perhaps I should stream from the zip to a temporary file, then call Response.TransmitFile()?
EDIT: In addition to possible solutions, I'm very interested in explanations of the memory usage issue present in the code above. Why would this consume far more than 1MB, even though Response.Flush is called on every loop iteration? Is it just the unnecessary heap allocation that occurs on every loop iteration (and doesn't get GC'd right away), or is there something else at work?
I am hosting a solution with an outside company so getting them to troubleshoot or send me log files is not working.
Is there a way from code or Global.asax file that I can trap Memory usage from my web application when it reaches a certain amount? Then figure out what is using all the memory
I would like to use output caching with WCF Data Services and although there's nothing specifically built in to support caching, there is an OnStartProcessingRequest method that allows me to hook in and set the cacheability of the request using normal ASP.NET mechanisms.
But I am worried about the worker process getting recycled due to excessive memory consumption if large responses are cached. Is there a way to specify an upper limit for the ASP.NET output cache so that if this limit is exceeded, items in the cache will be discarded?
I've seen the caching configuration settings but I get the impression from the documentation that this is for explicit caching via the Cache object since there is a separate outputCacheSettings which has no memory-related attributes.
Here's a code snippet from Scott Hanselman's post that shows how I'm setting the cacheability of the request.
I'm using asp.net and chrome and the page becomes unresponsive after a while. When I look at the chrome debug, I see that memory usage increases to about 80MB and chrome popups a request to kill the page. The error counter spins and generates about 30,000 errors when the popup comes. What triggers the error is the call of an updatepanel. The error as displayed in chrome is "Failed to load resource".
The update panel: the user clicks on a button and the panel is refreshed. There's only one update panel on the page.
The errors compound on every postback: 1,4, 11, 26, 57, 120, 247.... eventually chrome kills the page. When I put a breakpoint in the function, it stops the code just after I press the button; then after it hits the __doPostBack line, it starts the GetNewDate function again several times by going back to the line of the click event, apparently executing it the number of times shown above.
Is there a way to get load information on Application Server? How much memory or CPU is being used at a given point? I want to either 1. Limit users to use specific functionality of ASP.NET 3.5 application or 2. Deny users from accessing the application saying "Server is busy at the moment"
Is there a way to check the memory usage (consumption) of individual controls on a web form shown in a browser. Like Repeater Control, Multiline Text box etc. The reason is I am putting the repeater control in session and checking the status of controls, based on which I am doing further actions.
Environment.WorkingSet incorrectly reports the memory usage for a web site that runs on Windows 2003 Server.(OS Vers: Microsoft Windows NT 5.2.3790 Service Pack 2, .NET Vers: 2.0.50727.3607)
It reports memory as Working Set(Physical Mem.): 1952 MB (2047468061).
Same web site runs locally on Windows Vista with a Working Set(Physical Mem.): 49 MB (51924992).
I have limited access to the server and support is so limited.
so i have computed the total memory by traversing with VirtualQuery.
Total of pages with state: MEM_FREE is 1300 MB.
(I guess server have 4 GBs of RAM and PAE is not enabled, max user mode virtual address is 0x7fff0000.)
So, i know working set is not only about virtual memory. But, is it normal to have such a high working set while its very low on another machine?
We have a web application running which having around 100 users logged in, All clients are connected to server using websync. I was having requirement for keeping the session always live, so I am regenerating session when it is about to expire.
But after 3 or 4 days, I found cpu reached to 100% and application locked, then we need to restart the server to make it working.
I would like to know if there is a possible way to test my website in a simulated environment mimicking an event of being used by around 100 users at the same time. i would just like to know how will my website performs in real situations.
i have a web application that when i deploy it at a local computer with windows xp apnet_wp needs about 100-140mb of ram. when i deploy it to the webserver it is being reset by the administrator because the ram needed exceeds 250mb.i have noticed that at the local pc there is only one instance of aspnet_wp for all the connected users. On the other hand at the web server there are different instances all over 100mb.1) is there a programmaticaly way to reduse the aspnet_wp.2) does 140mb of ram is extremely big amount of ram for a medium web application?3) is it possible to to configure the web server to run only one instance of aspnet_wp for all connected issues?
created an application that accesses some external APIs (google earth api). When I run in VS, everything works. When I run my application on another computer, I get the error:
By defauly in my web.config I have set the , in runtime I just want to change the Mode = "Off" in memory. I do not want to save the changes to web.config.
Basically we need to see the description of the runtime error, when required.
The application pool randomly shuts down. I am assuming this is due to Rapid Fail Protection and errors are occuring that are triggering it. However... there are no errors being reported in the event logs. So my next inclination is to think of the possibilities of a memory leak. How do you test for memory leaks on a website running locally?
I am building an asp.net application, using II6 on windows server 2003 (vps hosting).
I am confronted with an error I didn't receive on my development machine (windows 7, iis 7.5, 64 bit).
When my wcf service tries launching my query running against a local sql server this is the error I receive:
Memory gates checking failed because the free memory (43732992 bytes) is less than 5% of total memory. As a result, the service will not be available for incoming requests. To resolve this, either reduce the load on the machine or adjust the value of minFreeMemoryPercentageToActivateService on the serviceHostingEnvironment config element.
I have both VS 2005 and 2008 installed on my machine. 2005 is fine. For 2008, literally any asp.net project I try to create gets this eror. I try stepping into the code, and the error occurs apparently before anything that I can trap is loaded. There is no information written to the event log. I have tried this with a "Hello World" webpage with nothing else going on. Seems unique to my Windows Server 2003 machine.
I am getting a weird error in asp.net while using leadtools imaging api. Here's the stack trace.
System.AccessViolationException: Attempted to read or write protected memory. This is often an indication that other memory is corrupt. at SetThreadData(_THREADDATA* ) at Leadtools.Codecs.CodecsOptions.Use() at Leadtools.Codecs.RasterCodecs.DoSave(SaveParams saveParams) at Leadtools.Codecs.RasterCodecs.Save(RasterImage image, Stream stream, RasterImageFormat format, Int32 bitsPerPixel)........
I am maintaining C# .NET code written by somebody else, I get following exception few times,Attempted to read or write protected memory. This is often an indication that other memory is corruptThe code structure where I get the above exception is somethign like this,- it is using ref variables in following sequence,Variable-1 and variable-2 are local variables in App1, - App1: func1() which passes these variables reference to func2(), - App1: func2() passes same variables reference via .net remoting to another application (App2).-App2: does the same passes same received reference to another call 2 times.- The execption is occured while returning from App2.
(App1:func1(ref Var1, ref Var2) --> App1:func2(ref Var1, ref Var2) <---App1 .net remoting to App2--> App2:func3(ref Var1, ref Var2)--> App2:func4(ref Var1, ref Var2)-->variables getting updated and function returned to original caller)
My doubts are ,1. Is passing reference variables in such chain is correct? will it cause such exeption? Does .net support ref variable call directly?
I am working on a web app that will heavily rely on configuration. The configuration is also will be written by another process or human. I am looking to get response on best practices in .net 3.5 on how to implement this case. I had used the configuration section of an early version of the Enterprise Library Applications Block. I really liked working with it but from what I hear it is discontinued in current versions. Hence the question... Need to be able to serialize collections, pick up file write event to reload new instance of config into memory.
In my web site which is basically a monitor application I have to keep a configuration file which contains some web servers name, names of web sites hosted on each web server, url and port numbers etc.
Can anyone please explain me what are the benefits of treating this configuration file as custom configuration file of my web application and reading it using "ConfigurationSection" or "IConfigurationSectionHandler" rather than treating it as a normal xml file and reading it using 'XMLDocument' or 'XMLTextReader' or 'XLINQ' etc? This will save me from creating an entry in in the web.config file as well this custom configuration file.
I have a memory leak somewhere that I cant find. Every few days my server will crash, and just before that I log a ton of SQL errors stating that it is "out of memory".
I cant find it anywhere, all of my connections are being disposed like so:
[Code].... Then I call the connection from my pages like so:[Code]....
That is all pretty straight forward. The connection is disposed because it is implementing the USING clause. I am opening the connection in my connection manager class, and not where it is being utilized?Or, could the problem be in the below method I am using to populate a SqlDataReader:[Code]....
Now, at first it appears as though this could be the problem because the Connection isn't part of a USING clause, however doesn't the 'Data.CommandBehavior.CloseConnection' pretty much do the same thing. This makes sure that the connection is closed when the reader is closed, right? Here is how I call that above reader from my login page code behind:[Code]....
So the DataReader will get closed even without the .Close() because it is in the USING, and the connection should get closed because I specified it in the ExecuteReader paramters right?