C# - How To Exceed The 60% Memory Limit Of IIS7 In Caching Application
Jun 11, 2010
Pardon if this is more serverfault vs. stackoverflow. It seems to be on the border.
We have an application that caches a large amount of product data for an e-commerce application using ASP.NET caching. This is a dictionary object with 65K elements, and our calculations put the object's size at ~10GB. Problem:
The amount of memory the object consumes seems to be far in excess of our 10GB calculation. BIGGEST CONCERN: We can't seem to use over 60% of the 32GB in the server.
What we've tried so far:
In machine.config/system.web (sf doesn't allow the tags, pardon the formatting):
processModel autoConfig="true" memoryLimit="80"
In web.config/system.web/caching/cache (sf doesn't allow the tags, pardon the formatting):
privateBytesLimit = "20000000000" (and 0, the default of course)
percentagePhysicalMemoryUsedLimit = "90"
Environment:
Windows 2008R2 x64
32GB RAM
IIS7
Nothing seems to allow us to exceed the 60% value.
As I know, by default Cache is stored in the memory and to the disk at the same time DiskCacheable=true).When the cached response is removed from the output cache due to memory pressure, it still remains on disk allowing a much larger set of pages to be cached. In addition, disk cached pages survive application restarts. And this is already in ASP.NET 2.0.I dont't know in which order the caches are removed from the memory and readen from disk instead? I would like to achive, that caches with the minimal trafic, or the longest last used, would be removed first from the memory. Is there some settings to do that, or even by default works that way?
I have set Private Memory limit of 200mb in IIS 7 for an application pool. The Private Working Set memory(Task Manager) for the application is always below 125mb but the number of page faults have increased a lot and application cache is getting cleared frequently after setting the limit.
I haven't set any limit on Virtual Memory.why the cache is getting cleared even when the Private memory used is below the allocated memory?
I am new to IIS7 server I have a ASHX generic handler in ASP.NET, its process some mathematical data depending on the user data (which is hard to predict), mathematical model can use lot of memory and may put IIS server to hold all the other tasks I am trying to limit the memory that ASP .NET process can take and setup timeout limit for the max execution time for this .ASHX process i will be glad if any one can point me to the right direction to resolve this
I am currently undertaking load testing of a asp.net 4.0 web application hosted on a 64bit 2008 server (iis 7.5).
The purpose of the load testing it to determine the maximum memory usage by the web application if every page is cached simultaneously.
To evaluate this I set the output cache duration of the pages to 900 seconds then I request each publicly accessible url via xenu link sluth. This effectively request 20,000 or so pages.
To monitor memory usage I am using both Windows performance monitor and Redgate memory profiler 7.0.
I have run the test twice, test 1 with the physical memory limit set to the default 0, and test 2 with the physical memory limit set to 921600 (900mb).
Here is what I have observed,
In both tests the application pool is never recycled. In test 1, the worker process memory usage grows to 1,300mb. (Above the memory limit of test 2) In test 2, memory usage grows to 720mb. In test 2, memory usage grows to 720mb. In test 1, the unused memory allocated to .Net grows to 700mb In test 2 it grows to 150mb.
does setting the physical memory limit in iis 7.5 cause the garbage collector to operate more aggressively?
I would like to use output caching with WCF Data Services and although there's nothing specifically built in to support caching, there is an OnStartProcessingRequest method that allows me to hook in and set the cacheability of the request using normal ASP.NET mechanisms.
But I am worried about the worker process getting recycled due to excessive memory consumption if large responses are cached. Is there a way to specify an upper limit for the ASP.NET output cache so that if this limit is exceeded, items in the cache will be discarded?
I've seen the caching configuration settings but I get the impression from the documentation that this is for explicit caching via the Cache object since there is a separate outputCacheSettings which has no memory-related attributes.
Here's a code snippet from Scott Hanselman's post that shows how I'm setting the cacheability of the request.
I have a web.config file which is quite large in my current solution running on IIS7.
It's working perfect on my dev server however I encounter the error 0x80070032 "Config Error Cannot read configuration file because it exceeds the maximum file size"
My current solution uses a very large web.config file. The architecture of my CMS application requires a large number of configuration settings.
Is there some way to extend this size limit or can I split the web.config file down into smaller files?
I have an ASP.NET master page which references a #include file as follows:
<!--#include virtual="/includes/scripts.inc"-->
I have modified the file /includes/scripts.inc but the changes do not show up in pages. What needs to be done so modifications will be reflected? I need to avoid the following: restarting the server restarting IIS modifying web.config (doesn't appear to have any effect) pretty much anything that causes the app domain to restart Any other options? Is there a setting which affects how long IIS caches #include files?
I have a problem with large respones and IIS7, the server runs out of memory. I've written the test code below that works pretty much like my real code... When i start to download the file i can se the memory usage rise until it hits 100% and Firefox complaints about lost connection to server, looks like IIS7 does not release cache or something.. Works in IIS6 by the way
I have a need to run an application in classic mode for backwards compatibility with a specific application, and am trying to understand what kind of impact that will have on the performance of an MVC application that is running on the site. If we put a few static file maps (for .js, .css, .png, etc) above the ASP.NET wildcard map to reduce the amount of processing by the ASP.NET handler, will we be approaching the integrated mode in terms of performance?
The thing i'm primarily concerned with is any effect this might have on output caching. I understand that integrated mode might (?) allow for the output cache to handle non ASP.NET content, but that isn't really a concern. We're more interested in ensuring that the MVC application has full use of the output cache. Empirically i've found that the two configurations operate on par when things go well, but if the page references resources that are not available, the integrated mode tends to fail much more quickly than the classic mode (e.g. 500 ms vs 10 seconds), reducing 'hang time' on the page load.
multiple groups of users interacts using browser. in each group, users will interact with some data(objects). the data(objects) are loaded from database. during the interaction, users needs quick and synchronized view of the same data(object), so they must be save in the memory. data(objects) will change during the interaction, but only the final result of this interaction need to be saved back into DB.
My Current Solution
load data(object) into global.asax the manipulate. but for this solution, i got few questions.
how can i make sure the web application have ONLY ONE instance?(configure in iis-->application pool?) because web application would restart by itself, as a result, all data in the application state will lose. how can i avoid application restart by it self rather than managed? in the iis application pool setting, i can set the recycle time, is it guaranteed that no application restart will happen during this time? or is there and event that might trigger something(before application restart) so i can save the current application status and load them back again?
I have an ASP.net application with c# which uses MS SQL server 2005. I find that the cpu usage of my server reaches 100% when ever i run the application. I was told that i may have memory leak in my application. How can i trace is there are any memory leaks in my application? Note: I am not the programmer of this application. But i have the complete source code as i am doing maintenance and enhancements on this application.
I'm trying to determine the memory used by my ASP.NET MVC application. My host imposes a 100mb application pool memory restriction. However, from my tests, an empty ASP.NET MVC application uses 30mb memory so I have absolutely no hope with an application that actually does something.
I can't find any benchmarks on what a "standard" MVC app should be using (I assume MS have someb somewhere??).
My application is not that special. I use StructureMap for DI and EF Code First. The model is equivelant to that of a blog. If I completely remove my data access code then my memory usage drops to around 40mb (of course in this state its pretty useless).
Bit of a weird title, but here's the deal. For a web site I'm working on, we have the need to generate quasi-3D images on the fly. Basically, it's for an art site, and we have the need to show a 3D representation of a Canvas given a 2D image (jpg). (See here for some context.) The approach we're taking is to leverage the WPF 3D API and create a Viewport3D in code, add a bunch of points to it with the correct dimensions, and then apply the textures from the original jpg appropriately. While testing it, I was testing it the whole time in a sandbox environment, and in the built in Cassini web server in Visual Studio. While trying to migrate it over to the actual code repository and testing it there, it stopped working. The image that's pumped out is the correct size, but is completely blank. It's totally black. After hours of banging my head against the wall, I figured out that it's an IIS issue. I created a simple sample app to demonstrate the issue (doesn't add images though it just paints all sides green), however since most of the code is largely irrelevant to the question, I won't put it here, rather I'll post a link to it:
[URL]
If you do download it and want to run it, in the code behind of Default.aspx you'll see this:
Feel free to change that path to whatever, and make sure that the correct permissions are in place as it'll try to save a file there. If you try that sample in Visual Studio with Cassini, it'll work fine, and you should have a new file called "output.png" which has a green 3D cube. If you try it in IIS, you'll get a blank image. A few points: Before anyone asks, yes, I gave all the proper folders the correct permission issues. I also do the actual 3D generation and image saving on a separate thread with the Apartment state set to STA. know this is a bit of an unusual case of fusing WPF and ASP.Net, but by some chance, Is there some setting in IIS that I need to change? Is there some limitation to the WPF API that won't allow it to run out of IIS?
I am trying to find a solution to control the number of logins on asp.net application. I need to install the application in the client server, and set the number of licences. e.g. only 10 users are allowed to access the app.
Every time someone tries to login I need to check how many user are logged in, compare with the total allowed then authorize that user to proceed.
I tried with Certificate, but I couldn't see where to match the number of logged in users with the max number of allowed user.
Also I would like to use the IP address as identifier, then if I open 3 browser windows, it count only one user logged.
Basically this web application will be sold by licences. We need to control the logins per computer, and not per user, and block logins if the limit of logins are reached.
I have an application written in asp.net c# and hosted on IIS6. There are an 'Image' folder with 3-subfolders. These folders contained images of type 'jpg','gif','png' etc. Images are very huge in size and in numbers. On the application every page has some images needs to open/show, due to the size of images my application get slower and slower.
I know there are some techniques to Cache image folders at client machine or some other methods. I have no access on IIS. So, I need some httphandler or httpmodules which provide something like Image-caching.
I tried this : codeproject.com/KB/aspnet/CachingImagesInASPNET.aspx The problem is I have to add an extra extension '.ashx' to every image link. There is no problem with images are loading dynamically but it has trouble with images need to embose a manual link. Also, this is working fine upto around 1000s images.
Running an ASP.NET application in its own app pool on Windows Server 08 / IIS 7.
Server keeps hitting 97% memory usage and sticking there, trying to work out if that is our application's fault.
My main question is, does all of the memory used by an application get displayed as the working set for the w3wp.exe process associated to it? Our application (according to IIS7 and the worker process in task manager) is using less then 350mb. I want to know if it is possible for our application to be using 5GB of memory but only showing 350mb for the process?
I'd like to describe strange issue I've noticed while analyzing my asp.net application in production and ask for some advice or opinion on the following matter.Application usually runs with some 80-90 MB of memory footprint. This seems stable since no memory leaks have been detected so far - no slight increase in memory usage over time. Yet, problem occurs when application pool recycles (I'm using shared hosting and judging by logs it occurs either when app is idle for 20 mins or every ~30 hours - something like that). The issue is that used memory almost doubles for some period on recycle - it goes to some 160-170 MBs without any explanation. This is confusing, since it is common claim that recycling should purge the memory and all other resources - at least I get it that way. System holds this amount of memory for some 7-8 hours and then memory usage drops to it's usual level of 90-100 MB, again, with no apparent reason (at least not know to me).
my web applications app pool configuration is PeriodicRestartMemory : 512000 PeriodicRestartPrivateMemory : 196608
although the virtual memory limit is higher than private memory, app pool is recycled with virtual memory limits exceeded errors in the event log (instead of private memory).
what is the reason for this? how could it exceed virtual memory limits before exceeding private memory limits? it seems that systems other allocations in virtual memory cause limits exceeded before applications private allocations exceed the limits, but what are those allocations of the system? or what is the root cause of this.
What's the best way to cache web site user data in asp.net 4.0? I have a table of user settings that track all kinds of user or session specific stuff like the state of UI elements (open/closed), preferences, whether some dialog has been dismissed, and so on. Since these don't change very often (for each user, anyway) but are looked up frequently it seems sensible to cache them. What's the best way? These are the options I've identified...
Store them in HttpContext.Current.Session directly (e.g. Session["setting_name"]) Store them in HttpContext.Current.Cache Use a global static dictionary, e.g. static ConcurrentDictionary<string,string> where the key is a unique userID + setting name value Store a dictionary object for each session in Session or Cache What's the most sensible way to do this? How does Session differ from Cache from a practical standpoint? Would it ever make sense to store a dictionary as a single session/cache object versus just adding lots of values directly? I would think lookups might be faster, but updates would be slower since I'd have to re-store the entire dictionary when it changed.
What problems or benefits might there be to using a global static cache? Seems like this would be the fastest, but I'd have to manage the size. I could just flush it periodically if it hits a certain size, or keep a cross reference queue and remove things oldest first when it gets to a certain size. Does this make any sense or is it just trying too hard?
I have a web application that will be distributed over 2 servers and the Database will be on a server other than the 2 servers. so application on each server will access database exists in another server.I am using caching in the application and when data changed the cache is cleared and the problem now how each server will feel with cache changes on the other server?
I have tried SQLCache dependency before and i have heard about Memcached, Velocity and Enterprise cache so which one will be more efficient and optimal for this case? and i have another solution to create webservice method on each application and when the cache is cleared in one application it will request the webservice method on the other application to clear cache there.
Which technique is more efficient and optimal in performance and security? and is SQLCache dependency related to SQL database only or can be applicable to other databases like Oracle?
Note: I am using load balancer to distribute requests coming for the application among the 2 server
Is there a technical/performance limit on the number of roles an application handles? I'm in the process of designing an application which I forsee having a lot of roles created, to be able to handle the degree of granularity the system should have (e.g., permissions per project, where there could be a lot of projects).
Would you recommend another approach other than using roles for this kind of granularity?
we need to extract the data from Facebook API for 50k companies per day. When we are running applitation, we are getting error likeĀ Application request limit reached. Is there any way around to increase the request limit. I am thining may be dynamcally changing the IP address, resume thread for some time, etc.
I am hosting a solution with an outside company so getting them to troubleshoot or send me log files is not working.
Is there a way from code or Global.asax file that I can trap Memory usage from my web application when it reaches a certain amount? Then figure out what is using all the memory