.net - Force HttpWebRequest To Use Cache In Asp Environment?
May 13, 2010
In my ASP.NET app I use HttpWebRequest for fetching external resources which I'd like to be cached. Consider the following code:
var req = WebRequest.Create("http://google.com/");
req.CachePolicy = new HttpRequestCachePolicy(HttpRequestCacheLevel.CacheIfAvailable);
var resp = req.GetResponse();
Console.WriteLine(resp.IsFromCache);
var answ = (new StreamReader(resp.GetResponseStream())).ReadToEnd();
Console.WriteLine(answ.Length);
HttpWebRequest uses IE cache, so when I run it as normal user (in tiny cmd test app), data is cached to %userprofile%Local SettingsTemporary Internet Files and next responses are read from cache.
I thought that when such code is run inside ASP.NET app, data will be cached to ...ASPNETLocal SettingsTemporary Internet Files but it is not and cache is never used.
What I am doing wrong? How to force HttpWebRequest to use cache in ASP.NET environment?
My Silverlight4 app is hosted in ASP.NET MVC 2 web application. I do web request through HttpWebRequest class but it gives back a result previously cached. How to disable this caching behavior? There are some links which talks about HttpWebRequest in .NET but Silverlight HttpWebrequest is different. add unique dummy query string on every web request, but I'd prefer more elegant solution. I also tried the following, but it didn't work:
_myHttpWebRequest.BeginGetRequestStream(new AsyncCallback(BeginRequest), new Guid());
In fact, by setting browser history settings it is possible to disable caching. See the following link: [URL] But asking a user to change browser settings is not an option for me.
Assume a simple aspx data entry page in which admin user can upload an image as well as some other data. They are stored in database and the next time admin visits that page to edit record, image data fetched and a preview generated and saved to disk (using GDI+) and the preview is shown in an image control.
This procedure works fine for the first time however if the image changes (a new one uploaded) the next time the page is surfed it shows previously uploaded image. I debugged the application and everything works correct. The new image data is in database and new preview is stored in Temp location however the page shows previous one. If I refresh the page it shows the new image preview. I should mention that preview is always saved to disk with one name (id of each record as the name).
I think that is because of IE and other browsers use client cache instead of loading images each time a page is surfed. I wonder if there is a way to force the client browser to refresh itself so the newly uploaded image is shown without user intervention.
We have a wfc layer that wraps the business classes and database access and use a client that lives on the database layer. Amongst our group we are attempting to form standards. Some want to have the client call the web method and pass the page they are requesting and the page size. Pass that to the database and then page in SQL Server use RowNum.Some want to cache the full list of objects in http cache on the service tier and page in memory. They concern here is memory use on the server.
Which would be best for a medium number of users with potentially large number of records to manage (say 30K) Is it better to cache them all in memory and work from there or page at the database as the application scales?
Since I don't want my sessions to be removed unless the session has been abandoned either via code or Session Timeout...For eviction, I would think "None" and for expireable, I would think False.I have tested and calling Session.Abandon does remove the object from the cache. I have also tested to see if by extending my session, the session object in cache is also extended. This does seem to work the "correct" way.
We have a data driven ASP.NET website which has been written using the standard pattern for data caching (adapted here from MSDN):
public DataTable GetData() { string key = "DataTable"; object item = Cache[key] as DataTable;
[code]...
The trouble with this is that the call to GetDataFromSQL() is expensive and the use of the site is fairly high. So every five minutes, when the cache drops, the site becomes very 'sticky' while a lot of requests are waiting for the new data to be retrieved.
What we really want to happen is for the old data to remain current while new data is periodically reloaded in the background. (The fact that someone might therefore see data that is six minutes old isn't a big issue - the data isn't that time sensitive). This is something that I can write myself, but it would be useful to know if any alternative caching engines (I know names like Velocity, memcache) support this kind of scenario. Or am I missing some obvious trick with the standard ASP.NET data cache?
we have so many parameters that the cache key is several hundred characters long. is there a limit to the length of these cache keys? Internally, it is using a dictionary, so theoretically the lookup time should be constant. However, I wonder if we have potential to run into some performance/memory problem.
I have use Nhibernate in my MVC Project by me known, Nhibernate have cache on Session and Object. now, I want use HttpContext.Current.Cache (system.web) for cache data something in project. my code same that have problem, haven't it. and that's right or wrong.
I've got a web application that runs of a state server. It looks like soon it may need to distributed and there will be two web servers behind a load balancer.
This works great for session state but my next challenge is Cache
My application leverages heavily of cache. I understand ASP.Net 4.0 will be offering more here but nothing much has been said about the how too.
There are two challenges that I face
1). Each webserver will have its own copy of cache whereas it would be more efficient to put this to a third server the same as session state is put to state server.
2). The real challenge is keeping cache in sync if a simple dataset derived from the database is changed my code dumps that cache item and reloads the cache. That's all well on one webserver but webserver number two wont know to drop that particular cache item and reload it. This could cause some unexpected problems in the application.
For scenario number 2 I could attempt to do some smart coding so server number two knows to dump the cache and reload it.
My guess is someone else has already been here before and there's probably a better implementation approach rather than writing extra code.
Does anyone know how I could achieve the goal of keeping Cache in sync between multiple webservers or even better farm Cache management to another server?
I need to enable caching in my asp.net application, but I do not want to use the webserver's memory for holding cache objects. If I add the page directive for output caching will the page be stored in the asp.net cache object?
I want to be able to maintain certain objects between application restarts.
To do that, I want to write specific cached items out to disk in Global.asax Application_End() function and re-load them back on Application_Start().
I currently have a cache helper class, which uses the following method to return the cached value:
return HttpContext.Current.Cache[key];
Problem: during Application_End(), HttpContext.Current is null since there is no web request (it's an automated cleanup procedure) - therefore, I cannot access .Cache[] to retrieve any of the items to save to disk.
Question: how can I access the cache items during Application_End()?
Im building a image gallery which reads file from disk, create thumbnails on the fly and present them to the user. This works good, except the processing takes a bit time.
I then decided to cache the processed images using the ASP .NET Application Cache. When a image is processed I add the byte[] stream to the cache. As far as I know this is beeing saved into the system memory. And this is working perfect, the loading of the page is much faster.
My question is if there are thousands of images which gets cached in the Application Cache, will that affect the server performance in any way?
You know I have the way to Cache the data I've got from the SQL Server over data caching. In addition I can output cache web user controls.Whats about a web user control contains data from a SQL database? Does it make sense to cache the data and also cache the control?What is the best solution for the combination of these two components?
I have a server control that I developed which generates navigation based on a third party CMS API. Currently I am caching this control using the PartialCaching attribute. The CMS uses cache key dependencies to invalidate the cache when a user makes an edit, however in the case of my server control it does not get invalidated and the updated navigation will not show up until the cache expiration set by the PartialCaching attribute.Here is my two part question:
What is the proper way to programmatically cache a server control, without using the PartialCaching attribute, and adding a cache key dependency?
Is it possible to continue to use the PartialCaching attribute and add a cache key dependency?
I'm trying to do some simple stuff, I've already looked at the examples through the web and I'm not sure of what I'm doing wrong It's a unit test that i'm doing to test some functionality that later will be performed by some different devices Basically I'm creating a webrequest to my site, which returns a set of cookies, which we later on need. Then I want to create a new webrequest, using the returned cookies from the first response, but when i'm reading that info, the cookies are empty
var request = (HttpWebRequest)WebRequest.Create("http://localhost/bla"); request.ContentType = "application/x-www-form-urlencoded"; request.Method = "GET"; request.CookieContainer = new CookieContainer(); request.CookieContainer.Add(originalResponse.Cookies); // originalResponse.Cookies has several cookies needed by "bla" var response = request.GetResponse(); In another place... (inside "bla") HttpContext.Current.Request.Cookies // this is empty
I am attempting to create a single sign on experience between an asp.net site and a wordpress site using a simple form POST method. I have built a simple php page that uses the native wordpress functions wp_insert_user and wp_signon to create user account in the mysql db and sign them in. In my asp.net 'create new user' page code behind, I'm using the post method of an HttpWebRequest to send the required information to the php page.
It almost works! The new wordpress user is created in the mysql database, but they are not logged in. How can I get wordpress to log them in?Here is my HttpWebRequest
'get the values Dim fn As String = TxtFirstName.Text Dim ln As String = TxtLastName.Text[code].....
I'm implementing PayPal PDT (Payment Data Transfer) for payment confirmations in ASP.NET.'m going to receive a url post from PayPal with query string parameters. Then I need to send back the form at the bottom of the page.I'd like to implement the form at the bottom as a non-visual HttpWebRequest.
I'm developing a website that will connect to a credit card processing gateway webservice. For security purposes this webservice accepts requests only from IP addresses that were previously informed to them.
Since I'm developing locally, my IP changes almost every day. Is there a way for me to change the IP address of a HttpWebRequest so that I can test the Webservice calls locally?
This webservice is accessed through a https address and the methods must be sent via POST.
I am trying to connect to a web service that uses Kerberos Authentication to authorize the user, but all I get is a 401 unauthorized everytime I try to make the request. Below is the code that I am using.
EDIT: I feel I should explain a bit more what I am attempting to do. I have been tasked with providing a new interface for my company's Google Search Appliance. I am using an ASP.NET page, which does some things like choose a Collection depending on where a user is located, etc. and then sends the appropriate search string the the GSA. This was all working well until they decided to turn authentication on, and now I can't get any results (I either get a 401 unauthorized, or a message stating that 'Data at the root level is invalid'). If I take the search string and provide it directly to the GSA, it authenticates fine, and displays the results, I just can't seem to get it through the HttpWebRequest.
EDIT 2: I did a little more looking (ran the request through Fiddler) and it looks like the request is only attempting Negotiate and not Kerberos. I set the credentials to use Kerberos explicitly as below, but it didn't help...
I wrote following code to update data to website. Code run; but I am not able to see my data if uploaded. Here we have facility to see what data is getting uploaded but I am not able to see my data.
// Above URL is not real as I do not want to disclose real URL as of Now Uri targetUrl = new Uri("http://www.x86map.com/post-embed/ewspost"); HttpWebRequest request = null; StringBuilder sb = new StringBuilder();
I am trying to automate a few things in wordpress blog, using HttpWebrequest.i have tried to get the login page "http://mywebsite.net/wp-admin"and then try to post with login data on page "http://www.mywebsite.net/wp-login.php"data = "log=admin&pwd=mypassword&wp-submit=Log+In&redirect_to=http%3A%2F%2Fmywebsite.net%2Fwp-admin%2F&testcookie=1"i am not able to login, wat i have discovered is that when using browser the cookies are sent to the server, but using HttpWebrequest the cookies are not sent on post, i am configured a cookiecontainer for the httpwebrequest and works fine other wise,.. and also on "post" the request host also changes to "www.mywebsite.net" and on "get" request it is "mywebsite.net"