GZIP And Cache Files Loaded With LastModified Data?
Jan 29, 2011
Project: Asp.net 2, MSVS VB.net
Hoster: Windows, Shared environment
I ran firebug (via Firefox) with googles speed test, and I got two major issues come back to me, and I sent two questions to my hoster.
1) MY QUESTION TO THE HOSTER1: Css, Js image files loaded to browser and cache from my hoster go up the request process with out the lastmodified data, and thus the browser cache does not know when to refresh the files on expiration date?
MY HOSTER RESPONSE: LastModified is a script that can be implemented to your java coding (ME: I think hes confused here, I dont use java) which you need to enter yourself. It is not a server function that we can modify on our side but it is something that you will have to change your website coding.
2) MY QUESTION TO THE HOSTER2: Firebug results said that GZIP is not activated for my files of CSS, js and asp.net pages, why not?
MY HOSTER RESPONSE: Currently, you are using Windows environment hosting services which does not support gzip. gzip is available on our Linux environment hosting plans as gzip is mainly for Linux OS.
After falling off my chair, I have to ask, what can I do via coding on my asp.net2 vb.net project? I thought these two issues were purely server related and nothing to do with code with the project? What do I do, or is my hoster wrong ?
I have heard that if we use gzip, aspx files will be loading faster. But, I am not sure as to how to use it in my web applications. I am hosting my site with GODADDY (ASP 2, IIS 7) Can anyone tell me if I will be able to use gzip? Can you give me a sample file where gzip is used?
we're looking to compress our gargantuan JavaScript files with GZip to speed up the page loads of our site. I know this can be done through IIS, but I can't seem to find a simple step-by-step guide on how to implement it.We're running IIS7.5 on Windows Server 2008 R2.
where external files / images are being loaded from. By this I mean that I'm aware that external JS files / Images are cached on the first load of a page. What I'd like to have is a tool that confirms to me that on subsequent requests these files are in fact being loaded from the users cache rather than downloading the file again.
I have a class that is creating an instance of StreamReader to an xml file on the local filesystem. It may be possible that this same file is requested multiple times per-second.
I was wondering whether I need to manually add this file to the System.Web.Cache and read it from there, or whether Windows itself is clever enough to cache the item itself so that it 'knows' when ASP.NET requests this file the second/third etc time that it doesnt have to do a disk seek/read operation and pulls it from its own cache?
This article: http://dotnetperls.com/file-read-benchmarks seems to back this up, but this: article:
[URL](although not discussing from a performance perspective, and maybe for other reasons entirely) lists how to add a physical file to the cache.
We have a data driven ASP.NET website which has been written using the standard pattern for data caching (adapted here from MSDN):
public DataTable GetData() { string key = "DataTable"; object item = Cache[key] as DataTable;
[code]...
The trouble with this is that the call to GetDataFromSQL() is expensive and the use of the site is fairly high. So every five minutes, when the cache drops, the site becomes very 'sticky' while a lot of requests are waiting for the new data to be retrieved.
What we really want to happen is for the old data to remain current while new data is periodically reloaded in the background. (The fact that someone might therefore see data that is six minutes old isn't a big issue - the data isn't that time sensitive). This is something that I can write myself, but it would be useful to know if any alternative caching engines (I know names like Velocity, memcache) support this kind of scenario. Or am I missing some obvious trick with the standard ASP.NET data cache?
I have use Nhibernate in my MVC Project by me known, Nhibernate have cache on Session and Object. now, I want use HttpContext.Current.Cache (system.web) for cache data something in project. my code same that have problem, haven't it. and that's right or wrong.
My requirement is to get the file size in client side. there is no problem in FF but in IE you can't do that unless u r using an activeX object. So we thought of putting it in browser cache and reading the file size from there and when we post it to the server we will be taking it from the cache and send it to the server.
I have a website that i did some time ago now they request some new features and i did some changes in some javascript files, but when i publish the clients that use the IE have problems with cache so in they browser they have old version of javascript. How can i clear the client cache so when they visit website they use latest javascript files that i modify.
I am not good at IIS management.I enabled gzip compression for my web site but In IIS Temporary Compressed Files folder,i dont see any aspx type file.I just see js,text,css and some html files.Is that normal?Why dont i see aspx pages as compressed.Thats my metabase.xml and i think that my settings are true.
i implement this library in my applicationhttp://www.dominicpettifer.co.uk/Blog/17/gzip-compress-your-websites-html-css-script-in-codeit Works very well if i run the site in Visual Studio but when i compile my site and publish in IIS it only Gzip ASPX files not CSS or JS files.does anyone knows a better way for implement JavaScript and CSS Gzip in C# 2005 (changing the IIS its not an option it has to be in the code)
I am running a website using IIS6 and i wrote a simple generic handler which return smaller images when it receive image url as query string. My problem is that the server is applying gzip to some file types such as .aspx and .ashx. And that made my response image from the handler appear with lower quality because they are compressed.
How can i disable gzip for just this handler file, i hope for a solution without editing the IIS.
I'm trying to create an HTTP gzip/deflate module, but when I'm trying to get the Accept-Encoding header it returns null, I tried in IE8, FF, Chrome and Opera.
We have a wfc layer that wraps the business classes and database access and use a client that lives on the database layer. Amongst our group we are attempting to form standards. Some want to have the client call the web method and pass the page they are requesting and the page size. Pass that to the database and then page in SQL Server use RowNum.Some want to cache the full list of objects in http cache on the service tier and page in memory. They concern here is memory use on the server.
Which would be best for a medium number of users with potentially large number of records to manage (say 30K) Is it better to cache them all in memory and work from there or page at the database as the application scales?
Since I don't want my sessions to be removed unless the session has been abandoned either via code or Session Timeout...For eviction, I would think "None" and for expireable, I would think False.I have tested and calling Session.Abandon does remove the object from the cache. I have also tested to see if by extending my session, the session object in cache is also extended. This does seem to work the "correct" way.
we have so many parameters that the cache key is several hundred characters long. is there a limit to the length of these cache keys? Internally, it is using a dictionary, so theoretically the lookup time should be constant. However, I wonder if we have potential to run into some performance/memory problem.
I've got a web application that runs of a state server. It looks like soon it may need to distributed and there will be two web servers behind a load balancer.
This works great for session state but my next challenge is Cache
My application leverages heavily of cache. I understand ASP.Net 4.0 will be offering more here but nothing much has been said about the how too.
There are two challenges that I face
1). Each webserver will have its own copy of cache whereas it would be more efficient to put this to a third server the same as session state is put to state server.
2). The real challenge is keeping cache in sync if a simple dataset derived from the database is changed my code dumps that cache item and reloads the cache. That's all well on one webserver but webserver number two wont know to drop that particular cache item and reload it. This could cause some unexpected problems in the application.
For scenario number 2 I could attempt to do some smart coding so server number two knows to dump the cache and reload it.
My guess is someone else has already been here before and there's probably a better implementation approach rather than writing extra code.
Does anyone know how I could achieve the goal of keeping Cache in sync between multiple webservers or even better farm Cache management to another server?