HTTP Response Differences Between Browser's View Source And Netcat's Output?
Jun 10, 2010
I'm looking at a website using Internet Explorer and Firefox. In each browser I select view source and see the website's URL in the links. These links were concatenated together using HttpContext.Current.Request.Url.Host in the code behind. However, when I use netcat or Burp Suite v1.3.03, looking at the same links I see the servername instead of the website's URL.
My question is - Why does view source in the browser display different links in the page source than what netcat or Burp Suite outputs? Is the browser rewriting stuff?
My thought to correct is to have a web.config setting which is used to create the links.
Next question - Does anyone know of a configuration change to make to IIS to return the URL instead of the server name or a .NET function that I should be calling instead to get the URL that the website is running as.
I have two very similar pieces of ASP.NET code that send a file in an HTTP Reponse to the client. They should cause the browser to prompt to save the file. The first one works, the second one doesn't. The HTTP responses as seen in Fiddler are below.
Working: HTTP/1.1 200 OK Cache-Control: private Content-Length: 228108 Content-Type: application/vnd.openxmlformats-officedocument.spreadsheetml.sheet Server: Microsoft-IIS/6.0 X-Powered-By: ASP.NET X-AspNet-Version: 4.0.30319 content-disposition: attachment; filename=Report.xlsx Date: Wed, 05 Jan 2011 12:17:48 GMT <binary data>
Not working: HTTP/1.1 200 OK Server: ASP.NET Development Server/10.0.0.0 Date: Wed, 05 Jan 2011 12:19:21 GMT X-AspNet-Version: 4.0.30319 Content-Length: 228080 content-disposition: attachment; filename=report 2.xlsx Cache-Control: private Content-Type: application/vnd.openxmlformats-officedocument.spreadsheetml.sheet Connection: Close <binary data>
When the first one is seen in Fiddler the browser correctly prompts to save the file. When the second one is seen in Fiddler, nothing observable happens in the browser. Same behaviour in both chrome and firefox.
EDIT: ASP.NET code that produces the second response
we're facing a weird and seemingly randomly appearing problem where the browser renders the complete, raw HTTP response (to a GET request) including all headers and the compressed content as text instead of just using the contents and rendering it. This happens for whole page loads as well as postbacks as well as page loads inside an iframe; for sure in Firefox 3.6.*, not sure about IE right now.
Our service is an ASP.NET 2.0 web app running on IIS 7.5, on our test machines we regularly have Fiddler running in the background (wondering if this might be part of the problem).
This behaviour occurs very rarely but we have started seeing this problem lately during our tests.
Has anybody encountered this problem before and knows what causes it and maybe even knows what to do about it?
I developed an asp.net mvc application. In one of my forms, I'm getting a following exception:
A public action method 'UpdateBasket' could not be found on controller 'App.Controllers.WebShopController'
But the funny thing is, that form submit works (even in debug mode) and finds an ActionMethod UpdateBasket and returns View. The problem is that because of that html generated error w3c validators and google crawlers can't access the site (I think so, correct me if I'm wrong).
I used to be able to view the pages of my ASP.NET 3.5 website locally via the 'View in Browser' facility. However, this no longer works (for any page). All I get is a 'HTTP 404 Not Found' error message.
First off, suggest better ways if you want rather than patch up this code. I am just starting this project and it is first time I have tried to code a web site so anything you suggest is very much welcomed. I have spent the last 2 and 1/2 days trying to find workable answers to this but none I have found and tried seem to fit. If needed I can email screens shots or code. I am trying to construct a website that will have the same main theme throughout as far as the header, navbars sitemap, and footer go. Naturally the content will vary from page to page. I want to use a single master page and css stylesheet for this main theme and I will change the content format as needed per page because each page may vary somewhat.
However, about 15 of the pages will all use the same format for their content and this format will differ from the rest of the site but the format of the main theme will stay the same. So I am trying to create a nested master page to use to format the just the content area for all these pages while retaining the main theme for the header, etc.. I believe I should use a separate css file with the nested master page to handle the formatting of the content areas for these 15 pages. I have code that looks like it works when viewed in web developer but design view but does not work when viewd through a browser (IE, Chrome, Firefox). So far I have the following done in web developer.
If it helps the code and screen shots follow. Master Page called "Parent.master". For all theme throughout site Nested Master Page called "Lab.master". For the 15 like pages Primary stylesheet file for main theme called "StylesheetNew.css" For site theme Secondary stylesheet file to syle the 15 like formatted pages. It is called Labmaster.css Labs.apsx file to use as first of the 15 like pages. Finally screens shots from web developer and browser. Sorry try as I might I could not capture screenshots and put them into this post. The problem is that web developer shows all the area (many lines high) with gray background and with the XXX text that I want to allocate for content in the 15 pages. Yet all the browser show is one line of text o a white background followed by the footer. It looks like the LabStyleSheet works in Web Developer but not in a browser.
look at this sample link address: (weather.gov) [URL] if you view page source in browser, you can see that it shows data in XML format (usng xsl.?). I need to implement a simple web page like that. I think that web site uses XML XSL I'm going to implement a web application in asp.net which will use data stored in sql database (or xml database or web service) and show these information like other normal web sites but in xml format in nice UI (using xsl?).That Weather website is only a sample to show what i want to do (i will not use any data from that site, my application is different).My requirement is being able to view page source only in XML format.Now I'm clear that xsl is the solution for that, but considering to use this method in asp.net.(use xml/xsl in dynamic asp.net pages) My Question? It is important for me to make the web page output in xml format(visible in xml format in page view source) but looks user friendly for users. 1.how can i do it in asp.net ?
how to make my source code to display on one line instead of multiple in source view. The display drives me batty when I'm trying to find something and I would prefer to display across the page instead of multiple lines down the page.
Occaisionally our office printer craps out on us in the middle of a print job, or someone just forgets to print because they get interrupted. In the good 'ole days, I built up my response using a StringBuilder and output the contents to the screen and to a log file in case we ever needed to go back and re-print.
Now I'm working with a system that makes use of all the .Net yumminess (Repeaters, page events, etc) rather than building up the HTML in code. Is there a way for me to log/archive the entire HTML response generated by the server for a particular page (e.g. hook into the Page_Render event and dump the output to a file)?
UPD: Seems like the cause of problem is that HTTP handler response isn't caching on server. The following code works well for web-form, but not for handler:
I was watching earlier some of video's which i have Downloaded about Master Page Event's,i was reading how we can call the Control Event's on Master Page from Content Page.Now I want the OutPut as-Response.Write(RadioButtion1.SelectedValue).wht ever value i select
I would like to read all content that's been written to the output stream. I'm attempting to do this using an HTTP module, and it seems like the obvious timing would be when handling the PreSendRequestContent event.
However, if the output stream seems to be set to write-only, as I can't read using a StreamReader. Is there a way I read and re-write the content without writing my own IIS module?
I'm trying to create an ASPX page that displays both an HTML message ("Please wait, your file transfer will begin momentarily") and also commence transmitting a file. I'm trying to avoid making the user open the page, and then clicking a Download buttonIt seems like this may be possible with a "Multipart/mixed" MIME type. Elsewhere I think I read that ASP.NET won't support this.
I've a question about something I'm searching for,for too long! We've build an application from which an admin upload songs into a database. Then user can bought songs and download it individualy. The problem is that when user download MP3 songs with the code below, it works great in Firefox and Chrome but not in IE8 simply because WMP trying to open the songs and it just don't get it instead of having a "Save as" dialog? Any issue on HOW can i force to have the "Save As" diaglog? Note that I have not MP3 physicaly on server it's in database. So I can't direct link to song ...
I'm using Visual Studio 2008, and when I select/highlight something in design view and switch to source view it does not highlight and scroll to the selected item. This makes it really hard to change stuff in source view, and it's very inconvenient at worst
I have some code that is used to replace certain page output with other text. The way I accomplish this is by setting the Response.Filter to a Stream, Flushing the Response, and then reading that Stream back into a string. From there I can manipulate the string and output the resulting code. You can see the basic code for this over at [URL] However, I noticed that Page Caching no longer works after the first Response.Flush call.
I put together a simple ASP.NET WebApp as an example. I have a Default.aspx with an @OutputCache set for 30 seconds. All this does is output DateTime.Now.ToLongTimeString(). I override Render. If I do a Response.Flush (even after the base.Render) the page does not get cached. This is regardless of any programmatic cacheability that I set. So it seems that Response.Flush completely undermines any page caching in use. Why is this?
extra credit: is there a way to accomplish what I want (render output to a string) that will not result in Page Cache getting bypassed?
I need to I can make the web request to a webservice which take a XML argument. And is expected to return a Binary response. I am able to make the request but while recieving the response back I am unable to get the response in binary. When I read the response using streamreader see the header and some attached "HEBRISH" words probably binary but unable to sepreate it out. Please help in seprating out the binary the response data.