I have subdomain "s1" which points to "a" folder. I want to know how could I access to files stored in "b" folder form subdomain "s1". My issue is that files I have some html content which saved in DB Records and their path are base on webroot i.e. "<img src="/b/movie1.avi" /> .Could any one tell me how I could show these files via subdomain "s1"
What I want to do is take traffic that is going to shop.mywebsite.com and redirect or rewrite (I'm not sure of the terminology) the domain to be www.mywebsite.com/shop. Both shop.* and www.* are separate web applications (nopCommerce and Umbraco respectively) that don't seem to cooperate when I've tried to nest them. Both applications are in a Server 2008 R2/IIS 7.5 environment.
I've searched around stackoverflow and what I've found is a lot of answers to mapping the other direction (ie subfolder to a subdomain) but that's not what I'm looking for as far as I understand the problem.
The end goal is to combine the SEO reputation of the shop subdomain into the www subdomain. I readily admit that I might have this all backwards and am willing to try any suggestions I'm offered.
I have an app with multiple subdomains, subone.parent.com, subtwo.parent.com.
I have a logon page at parent.com/login. When a user logs in I redirect them to the proper domain based on which one they are a member of. This works fine.
FormsAuthenticationTicket ticket = new FormsAuth... string encTicket = FormsAuthentication.Encrypt(ticket); var cookie = new HttpCookie(FormsAuthentication.FormsCookieName, encTicket); cookie.Domain = subone.parent.com Repsonse.Cookies.Add(cookie)
This properly authenticates the user for subone.parent.com and not subtwo.parent.com. However I would like to do the following.
If the user goes back to parent.com, I would like to know that they are logged in and redirect them back to subone.parent.com.
Is there a best practice for accomplishing this? Or do I have to set another cookie for parent.com?
I started working on a website using 2010 Express. However, the hosting company that I am using hasn't fully implemented .NET 4.0. I'm wondering if there is a way that I can save my website for compatability with 2008 express. Then I can finish and publis using 2008 Express.
I'm hoping that somebody cleverer than I am can explain what's happening with my file uploading and how to improve it.My website is used by vets to submit photographs of tumours in horses for examination by another vet and so it's important that the pictures uploaded are high quality, and therefore the file size is large. Since they might be uploading details of several tumours at a time with several photos of each uploading times can be very long.
I accept that upload speeds are generally slower than download speeds and I have put notes on the page to advise the user to be patient while the images upload. However, I've noticed when using Google Chrome, which displays a percentage-complete in the status bar when files are being uploaded that what seems to happen is that when the user selects a picture from the asyncfileupload control it then slowly counts to 100% as the image is uploaded. But when they then click on the 'Save' button which is when code-behind uploads the file to the web server and writes an entry in the database it again counts to 100% at the same, slow speed.
It's as if the file is being uploaded twice. It's as if the file is being uploaded twice, but I've only coded it to happen once, when the 'Save' button is pressed. The first upload must be some part of the way the asyncfileupload control works. So am I using it incorrectly? Should I somehow be using the inbuilt, automated upload to get the image to my server rather than in the code-behind of the 'Save' button?
One other piece of information which may be relevant is that at the point the picture is uploaded by the asyncfileupload control there is no associated record in the database for it to be linked to. When the 'Save' button is clicked a record for that tumour is created first, then then the image(s) is uploaded and an associated record created with the image filename, so using the automated uploading of the asyncfileupload control may not be possible for me as at that point there is no record to link the filename to. If that's the case can I disable it somehow?
I need to do multiple file uploads in my ASP.NET page. I also have to display a progress bar with the status of the file transfer. It should display an all file progress status bar separately and a total in a separate progress bar until the file upload has finished.
I have a page currently setup to upload 1 image at a time, it uploaded the image, then creates a thumbnail for that image in the same directory. Client needs / wants to be able to upload a few more images at the same time. So i would like to see if i can use the existing code that works, but setup to upload a few more images and process them all to create the thumbnails.
I currently have a page that allows them to upload 5 images at a time and that works fine, but its just simple uploading the images, nothing else is happening at that moment.
I recently embarked on the endeavor of creating my own asynchronous file upload components for ASP.NET. I took lessons learned form Darren Johnstone's FileUpload project and created an HttpModule for extracting the files from the submitted data.
I got everything working as it should in testing with VS 2008 using the Development Server. I even went so far during my testing to ensure that the request was being intercepted by the module before the files began uploading. After I was satisfied with things, I deployed the project to our web server (Win 2008 w/ IIS 7). I was horrified to learn that the controls were not functioning when deployed.
After some remote debugging, I found that the HttpApplication.AuthenticateRequest event (my location for hooking in to the process) was not being invoked until the files were completely uploaded.
In continuation to my previous research at this link : Security Risks or concerns with the use of FileUpload control of asp.net - how to Scan files during upload, also how to intimate user if file is virus affected and abort the operation. In addition to above, we have McAfee Antivirus installed on our servers. I heard that there is some APIS for this work for Symantac Antivirus but I am not sure about McAfee antivirus.
I have a aspx called user-photo-upload.aspx and another aspx called get-photo.aspx. I set the Session["PhotoId"] in the page_load method of user-photo-upload.aspx.
If I visit the user-photo-upload.aspx through the browser normally, the session can be retrieved in get-photo.aspx. But if the flash uploads photo to the user-photo-upload.aspx page, I can't get the Session["PhotoId"] in get-photo.aspx.
I discover that the Session ID is different when visiting the page using browser normally or by flash. I don't know why flash uses another session.
On my current project I have to accept 2 Excel spreadsheets that will be uploaded by clients, process them and create a download package based on this information. This process includes extracting data from the sheets, updating our databases and building several PDF files for the download package. This takes between 15 seconds to 2 minutes to complete, depending on the complexity of the request. Naturally, I want to show some kind of processing indicator rather than just leaving the user hanging while the page loads.
Here's the problem: How to show this processing indicator.
I have to do a full postback to upload the files so this eliminates several nice AJAX indicator methods (sponsoring users rejected the AJAX toolkit async file upload, saying it was confusing to them). If I process on any of the page events during the postback, the page doesn't load until the lengthy process is completed so the browser/site looks 'hung'.
Basically, I need some ideas on how to display a 'building your download' graphic while the lengthy process is working that will also work with a full postback.
I'm using ASP.NET for file uploads, and I'm hitting a bit of a snag in that data sent by the client as multipart form data is read straight into RAM.
Obviously, this means maximum file upload size is limited to the available RAM on the machine, and even then is much smaller as soon as you have multiple simultaneous uploads.
Is it possible to get ASP.NET to write the data to a temporary file on the hard drive as it is recieved rather than reading into RAM?
How to save image from another website programmatically? Using asp.net c# for example I wouldlike to create request on Google to search something for example beer and from search result save images using asp.net web application
i'm developing a web application using asp.net 3.5 sp1, my issue is , i need to upload multiple files asynchronously.
for that i used the ajax control toolkit "async file upload", where i can only upload single file and cannot track the filenames after uploading, so i go for the "file uploaded ajax"
[URL]
when i started including the cascading dropdown in that page("mainpage.aspx") , an error occured in the page, ie; when i click add files of the "file upload ajax", the "file upload" control some times doesnot show, so i created another page("testpage.aspx") and included the "file upload" control in that page and created an iframe in the "mainpage.aspx" and call the "testpage.aspx", now it is working fine, but i was not able to track the filename, can i access the control inside that iframe from the "mainpage.aspx" ?
i created search engine using Google search API in asp.net.
if i click search list link another new site will open.when i click new site i want to save corresponding link with logo like bookmarks link to sql in asp.net
I am designing a web application for a client who manages shows and would like to make a plan of the tables for the show online on his web site (with Drag and Drop).
I created an asp web page with AJAX DragPanelExtension where I put Label controls as tables and it is possible to move them arround but I don't know how to save the layout so when my client closes the web page and opens again, plan keeps its layout.
i develop a site that is working fine on my system. but when i publish this the data is not saving into the database. my database in in app_data folder. what is the problem.
Let's say I am creating a image hosting website. My potential users will be somewhere around 1 Million, and every user potential has 10,000 images, and I need to serve over 1000 images per second.
So, I bought a diskarray, with 10T storage, SAS 15K SCSI drives.
The problem is: What is the best way to save those files on disk? How to organize the folder structure to make sure NTFS can find one file from a billion files under a huge tree folder quickly? I mean, serve 1000 images per second is non trivial issue. My current website is serving over 100 images per second, and I already see the performance problem: NTFS can't find the file fast enough! And of course, my folder structure is not good enough either.
I have written a website and let the user upload their photo. It also resizes the picture to 400 x 400 and crops automatically to the center part of the picture. All of this works good but each file is about 355K each for JPG and PNG file formats. I just tried GIF and it is 49K but the picture is grainy when saved this way. Is there any way I can get the picture to a smaller file size with keeping it at 400 x 400? I really thought PNG would be the way to go and was surprised by it being the same as jpg. I don't want to lower the quality either for JPG files.