Currently I am looking to move my websites images to a storage service. I have two websites developed in PHP and ASP.NET.
Using Amazon S3 service we can host all our images and videos to serve web pages. But there are some limitations using S3 service when we want to serve images.
If website needs different thumbnail images with different sizes from original image, it is tough. We have again need to subscribe for EC2 also. Though the data transfer from S3 to EC2 is free, it takes time for data transfer before processing image resize operation. Uploading number of files in zip format and unzipping in S3 is not possible to reduce number of uploads. Downloading multiple files from S3 is not possible in case if we want to shift to another provider. Image names are case sensitive in S3. Which will not load images if image name does not match with request. Among all these first one is very important thing since image resize is general requirement. Which provider is best suitable to achieve my goal. Can I move to Google AppEngine only for the purpose of image hosting or is there any other vendor who can provide above services?
What is the best approach to storing images in a file system. Currently I am separating the images by day, month and then year. I am also hashing file name so that no two files will be stored with the same name. Lastly, how would I go about using the image file address in a forum?
We are using Telerik RadEditor control in our project and having some problems with its integration with our user documents storage engine. We've implemented custom content provider for Telerik image manager and it shows correct documents in dialog. But in doesn't switch between images while selection. You can see this on this picture. We've tested this behaviour on the blank ASP.NET page without any additional CSS or JS. Rad editor tag could be found below. Please help! Why do this could happen?
how to zoom image in side like in shopping sitewhen we put mouse over the particular portion of of image,that portion zooms in the side.how to do this with simple codes?
the account we use to copy/publish websites to our web server is not allowing us access to push websites. I've added this account to the permissions in the links listed in the below link, but i don't have success when trying to push remotely. When the account is in the local admins account, everything works flawlessly, but we are not allowed to have it in there. http://social.msdn.microsoft.com/Forums/en-US/vssetup/thread/31be047e-4716-4974-b8a1-be0111b50199 I've googled and searched a lot for this particular error, but am not finding an answer that helps. We get this error, 'Unable to create the Web 'http://edea01/test/planning'. You are not authorized to perform the current operation', and the above link is the scenario that matches mine the most. I don't do develompment work, but am asked to figure out this connection problem.
A few years ago, when first being introduced to ASP.net and the .NET Framework, I built a very simple online file storage system.This system used Rijndael encryption for storing the files encrypted on the server's hard drive, and an HttpHandler to decrypt and send those files to the client. Being one of my first project with ASP.net and databases, not understanding much about how the whole thing works (as well as falling to the same trap described by Jeff Atwood on this subject), I decided to store freshly generated keys and IVs together with each file entry in the database.
To make things a bit clearer, encryption was only to protect files from direct access to the server, and keys were not generated by user-entered passwords. My question is, assuming I don't want to keep one key for all files, how should I store encryption keys for best security? What is considered best practice? (i.e: On a different server, on a plain-text file, encrypted). Also, what is the initialization vector used for in this type of encryption algorithm? Should it be constant in a system?
Where should i store photos for a facebook-like app ? locally on the server in a folder or upload them to a database ? I am not intending to make this app commercial .
Im using Visual studio 2008 in a Windows 7 (32 bit) operating system. Ive just started getting this error message randomlyNot enough storage is available to process this commandIve googled the message and it seems like the solution is to increate the amount of RAM visual studio usesIve tried to follow the instructions herehttp://stevenharman.net/blog/archive/2008/04/29/hacking-visual-studio-to-use-more-than-2gigabytes-of-memory.aspxbut when I try to use bcdedit, I get this errorboot configuration data store could not be opened access is deniedso I tried the suggestion herehttp://social.answers.microsoft.com/Forums/en-US/w7repair/thread/9e995bc8-141c-4ed0-9b17-2dbe92369202Im confused by this linec. At the command prompt, the following line, and then press ENTER:bcdedit /set {current} Description "name you want"and press Enter.
In our organization we had to write code to store and retrive documents in many apps. Now I would like to write a service/library that can be used my other devs. Not sure how to start about that? What should be my first step.
I maintain a web application (ASP.NET/IIS7/SQL2K8/Win2K8) that needs to access documents, actually hundreds of thousands of documents, and growing. Currently, they are all on a Windows 2K8 Server fileshare, being accessed by UNC path (SMB). The files are in a single flat directory and I'm trying to plan how to best improve this solution. I don't want to use the SQL Filestream attribute as it would be significant effort to migrate it all into that, and would really lock in to SQL Server. I also need to find a way to replicate the data for disaster recovery, so perhaps a solution can help with that too. Options could be: Segment files into multiple directories? application would add metadata for which directory it's on (or segment by other means) Segment files into separate servers? (virtualize)Backup becomes more complicated. Application would add metadata for which server it's on
NAS Storage SAN Storage
Put a service (WCF) in front of the files and have the app talk to the service bonus of being reusable across many applications Assuming I'm going to store on filesystem and not in database (I've read those disccusions here), which would be a more scalable solution?
I have an ASP.NET site in a web server and the images are stored in NAS (some kind of lacie external storage), I tried UNC and other things but no success. how can I access images from external storage in ASP.NET Update: The images are reachable from the server but when I try to access in ASP.NET throws me error.
I would like to be able to give my users the ability to securely upload and store files, sometimes upwards to about a gigabyte in size. I'm not quite sure what's the best way to go about doing this. I would like to have the connection between the client and the server secured, so I'm sure I will probably have to use SSL. Now, my major crux is secure storage. Being that the data is somewhat sensitive, I'm wondering if I should encrypt the file before it is stored. I guess my question is, should I open the file and encrypt each byte or encrypt the file as a whole. I'm assuming that the latter is the better option. I know that there are a number of examples of how to encrypt a file using a number of different methods (aes, des, md5 (hashing not really encrypting), etc). Currently, to encrypt text, I am using the Rijndael algorithm, sha1, 256 bit keys, a predefined passphrase, salt value, and IV.
I'm currently developing a web application whose primary user function is uploading and downloading of files. The files will be stored on the hard disk (no cloud storage yet).
Taking into consideration the possibilities of gigabytes of data and a large number of files, do I need to organize files into sub folders to account for the fetching of a file or is the file system's indexing already very efficient and I can ignore this potential bottle neck?
Update:
On a side note, I plan to store file names and any additional information in a SQL database and only query the disk when a user actually wants to download the file. This is how I plan on retrieving files:
In order to maintain user uploaded images in website becomes very tough as the number of images are increasing. In the long run the available disk space will come to 0 bytes.
Amazon generally provides unlimited space for their S3 service. If we want to provide unlimited space to our website what are the possible ways?
On a website such as Amazon, they usually have a prodcut description which is normally 2 - 3 paragraphs of description. Would these information be stored on a database, or would they use include files for those sections, would it be stored in an XML file? I have a website which stores everything in a database and for most of it the website works. EG, in my database, I have a table with Column Names such as ISBN Number, Book Name, Publisher Name and all of which don't go above 50 characters! In some cases, the descriptions go on and on over several paragraphs. Where should this be stored? In the table or in another location which the database references?
I have data for Visa, Mastercard, moneybookers, etc. .. How do you propose to store data? In each table separately or not? You might like: 1 table that stores all the data and then call the information according to Visa, Mastercard, moneybookers, etc. Example: a table stored all data has deposit_options from each payment, or payments for each option do I need to each payment separately.
I have a project in which several modules need to retrieve assorted scalar values (ie. int, string, etc.) before the ViewState is reestablished. To date I've been using the Session object for this though things are getting somewhat unwieldly now. Also Module A's session variables should really only be accessible by Module A, Module B's session variables really only accessible by Module B, etc. - yet with the Session object no such scope is available.
I'm thinking of using 3 layered architecture in my web site.Kind of classic layers - Presentation, BL, DAL and BO - business objects, just like in this great article http://imar.spaanjaars.com/QuickDocId.aspx?quickdoc=416I will have 4 projects according to levels.I'm goung to store data in xml files and my question is where should those files be located? In App_Data folder of web site? Then how DAL will know path where to find files to parse?
In my Entity Framework v4 project, I have a table with two columns that are computed by the database (via triggers, etc.). In order to get EF to properly insert records into the table, I have to manually mark the columns as "Computed" in the EF Storage Model (the StoreGeneratedPattern attribute), which is not supported by the designer -- so I have to make the edits by hand to the XML in the .EDMX file. The problem is that whenever my database schema changes, and I need to update the storage model in my project, the "Update Model Wizard" overwrites the whole Storage Model section of the .EDMX, eliminating my manual changes. This means that I have to keep a special list of such changes and manually re-apply them everytime I do an update!
Is there any way to prevent the Storage Model overwrites? Is there a way to flag the columns in the database, so that EF will automatically mark them as computed? As a last resort, is the some REALLY EASY XML tool/technique that can automatically apply the changes for me after every refresh? I've been told by an insider that one solution might be to check out the "Designer Power Pack" (link below), which allows you to customize generation of the storage model. I've only skimmed the info so far, but it looks to me like there may be a day or two learning curve to figure that out. Does anyone have any experience with the Designer Power Pack, or any other ideas? [URL]
I'm using the MSChart Control in a Web Project. I saw that there are 3 different storage mode settings: file/memory/session. I couldn't find any information about the pros/contras or the impact of the settings.
I want to import csv file(already uploaded in blob storage) in Azure. For example I have uploded test.csv on blob storage, now i just want to import that test.csv file in .net(azure) and after importing i will insert that data into azure database. I am using C# .net.
Creating a cvs file with all rows. Upload it as blob. Parse it with a Worker role and insert it in the sql azure db.
I know this question is related to many others, but please bear with me.I am trying an experiment to store all information in database tables instead of the ASP.NET session. In ASP.NET 4 one can create a custom provider for session. So, again should I implement a Custom Session-State Provider or should I just disable session (in Web.config)?From the comments my question can be misunderstood. Hopefully this tidbit will help clarify:don't want to store the session in the database. I want to store information in the database that you would typically store in the session. One reason why: I don't want to carry around a session on every page, especially if thapage doesn't care about 90 percent of the information in the session