I am workin on a web projects which has arround 25000 xml files. Some where in application we need to compare some part of a xml file with all 25000 xml files. Although it is working but takin arround 25 minutes to compare. how can i reduce this time and can compare files in less time.
I have faced a peculiar issue in the production environment where I have got a main Application named "Configurator". it has a page which submits the asynchronus request for a batch process. after the batch process is started it calls a web service to genarate a report. the issue is that the number of times the service gets called the main application dll gets copied in the temporary aspnet files in either the service folder or the application calling the service named "GenerateReport" which is making the temporary ASPNET files as huge as 8 GB per day, which is in tuen is bringing the production server down.
I have tried simulating it is UAT environment with the same depolyed code and the same IIS settings. But I was not able to replicate the issue in UAT.It seems to be a very specific and peculiar issue.
If we have a huge amount of text with some images then, the text and images should be directly included into an ASP.NET 2.0 page with other server controls or whether they should be included from some outside document like a Word Document ? Which one is the best method of doing this ? Can this be done using an Xml file ? If yes, then how ? What are the advantages and disadvantages of both method (if any) ?
how can I create a query in order to select some records every time user wanna to display them into a gridview. I don't want to select all entire records from my data source. What I exactly want is to simulate pagination by query.
I am working with GridView, When huge amount of Data is inserted in the GridView its columns became very thin all tall, however I want to Display the Gridview with Fix size and if the Data is hugeit only display the starting few words of Data.in other words I want to use gridView same like the GMAIL account where Data Display like this. .
I want to upload video files may be more than 500mb, simultanously in which if 500mb files are uploaded and stopped in middle of it and i want to continue the progress where it stops
I have many (>600) EPS files(300 KB - 1 MB) in database. In my ASP.NET application (using ASP.NET 4.0) I need to retrieve them one by one and call a web service which would convert the content to the JPEG file and update the database (JPEGContent column with the JPEG content). However, retrieving the content for 600 of them itself takes too long from the SQL management studio itself (takes 5 minutes for 10 EPS contents).
So I have two issues:-
1) How to get the EPS content ( unfortunately, selecting certain number of content is not an option :-( ):-
Approach 1:-
foreach(var DataRow in DataTable.Rows) { // get the Id and byte[] of EPS // Call the web method to convert EPS content to JPEG }
or
foreach(var DataRow in DataTable.Rows) { // get only the Id of EPS // Hit database to get the content of EPS // Call the web method to convert EPS content to JPEG }
or
Any other approach?
2) Converting EPS to JPEG using a web method for >600 contents. Ofcourse, each call would be a long running operation. Would task parellel library (TPL) be a better way to achieve this?
Also, is doing the entire thing in a SQL CLR function a good idea?
EDIT :- Unfortunately, I have to do this in the ASP.NET application itself and doing that in a separate process like Windows service is not an option.
Recently I was working on displaying workflow diagram images in our web application. I managed to use the rehosted WF designer and create images on-the-fly on the server, but imagining how large the workflow diagrams can very quickly become, I wanted to give a better user experience by using some ajax control for displaying images that would support zoom & pan functionality.
I happened to come across the website of seadragon, which seems to be just an amazing piece of work that I could use. There is just one disadvantage - in order to use their library for generating deep zoom versions of images I have to use the file structure on a server. Because of the temporary nature of the images I am using (workflow diagrams with progress indicators), it is important to not only be able to create such images but also to get rid of them after some time.
Now the question is how can I best ensure that the temporary image files and the folder hierarchy can be created on a server (ASP.NET web app), and later cleaned up. I was thinking of using the cache functionality and by the expiration of the cache item delete the corresponding image folder hierarchy, or simply in the Application_Start and Application_End of Global.asax delete the content of the whole temporary folder, but I'm not really sure whether this is a good idea and whether there are some security restrictions or file-system-related troubles. What do you think ?
I'm testing a very simple aspx page on Visual Studio's own ASP.NET Development Server(the local server). On the webpage there is a FileUpload control which can upload jpg file up to 2MB without problems. On uploading bigger files, the browser immidiately show "The web page cannot be displayed". It does not show any exception which really puzzles me. "The web page cannot be displayed" is normally caused by network problem, but in this case it's a local server and it can handle smaller jpg file fine. Whta's the problem here?
The other day I came across an alternative way of accessing web.config configuration in some article. It allowed to:provide path to web.config file modify web.config configuration at runtime like:config.MySetting = "new value";load web.config from another web application in the same IIS (I'm not sure about it)work with configuration using class instance instead of static ConfigurationManager class
I have a problem. I have a textbox where i enter the amount and in another textbox the amount in words comes automatically. The function appended below works fine when the amount entered is without a decimal. But if the amount entered has a decimal, then the function gives an error. Can anyone check the same and tel me a solution.Wat i want is suppose the amount entered in 2345.68 the amount in words should come, Rupees two thousand three hundred forty fve and sixty eight paisa only. if the decimal is not entered then the function gives proper result.
Does it have any major effect on performance/ memory if my web.config is really huge (say, 1000+ entries in <appSettings>)? Is it a good idea to maintain a different custom xml config file for all business specific settings for my app?
We have a asp.net application that has this report that shows about 1000 rows as of right now and can grow up potentially up to 20,000. Dont ask me why, but out client does not like paging and does not like filtering, they like to see everything on a single page. Our obvious problem is the load its putting on the server, in terms of memory (also the factor that the client browser may crash as well).
My question is: If I provide a custom desktop application only for this report, that can display thousands and thousands of rows (through web services or remotting), would it clog up the server? On the server the worker process of the IIS basically eats up memory in case of a the asp.net application, but if I have this desktop app running seperating calling the same data base on the application server,
I have tried to get upto 200,000 record after that iis get exited, So anybody having the solution to get huge records of data and i need to get the data quick as well what is the tune up i have to do in my WCF application
When exporting a lot of data to a string (csv format), I get a OutOfMemoryException. What's the best way to tackle this? The string is returned to a Flex Application.What I'd do is export the csv to the server disk and give back an url to Flex. Like this, I can flush the stream writing to the disk.Update:String is build with a StringBuilder:
I have a project which generates snapshots of a database, converts it to XML and then stores the XML inside a separate database. Unfortunately, these snapshots are becoming huge files, and are now about 10 megabytes each. Fortunately, I only have to store them for about a month before they can be discarded again but still, a month of snapshots turn out to become real bad for it's performance...I think there is a way to improve performance a lot. No, not by storing the XML in a separate folder somewhere, because I don't have write access to any location on that server. The XML must stay within the database. But somehow, the field [Content] might be optimized somehow so things will speed up...I won't need any full-text search options on this field. I will never do any searching based on this field. So perhaps by disabling this field for search instructions or whatever?The table has no references to other tables, but the structure is fixed. I cannot rename things, or change the field types. So I wonder if optimizations is still possible.Well, is it?The structure, as generated by SQL Server:
CREATE TABLE [dbo].[Snapshots]( [Identity] [int] IDENTITY(1,1) NOT NULL, [Header] [varchar](64) NOT NULL,[code]....
Performance isn't just slow when selecting data from this table but also when selecting or inserting data in one of the other tables in this database! When I delete all records from this table, the whole system is fast. When I start adding snapshots, performance starts to decrease. After about 30 snapshots, performance becomes bad and the risk of connection timeouts increase.Maybe the problem isn't in the database itself, although it's still slow when used through the management tool. (Fast when Snapshots is empty.) I mainly use ASP.NET 3.5 and the Entity Framework to connect to this database and then read the multiple tables. Maybe some performance can be gained here, although that wouldn't explain why the database is also slow from the management tools and when used through other applications with a direct connection...
I'm currently exporting a database table with huge data (100000+ records) into an xml file using XmlTextWriter class and I'm writing directly to a file on the physical drive.
_XmlTextWriterObject = new XmlTextWriter(_xmlFilePath, null);
While my code runs ok, my question is that is it the best approach? Or should I write the whole xml in memory stream first and then write the xml document in physical file from memory stream? And what are the effects on memory/ performance in both cases?
EDIT
I will indeed be using XmlTextWriter but I meant to say whether to pass a physical file path string to the XmlTextWriter constructor (or, as John suggested, to the XmlTextWriter.Create() method) or use stream based api. My current code looks like the following:
I'm using the following codeproject to build an asp.net website and so far everything is good. My only problem is after the barcode is generated, a huge whitespace exist to the right of the barcode. I've been playing with this and am unable to resolve it.
We have a .aspx file which has about 400 lines of javascript code. Is it a good idea to have such huge code in its own file? What is the performance difference in having huge javascript code in aspx as against the .js file?
I have a DataTable available with me which contains thousands of rows. There is a column called EmpID which is containing '0' for some of the rows. I want to remove them from my current DataTable and want to create a new correct DataTable. I cannot go row by row checking it since it contains huge amount of data.
I have a table with just a column and a row in a table that it save just a file with size 1.5 GB ! C# application and sql server are in different machines. I want to read that file by DataReader every 100 MB then save all 100 MB files to disk by "FileMode.Append" for file stream and collect them to one file.
I have code in an ASP.NET form that needs to, depending on user entry create messages in the database. We are speaking of potentially thousands of db entries. How do I protect against deadlocks, I mean apart from using Transactions and setting IsolationLevel to Serializable, as well as using WITH(NOLOCK) statement on my select statements since I don't mind a dirty read.
Event code: 3005 Event message: An unhandled exception has occurred. Event time: 10/13/2010 10:12:14 PM Event time (UTC): 10/14/2010 3:12:14 AM Event ID: a565c58a7f844692859aa21303447c7c Event sequence: 206 Event occurrence: 1 Event detail code: 0 Application information: Application domain: /LM/W3SVC/610100832/Root-12-129314933998593750 Trust level: Full Application Virtual Path: / Application Path: D:Websitesadmin.beta.sharedTime.com Machine name: SHAREDTIME Process information: Process ID: 3440 Process name: w3wp.exe Account name: NT AUTHORITYNETWORK SERVICE Exception information: Exception type: SqlException Exception message: Transaction (Process ID 56) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction. Request information: Request URL: http://beta.admin.sharedTime.com/admin_text_mass_send.aspx Request path: /admin_text_mass_send.aspx User host address: 69.211.10.138 User: Is authenticated: False Authentication Type: Thread account name: NT AUTHORITYNETWORK SERVICE Thread information: Thread ID: 10 Thread account name: NT AUTHORITYNETWORK SERVICE Is impersonating: False Stack trace: at mtNamespace.mt.createmessage_queue(String phone_number, String text_message, DateTime send_on, String system_name, Double user_no, Double send_priority, String message_type, Boolean returnqueue) in http://server/App_Code/mt.vb:line 1509 at ASP.admin_text_mass_send_aspx.save_user_values(Object sender, EventArgs e) in http://server/admin_text_mass_send.aspx:line 103 at System.Web.UI.WebControls.Button.OnClick(EventArgs e) at System.Web.UI.WebControls.Button.RaisePostBackEvent(String eventArgument) at System.Web.UI.WebControls.Button.System.Web.UI.IPostBackEventHandler.RaisePostBackEvent(String eventArgument) at System.Web.UI.Page.RaisePostBackEvent(IPostBackEventHandler sourceControl, String eventArgument) at System.Web.UI.Page.RaisePostBackEvent(NameValueCollection postData) at System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) Custom event details:
I am developing a huge almost millions of users and very complex logic , secure and optimize web application.but i am thinking to use client side archeticture with asp.net like gmail is using.