Does it have any major effect on performance/ memory if my web.config is really huge (say, 1000+ entries in <appSettings>)? Is it a good idea to maintain a different custom xml config file for all business specific settings for my app?
We have a .aspx file which has about 400 lines of javascript code. Is it a good idea to have such huge code in its own file? What is the performance difference in having huge javascript code in aspx as against the .js file?
I have one Store Procedure that generating report ..
For storing the data there , i used many temp. tables. There is many Select Query , little less insert and delete query.
Now if there is huge data around 1 lac in temp table my select query taking to much time and also may be insert and delete query
I added Primary key to all auto Increment Field in temp table. Also defined Clustered index on that primary key as unique Clustered index to improve the performance .
But there is not so much improvement in case of huge temp table.
Right now the whole Store Procedure is taking time to complete around 1.5 days or around 30 hours ..
So i want to increase the performance as much that it completes on nearly 3-4 hours.
currently in our application we have an ApplicationConfiguration class. It is basically just a static class with const values specifying certain application-global configuration options. Some of these values will rarely if ever change and are only put into the configuration for elegance/cleanness.
Some of these values though do need to be different between production and development. So I'm considering making this class just a wrapper over Web.config. These values are checked many times throughout our codebase. If I change these from a const to a read-only getter that reads from Web.config will this affect compiler optimizations or make our application in any way slower?
I have developed a website in asp.net framework 2 . This website is being hosted in two different servers without any change in code. My issue is about the performance of these 2 sites. One website is taking much time for inserting datas to the DB (SQL server 2005). 2 websites are having different DB server.
I think the issue is for the DB server. How can we rectify the DB performance while insertion and Is there any other cause for this permance issue?
I am not sure if this is the right forum. I can not find a forum for LINQ.
I am working on an application using LINQ. Application performance is not up to par and my tests show that it is LINQ queries that are slow. I was wondering if anybody can recommend where I can find an article about optimizing LINQ performance maybe by compilation or other methods.
I am creating a service oriented application where trying to have everything using services....however there is something I am not sure of , I am having a page that calls the database at the page load...so what would be better and faster?? to call database in pageload , or to call wcf service from javascript during javascript load ??btw , I am using a repeater in the page , but I have created somekind of an engine to create the suitable html so...I'll be creating the repeaters html using the wcf and resend it back to the page If I am using a wcf service at the start.
I am workin on a web projects which has arround 25000 xml files. Some where in application we need to compare some part of a xml file with all 25000 xml files. Although it is working but takin arround 25 minutes to compare. how can i reduce this time and can compare files in less time.
We have a asp.net application that has this report that shows about 1000 rows as of right now and can grow up potentially up to 20,000. Dont ask me why, but out client does not like paging and does not like filtering, they like to see everything on a single page. Our obvious problem is the load its putting on the server, in terms of memory (also the factor that the client browser may crash as well).
My question is: If I provide a custom desktop application only for this report, that can display thousands and thousands of rows (through web services or remotting), would it clog up the server? On the server the worker process of the IIS basically eats up memory in case of a the asp.net application, but if I have this desktop app running seperating calling the same data base on the application server,
I have tried to get upto 200,000 record after that iis get exited, So anybody having the solution to get huge records of data and i need to get the data quick as well what is the tune up i have to do in my WCF application
When exporting a lot of data to a string (csv format), I get a OutOfMemoryException. What's the best way to tackle this? The string is returned to a Flex Application.What I'd do is export the csv to the server disk and give back an url to Flex. Like this, I can flush the stream writing to the disk.Update:String is build with a StringBuilder:
I want to upload video files may be more than 500mb, simultanously in which if 500mb files are uploaded and stopped in middle of it and i want to continue the progress where it stops
If we have a huge amount of text with some images then, the text and images should be directly included into an ASP.NET 2.0 page with other server controls or whether they should be included from some outside document like a Word Document ? Which one is the best method of doing this ? Can this be done using an Xml file ? If yes, then how ? What are the advantages and disadvantages of both method (if any) ?
I have a project which generates snapshots of a database, converts it to XML and then stores the XML inside a separate database. Unfortunately, these snapshots are becoming huge files, and are now about 10 megabytes each. Fortunately, I only have to store them for about a month before they can be discarded again but still, a month of snapshots turn out to become real bad for it's performance...I think there is a way to improve performance a lot. No, not by storing the XML in a separate folder somewhere, because I don't have write access to any location on that server. The XML must stay within the database. But somehow, the field [Content] might be optimized somehow so things will speed up...I won't need any full-text search options on this field. I will never do any searching based on this field. So perhaps by disabling this field for search instructions or whatever?The table has no references to other tables, but the structure is fixed. I cannot rename things, or change the field types. So I wonder if optimizations is still possible.Well, is it?The structure, as generated by SQL Server:
CREATE TABLE [dbo].[Snapshots]( [Identity] [int] IDENTITY(1,1) NOT NULL, [Header] [varchar](64) NOT NULL,[code]....
Performance isn't just slow when selecting data from this table but also when selecting or inserting data in one of the other tables in this database! When I delete all records from this table, the whole system is fast. When I start adding snapshots, performance starts to decrease. After about 30 snapshots, performance becomes bad and the risk of connection timeouts increase.Maybe the problem isn't in the database itself, although it's still slow when used through the management tool. (Fast when Snapshots is empty.) I mainly use ASP.NET 3.5 and the Entity Framework to connect to this database and then read the multiple tables. Maybe some performance can be gained here, although that wouldn't explain why the database is also slow from the management tools and when used through other applications with a direct connection...
I'm currently exporting a database table with huge data (100000+ records) into an xml file using XmlTextWriter class and I'm writing directly to a file on the physical drive.
_XmlTextWriterObject = new XmlTextWriter(_xmlFilePath, null);
While my code runs ok, my question is that is it the best approach? Or should I write the whole xml in memory stream first and then write the xml document in physical file from memory stream? And what are the effects on memory/ performance in both cases?
EDIT
I will indeed be using XmlTextWriter but I meant to say whether to pass a physical file path string to the XmlTextWriter constructor (or, as John suggested, to the XmlTextWriter.Create() method) or use stream based api. My current code looks like the following:
I'm using the following codeproject to build an asp.net website and so far everything is good. My only problem is after the barcode is generated, a huge whitespace exist to the right of the barcode. I've been playing with this and am unable to resolve it.
I have a DataTable available with me which contains thousands of rows. There is a column called EmpID which is containing '0' for some of the rows. I want to remove them from my current DataTable and want to create a new correct DataTable. I cannot go row by row checking it since it contains huge amount of data.
I have a table with just a column and a row in a table that it save just a file with size 1.5 GB ! C# application and sql server are in different machines. I want to read that file by DataReader every 100 MB then save all 100 MB files to disk by "FileMode.Append" for file stream and collect them to one file.
I have code in an ASP.NET form that needs to, depending on user entry create messages in the database. We are speaking of potentially thousands of db entries. How do I protect against deadlocks, I mean apart from using Transactions and setting IsolationLevel to Serializable, as well as using WITH(NOLOCK) statement on my select statements since I don't mind a dirty read.
Event code: 3005 Event message: An unhandled exception has occurred. Event time: 10/13/2010 10:12:14 PM Event time (UTC): 10/14/2010 3:12:14 AM Event ID: a565c58a7f844692859aa21303447c7c Event sequence: 206 Event occurrence: 1 Event detail code: 0 Application information: Application domain: /LM/W3SVC/610100832/Root-12-129314933998593750 Trust level: Full Application Virtual Path: / Application Path: D:Websitesadmin.beta.sharedTime.com Machine name: SHAREDTIME Process information: Process ID: 3440 Process name: w3wp.exe Account name: NT AUTHORITYNETWORK SERVICE Exception information: Exception type: SqlException Exception message: Transaction (Process ID 56) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction. Request information: Request URL: http://beta.admin.sharedTime.com/admin_text_mass_send.aspx Request path: /admin_text_mass_send.aspx User host address: 69.211.10.138 User: Is authenticated: False Authentication Type: Thread account name: NT AUTHORITYNETWORK SERVICE Thread information: Thread ID: 10 Thread account name: NT AUTHORITYNETWORK SERVICE Is impersonating: False Stack trace: at mtNamespace.mt.createmessage_queue(String phone_number, String text_message, DateTime send_on, String system_name, Double user_no, Double send_priority, String message_type, Boolean returnqueue) in http://server/App_Code/mt.vb:line 1509 at ASP.admin_text_mass_send_aspx.save_user_values(Object sender, EventArgs e) in http://server/admin_text_mass_send.aspx:line 103 at System.Web.UI.WebControls.Button.OnClick(EventArgs e) at System.Web.UI.WebControls.Button.RaisePostBackEvent(String eventArgument) at System.Web.UI.WebControls.Button.System.Web.UI.IPostBackEventHandler.RaisePostBackEvent(String eventArgument) at System.Web.UI.Page.RaisePostBackEvent(IPostBackEventHandler sourceControl, String eventArgument) at System.Web.UI.Page.RaisePostBackEvent(NameValueCollection postData) at System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) Custom event details:
I have faced a peculiar issue in the production environment where I have got a main Application named "Configurator". it has a page which submits the asynchronus request for a batch process. after the batch process is started it calls a web service to genarate a report. the issue is that the number of times the service gets called the main application dll gets copied in the temporary aspnet files in either the service folder or the application calling the service named "GenerateReport" which is making the temporary ASPNET files as huge as 8 GB per day, which is in tuen is bringing the production server down.
I have tried simulating it is UAT environment with the same depolyed code and the same IIS settings. But I was not able to replicate the issue in UAT.It seems to be a very specific and peculiar issue.
I have many (>600) EPS files(300 KB - 1 MB) in database. In my ASP.NET application (using ASP.NET 4.0) I need to retrieve them one by one and call a web service which would convert the content to the JPEG file and update the database (JPEGContent column with the JPEG content). However, retrieving the content for 600 of them itself takes too long from the SQL management studio itself (takes 5 minutes for 10 EPS contents).
So I have two issues:-
1) How to get the EPS content ( unfortunately, selecting certain number of content is not an option :-( ):-
Approach 1:-
foreach(var DataRow in DataTable.Rows) { // get the Id and byte[] of EPS // Call the web method to convert EPS content to JPEG }
or
foreach(var DataRow in DataTable.Rows) { // get only the Id of EPS // Hit database to get the content of EPS // Call the web method to convert EPS content to JPEG }
or
Any other approach?
2) Converting EPS to JPEG using a web method for >600 contents. Ofcourse, each call would be a long running operation. Would task parellel library (TPL) be a better way to achieve this?
Also, is doing the entire thing in a SQL CLR function a good idea?
EDIT :- Unfortunately, I have to do this in the ASP.NET application itself and doing that in a separate process like Windows service is not an option.
I am developing a huge almost millions of users and very complex logic , secure and optimize web application.but i am thinking to use client side archeticture with asp.net like gmail is using.
how can I create a query in order to select some records every time user wanna to display them into a gridview. I don't want to select all entire records from my data source. What I exactly want is to simulate pagination by query.