Optimizing Performance Of Large .NET Applications
Feb 1, 2010
I'm building a asp.net web application with lots and lots of controls and huge volumes of data. My application is very slow and it is taking a large amount of time to load the data into the .net controls like grid, tree view etc. I also have some ajaxified pages and controls in my application. I want to reduce the page load time in each postbacks.What are the standards/best practices to be followed while developing large asp.net applications?
View 3 Replies
Similar Messages:
Oct 17, 2010
I have a site running on.NET 4 and it seems sluggish compared to its .NET 3 counterpart even though it is running on a faster box. Is there some type of guide for recommended optimizations on the server to increase performance?
View 2 Replies
Feb 20, 2011
I have about 5,00,000 data. I want to display the records according to group by. It takes to me about 2 minutes. Is it good?
If not, then what is best process to display the record in a very fast way?
Moreover, I have encrypted one field in database and encryption is done from my code behind. So when I display the records, I have to decrypt those. That's why I make paging on 100 records per page.
View 4 Replies
Sep 21, 2010
Imagine I have a class MyTestClass. And I need an instance of this Type throughout my whole web application.Now there are several possibilities to accomplish this.1. Make MyTestClass static, make it contain static methods only
Probably the most performant solution. I'm not feeling lucky about using static fields though. Thread safety? What if my static class contained a static System.Collections.Queue?
View 8 Replies
Feb 19, 2011
I have a web performance test which contains a request whose response is greater than 5MB, and the Extract Hidden Fields rule fails to find (necessary and required!) hidden fields in the response. Response header contains
HTTP/1.1 200 OK
Transfer-Encoding : chunked
Vary : Accept-Encoding, User-Agent
Cache-Control : private
Content-Type : text/plain; charset=utf-8
Date : Sat, 19 Feb 2011 15:24:38 GMT
Server : Microsoft-IIS/6.0
X-AspNet-Version : 2.0.50727
X-Powered-By : ASP.NET
Other than that and the response size, there is nothing remarkable about this scenario. In fact, this same test succeeds when a smaller data set is used. I suspect the Web Performance Test framework is having issues parsing the "chunked" encoding or sheer volume of data. Ahem, how can I obtain these required hidden fields from my response? ie resolutions, work arounds, converting auto-extraction to manual, etc.
View 1 Replies
Sep 13, 2010
It is a very large .txt file (more than 3M), and produced everyday, the content is user's system log like below:
2007-11-01 18:20:42,983 [4520] INFO GetXXX() SERVICE START
2007-11-01 18:21:42,983 [4520] WARING USER ACCESS DENIED
2007-11-01 18:22:42,983 [4520] ERROR INPUT PARAMETER IS NULL CAN NOT CONVERT TO INT32
2007-11-01 18:23:59,968 [4520] INFO USER LOGOUT
View 14 Replies
Nov 5, 2010
I tried to find out about subject but with no success. The point is that in the beginning I've made many user controls. My site is too slow. I have not idea yet if it is because of user controls.
View 7 Replies
Mar 4, 2011
I have a project which generates snapshots of a database, converts it to XML and then stores the XML inside a separate database. Unfortunately, these snapshots are becoming huge files, and are now about 10 megabytes each. Fortunately, I only have to store them for about a month before they can be discarded again but still, a month of snapshots turn out to become real bad for it's performance...I think there is a way to improve performance a lot. No, not by storing the XML in a separate folder somewhere, because I don't have write access to any location on that server. The XML must stay within the database. But somehow, the field [Content] might be optimized somehow so things will speed up...I won't need any full-text search options on this field. I will never do any searching based on this field. So perhaps by disabling this field for search instructions or whatever?The table has no references to other tables, but the structure is fixed. I cannot rename things, or change the field types. So I wonder if optimizations is still possible.Well, is it?The structure, as generated by SQL Server:
CREATE TABLE [dbo].[Snapshots](
[Identity] [int] IDENTITY(1,1) NOT NULL,
[Header] [varchar](64) NOT NULL,[code]....
Performance isn't just slow when selecting data from this table but also when selecting or inserting data in one of the other tables in this database! When I delete all records from this table, the whole system is fast. When I start adding snapshots, performance starts to decrease. After about 30 snapshots, performance becomes bad and the risk of connection timeouts increase.Maybe the problem isn't in the database itself, although it's still slow when used through the management tool. (Fast when Snapshots is empty.) I mainly use ASP.NET 3.5 and the Entity Framework to connect to this database and then read the multiple tables. Maybe some performance can be gained here, although that wouldn't explain why the database is also slow from the management tools and when used through other applications with a direct connection...
View 3 Replies
Sep 20, 2010
I have developed a website in asp.net framework 2 . This website is being hosted in two different servers without any change in code. My issue is about the performance of these 2 sites. One website is taking much time for inserting datas to the DB (SQL server 2005). 2 websites are having different DB server.
I think the issue is for the DB server. How can we rectify the DB performance while insertion and Is there any other cause for this permance issue?
View 1 Replies
Apr 29, 2010
I am not sure if this is the right forum. I can not find a forum for LINQ.
I am working on an application using LINQ. Application performance is not up to par and my tests show that it is LINQ queries that are slow. I was wondering if anybody can recommend where I can find an article about optimizing LINQ performance maybe by compilation or other methods.
View 1 Replies
Jun 7, 2010
I've been worked with web services so far, and I'm interested in expanding my services to console applications as well so I started digging up with WCF but I'm conserned that I won't be able to use the HttpContext collection that I've been used to do with web services one important thing which is to generate a random value from HttpContext.Current.Request.ServerVariables["ALL_HTTP"] that I need to reckon if it's the same or at least near what machine that is calling my service. How can I overcome this problem?
I need to know what machine is calling to count the number of attempts to login into my system for example. So must do it inside of the svc code otherwise if I let the client inform what ip address or what computer he is using, anyone could forge this argument and surpass by another machine. May be I'm approaching this matter wrongly. And I should count the number of attempts per state session, but how is it done?
View 1 Replies
Mar 20, 2011
I am creating a service oriented application where trying to have everything using services....however there is something I am not sure of , I am having a page that calls the database at the page load...so what would be better and faster?? to call database in pageload , or to call wcf service from javascript during javascript load ??btw , I am using a repeater in the page , but I have created somekind of an engine to create the suitable html so...I'll be creating the repeaters html using the wcf and resend it back to the page If I am using a wcf service at the start.
View 1 Replies
Apr 22, 2010
Our corporate intranet is designed so that each web application is a child application in the primary application.. Everything has worked fine with Visual Studio 2008 and even in 2010 running the website locally works great, the output directory for the child apps is ..in and the ProjectName.dll copies to that directory.. When I do a publish however it does not and I have to manually copy the dll from the bin folder in the project folder to the parent bin folder, this isn't hard of course but more of a pain in the butt each time I need to publish something. I made sure the output directory is correct for both debug and release yet on publish is just copies it to the child bin and not the parent bin as needed.
View 2 Replies
May 20, 2010
I am working on an ASP.NET 3.5, C#, ASP.NET AJAX, JQUERY web application which currently is having some performance issues. I have a screen with the forms view control and when I edit the form and save it for the first time the POST takes 4.89 seconds and when I perform the same operation again for the second time the POST only takes 1.09 seconds. What could be the reason for this strange behavior?
Note: I am using firebug and I am disregarding the page resources (like js, images) load time.
EDIT: I am using the Web Deployment Project to precompile the application.
View 3 Replies
Mar 14, 2011
I'm now getting into EF and from what I'm seeing so far, I wouldn't have to worry about writing stored procedures any more. Looks like EF takes care of all of that -- including INSERT's that store data in multiple tables. One of the things they taught us when learning stored procedures was that they're compiled which has performance benefits. How does EF 4 fare against using stored procedures?
View 3 Replies
Oct 12, 2010
I'm trying to make a decision about how to display my data. What I have now is a list of products displayed in a repeater. But for code-maintenance I've put my product items in a seperate usercontrol and loading them in a loop with db results using LoadControl.The product control itself is very simple, just a few public properties like title, url, rating but I'm not sure if this will affect my performance. I did some reading here and on forums and some people say it's not the best practice especially if you have more then 20 or 30 of these controls. So, is it really a performance hit using this method or does it stay ok with around 10.000 hits a day.
View 2 Replies
Feb 16, 2011
I have developed a gridview which has displays 8,000 records. However, the user is compaining that it is too slow. A major function of this gridview is to filter on two of the columns. It seems to me that if the records were stored in memory it would be much faster. This is caching, I think. Is that the case? Since filtering is being done I don't think custom paging will help much. When they submit a filter, with custom paging wouldn't the entire table be read again? So I was thinking of trying caching which is available on SQL Server. Am I on the right track? Can you cache with an Access database? Can you think of any other ways to improve performance of a gridview?
View 1 Replies
Jul 29, 2010
I did not post this in database forum because the data will run on an asp.net site.I have 3 solution in my mind and i would like some opinions.What i want to do is create a parent-child relationship on 2 controls.A listbox and a textbox.I will populate the listbox with data(this is taken care of so ignore) and what i want to do is when i click on an item in the listbox then the textbox to show it's joined related item(one item only).This is not a problem in windows forms app but in asp, since i'm new i don't know what would be better for a faster retrieval on the page.So the 3 options i have in mind are:
1)DAL.The standard create a query and let one @parameter wait for the id.
2)ADO.NET with possibly an sqldatasource.In standard forms i would have chosen simple ado.net but in asp?So either constantly open,close the db and retrieve the item or use an sqldatasource and chance the parameter.
3)Asynchronous handler page.Bind the textbox to an asynchronous page that contains the connection and expect the @id parameter.
I admit that i'm not fond of DAL but if it will boost speed then i will use it.But i have a though that says that simple ADO will be faster.
View 16 Replies
Nov 19, 2010
I have a strange situation on a production server. Connection for asp.net get queued but the CPU is only at 40%. Also the database runs fine at 30% CPU.
Some more history as requested in the comments:
In the peak hours the sites gets around 20,000 visitors an hour.
The site is an asp.net webforms application with a lot of AJAX/POSTs
The site uses a lot of User generated content
We measure the performance of the site with a testpage which does hit the database and the webservices used by the site. This page get served within a second on normal load. Whe define the application as slow when the request takes more than 4 seconds.
From the measurements we can see that the connectiontime is fast, but the processing time is large.
We can't pinpoint the slowresponse the a single request, the site runs fine during normal hours but gets slow during peak hours
We had a problem that the site was CPU bound (aka running at 100%), we fixed that
We also had problems with exceptions maken the appdomain restart, we fixed that do
During peak hours I take a look at the asp.net performance counters. We can see behaviour that we have 600 current connections with 500 queued connections.
At peak times the CPU is around 40% (which makes me the think that it is not CPU bound)
Physical memory is around 60% used
At peak times the DatabaseServer CPU is around 30% (which makes me think it is not Database bound)
My conclusion is that something else is stopping the server from handling the requests faster. Possible suspects:
Deadlocks (!syncblk only gives one lock)
Disk I/O (checked via sysinternals procesexplorer: 3.5 mB/s)
Garbage collection (10~15% during peaks)
Network I/O (connect time still low)
To find out what the proces is doing I created to minidumps.
I managed to create two MemoryDumps 20 seconds apart. This is the output of the first:
!threadpool
CPU utilization 6%
Worker Thread: Total: 95 Running: 72 Idle: 23 MaxLimit: 200 MinLimit: 100
Work Request in Queue: 1
Number of Timers: 64
and the output of the second:
!threadpool
CPU utilization 9%
Worker Thread: Total: 111 Running: 111 Idle: 0 MaxLimit: 200 MinLimit: 100
Work Request in Queue: 1589
View 3 Replies
Nov 12, 2010
Performance testing best practices for an ap .net application
View 1 Replies
Jan 12, 2010
I'm trying to add some performance counters to my asp.net website. Now, I know how to increment/decrement some custom counter I make .. but my problem is that if I get my ASP.NET website to create these counters, if they do not exist (eg. i do this check in the global.asax App start method) then add/create them.
But, it doesn't work - access to the registry is denied/forbidden.
I'm assuming this is because the asp.net process is so stripped down (for security) that u can't touch that type of thing. Therefore, i'm wondering if the only other solution is to make a quick console or winform app which does one thing -> add's the perf counters. running this as my normal logged in user would me i have admin rights, so it will work. or is there something else i can do?
View 2 Replies
Apr 6, 2010
Does it have any major effect on performance/ memory if my web.config is really huge (say, 1000+ entries in <appSettings>)? Is it a good idea to maintain a different custom xml config file for all business specific settings for my app?
View 3 Replies
Sep 9, 2010
I am using ASP.NET AJAX Autocomplete to load the data. I am fetching data from a Dataset, which is store in Session. The table length is approx 1000 rows.I use DataView to and Table.Select ("Data Like '"+ inputvalue +"'") to filter data on each keystroke.I am wondering, what's the best and fastest way to store and fetch data for Autocomplete?
View 5 Replies
Sep 7, 2010
Does Modelbinding increase performance that much? I have a large table which I have split into 4 details/edit screens, and want to load only information that is needed on the particular screen.
View 1 Replies
Nov 22, 2010
I'm using MySql with asp.net 2.0. Every page witch has a database operation take long time to complete. How can i increase the performance.
View 1 Replies