C# - Performance When Web.config Is Really Huge?

Apr 6, 2010

Does it have any major effect on performance/ memory if my web.config is really huge (say, 1000+ entries in <appSettings>)? Is it a good idea to maintain a different custom xml config file for all business specific settings for my app?

View 3 Replies


Similar Messages:

Performance Difference In Having Huge Javascript Code Against .js File

Dec 9, 2010

We have a .aspx file which has about 400 lines of javascript code. Is it a good idea to have such huge code in its own file? What is the performance difference in having huge javascript code in aspx as against the .js file?

View 5 Replies

SQL Server :: How To Increase The Performance Of The Select Query Fetching Data From Huge Temp Table

Jan 25, 2011

I have one Store Procedure that generating report ..

For storing the data there , i used many temp. tables. There is many Select Query , little less insert and delete query.

Now if there is huge data around 1 lac in temp table my select query taking to much time and also may be insert and delete query

I added Primary key to all auto Increment Field in temp table. Also defined Clustered index on that primary key as unique Clustered index to improve the performance .

But there is not so much improvement in case of huge temp table.

Right now the whole Store Procedure is taking time to complete around 1.5 days or around 30 hours ..

So i want to increase the performance as much that it completes on nearly 3-4 hours.

View 39 Replies

C# - Performance Penalty For Storing Values In The Web.config Compared To Const Fields?

Aug 25, 2010

currently in our application we have an ApplicationConfiguration class. It is basically just a static class with const values specifying certain application-global configuration options. Some of these values will rarely if ever change and are only put into the configuration for elegance/cleanness.

Some of these values though do need to be different between production and development. So I'm considering making this class just a wrapper over Web.config. These values are checked many times throughout our codebase. If I change these from a const to a read-only getter that reads from Web.config will this affect compiler optimizations or make our application in any way slower?

View 1 Replies

Configuration :: .Net Website Performance Of 2 Sites - Rectify The DB Performance While Insertion?

Sep 20, 2010

I have developed a website in asp.net framework 2 . This website is being hosted in two different servers without any change in code. My issue is about the performance of these 2 sites. One website is taking much time for inserting datas to the DB (SQL server 2005). 2 websites are having different DB server.

I think the issue is for the DB server. How can we rectify the DB performance while insertion and Is there any other cause for this permance issue?

View 1 Replies

DataSource Controls :: LINQ Performance Application Performance Is Not Up To Par?

Apr 29, 2010

I am not sure if this is the right forum. I can not find a forum for LINQ.

I am working on an application using LINQ. Application performance is not up to par and my tests show that it is LINQ queries that are slow. I was wondering if anybody can recommend where I can find an article about optimizing LINQ performance maybe by compilation or other methods.

View 1 Replies

WCF / ASMX :: Wcf Performance Vs Page Events Performance?

Mar 20, 2011

I am creating a service oriented application where trying to have everything using services....however there is something I am not sure of , I am having a page that calls the database at the page load...so what would be better and faster?? to call database in pageload , or to call wcf service from javascript during javascript load ??btw , I am using a repeater in the page , but I have created somekind of an engine to create the suitable html so...I'll be creating the repeaters html using the wcf and resend it back to the page If I am using a wcf service at the start.

View 1 Replies

Handle Huge Amount Of XML Files?

Nov 29, 2010

I am workin on a web projects which has arround 25000 xml files. Some where in application we need to compare some part of a xml file with all 25000 xml files. Although it is working but takin arround 25 minutes to compare. how can i reduce this time and can compare files in less time.

View 1 Replies

C# - Huge Grid Displayed On .net Page?

Aug 23, 2010

We have a asp.net application that has this report that shows about 1000 rows as of right now and can grow up potentially up to 20,000. Dont ask me why, but out client does not like paging and does not like filtering, they like to see everything on a single page. Our obvious problem is the load its putting on the server, in terms of memory (also the factor that the client browser may crash as well).

My question is: If I provide a custom desktop application only for this report, that can display thousands and thousands of rows (through web services or remotting), would it clog up the server? On the server the worker process of the IIS basically eats up memory in case of a the asp.net application, but if I have this desktop app running seperating calling the same data base on the application server,

View 4 Replies

WCF / ASMX :: Getting Huge Number Of Record Above 200,000?

Jan 16, 2011

I have tried to get upto 200,000 record after that iis get exited, So anybody having the solution to get huge records of data and i need to get the data quick as well what is the tune up i have to do in my WCF application

View 3 Replies

.net - OutOfMemoryException When Creating Huge String?

Sep 17, 2010

When exporting a lot of data to a string (csv format), I get a OutOfMemoryException. What's the best way to tackle this? The string is returned to a Flex Application.What I'd do is export the csv to the server disk and give back an url to Flex. Like this, I can flush the stream writing to the disk.Update:String is build with a StringBuilder:

StringBuilder stringbuilder = new StringBuilder();
string delimiter = ";";
bool showUserData = true;

[code]...

View 5 Replies

Uploading Huge Media Files?

Dec 7, 2010

I want to upload video files may be more than 500mb, simultanously in which if 500mb files are uploaded and stopped in middle of it and i want to continue the progress where it stops

How can i upload such a large video files?

View 1 Replies

Web Forms :: Huge Amount Of Text In Page?

Jan 14, 2010

If we have a huge amount of text with some images then, the text and images should be directly included into an ASP.NET 2.0 page with other server controls or whether they should be included from some outside document like a Word Document ? Which one is the best method of doing this ? Can this be done using an Xml file ? If yes, then how ? What are the advantages and disadvantages of both method (if any) ?

View 2 Replies

Optimizing A Table With A Huge Text-field?

Mar 4, 2011

I have a project which generates snapshots of a database, converts it to XML and then stores the XML inside a separate database. Unfortunately, these snapshots are becoming huge files, and are now about 10 megabytes each. Fortunately, I only have to store them for about a month before they can be discarded again but still, a month of snapshots turn out to become real bad for it's performance...I think there is a way to improve performance a lot. No, not by storing the XML in a separate folder somewhere, because I don't have write access to any location on that server. The XML must stay within the database. But somehow, the field [Content] might be optimized somehow so things will speed up...I won't need any full-text search options on this field. I will never do any searching based on this field. So perhaps by disabling this field for search instructions or whatever?The table has no references to other tables, but the structure is fixed. I cannot rename things, or change the field types. So I wonder if optimizations is still possible.Well, is it?The structure, as generated by SQL Server:

CREATE TABLE [dbo].[Snapshots](
[Identity] [int] IDENTITY(1,1) NOT NULL,
[Header] [varchar](64) NOT NULL,[code]....

Performance isn't just slow when selecting data from this table but also when selecting or inserting data in one of the other tables in this database! When I delete all records from this table, the whole system is fast. When I start adding snapshots, performance starts to decrease. After about 30 snapshots, performance becomes bad and the risk of connection timeouts increase.Maybe the problem isn't in the database itself, although it's still slow when used through the management tool. (Fast when Snapshots is empty.) I mainly use ASP.NET 3.5 and the Entity Framework to connect to this database and then read the multiple tables. Maybe some performance can be gained here, although that wouldn't explain why the database is also slow from the management tools and when used through other applications with a direct connection...

View 3 Replies

C# - Best Approach To Write Huge Xml Data To File?

Apr 20, 2010

I'm currently exporting a database table with huge data (100000+ records) into an xml file using XmlTextWriter class and I'm writing directly to a file on the physical drive.

_XmlTextWriterObject = new XmlTextWriter(_xmlFilePath, null);

While my code runs ok, my question is that is it the best approach? Or should I write the whole xml in memory stream first and then write the xml document in physical file from memory stream? And what are the effects on memory/ performance in both cases?

EDIT

I will indeed be using XmlTextWriter but I meant to say whether to pass a physical file path string to the XmlTextWriter constructor (or, as John suggested, to the XmlTextWriter.Create() method) or use stream based api. My current code looks like the following:

[Code]....

View 3 Replies

C# - Huge Whitespace Exists To Right After Drawing Image?

Jul 23, 2010

I'm using the following codeproject to build an asp.net website and so far everything is good. My only problem is after the barcode is generated, a huge whitespace exist to the right of the barcode. I've been playing with this and am unable to resolve it.

Details below:

Link to Code Project Article: [URL]

Copy of the Font is here: [URL]

//Working Path
string sWorkPath = "";
sWorkPath = this.Context.Server.MapPath("");
//Fonts................

View 1 Replies

C# - How To Remove Rows From Huge Data Table Without Iterating It

Jan 28, 2011

I have a DataTable available with me which contains thousands of rows. There is a column called EmpID which is containing '0' for some of the rows. I want to remove them from my current DataTable and want to create a new correct DataTable. I cannot go row by row checking it since it contains huge amount of data.

View 6 Replies

ADO.NET :: Reading A Segment Of Huge Varbinaty (max) Field By DataReader

Oct 19, 2010

I have a table with just a column and a row in a table that it save just a file with size 1.5 GB ! C# application and sql server are in different machines. I want to read that file by DataReader every 100 MB then save all 100 MB files to disk by "FileMode.Append" for file stream and collect them to one file.

View 1 Replies

C# - Avoiding Deadlocks And TimeOuts When Processing Huge Data?

Oct 14, 2010

I have code in an ASP.NET form that needs to, depending on user entry create messages in the database. We are speaking of potentially thousands of db entries. How do I protect against deadlocks, I mean apart from using Transactions and setting IsolationLevel to Serializable, as well as using WITH(NOLOCK) statement on my select statements since I don't mind a dirty read.

Event code: 3005 Event message: An unhandled exception has occurred. Event time: 10/13/2010 10:12:14 PM Event time (UTC): 10/14/2010 3:12:14 AM Event ID: a565c58a7f844692859aa21303447c7c Event sequence: 206 Event occurrence: 1 Event detail code: 0 Application information: Application domain: /LM/W3SVC/610100832/Root-12-129314933998593750 Trust level: Full Application Virtual Path: / Application Path: D:Websitesadmin.beta.sharedTime.com Machine name: SHAREDTIME Process information: Process ID: 3440 Process name: w3wp.exe Account name: NT AUTHORITYNETWORK SERVICE Exception information: Exception type: SqlException Exception message: Transaction (Process ID 56) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction. Request information: Request URL: http://beta.admin.sharedTime.com/admin_text_mass_send.aspx Request path: /admin_text_mass_send.aspx User host address: 69.211.10.138 User: Is authenticated: False Authentication Type: Thread account name: NT AUTHORITYNETWORK SERVICE Thread information: Thread ID: 10 Thread account name: NT AUTHORITYNETWORK SERVICE Is impersonating: False Stack trace: at mtNamespace.mt.createmessage_queue(String phone_number, String text_message, DateTime send_on, String system_name, Double user_no, Double send_priority, String message_type, Boolean returnqueue) in http://server/App_Code/mt.vb:line 1509 at ASP.admin_text_mass_send_aspx.save_user_values(Object sender, EventArgs e) in http://server/admin_text_mass_send.aspx:line 103 at System.Web.UI.WebControls.Button.OnClick(EventArgs e) at System.Web.UI.WebControls.Button.RaisePostBackEvent(String eventArgument) at System.Web.UI.WebControls.Button.System.Web.UI.IPostBackEventHandler.RaisePostBackEvent(String eventArgument) at System.Web.UI.Page.RaisePostBackEvent(IPostBackEventHandler sourceControl, String eventArgument) at System.Web.UI.Page.RaisePostBackEvent(NameValueCollection postData) at System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) Custom event details:

View 1 Replies

Configuration :: Temporary Files Getting Created With Huge Amount Of Space?

Mar 29, 2010

I have faced a peculiar issue in the production environment where I have got a main Application named "Configurator". it has a page which submits the asynchronus request for a batch process. after the batch process is started it calls a web service to genarate a report. the issue is that the number of times the service gets called the main application dll gets copied in the temporary aspnet files in either the service folder or the application calling the service named "GenerateReport" which is making the temporary ASPNET files as huge as 8 GB per day, which is in tuen is bringing the production server down.

I have tried simulating it is UAT environment with the same depolyed code and the same IIS settings. But I was not able to replicate the issue in UAT.It seems to be a very specific and peculiar issue.

View 4 Replies

Forms Data Controls :: Export Huge Record To Exl File?

May 27, 2010

I am trying to export huge record (having more than 3000000 of records) to exl file.

i have one class file which have function ExportToExcell .it is working fine for samll records ..but for big record,got fallowing error

Unable to evaluate expression because the code is optimized or a native frame is on top of the call stack.

Function is as falllow-first string for file name and othere three for any heading in exl file and datatable is data toexport.

I got all data in datatable but got error when try to export to exl

[Code]....

View 6 Replies

DataSource Controls :: Reading A Huge File From Data Base?

Oct 19, 2010

I have a 1.5 GB file into a table. I want to read It from server and create a file stream with it for save.

When I use DataAdapter.Fill(DataSet); code OutOfMemoryException occurs.

What should I do for read a hage file from data base?

View 8 Replies

C# - Retrieving Many Huge Sized EPS Files And Converting Them To JPEG In Application?

Mar 23, 2010

I have many (>600) EPS files(300 KB - 1 MB) in database. In my ASP.NET application (using ASP.NET 4.0) I need to retrieve them one by one and call a web service which would convert the content to the JPEG file and update the database (JPEGContent column with the JPEG content). However, retrieving the content for 600 of them itself takes too long from the SQL management studio itself (takes 5 minutes for 10 EPS contents).

So I have two issues:-

1) How to get the EPS content ( unfortunately, selecting certain number of content is not an option :-( ):-

Approach 1:-

foreach(var DataRow in DataTable.Rows)
{
// get the Id and byte[] of EPS
// Call the web method to convert EPS content to JPEG
}

or

foreach(var DataRow in DataTable.Rows)
{
// get only the Id of EPS
// Hit database to get the content of EPS
// Call the web method to convert EPS content to JPEG
}

or

Any other approach?

2) Converting EPS to JPEG using a web method for >600 contents. Ofcourse, each call would be a long running operation. Would task parellel library (TPL) be a better way to achieve this?

Also, is doing the entire thing in a SQL CLR function a good idea?

EDIT :- Unfortunately, I have to do this in the ASP.NET application itself and doing that in a separate process like Windows service is not an option.

View 1 Replies

Javascript - Build Huge Website Using Ajax Jquery Sqlserver?

Apr 4, 2011

I am developing a huge almost millions of users and very complex logic , secure and optimize web application.but i am thinking to use client side archeticture with asp.net like gmail is using.

View 2 Replies

How To Display Huge Amount Of Data By A Gridview / By Not Selecting Entire Data From Datasource

Feb 21, 2010

how can I create a query in order to select some records every time user wanna to display them into a gridview. I don't want to select all entire records from my data source. What I exactly want is to simulate pagination by query.

View 1 Replies







Copyrights 2005-15 www.BigResource.com, All rights reserved