WCF / ASMX :: Getting Huge Number Of Record Above 200,000?
Jan 16, 2011
I have tried to get upto 200,000 record after that iis get exited, So anybody having the solution to get huge records of data and i need to get the data quick as well what is the tune up i have to do in my WCF application
I have xml document and wanting to pass it through wcf restfull service, i am able to do that by using string format i mean i can pass the xml as string param in the restful service and it worked but now my xml document becomes huge so if i pass it as string then it gives URI long error. any way i can pass the huge xml document through the WCF RESTFul service.
i am using one text box in New registration page. when ever user clicks on New registration button, The text box of record number should be auto incremented.
i have a radio button list that on selecting, queries a table and displays data in a textarea. now i'm trying to enhance my code so that the value of the radio button is displayed within the URL. that way i can send users links to content displayed by the radio selection (i.e. articles).
So far i've been able to place the "Article_PK" within the url successfully when the user makes a selection. now i'm just stuck on how to get my databind working again. so basically, i need to read in the value of the "Article_PK" in the url to display the data within the textarea. need someone with some mad coding skills to save the day!
code behind --------
public partial class frm_Articles : System.Web.UI.Page { protected void Page_Load(object sender, EventArgs e)[code]......
i need to start reading a recordset starting from a specific record using SQL. For example: i've 10 records in my msaccess table and i'd like starting read it from record number 6 till the EOF.
I have data grid with page size 15. I need to find total number of record(s) in a grid. How to do it?There is two ways to do:
1) Here I have used LINQ. So calling stored procedure again and we can done using .Count () property. But here I don't want make call to data base again.2) Creating our own logic like here I have done like:
The way I quickly test the asmx file is to "View in Browser" from Visual Studio (I'm using 2005, and have IIS 6). The address shows up like this:
"http://localhost:1399/MyWebService.asmx"
where 1399 is a generated random number and can change each time I test by this process.
The books I've been reading (as well as other sources in the internet) keep referring to testing web services by typing something like this in the browser:
"http://localhost/MyFolder/MyWebService.asmx"
where "MyWebService" is the name of the web service and "MyFolder" is where that webservice is.
This "http://localhost/" with no port number testing never worked for me. Some books mention random port numbers, but most do not even mention port numbers. [URL]
vs2010, .NET 4.0I have a webmethod which I want to return a data record (for example: name, surename, address, age, zip etc)I wonder how to accomplish this. I think the solution is about creating an object which contains all the records serialized. And then the webmethod returns this serialized object. I'm not 100% sure how to accomplish this, if some of you good provide me with some tips, tutorial or link to an article explaining it, that would be great
I have developed one simple application in that i am using mysql database. but my database size more thatn 4,50,0000, as well as ecah and every second my database is updated i.e record ll be added to the table how can i get the details for one single click.
I need to pass a random number from 1 to 100 from the client to a webservice, and then the webservice will subtract this number from the list of 100 numbers (say 1-100). Then, in the next call, the next random number will be subtracted from the rest 99 numbers and so on.
I need this to be done with AJAX. I don't know where the Array list should be held in the first place (would it be in the web service or in the code behind? ), and how this can be done?
I am developing a asp.net application and i am using SQL Server 2008. I took a IDENTITY column as Record_ID for detail table where i will have trillions of records per year. So just want to ask whats the largest number record id (identity) column can hold and in ASP.NET which data type i should use to handle record id as i am using this id as a reference to update the table data. I don't want to end up being trapped some day.
I am using the asp.net and framework 2.0 with Ajax enable web.i have a text box and when user will enter "a" or number in this will search the name started from "a" character or number if he or she enter the number basically we can say live-search.
Employee search: Textbox.In this text box she/he will enter the first character of name or first number of employeeno and according to the a character name will search in list.
Does it have any major effect on performance/ memory if my web.config is really huge (say, 1000+ entries in <appSettings>)? Is it a good idea to maintain a different custom xml config file for all business specific settings for my app?
I am workin on a web projects which has arround 25000 xml files. Some where in application we need to compare some part of a xml file with all 25000 xml files. Although it is working but takin arround 25 minutes to compare. how can i reduce this time and can compare files in less time.
We have a asp.net application that has this report that shows about 1000 rows as of right now and can grow up potentially up to 20,000. Dont ask me why, but out client does not like paging and does not like filtering, they like to see everything on a single page. Our obvious problem is the load its putting on the server, in terms of memory (also the factor that the client browser may crash as well).
My question is: If I provide a custom desktop application only for this report, that can display thousands and thousands of rows (through web services or remotting), would it clog up the server? On the server the worker process of the IIS basically eats up memory in case of a the asp.net application, but if I have this desktop app running seperating calling the same data base on the application server,
When exporting a lot of data to a string (csv format), I get a OutOfMemoryException. What's the best way to tackle this? The string is returned to a Flex Application.What I'd do is export the csv to the server disk and give back an url to Flex. Like this, I can flush the stream writing to the disk.Update:String is build with a StringBuilder:
I want to upload video files may be more than 500mb, simultanously in which if 500mb files are uploaded and stopped in middle of it and i want to continue the progress where it stops
If we have a huge amount of text with some images then, the text and images should be directly included into an ASP.NET 2.0 page with other server controls or whether they should be included from some outside document like a Word Document ? Which one is the best method of doing this ? Can this be done using an Xml file ? If yes, then how ? What are the advantages and disadvantages of both method (if any) ?
I have a project which generates snapshots of a database, converts it to XML and then stores the XML inside a separate database. Unfortunately, these snapshots are becoming huge files, and are now about 10 megabytes each. Fortunately, I only have to store them for about a month before they can be discarded again but still, a month of snapshots turn out to become real bad for it's performance...I think there is a way to improve performance a lot. No, not by storing the XML in a separate folder somewhere, because I don't have write access to any location on that server. The XML must stay within the database. But somehow, the field [Content] might be optimized somehow so things will speed up...I won't need any full-text search options on this field. I will never do any searching based on this field. So perhaps by disabling this field for search instructions or whatever?The table has no references to other tables, but the structure is fixed. I cannot rename things, or change the field types. So I wonder if optimizations is still possible.Well, is it?The structure, as generated by SQL Server:
CREATE TABLE [dbo].[Snapshots]( [Identity] [int] IDENTITY(1,1) NOT NULL, [Header] [varchar](64) NOT NULL,[code]....
Performance isn't just slow when selecting data from this table but also when selecting or inserting data in one of the other tables in this database! When I delete all records from this table, the whole system is fast. When I start adding snapshots, performance starts to decrease. After about 30 snapshots, performance becomes bad and the risk of connection timeouts increase.Maybe the problem isn't in the database itself, although it's still slow when used through the management tool. (Fast when Snapshots is empty.) I mainly use ASP.NET 3.5 and the Entity Framework to connect to this database and then read the multiple tables. Maybe some performance can be gained here, although that wouldn't explain why the database is also slow from the management tools and when used through other applications with a direct connection...
I'm currently exporting a database table with huge data (100000+ records) into an xml file using XmlTextWriter class and I'm writing directly to a file on the physical drive.
_XmlTextWriterObject = new XmlTextWriter(_xmlFilePath, null);
While my code runs ok, my question is that is it the best approach? Or should I write the whole xml in memory stream first and then write the xml document in physical file from memory stream? And what are the effects on memory/ performance in both cases?
EDIT
I will indeed be using XmlTextWriter but I meant to say whether to pass a physical file path string to the XmlTextWriter constructor (or, as John suggested, to the XmlTextWriter.Create() method) or use stream based api. My current code looks like the following:
I'm using the following codeproject to build an asp.net website and so far everything is good. My only problem is after the barcode is generated, a huge whitespace exist to the right of the barcode. I've been playing with this and am unable to resolve it.
We have a .aspx file which has about 400 lines of javascript code. Is it a good idea to have such huge code in its own file? What is the performance difference in having huge javascript code in aspx as against the .js file?