C# - Avoiding Deadlocks And TimeOuts When Processing Huge Data?
Oct 14, 2010
I have code in an ASP.NET form that needs to, depending on user entry create messages in the database. We are speaking of potentially thousands of db entries. How do I protect against deadlocks, I mean apart from using Transactions and setting IsolationLevel to Serializable, as well as using WITH(NOLOCK) statement on my select statements since I don't mind a dirty read.
Event code: 3005 Event message: An unhandled exception has occurred. Event time: 10/13/2010 10:12:14 PM Event time (UTC): 10/14/2010 3:12:14 AM Event ID: a565c58a7f844692859aa21303447c7c Event sequence: 206 Event occurrence: 1 Event detail code: 0 Application information: Application domain: /LM/W3SVC/610100832/Root-12-129314933998593750 Trust level: Full Application Virtual Path: / Application Path: D:Websitesadmin.beta.sharedTime.com Machine name: SHAREDTIME Process information: Process ID: 3440 Process name: w3wp.exe Account name: NT AUTHORITYNETWORK SERVICE Exception information: Exception type: SqlException Exception message: Transaction (Process ID 56) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction. Request information: Request URL: http://beta.admin.sharedTime.com/admin_text_mass_send.aspx Request path: /admin_text_mass_send.aspx User host address: 69.211.10.138 User: Is authenticated: False Authentication Type: Thread account name: NT AUTHORITYNETWORK SERVICE Thread information: Thread ID: 10 Thread account name: NT AUTHORITYNETWORK SERVICE Is impersonating: False Stack trace: at mtNamespace.mt.createmessage_queue(String phone_number, String text_message, DateTime send_on, String system_name, Double user_no, Double send_priority, String message_type, Boolean returnqueue) in http://server/App_Code/mt.vb:line 1509 at ASP.admin_text_mass_send_aspx.save_user_values(Object sender, EventArgs e) in http://server/admin_text_mass_send.aspx:line 103 at System.Web.UI.WebControls.Button.OnClick(EventArgs e) at System.Web.UI.WebControls.Button.RaisePostBackEvent(String eventArgument) at System.Web.UI.WebControls.Button.System.Web.UI.IPostBackEventHandler.RaisePostBackEvent(String eventArgument) at System.Web.UI.Page.RaisePostBackEvent(IPostBackEventHandler sourceControl, String eventArgument) at System.Web.UI.Page.RaisePostBackEvent(NameValueCollection postData) at System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) Custom event details:
I have an asp.net page that will be doing some processing that may take a very long time to complete. I cannot just set the page timeout value since this is going to be in a hosted environment and the timeout values that I set in my web.config are overridden by the server host. What I'm doing is taking a file from a FileUpload control and doing some web requests to a 3rd party service for each line in the file, all of which may take a very long time to process a large file. I'm talking on the order of, say, 30 minutes, and there's just no way to optimize this any further to cut down on the processing time. Is it possible to even do such a lengthy page request in asp.net? Can someone give me a pointer in the right direction here to make this happen? Is my only hope to create an async page? It seems that doing an async page is the way to go if I have a potential for a lot of lengthy requests, but really this massive of a request is going to happen VERY rarely so this is not an issue of running out of thread pool since most of the time this particular request will be completed relatively quickly, but on occasion it may receive a very large file it will need to process and will take a very long time. So what is the best way to handle that case?I'd also like to update the client with the processing status as the processing is going on. I'm familiar with doing client ajax calls via jQuery to a page webmethod so if there is some clean way to update the client as this long processing
I have a web parts page in which each web part has its own personalization data(list of DB table primary keys). At startup, I want to get thecombined data stored in each web part to make a query in the background. Is it possible ? Currently I have to wait for IPersonalizationData.Load for each web part to get what keys are stored by each. Is there a way to store personalization data for the whole page ? I can't use profile as this data is not per user. Worst case is write to a file which I want to avoid if possible.
I'm currently exporting a database table with huge data (100000+ records) into an xml file using XmlTextWriter class and I'm writing directly to a file on the physical drive.
_XmlTextWriterObject = new XmlTextWriter(_xmlFilePath, null);
While my code runs ok, my question is that is it the best approach? Or should I write the whole xml in memory stream first and then write the xml document in physical file from memory stream? And what are the effects on memory/ performance in both cases?
EDIT
I will indeed be using XmlTextWriter but I meant to say whether to pass a physical file path string to the XmlTextWriter constructor (or, as John suggested, to the XmlTextWriter.Create() method) or use stream based api. My current code looks like the following:
how can I create a query in order to select some records every time user wanna to display them into a gridview. I don't want to select all entire records from my data source. What I exactly want is to simulate pagination by query.
I am Using OboutGrid to Display the data in my Webforms..this is cool when am binding small data..but when am binding huge data it takes more time to display the records rather than normal datagrid. How can i improve the performance..
I am working with GridView, When huge amount of Data is inserted in the GridView its columns became very thin all tall, however I want to Display the Gridview with Fix size and if the Data is hugeit only display the starting few words of Data.in other words I want to use gridView same like the GMAIL account where Data Display like this. .
I have a DataTable available with me which contains thousands of rows. There is a column called EmpID which is containing '0' for some of the rows. I want to remove them from my current DataTable and want to create a new correct DataTable. I cannot go row by row checking it since it contains huge amount of data.
We are in the process of building a huge site. We are contemplating on whether to do the processing of HTML at server side (ASP .Net) or at the client side. For example we have HTML files which acts like templates for the generation of tabs. Is it better for the server side to get hold of content section (div) of HTML load the appropriate values and send the updated HTML to the browser or is it better that a chunk of data is passed onto client and make Javascript do the work?
We have our web-based car tracking (gps) software developed in asp.net. We receive positional information every 30 seconds. Overall we get millions of records everyday. We have a historical view which displays the position where the car was throughout the day (24hr period) - this page takes a long time to load. We tried indexing the db, de-normalizing it and also re-wrting the queries - but still not a great improvement. Is there any other best practise that we should follow to load huge loads of data into a webpage (there should be no pagination)? Can we somehow bring it to the clientside using AJAX and try loading it when the button is clicked?
I have one Store Procedure that generating report ..
For storing the data there , i used many temp. tables. There is many Select Query , little less insert and delete query.
Now if there is huge data around 1 lac in temp table my select query taking to much time and also may be insert and delete query
I added Primary key to all auto Increment Field in temp table. Also defined Clustered index on that primary key as unique Clustered index to improve the performance .
But there is not so much improvement in case of huge temp table.
Right now the whole Store Procedure is taking time to complete around 1.5 days or around 30 hours ..
So i want to increase the performance as much that it completes on nearly 3-4 hours.
I have a file upload to SQL Server 2005 DB that keeps timeing out at different times. I know about the web.config <httpRuntime maxRequestLength="2097151" executionTimeout="180"/> Where else are there times out settings?
I have a web service that sends up to 5000 emails to our customers.
It is called asynchronously from a web application and then chuggs away sending the emails. Generally it works okay. Yesterday, it kept stopping - sometimes after sending a small number (33 first time) and then it might send another couple of hundred - then 50 or so etc. This went on all day as I kept manually calling the web service.
The data (subject, body, fromaddress, replytoaddress etc) is all kept in the database - as are the recipients.
Here's the code that gets the data from the database.
// Create Instance of Connection and Command Object SqlConnection myConnection = new SqlConnection(ConnectionString); SqlCommand myCommand = new SqlCommand("CPEmailsToSend_List", myConnection);
// Mark the Command as a SPROC myCommand.CommandType = CommandType.StoredProcedure; myCommand.CommandTimeout = 1000;
[Code]....
So I have a dataset containing 3 sets of data - the subject, body etc., a list of attachments and a list of recipients. I then use the .net smtpClient like this:
Code: //about to start sending emails - update the MessageSendStatus to inprogress myCommand.CommandText = "CPEmailSendStatus_Update"; myCommand.Parameters.Clear(); myCommand.Parameters.Add("@CPEmailID", SqlDbType.Int).Value = CPEmailID; myCommand.Parameters.Add("@SendStatus", SqlDbType.TinyInt).Value = 1;
[Code] ....
When the web service stopped (it kept stopping yesterday) - no errors were raised. I have a function on the page that checks the email address is valid before I try to send the email as I know the smtpClient does not like invalid email addresses. Normally, invalid email addresses are trapped and the loop continues and sends the next one etc. But, yesterday, it just stopped sending. No error messages - nothing.
So, I began to wonder if the Command object is timing out. With a lot of email addresses to send to (up to 5000) and half a dozen attachments - the web service can take a long time (hours) to send all the emails. So, is the fact that I am reusing the command object over and over again for different stored procedures the issue? (I do this a lot and have never had a problem before - I read somewhere that the CommandTimeout only counts the time network access is used).
I've been sending test emails all morning (but I can only send small batches as I have a limited number of email addresses I can send test emails to) and they have all worked okay ... but, yesterday, trying to send about 5000 emails took about 30 attempts.
I have a webservice, that when called, calls many other web services based on the information that was sent. I have always left the timeout as the default, or a better way to describe it, I have never changed anything. How (and where) do I change the timeouts for each service individually. Since I have quite a few now, I want the original transaction to never take more than 30 seconds. That said. I want to go in and change the timeouts for the services I have to ping in that 30 seconds to be 5 to 10 seconds based on which one.
There are various ways to handle session timeouts, like "meta refreshes" javascript on load functions etc.
I would like something neat like: 5 minutes before timeout, warn the user...
I am also contemplating keeping the session open for as long as the browser is open(still need to figure out how to do it though... probably some iframe with refreshing).
How do you handle session timeouts, and what direction do you think i should go in?
I expected this would set 30 minutes as timeout for inactive sessions, but they seem to be timing out much sooner.
Is there some other way to specify time for session timeout ?
I know I am using an unsupported AccessMembershipProvider which I was forced to do because my host service does not support SQL Server Database (so I am using an Access Database).
However, this AccessDB provider seems to work fine in all other respects. I'm suspecting that the early timeout is because of some other obscure setting.
I'm using visual studio 2008 and sql server 2005 and everything is working just fine under normal use. However if a user is on a page for a while several minutes with no activity then clicks a button on occassion the site throws the following exception ...
Procedure or Function "sp_name" parameter '@SomeParameterName', which was not supplied
I'm also encountering this error in Visual Studio while debugging the application, in otherwords run the site from visual studio then make some change to the html in VS save the changes and refresh the page.
The error is not consistent nor is the time the page has to stay idle in order for it to occur....
The current sql command object timeout is 30 secs and the website timeout is 30 minutes.
I are building a web application which will be deployed to Windows Azure. I want user to set session timeout value which will be stored in Database. Currently I am aware of Web.Config method to set session timeout. i.e.
I am generating an output on a webpage in a format like this
({"1":"Jeff","2":"Tom","3":"Michael",})
For this, basically this is the code I am using
Response.Write("(" + "{"); // for (Int32 i = 0; i < k.Length; i++) { Response.Write(Convert.ToString(k.GetValue(i)) + ","); } // Response.Write("}" + ")");
Notice my output, after Michael" there is a comma which I do not want since this is the last vaue but this is appearing since , is in the for loop. How to prevent this/remove this last comma from appearing?
My output should be ({"1":"Jeff","2":"Tom","3":"Michael"}) (There's no comma after last value here)
I've converted an ASP.Net website over to an Azure version and I've got it up and running. However the vast majority of the time I experience timeouts when I'm trying to debug the application.This seems to happen on any page at any time. I start debugging from VS2010 and sometimes the home page will come up sometimes it hangs. When it does come up if I select an item from a drop down list sometimes it works somtimes it times out. I have breakpoints set and they don't even fire I just eventually get the time out.
If I hit refresh that sometimes seems to fix it but it eventually starts happening again. This is making the completion of the conversion VERY difficult and I'm starting to creep up to a deadline.Does anyone have any idea what might be happening? If there is a better location I can post this question to would someone be kind enough to point me in the right direction?Thanks in advance for any help and in addition my email is mgorgone@pictureu.com as our Windows Live account email is diff.