I have a web service that sends up to 5000 emails to our customers.
It is called asynchronously from a web application and then chuggs away sending the emails. Generally it works okay. Yesterday, it kept stopping - sometimes after sending a small number (33 first time) and then it might send another couple of hundred - then 50 or so etc. This went on all day as I kept manually calling the web service.
The data (subject, body, fromaddress, replytoaddress etc) is all kept in the database - as are the recipients.
Here's the code that gets the data from the database.
// Create Instance of Connection and Command Object
SqlConnection myConnection = new SqlConnection(ConnectionString);
SqlCommand myCommand = new SqlCommand("CPEmailsToSend_List", myConnection);
// Mark the Command as a SPROC
myCommand.CommandType = CommandType.StoredProcedure;
myCommand.CommandTimeout = 1000;
[Code]....
So I have a dataset containing 3 sets of data - the subject, body etc., a list of attachments and a list of recipients. I then use the .net smtpClient like this:
Code:
//about to start sending emails - update the MessageSendStatus to inprogress
myCommand.CommandText = "CPEmailSendStatus_Update";
myCommand.Parameters.Clear();
myCommand.Parameters.Add("@CPEmailID", SqlDbType.Int).Value = CPEmailID;
myCommand.Parameters.Add("@SendStatus", SqlDbType.TinyInt).Value = 1;
[Code] ....
When the web service stopped (it kept stopping yesterday) - no errors were raised. I have a function on the page that checks the email address is valid before I try to send the email as I know the smtpClient does not like invalid email addresses. Normally, invalid email addresses are trapped and the loop continues and sends the next one etc. But, yesterday, it just stopped sending. No error messages - nothing.
So, I began to wonder if the Command object is timing out. With a lot of email addresses to send to (up to 5000) and half a dozen attachments - the web service can take a long time (hours) to send all the emails. So, is the fact that I am reusing the command object over and over again for different stored procedures the issue? (I do this a lot and have never had a problem before - I read somewhere that the CommandTimeout only counts the time network access is used).
I've been sending test emails all morning (but I can only send small batches as I have a limited number of email addresses I can send test emails to) and they have all worked okay ... but, yesterday, trying to send about 5000 emails took about 30 attempts.
I have a file upload to SQL Server 2005 DB that keeps timeing out at different times. I know about the web.config <httpRuntime maxRequestLength="2097151" executionTimeout="180"/> Where else are there times out settings?
I have a webservice, that when called, calls many other web services based on the information that was sent. I have always left the timeout as the default, or a better way to describe it, I have never changed anything. How (and where) do I change the timeouts for each service individually. Since I have quite a few now, I want the original transaction to never take more than 30 seconds. That said. I want to go in and change the timeouts for the services I have to ping in that 30 seconds to be 5 to 10 seconds based on which one.
There are various ways to handle session timeouts, like "meta refreshes" javascript on load functions etc.
I would like something neat like: 5 minutes before timeout, warn the user...
I am also contemplating keeping the session open for as long as the browser is open(still need to figure out how to do it though... probably some iframe with refreshing).
How do you handle session timeouts, and what direction do you think i should go in?
I expected this would set 30 minutes as timeout for inactive sessions, but they seem to be timing out much sooner.
Is there some other way to specify time for session timeout ?
I know I am using an unsupported AccessMembershipProvider which I was forced to do because my host service does not support SQL Server Database (so I am using an Access Database).
However, this AccessDB provider seems to work fine in all other respects. I'm suspecting that the early timeout is because of some other obscure setting.
I'm using visual studio 2008 and sql server 2005 and everything is working just fine under normal use. However if a user is on a page for a while several minutes with no activity then clicks a button on occassion the site throws the following exception ...
Procedure or Function "sp_name" parameter '@SomeParameterName', which was not supplied
I'm also encountering this error in Visual Studio while debugging the application, in otherwords run the site from visual studio then make some change to the html in VS save the changes and refresh the page.
The error is not consistent nor is the time the page has to stay idle in order for it to occur....
The current sql command object timeout is 30 secs and the website timeout is 30 minutes.
I are building a web application which will be deployed to Windows Azure. I want user to set session timeout value which will be stored in Database. Currently I am aware of Web.Config method to set session timeout. i.e.
If I watch the processes tab in Task Manager on the web server I see "cmd.exe" under the context of 'administrator' but it just hangs. For test purposes c:SendV80s.bat: copy c: oot.ini c:zzz.txt
If I logon onto the webserver's console and execute SendV80s.bat it works and exits without issue. But when I execute the same batch file via the Submit button it gets stuck executing in Task Manager/Process. I believe this has something to do with the fact that cmd is not running in a full environment/desktop context. I just noticed this on the actual console of the webserver (not in my RDP console but console 0 instead)
A pop-up box stating: CMD.exe Application error The application failed to initialize properly (0xc0000142). Click OK to terminate the application. And when I click on the OK button my ASPX page's WaitForExit is satisfied and the continues processing normally.
I've converted an ASP.Net website over to an Azure version and I've got it up and running. However the vast majority of the time I experience timeouts when I'm trying to debug the application.This seems to happen on any page at any time. I start debugging from VS2010 and sometimes the home page will come up sometimes it hangs. When it does come up if I select an item from a drop down list sometimes it works somtimes it times out. I have breakpoints set and they don't even fire I just eventually get the time out.
If I hit refresh that sometimes seems to fix it but it eventually starts happening again. This is making the completion of the conversion VERY difficult and I'm starting to creep up to a deadline.Does anyone have any idea what might be happening? If there is a better location I can post this question to would someone be kind enough to point me in the right direction?Thanks in advance for any help and in addition my email is mgorgone@pictureu.com as our Windows Live account email is diff.
Is this enough that should prevent that page from timing out as it does what it needs to? I have no control over the process and how long it runs but we've found that 5 minutes is more than enough time, yet we're still getting intermittent errors of:
System.Web.HttpException: Request timed out.
We've tried upping the value to 600 with really no difference and in any testing we've done we can never get the actual process to run for that long. Is there elsewhere that we need to be setting timeout values that won't affect the entire application and only the specific page we need the longer timeout value on?
Can I still use Session_OnEnd to trap a session timeout? Is there a better way to trap a session timeout? I want to take the user to a page that tells them the session timed out, and give them an opportunity to re-enter the application in a new session if they wish. How can I accomplish this? Is there still a global.asax file in the .NET 4.0 world?
I have code in an ASP.NET form that needs to, depending on user entry create messages in the database. We are speaking of potentially thousands of db entries. How do I protect against deadlocks, I mean apart from using Transactions and setting IsolationLevel to Serializable, as well as using WITH(NOLOCK) statement on my select statements since I don't mind a dirty read.
Event code: 3005 Event message: An unhandled exception has occurred. Event time: 10/13/2010 10:12:14 PM Event time (UTC): 10/14/2010 3:12:14 AM Event ID: a565c58a7f844692859aa21303447c7c Event sequence: 206 Event occurrence: 1 Event detail code: 0 Application information: Application domain: /LM/W3SVC/610100832/Root-12-129314933998593750 Trust level: Full Application Virtual Path: / Application Path: D:Websitesadmin.beta.sharedTime.com Machine name: SHAREDTIME Process information: Process ID: 3440 Process name: w3wp.exe Account name: NT AUTHORITYNETWORK SERVICE Exception information: Exception type: SqlException Exception message: Transaction (Process ID 56) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction. Request information: Request URL: http://beta.admin.sharedTime.com/admin_text_mass_send.aspx Request path: /admin_text_mass_send.aspx User host address: 69.211.10.138 User: Is authenticated: False Authentication Type: Thread account name: NT AUTHORITYNETWORK SERVICE Thread information: Thread ID: 10 Thread account name: NT AUTHORITYNETWORK SERVICE Is impersonating: False Stack trace: at mtNamespace.mt.createmessage_queue(String phone_number, String text_message, DateTime send_on, String system_name, Double user_no, Double send_priority, String message_type, Boolean returnqueue) in http://server/App_Code/mt.vb:line 1509 at ASP.admin_text_mass_send_aspx.save_user_values(Object sender, EventArgs e) in http://server/admin_text_mass_send.aspx:line 103 at System.Web.UI.WebControls.Button.OnClick(EventArgs e) at System.Web.UI.WebControls.Button.RaisePostBackEvent(String eventArgument) at System.Web.UI.WebControls.Button.System.Web.UI.IPostBackEventHandler.RaisePostBackEvent(String eventArgument) at System.Web.UI.Page.RaisePostBackEvent(IPostBackEventHandler sourceControl, String eventArgument) at System.Web.UI.Page.RaisePostBackEvent(NameValueCollection postData) at System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) Custom event details:
I know that similar questions have been asked all over the place, but I'm having trouble finding one that relates directly to what I'm after.
I have a website where a user uploads a data file, then that file is transformed and imported into SQL. The file could be up to 50mb in size, and some times this process can take 30 minutes or sometimes even longer.
I realise I need to palm off the actual work to another process, and poll that process on the web page. I'm wondering what the best approach would be though? Being a web developer by trade, I'm finding all this new Windows Service stuff a bit confusing, and I just wanted somewhere to start.
So:
Can I do / should I being doing this with a windows service? if so, how?
Should I use WCF? If this runs under IIS, will I have problems with aspnet_wp.exe recycling and timing out my process?
clarifications
The data is imported into sql, there's no file distribution taking place.
If there is a failure, it absolutely MUST be reported to the user. The web page will poll every, lets say, 5 seconds, from the time the async task begins, to get the 'status' of the import. Once it's finished another response will tell the page to stop polling for status updates.
queries on final decision
ok, so as I thought, it seems that a windows service is the best idea. So as to HOW to get it to work, it seems the 'put the file there and wait for the service to pick it up' idea is the generally accepted way, is there a way I can start a process run by the service, without it having to constantly be checking a database table / folder? As I said earlier, I don't have any experience with Windows Services - I wondered if I put a public method in the service, can I call it somehow?
I keep receiving an exception with the message "Logon failure: unknown user name or bad password". Im absolutelly sure that i'm submiting my correct username/pwd
I've developed a popup email .aspx used on our intranet based web app that is auto generated with .pdf's attached. I'm developing with VS 2008 ASP.Net 3.5 C# and System.Net.Mail.MailMessage. I can create and send the email with no issues. The problem is with any attempt to open or delete the attachments I get the above error. The .pdf's a copied with the following code:
FileStream fsr = new FileStream(inFilename, FileMode.Open, FileAccess.Read, FileShare.Read); BinaryReader reader = new BinaryReader(fsr); byte[] bytes = new byte[fsr.Length]; reader.Read(bytes, 0, bytes.Length); FileStream fsw = new FileStream(outFileName, FileMode.Create, FileAccess.Write, FileShare.Write); BinaryWriter writer = new BinaryWriter(fsw); writer.Write(bytes, 0, bytes.Length); // clean up writer.Flush(); writer.Close(); writer = null; fsw.Close(); fsw.Dispose(); fsw = null; reader.Close(); reader = null; fsr.Close(); fsr.Dispose(); fsr = null; Later after sending the email I: mailMessage.Dispose(); mailMessage = null; foreach (string fileName in attachments) { if (File.Exists(fileName)) File.Delete(fileName); }
The error occurs at: the File.Delete(fileName);
how I can delete or reopen these files after sending the email?
I have an asp.net page that will be doing some processing that may take a very long time to complete. I cannot just set the page timeout value since this is going to be in a hosted environment and the timeout values that I set in my web.config are overridden by the server host. What I'm doing is taking a file from a FileUpload control and doing some web requests to a 3rd party service for each line in the file, all of which may take a very long time to process a large file. I'm talking on the order of, say, 30 minutes, and there's just no way to optimize this any further to cut down on the processing time. Is it possible to even do such a lengthy page request in asp.net? Can someone give me a pointer in the right direction here to make this happen? Is my only hope to create an async page? It seems that doing an async page is the way to go if I have a potential for a lot of lengthy requests, but really this massive of a request is going to happen VERY rarely so this is not an issue of running out of thread pool since most of the time this particular request will be completed relatively quickly, but on occasion it may receive a very large file it will need to process and will take a very long time. So what is the best way to handle that case?I'd also like to update the client with the processing status as the processing is going on. I'm familiar with doing client ajax calls via jQuery to a page webmethod so if there is some clean way to update the client as this long processing
I have a .NET infra code running both within the IIS worker process and within a desktop client app. How can the .NET code determine whether it is running within an IIS worker process?
I know that I could check the name of the process (w3wp.exe, for instance), but I would like a more robust approach. I wish to make a side note. This is not a production need. I need this information to enable certain scenarios useful during the development and testing phase. Specifically to ease the testing of secure vs non secure configurations.
When I hit the run button (in my Default.aspx), a process starts (this process contacts a webservice to get some files, etc). How do I: Ensure that only a single process is running at a time (i.e. if I refresh the browser, I don't want to start the process a second time)Track progress - there are 4 points of the process (at 25%, 50%, 75%, 100%) that I want to track, and when each part completes, I want to update the progress bar. I have a status object for the running process, but the question is how to update the progress bar automatically? Do I need to use threads to achieve the above two?
My code is that I want to create a log file and log it upon a new user browsing the site. However, what i did was I put in a 6 second delay and then used another browser to access the page. And it threw an Exception saying it is being used by another process which is true. So how come I set it so that, if IT is being used by another process, WAIT and retry every 500 milliseconds until it becomes free/available?
here's the code:
protected void Page_Load(object sender, EventArgs e) {if (!IsPostBack) // if this is the first time page loads, set k to 1 { lognewuser(); }}
I am having the text file which is used to track all the ip address which is available in the network and replace the content from"Reply from 172.29.116.3: bytes=32 time=1ms TTL=255" to 172.29.116.3.
For this task i am having 2 functions 1.runCMD() Function is used to create a file which ping all the ip address between 3 to 254 ("Reply from 172.29.116.3: bytes=32 time=1ms TTL=255").
2.Another function textFileReplace() which is used to replace the text from "Reply from 172.29.116.3: bytes=32 time=1ms TTL=255" to 172.29.116.3.
This process will continue every 30 minutes..
But i am having the error while accessing the function textFileReplace() as The process cannot access the file 'C:inetpub' because it is being used by another process.
I have a stored procedure which fetches data after joining 8-9 tables and inserts that into a temp table. It was running fine till now, but now when the amount of data fetched have exceeded 20000, the SP is breaking. I have debugged the sp and found that this main query is failing after returning arround 15000-16000 records.
The error message says Transaction (Process ID) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction. I know what is deadlock, but when i checked with SP_LOCK stored proc i only found that my process is running on database only. So how is this getting deadlocked when there are no other processes running simultaneously on server.
When I delete all files from the directory path. then it will give me this error The process cannot access the file because it is being used by another process. How to solve this?
I am getting binary data from a SQL Server database field and am creating a document locally in a directory my application has permissions. However I am still getting the error specified in the title. I have tried numerous suggestions posted on the web including those suggested in previous posts on Stackoverflow. I have also used ProcessExplorer > Find Handle to locate the lock and it returns nothing as if the file is not locked.
I am using the code below to save the file to the file system and I then try to copy this file to a new location later in the application process within another method. It is this copy method that takes the path of the newly created file that throws the exception.
The file itself is created with its content and i can open it through Windows Explorer without any problems.
Am I missing something completely obvious? Am I creating the file correctly from the database?
// Get file from DB FileStream fs = new FileStream( "C:myTempDirectorymyFile.doc", FileMode.OpenOrCreate, FileAccess.Write); BinaryWriter br = new BinaryWriter(fs); br.Write("BinaryDataFromDB"); fs.Flush(); fs.Close(); fs.Dispose(); // Copy file File.Copy(sourceFileName, destinationFilename, true);