Architecture :: Running Time Consuming Process Behind The Scene?
Dec 23, 2010
My web application currently requires users to upload files, after which I take it for "further processing". This processing is VERY time consuming and can take a while before the control gets back to the user. I would like to run this in the background and not have the user wait until this completes.
I know this question has been asked in this very forum before but I'm not able to understand or I'm not able to proceed in the right direction. My understanding is there are a few ways I can go about this
a) create a BackgroundWorker process in my Global.asax file that will spawn a process and take care of the activity.
b) create a web service that will do the processing for me .. (how?)
I have worked on many localized sites in the past, using mainy some standard patterns and techniques involving Global/Local RESX files as well as session to maintain currently selected language.
However, it is quite difficult to maintain from a translator point of view, we used a small tool that converts RESX to Excel and then the localized Excel back to RESX, but for some reason this technique doesnt work properly anymore...
better and more reliable tools (paid or free) that can be used to localize RESX and provide translation team a better/easier way to localize content?
I want to be able to run an ASP.NET application when a file is upload via FTP. I understand how to use the filewatcher class to monitor the directory but I don't know where to put the code so it is always running.
Files will be uploaded once per month for each group of users of the system and the next time the user logs on after the upload I would like the application to reflect the new data. I don't want to check for a new upload when each user logs on because the processing could take a log time.
The uploads are automatic from multiple other computer systems and not done by any user.
The Application is hosted (by GoDaddy) so I don't have full control of the server.
Is there any way to have a process running all the time in the background and if so how? The application is written in VB.net.
I know that similar questions have been asked all over the place, but I'm having trouble finding one that relates directly to what I'm after.
I have a website where a user uploads a data file, then that file is transformed and imported into SQL. The file could be up to 50mb in size, and some times this process can take 30 minutes or sometimes even longer.
I realise I need to palm off the actual work to another process, and poll that process on the web page. I'm wondering what the best approach would be though? Being a web developer by trade, I'm finding all this new Windows Service stuff a bit confusing, and I just wanted somewhere to start.
So:
Can I do / should I being doing this with a windows service? if so, how?
Should I use WCF? If this runs under IIS, will I have problems with aspnet_wp.exe recycling and timing out my process?
clarifications
The data is imported into sql, there's no file distribution taking place.
If there is a failure, it absolutely MUST be reported to the user. The web page will poll every, lets say, 5 seconds, from the time the async task begins, to get the 'status' of the import. Once it's finished another response will tell the page to stop polling for status updates.
queries on final decision
ok, so as I thought, it seems that a windows service is the best idea. So as to HOW to get it to work, it seems the 'put the file there and wait for the service to pick it up' idea is the generally accepted way, is there a way I can start a process run by the service, without it having to constantly be checking a database table / folder? As I said earlier, I don't have any experience with Windows Services - I wondered if I put a public method in the service, can I call it somehow?
Here's what I want to do, however I'm not sure of the best place for Excel. Excel files are sent via e-mail from various sources, from each source, they are sent in different versions of Excel and different information layouts, though from each source they are consistent.
Overview.
I want to give the user a front end to enable them to place the excel files into a drop zone and specify which files have come from which source. I then want to pass the file names and source origin to a workflow service. Based on this info the workflow service will then take 1..n workflow routes based on the source with the ultimate aim that data will be placed into an SQLServer 2005 DB or if the Excel lines cannot be processed, then output for the user to see with a reason. Because this is quite a complex task, I'm taking each source in turn.
Initial Problem.
My first set of Excel files come in 4 seperate files. I need to first merge the data in the Excel sheets to make 1 File for Processing. Some of the rows in the spreadsheets can be merged into a single line under certain conditions which will make the business logic much easier further on. So, first off After the user has dropped the Excel files into the drop zone I need to open the spreadsheets and merge them. I then need to feed each line to a 'rules' engine to determine the data and business logic that needs to be processed before updating the DB. So I thought I'd use a Workflow service for this since there will be no reason not to work on multiple sources at the same time as I progress through the development of the program. The problem I have is that unless I download the CP1 for workflow I cannot use this directly (so it appears) for working with Excel. So I was looking at using possibly WCF as the 'feed' to the workflow service, but can I use WCF to do the possible long running process of merging the Excel spreadsheets before pasing a row at a time down to the workflow service?
I need reliable way to run 10 different tasks simultaneously. For instance the first one would be sending emails, while the next one is cleaning rows from a specific table... so on an so forth.
I've used the Thread class and while it works well on my development machine (VS2010 internal web server) non of these threads seems to be working at all on my production server. And I don't know of an effective way to debug the problem on the production server.
I saw this technique which encourage you to register cache objects. Since the application fires a callback when a cached item expires, then it's possible to run any code to mimic threading behavior. It seems a little Micky Mouse like.
I have a aspx page in which i create reports and charts on the fly. Creation of these charts and reports takes a lot of time because of which a blank screen is shown to the user until the creation completes.
can unlink actual report and chart creation code from page load so that i can show a processing text and then show the generated chart or report once it is ready.
EDIT :-
I would want to do something like this -
on first request trigger the report or chart creation and register a call back for completion. the client can then poll the server every 2-3 seconds to check if the report creaetion is complete.
I have a function which performs a series of time-consuming operations which include querying a large database, customising an excel file and sending an email with a 5MB attachment.
I would like to excute this function in the background, when a button is clicked, and immediately redirect the user to another aspx page. The user should be free to browse to other pages or even close the browser when the background operation is still running on the server. I have tried to implement threading but could not get it to work. The email with attachment does not get sent even though there are no errors.
I'm in a need to get a list of all the ASP.NET running worker process and its associated application pools and process ID.Is there a way to do this pragmatically (C#) or a powershell script?
Is it possible to kill the current running process in order to run a new process in asp.net? The problem I have is that when a report is running, the user may click on other process, then the process of report should be finished in order to run another process. So is it possible to kill the current process?
In an asp.net web form, I keep getting a connection reset error message. The page is doing a some long running processing (about 2-5 minutes).
I have no problem when the web request comes from the same machine as the web server. But when the request originates across the network, I get a connection reset error about 1:30 or 2 minutes into waiting for a response.
I have set the in web.config for this application and put the application it's own application pool.
What else can I try?
Edit
The purpose of this page is to accept input from the user, calculate something, and send the result back to them. The long running calculation isn't something I can offload until a later time.
We need assistance with a portion of a major web application project scheduled to begin user testing within 10 days. The troublesome part of the code calls an external application when a button is clicked in Web page. When run from a command prompt or bat file, the external application, App.exe, runs perfectly producing the desired result. We created a test page and a small console application, Con.exe, to test the button code with a known quantity. The test application runs perfectly when the button in the Web page is clicked. However when we change the code to run App.exe instead of Con.exe, the App.exe code will not execute. We even tried creating a bat file to execute App.exe. Like the exe, the bat file executes from the command prompt or when double-clicked,but fails to run when called as a process from the button click event. Can anyone assist us? Here is the C# code: (.Net 3.5 sp1)
using System; using System.Collections; using System.Configuration; using System.Data; using System.Linq; using System.Web; using System.Web.Security; using System.Web.UI; using System.Web.UI.HtmlControls; using System.Web.UI.WebControls; using System.Web.UI.WebControls.WebParts; using System.Xml.Linq; using System.Diagnostics; namespace WebButtonTest public partial class _Default : System.Web.UI.Page { protected void Page_Load(object sender, EventArgs e) { protected void Timer1_Tick(object sender, EventArgs e) { int checkNumber = 0; if (Session["checkNumber"] != null).......
Note:- i don't want to use updateprogress etc. control of ajax
on button click, long task(e.g thread) runs in my webpage for about 4-5 minutes.I want to show status to user either by a processing image through javascript(image must be shown in a certain part of page other part of page will remain intact) or an exact status of process if possible. i have tried a lot but all in vein.
Situation:I have an ASP .NET application that will search through docs using Lucene. I want to run the initial indexing (the index will be incremental after the initial run so there wont be need to index the whole directory again in future). Currently, I have about 5GB of docs (45000files).Problem: My application times out before completing the process. I have altered the TimeOut like this:HttpContext.Current.Server.ScriptTimeout = 200000;but it still does not complete the process.
I am not sure exactly which topic this post should go under...
Here is what I am doing.
I have a web form where a person will edit their blog article. At some point, once they are done editing, they can click a button "Publish Blog Now".
Once the blog is published in the click event on the server side I am doing a query to get a list of subscriber email address.
These are people who subscribed to this blogger and wish to receive an email notification whenever this person publishes a new blog.
What I just realized today is that my hosting provider only allows me to send a maximum of 200 emails per hour. Which means in my loop I need to sleep for roughly 20 seconds between each email notification sent. But I dont want the user who clicked the publish button to have to sit there and wait while that process is going.
How can I return to the user but yet continue to run some code on the server side to send out the emails in the background even if user closes web browser?
This is an ASP.NET web application targeting .net 4.0 and I am using c# as my back-end language and VS2010 as my development tool.
I've been having a difficult time with this. I have an asp.net page. The user hits the "Run" button and I have code IN AN ASSEMBLY, not in the APP_CODE folder that is called and runs a long process that moves product info from a file into the database. While the user waits, I would like them to see status updates like what product the import process in on and status info. I'm assuming I'd break off into another thread and use Ajax but I have no idea how to do this.
Wondering if there is a performance difference between letting a long running process hang in asp.net vs running the process via a windows service. I have done this once before and the windows service was much quicker and didnt bog down my system, whereas the asp.net request seemed to wreak havoc.
I have a solution that was originally created using VS 2008. I have quite a few unit tests that all passed when ran in VS 2008.
Yesterday I opened this solution in VS 2010 and converted it using the wizard. The solution built just fine, it runs just fine, BUT now all tests that access the db fail. I get the same error on every test (below).
"The runtime has encountered a fatal error. The address of the error was at 0x5b3b5ad7, on thread 0x19a4. The error code is 0xc0000005. This error may be a bug in the CLR or in the unsafe or non-verifiable portions of user code. Common sources of this bug include user marshaling errors for COM-interop or PInvoke, which may corrupt the stack."
In test classes that use ClassInitialize() and ClassCleanup() methods, all the tests fail. All tests fail with the same error, "The agent process was stopped while the test was running." Tests in test classes that do not utilize ClassInitialize() and ClassCleanup() (these tests do not hit the database) all pass.
When this was a VS2008 project all 153 tests passed. No changes have been made to these tests, other than converting the project to a VS2010 project.
I've been playing around with one test class. I commented out the ClassInitialize() and ClassCleanup() methods. I rewrote one test method and mocked everything. I tried running just that one test. I still got the "The agent process was stopped while the test was running" error.
Visual Studio 2010, version 10.0.30319.1 RTMRel
Test framework: integrated VS 2010 MS Test
Mocking framework: Not using mocking, I have a db that is destroyed and recreated in MyClassInitialize(), it is populated with known values. Yes, I know I should be using mocking, but this is an older app and it has not been updated with all of the new bells and whistles yet.
DB Framework: NHibernate
Target Framework is 3.5 for all projects in this solution except for the test project. The target framework is 4.0 for the test project. I tried changing it to 3.5, it will not let me.
"Attempted re-targeting of the project has been canceled. You cannot change the specified .NET framework or profile for a test project."
I need to do an update a field in database every x minutes. ie: a person login and I need to update a field related to they every X minutes until the value reach a value. Like this, this person start a count event from 1 to 10, they log off the web, but this count must remain countting until reachs 10, 1 by 1 every 7 minutes. I cannot do a SQL Job. Should I User a System.Timer??? Should I record the time and value of the last update, when the person log in I cauculate and update the value??
Coming to my task, I have to schedule a process which will delete all the files in the paticular table with particular key.I have the stored procedure for the deletion. What all I need is, How to schedule this process in the application server.?
I am in the process of creating an audit trail system. Simple enough. Certain fields require audit trail. What is the best design concept to allow this to work in multiple applications without having to change much? I would of course leave it to the admin of the site to add what fields should be audited, but the logic is the problem. Would implementing the interface IComparable be a place to start? My initial thinking is to compare 2 arrays against eachother and do an insert of what fields are in question. Array1 are the fields in the form, array2 are the fields from the sql table that require auditing.