Databases :: Losing Connection When Running Long Running Oracle Procedure?
Jul 1, 2010
I am executing a long-running Oracle stored procedure from .NET. The procedure takes about three hours to run. Ideally, the user should be able to kick off the procedure, close the browser, and come back later to check the results.
The problem is that the connection to the Oracle procedure is lost after exactly an hour. As you would expect, the Oracle procedre runs to completion if it is executed from SQL Plus. Strangely enough, it will also run to completion if I run in debug mode on my local machine (I start two threads, one of which executes the procedure. I set a breakpoint on the second thread).
Here is my connection string:
data source= (DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(PROTOCOL=TCP)(HOST=serverx)(PORT=1521)))(CONNECT_DATA=(SERVER=DEDICATED)(SERVICE_NAME=TestSID)))
In an asp.net web form, I keep getting a connection reset error message. The page is doing a some long running processing (about 2-5 minutes).
I have no problem when the web request comes from the same machine as the web server. But when the request originates across the network, I get a connection reset error about 1:30 or 2 minutes into waiting for a response.
I have set the in web.config for this application and put the application it's own application pool.
What else can I try?
Edit
The purpose of this page is to accept input from the user, calculate something, and send the result back to them. The long running calculation isn't something I can offload until a later time.
I have a site that calls a long running stored procedure that eventually times out the UI. The procedure simply runs some logic in the database, and kicks of a secondary process. No data is ever returned to the UI, so I don't need the UI to wait for anything.Is there a way to call the stored procedure from the UI and move on without having the UI having to wait for the store procedure to complete?
I've developed a web application to accept video file uploads and then pass them to a backend service on an external server. The application runs without error on the visual studio debugging webserver, but once on a production iis 6 or 7 server, yields a timeout error at about a consistent amount of time into handling a large upload. Specifically, it errors in the middle of transferring the video file to the external server, once the application has successfully received it from the client. I'm aware of several timeouts to be configured related to the problem, and have done so. The application's web config has been tested with one or both of the following settings
<system.web> <httpRuntime executionTimeout="9999999" maxRequestLength="2048000" /> </system.web> and <configuration> <location path="default.aspx"> (the page at issue that's timing out) <system.web> <httpRuntime executionTimeout="9999999" maxRequestLength="2048000" /> </system.web> </location> </configuration>
And within the initialization of the webrequest made to the external server to send the video received from the client browser:
So with the execution time limits on both the webform as a whole and the connection made to the external server, I'm at a loss for what timeout is left unconfigured, or how to determine such, when I continue to get the following error: Unexpected error executing Brightcove Upload:...........................
I have checked the servers (Win Server 2003) application event logs for the following problem [URL] which doesn't show up. It just appears that sessions drop randomly for random users
It's a single server setup, no web farms and no load balancing
Even though the issue I point to above doesn't occur in the logs, is it worth increasing the stateNetworkTimeout attribute anyway? The configuration at the moment is simply
IIS6 Settings Recycle worker processes (in minutes) = 120 Recycle worker processes (number of requests) = 35000 Recycle worker processes at the following times = Unchecked Maximum virtual memory = Unchecked Maximum used memory = Unchecked Shutdown worker processes after being idle = 90 Limit the kernel request queue = 1500 Everthing else = Unchecked
When testing this locally (on my dev machine) i see the application is working alot slower.
Is there a better way to write this code? Should i use ThreadPool.QueueUserWorkItem or create a new thread using Thread t = new Thread(new ThreadStart(DoWork)); ? Will it be better to create a totally seperate application for the purpose of sending the newsletters. will that help if ill run this application on the same machine?
i've seen other posts here talking about ThreadPool vs Thread but its seem no one is sure which is better.
I've got ASP.NET 4.0 application running on a 64bit OS. IIS is configured to run in 64bit mode (needs to be to support other ASP.NET applications). Is there any way to do this? Oracle doesn't provide a 64bit client as of this writing for .NET 4.0.
When an ASPX page needs to make a call to a potentially long-running operation (lengthy DB query, call to a remote webservice, etc.), I use RegisterAsyncTask, so the IIS worker thread is returned to the pool, rather than being tied up for the duration of the long-running operation. However ASMX webservices don't have a RegisterAsyncTask function. When an ASMX webservice needs to call a potentially long-running operation, how can I implement the same behavior as RegisterAsyncTask? Note: the ASMX webservice is implemented as a script-service: returning json to a direct jQuery/ajax call. Therefore, I cannot use the "BeginXXX" approach described by MSDN, since that implements the asynchronous behavior within the generated client-stub (which isn't used when calling the webservice directly via ajax). EDIT: Adding source code: implemented the BeginXXX/EndXXX approach listed in John's answer. The synchronous "Parrot" function works fine. But the asynchronous "SlowParrot" function gives an internal server error: "Unknown web method SlowParrot"
I would like to makea windows service. whenever the user of my ASP.NET application has to do a time-consuming task, the IIS would give the task to the service which will return a token(a temporary name for the task) and in the background the service would do the task. At anytime, the user would see the status of his/her task which would be either pending in queue, processing, or completed. The service would do a fixed number of jobs in parallel, and would keep a queue for the next-incoming tasks. In addition there would be a WinForms application for system administrator that would allow adding special ADMIn tasks such as "Clean orphaned files" or "archive data of inactive users".
Can you point me to something that can jump start me on this as a whole concept - I know I can google for windows services and I am able to do it myself from scratch but time is of the Essence so maybe you know of something that is already there and i can use block to build out of.
I'm looking for ways to improve a web page that initiates a long-running (>2 minutes) server-side task. The current version of the page just clocks for the full duration of the task, which can be very frustrating to the user.
I already have a few ideas about how I could improve the user's experience, but they all would involve the use of AJAX to some extent. Because of previous experiences that I've had on this project, I know that not all users have JavaScript enabled or available.
Assuming that the server-side process has already been optimized as much as possible, what else could I do to improve the experience of all users as much as possible?
Note:- i don't want to use updateprogress etc. control of ajax
on button click, long task(e.g thread) runs in my webpage for about 4-5 minutes.I want to show status to user either by a processing image through javascript(image must be shown in a certain part of page other part of page will remain intact) or an exact status of process if possible. i have tried a lot but all in vein.
I have a long-running WCF service that I need to call, but I would like to do is to open modal window (modal popup extender) that simply shows progress and stops the user from interacting on the page until the service returns. What I was trying to do was the following:
1. Click button to activate the process which calls a method in my code-behind. 2. This method opens my modal panel with some pretty animation. 3. I call my WCF service asychronously so that the UI will refresh. 4. Service ends which calls my delegate I setup. 5. My delegate method would then refresh the page with results, and dismiss the model popup.
I have a report that takes a couple of minutes to generate.What I am trying to do is display a progress indicatior onscreen (progress bar or spinning circle) while this is running.I was thinking of using javascript to display the progress indicator but am not sure how to get started on this. am using ASP.NET 2008, C#.
Situation:I have an ASP .NET application that will search through docs using Lucene. I want to run the initial indexing (the index will be incremental after the initial run so there wont be need to index the whole directory again in future). Currently, I have about 5GB of docs (45000files).Problem: My application times out before completing the process. I have altered the TimeOut like this:HttpContext.Current.Server.ScriptTimeout = 200000;but it still does not complete the process.
I have NHibernate sessions cached in the ASP.NET session.
I came across a situation where a user edited an object so it's in their first level cache in the ISession. Another user then edited the same object.
At this point User1 still sees their original version of their edits where as User2 sees the correct state of the object?
What is the correct way to handle this without manually calling session.Refresh(myObj) explicitly for every single object all the time?
I also have a 2nd level cache enabled. For NHibernate Long Session should I just disable the first level cache entirely?
Edit: Adding some more terminology to what I'm looking to achieve from 10.4.1. Long session with automatic versioning the end of this section concludes with
As the ISession is also the (mandatory) first-level cache and contains all loaded objects, we can propably use this strategy only for a few request/response cycles. This is indeed recommended, as the ISession will soon also have stale data.
I'm not sure what kind of documentation this is for it to include both probably and then immediately say the session will have stale data (which is what I'm seeing).
I've run into a problem after installing the 64 bit Oracle client onto my Win 7 x64 dev box. I have installed and configured the oracle client and added a reference to it in my library project and it runs without problems when deployed to a Win 2008 R2 server; however I cannot run it in the built-in VS2010 debugger.
The code throws a BadImageFormatException when the .open() statement is called on the connection object.
I figured out that if I will run it in IIS and move the application out of the default application pool, the error goes away for some reason.
However, I can't do this when I'm running the test project (MSTest) and I the result is that I cannot run unit tests against this code. Yes, I can mock it, but I would really like to understand and eliminate this error. There are several cases where I would like to test against some test data in the database.
I've been having a difficult time with this. I have an asp.net page. The user hits the "Run" button and I have code IN AN ASSEMBLY, not in the APP_CODE folder that is called and runs a long process that moves product info from a file into the database. While the user waits, I would like them to see status updates like what product the import process in on and status info. I'm assuming I'd break off into another thread and use Ajax but I have no idea how to do this.
I have a long poll HTTP request using ASP.NET 4, MVC 2 and AsyncController. If a user closes their browser and kills the HTTP connection without the request completing, I'd like to know about it and completely clean up after them. If I don't, the open and incomplete requests just sit there and eventually IIS stops accepting new requests.
You can simulate my long running HTTP request by making a normal ASP.NET application with a page that has a Thread.Sleep. Even if you close the browser, the request carries on as if it hasn't.
There is a property called Response.IsClientConnected that gets switched to false if the client disconnects, and I can poll this to achieve the desired effect but it's not very clean and I'd like to avoid polling. Is there a way of getting notified when this happens rather than having to poll this property?
I need to invoke a long running task from an ASP.NET page, and allow the user to view the tasks progress as it executes.
In my current case I want to import data from a series of data files into a database, but this involves a fair amount of processing. I would like the user to see how far through the files the task is, and any problems encountered along the way.
Due to limited processing resources I would like to queue the requests for this service.
I have recently looked at Windows Workflow and wondered if it might offer a solution?
I am thinking of a solution that might look like:
ASP.NET AJAX page -> WCF Service -> MSMQ -> Workflow Service *or* Windows Service
Wondering if there is a performance difference between letting a long running process hang in asp.net vs running the process via a windows service. I have done this once before and the windows service was much quicker and didnt bog down my system, whereas the asp.net request seemed to wreak havoc.
How can I show loading image for the user while executing long running process in an ASP.Net Ajax application? Is there a way other than using Page Methods? Any ideas?