GetDataForYearWorker gets the response from a webservice synchronously. It uses very little computing power on my asp.net application, but it ussualy takes 3-5 sec for each webservice response. Because the calls to the webservice are independent of eachother, I want to make tham all at the same time. But it looks like only 2 threads can run at the same time. Why is this and how can I have 8 threads working at the same time?
Let's imaging there are 2 pages on the web site: quick and slow. Requests to slow page are executed for a 1 minute, request to quick 5 seconds.Whole my development career I thought that if 1st started request is slow: he will do a (synchronous) call to DB... wait answer... If during this time request to quick page will be done, this request will be processed while system is waiting for response from DB.[URL] One instance of the HttpApplication class is used to process many requests in its lifetime. However, it can process only one request at a time. Thus, member variables can be used to store per-request data.Does it mean that my original thoughts are wrong?Could you please clarify what they mean? I am pretty sure that thing are as I expect...
It seems that by default, ASP.NET 3.5 running on IIS 6.0 does not do any parallel processing whatsoever. With a quad-core system and a test webforms application that runs an infinite while loop on the server, CPU usage never goes higher than 30% regardless of how many clients are connected and independently running the while loop.What are my options for enabling parallel processing?
We are using 6 iFrames on our page. They fetch data from couple of external web services and an internal WCF service and display the data. There is a separate aspx page built for each of the iFrames. From our perf monitoring we found out that at any point only two threads are executing in parallel. Not all 6 threads get executed. What can probably be cause for this? Is there any restriction that more than 2 threads can't be created in parallel? Is there any configuration where I can change this?
I'm working on an ASP.NET MVC application that uses the Google Maps Geocoding API. In a single batch there may be upto 1000 queries to submit to the Geocoding API, so I'm trying to use a parallel processing approach to imporove performance. The method responsible for starting a process for each core is:
public void GeoCode(Queue<Job> qJobs, bool bolKeepTrying, bool bolSpellCheck, Action<Job, bool, bool> aWorker) { // Get the number of processors, initialize the number of remaining // threads, and set the starting point for the iteration. int intCoreCount = Environment.ProcessorCount; int intRemainingWorkItems = intCoreCount;
[Code]...
This is based on patterns document I found on Microsoft's parallel computing web site. The problem is that the Google API has a limit of 10 QPS (enterprise customer) - which I'm hitting - then I get HTTP 403 error's. Is this a way I can benefit from parallel processing but limit the requests I'm making? I've tried using Thread.Sleep but it doesn't solve the problem.
Currently we are developing an ASMX, ASP 2.0, IIS 7 web service that does some calculations (and return a dynamically generated document) and will take approx. 60 seconds to run.Since whe have a big machine with multiple cores and lots of RAM, I expected that IIS tries its best to route the requests that arrive in its requests queue to all available threads of the app pool's thread pool.But we experience quiet the opposite:When we issue requests to the ASMX web service URL from multiple different clients, the IIS seems to serially process these requests. I.e. request 1 arrives, is being processed, then request 2 is being processed, then request 3, etc
I have a static class with a static get property, and in this property, I do this:
// property body { // HttpContext.Current is NOT null Parallel.ForEach(files, file => { // HttpContext.Current is null var promo = new Promotion(); }); // HttpContext.Current is NOT null }
This static class doesn't undergone type initialization until a view uses this property.
The problem is that Promotion's static constructor, which is initialized the first time a new Promotion() is created within the Parallel.ForEach(), uses HttpContext.Current. When promo is instantiated within the scope of this Parallel.ForEach(), HttpContext.Current is null, and new Promotion() therefore causes an exception.
HttpContext.Current is not null within the static get property because it's not called until the view uses it (and there is therefore a HttpContext.Current).
If Promotion used HttpContext.Current in its instances instead of its static members, I could probably just pass HttpContext.Current into the new Promotion() constructor:
var context = HttpContext.Current; Parallel.ForEach(files, file => { var promo = new Promotion(context); });
But since static members of Promotion need HttpContext.Current, I can't. I could probably redesign the Promotion class to change the static members that need it to be instance members, but they are static for a reason--there would be a large performance penalty if all the members that were static had to be defined instead on each instance each time a new Promotion was instantiated.
What are the possible workarounds for this? I didn't realize HttpContext.Current would be null within the scope of Parallel.ForEach().
I am seeking your expertise in ASP.Net with regards to multi-thread support of WebService in a scenario , that is If a WebService-Client is making simultaneous calls from the same process, the requests will be serialized at the WebServices so that only one-call will execute at any one time , on the contrary, if those calls are sent from different WebService-Clients ( Instances/Processes) , they are processed in-parallel by WebServices.
Have you ever experienced the same with ASP.Net, and what configurations/Settings should be followed, in order for a WebServices to concurrently process simultaneous calls form a single WebClient , when deploying a large number of Web-Clients' instances/processes is impractical in a project-context.
I am experimenting AsyncController feature. What I did is set up two tasks to run in parallel. As in the code below, the problem is that sometime all tasks finished and return successfully, sometime only one task finish and sometime each task finish half of it's work and return. It is weird, what did I do wrong?
I have realized that Trace.Write is not a good way of tracing as it is gives you the time since last entry which makes no sense if more threads are writing.
I've got a WCF web service that runs fine at the moment but there is talk about using it very heavily soon. As part of it's normal process it writes a file out, then reads it back in again (don't ask why, I know it's stupid). I'm concerned that if we start hitting it with lots of requests then the following might happen.
1. Request 1 writes the file out.
2. Request 2 comes in and overwrites the file.
3. Request 1 reads the file back in but this is now the wrong file.
My understanding is that the requests would naturally queue up so that request 2 wouldn't start until request 1 had finished, but I'm not totally sure.
I've got about 30 projects in VS 2008 and we are finally starting to upgrade to 2010. I get change requests pretty regularly so I'm likely to have to make changes to projects that are still in 2008 after VS 2010 is installed. Is it possible to intall VS 2010 without messing up the VS 2008 install? If so is there any particular trick to it so they live side by side or do they automatically do parallel install in different directories?
I built a little web application that displays charts. I was thinking that it might be useful for the superuser of the app to do a complete data refresh, however this process takes around 10 minutes to complete. I was thinking perhaps the user could click a button that would start off a new thread to do a data refresh and subsequent clicks would kill the thread and restart the data population process. The user would then be free to browse about the site and view the charts as their data is populated.
Is there a simple method of accomplishing something like this?
When I serve an ASP.NET page, can I render the various controls on the page in parallel?
I have a few Telerik controls (RadGrids) on the page and when I step through the page being loaded, it seems as though the controls are databound and rendered serially.
Maybe this behavior is because I am hooked in with the debugger.
Is there anyway to load the page and have select controls build on separate threads? Is that even conceptually possible or must it be done sequentially?
I'm trying to get a better handle on how threads work in ASP.NET, so I have a test site with a few pages, and I have a test WinForms client that creates 40 roughly concurrent requests to the test site. The requests take about 5-10 seconds to complete--they call a web service on another server. When I run the test client, I can use Fiddler to see that the requests are being made concurrently. However, when I look at Performance Monitor on the web server, with counters "ASP.NET Apps v2.0.xxx/Requests Executing", "ASP.NET/Requests Current", "ASP.NET Requests Queued", these counters never display more than 2.
This is the case regardless of whether the test page I'm requesting is set up with Async=True and using the Begin/End pattern of calling the web service, or if it's set up to make the call synchronously. Judging by what I see in Fiddler, I would think I should be seeing a total of 40 requests in one of those states, but I don't. Why is that? Do these counters not mean what I think they mean?
In the main thread I open a new thread that gets the number of new messages of user (takes about 5 secs) and this second thread should save the number in some place.
In the main thread I should check the "some place" and if the value exists I display it on the page.
Where can I save the value from the second thread to read it from the main one? This value is unique per user so I can't use static field.
I am trying to reproduce a threading error condition within an HTTP Handler.
Basically, the ASP.net worker procecss is creating 2 threads which invoke the HTTP handler in my application simultaneously when a certain page loads.
Inside the http handler, is a resource which is not thread safe. Hence, when the 2 threads try to access it simultaneously an exception occurs.
I could potentially, put a lock statement around the resource, however I want to make sure that it is infact the case. So I wanted to create the situation in a console application first.
But i cant get 2 threads to execute a method at the same time like asp.net wp does. So, my question is how can you can create 2 threads which can execute a method at the same time.
Edit:
The underlying resource is a sql database with a user table (has a name column only). Here is a sample code i tried.
[TestClass] public class UnitTest1 { [TestMethod] public void Linq2SqlThreadSafetyTest()
I'm using ASP.NET 4.0 on IIS7.5 and WCF Callback technique. I have no problem with callback. The wcf service can fire callback method in web client but it seems it's on another thread with the UI thread.
I'm investigating some performance improvements that can be made to our web server and ASP.NET application.This page contains a few things that we can do.We currently have two worker processes running as a garden. Do each of these worker processes have their own ASP.NET threadpool? Or do both of these worker processes share a single threadpool and the max number of worker threads is shared across these processes?This post seems to suggest that the two processes share a common ASP.NET threadpool. All w3wp.exe threads do is take requests from HTTP.SYS queue, process it, and hand the request toASPNET_ISAPI.DLL, who then deposits those requests into the ASP.Net request queue, and the ASP.Net threadsBut this post suggests that each worker process contains their own ASP.NET threadpool. Each process (w3wp.exe) has its own CLR thread pool which has the configured maxworkerthreads value (20 default).
When my asp.net page loads,it needs to get data from 2 webservices.I want to make the data retrieval processes work concurently.How can I do this and update Label1 with the text result from WS1 and label2 with the result from WS2?I don't know how the code should look like in order to be able to update the controls on the page.
I have a web application hosted in IIS 7.5 and expose its functionality using web services. The web application has state the the first thing I do is to call a web service method to start it all up. As a consequence several threads are created and these threads run until the system is stopped. All recycling of application pools and such is disabled! To access the web services the calling client must authenticate (using basic auth and local windows users). Thus, an incoming request has the CurrentPrincipal set. The logging subsystem is setup to tag all log messages with the current principals identity name and this works just fine for code that runs on an the thread of on incoming request. But, it doesn't work at all for all of my background threads. The current principal is still set in these threads but the identity objects name has been disposed. Why is this? What can I do to fix this. I would really like to get all log messages tagged with the current principal. It also seems impossible to detect if the identity name instance has been disposed without trying to access it and catch the exception which is highly annoying.