Which storage method should I use when using ASP.NET Charting Tools? I'm currently using memory, because it seems like the most straight-forward and easy to implement option. Is there any reasons I should be aware of why I should choose one of the other options? The charts generated will be useful only for an overview kind of purpose (checking to see if a data collection is following a certain pattern, as it is being built day by day), and this function won't be used frequently.
Im trying to retrieve a datatable that I put into a session variable, however when i do it apears that there are no columns... I have done this before in VB.NET, I'n now using C#, and it worked perfectly, and as far as i can see there is no change in the code other than the obvious syntax changes.
EDIT: The dt (in part 2) variable has the right data when i look in the data visulization window, but i have noticed that the other properties associated with a datatable are abscent. When i hover over a normal looking datatable with my mouse it looks like the following: " dt ----- {System.Data.DataTable}" but in the class im working on it just looks like "dt ----- {}". Also, after I return the session variable into the dt I clone it (dtclone = dt.clone(); ) and the clone is empty in the data visulizer.... what on earth!
EDIT 2: I have now also tried converting the first datatable to a dataset, putting this in the session variable, and recoverting it back to a datatable in the class. Am starting to wonder if it is a probelm with: dt.Load(sqlReader); The data does appear after this step though in the dataset visualiser, but not after being cloned. Code below.
1) SQL command in a webhandler, the results of which populate the datatable to be put into the session variable.
DataTable dt = new DataTable(); SqlDataReader sqlReader= default(SqlDataReader); SqlDataAdapter sqlAdapter = new SqlDataAdapter(); sqlReader = storedProc.ExecuteReader(CommandBehavior.CloseConnection); dt.Load(sqlReader); System.Web.HttpContext.Current.Session["ResultsTable"] = dt;
2) Part of the code in a class which performs calculations on the table:
I have the compiled output from a site which uses asp precompilation. The .aspx files are just a placeholder and there are .compiled files for each .aspx. I'm trying to reconstruct a site's .aspx files from the precompiled assemblies. I've looked in the compiled assemblies using reflector but reconstructing the .aspx looks like a bigger task than what I was hoping for. I have a relatively short timeframe for this.Does anyone know of any easy way? Are there any tools out there that can acheive this task?
Some while ago I found a web page explaining the default size of a session was 2048 kb, the minimum was 1 kb.How can this be adjusted? And if some one has links regarding the subject I would be a happy camper :)
In my applicaion, I store some values in an ArrayList and store this Arraylist in a session variable. I want to know the size of this session variable in memory. Is there any method to do this.
In my project I have configured .NET's sessions to go into database.
I also have a global.asax which implements Session_Start().
In Session_Start() I write three things to the session:
The time the session started.
The user's host address.
A serializable device object wrapping the user's agent.
The problem is now that users which don't allow cookies won't allow session cookies either.
(Easily reproducable by putting the site URL to the restricted sites of IE).
If I keep on refreshing (put finger on F5) a new session is created for every request (-> no session cookie). Shortly, the web server process grows to some hundred megabytes.
It does not matter if you use IIS7 or Cassini Local Webserver.
The issue is now: the memory does not get released until the sessions time out. What is the logic here if sessions should really go to database? How long will .NET keep them in memory? Eventually, you'll even get Out Of Memory exceptions!
Anybody know? How to detect and prevent such (almost malicious) "attacks"?
I am building an asp.net application, using II6 on windows server 2003 (vps hosting).
I am confronted with an error I didn't receive on my development machine (windows 7, iis 7.5, 64 bit).
When my wcf service tries launching my query running against a local sql server this is the error I receive:
Memory gates checking failed because the free memory (43732992 bytes) is less than 5% of total memory. As a result, the service will not be available for incoming requests. To resolve this, either reduce the load on the machine or adjust the value of minFreeMemoryPercentageToActivateService on the serviceHostingEnvironment config element.
I have both VS 2005 and 2008 installed on my machine. 2005 is fine. For 2008, literally any asp.net project I try to create gets this eror. I try stepping into the code, and the error occurs apparently before anything that I can trap is loaded. There is no information written to the event log. I have tried this with a "Hello World" webpage with nothing else going on. Seems unique to my Windows Server 2003 machine.
I am getting a weird error in asp.net while using leadtools imaging api. Here's the stack trace.
System.AccessViolationException: Attempted to read or write protected memory. This is often an indication that other memory is corrupt. at SetThreadData(_THREADDATA* ) at Leadtools.Codecs.CodecsOptions.Use() at Leadtools.Codecs.RasterCodecs.DoSave(SaveParams saveParams) at Leadtools.Codecs.RasterCodecs.Save(RasterImage image, Stream stream, RasterImageFormat format, Int32 bitsPerPixel)........
I am maintaining C# .NET code written by somebody else, I get following exception few times,Attempted to read or write protected memory. This is often an indication that other memory is corruptThe code structure where I get the above exception is somethign like this,- it is using ref variables in following sequence,Variable-1 and variable-2 are local variables in App1, - App1: func1() which passes these variables reference to func2(), - App1: func2() passes same variables reference via .net remoting to another application (App2).-App2: does the same passes same received reference to another call 2 times.- The execption is occured while returning from App2.
(App1:func1(ref Var1, ref Var2) --> App1:func2(ref Var1, ref Var2) <---App1 .net remoting to App2--> App2:func3(ref Var1, ref Var2)--> App2:func4(ref Var1, ref Var2)-->variables getting updated and function returned to original caller)
My doubts are ,1. Is passing reference variables in such chain is correct? will it cause such exeption? Does .net support ref variable call directly?
So I'm working on a site that is going to need a file upload control where the user would be able to upload an image, and then the page would postback causing the image to appear for their viewing (before they submit the data to the database, the image is being stored as type image). What I do now is have my own private web form where people send me their image, I format it accordingly and use the following simple code to upload it:
byte[] newimage = fileUpImgFile.FileBytes; var myDataTable = (from item in context.TypeSet where item.Number == txtBxNumber.Text.Trim() select item).ToList(); foreach (Type item in myDataTable) { item.Photo = newimage; } context.SaveChanges();
Which works, but in this case, it would only work if the record exists already in the database so the person would have to save the data, then go back in and upload the image (inconvenient and inefficient). Is there a way to upload it, store it in memory, and then display it, without saving it to the database?
How can i Make A Text File In Memory(Ram -> Save NoWhere) And Write Something On It And Open NotePad on top of Client browser And Open That Text File In It And Let the user save it by him/her self? -> in code behind
How do I create a text file and write the contents of a string to it? I plan to reference to the text file later. It could be on the server (root folder) or anywhere where I can reference it. Below is the string contents
foreach (string s in strValuesToSearch) { if (result.Contains(s)) result = result.Replace(s, stringToReplace);
I am dynamically generating a file from server based on user input. I need to provide a download button which, upon clicking, downloads a file to the user's file system. Also, the user might click the same button twice, upon which the file should download again.The dynamic generation of file rules out the HttpResponse.TransmitFile() option, which suports mutliple download.Almost every other option I have come across needs Response.End() to be invoked, which prevents a second download.How do I satisfy the 'multiple download" requirement?Read up on Virtual Path providers, which might enable me to use TransmitFile(), but that looks like an overkill for such a simple requirement.
What are the options for handling file uploads to reduce the memory footprint? Is there a way to upload in chunks? Is there a way to stream upload directly to disk instead of loading entire file in server memory?
I have a memory leak somewhere that I cant find. Every few days my server will crash, and just before that I log a ton of SQL errors stating that it is "out of memory".
I cant find it anywhere, all of my connections are being disposed like so:
[Code].... Then I call the connection from my pages like so:[Code]....
That is all pretty straight forward. The connection is disposed because it is implementing the USING clause. I am opening the connection in my connection manager class, and not where it is being utilized?Or, could the problem be in the below method I am using to populate a SqlDataReader:[Code]....
Now, at first it appears as though this could be the problem because the Connection isn't part of a USING clause, however doesn't the 'Data.CommandBehavior.CloseConnection' pretty much do the same thing. This makes sure that the connection is closed when the reader is closed, right? Here is how I call that above reader from my login page code behind:[Code]....
So the DataReader will get closed even without the .Close() because it is in the USING, and the connection should get closed because I specified it in the ExecuteReader paramters right?
I have a memory issue on my websites and am trying to get to the bottom of it. I have downloaded the 14 day trial of ANTS Memory Profiler and have been playing with it to get a grip of what it's telling me. In the memory options on the timeline, I can see Bytes in All Heaps and Private Bytes etc but I am not sure which ones I should be focusing on to see where the memory spikes and doesn't go back down.I am profiling a ASP.NET website using ASP.NET 2.0.
I am working on a web app that will heavily rely on configuration. The configuration is also will be written by another process or human. I am looking to get response on best practices in .net 3.5 on how to implement this case. I had used the configuration section of an early version of the Enterprise Library Applications Block. I really liked working with it but from what I hear it is discontinued in current versions. Hence the question... Need to be able to serialize collections, pick up file write event to reload new instance of config into memory.
I am getting a strange error over one site vs another. I have created a chart web site that works, but now I'm moving that code into an existing website. I'm getting this error (see title) and cannot figure out why. The references are the same and I have the using directives the same (as far as I can tell). I'm stumped.
using System; using System.Collections.Generic; using System.Linq; using System.Web; using System.Web.UI; using System.Web.UI.WebControls; using System.Web.UI.DataVisualization.Charting; using System.Data.SqlClient; using System.Drawing; using System.Data;
...
Charting.Title vbTitle = new Charting.Title( ddlSchool.SelectedItem.Text.ToString() + " Logins per Hour" );
I need to pull data out for a chart and the following is what I have come up with. While it works I know there must be a more elegant way to get the same data in the same format.