Jquery - Best Way To Organize Files In A Large MVC2 Project?
Jan 27, 2011
I've looked all over the web for the best way to organize an ASP.NET MVC2 project. I've only seen examples of people using the default template for MVC2 projects. But is this the best way to organize your project if it is going to contain a large number of files?
We're in the process of building an application that is heavily built around jQuery for UI and ajax using JSON. So, as you can imagine, we will have many custom .js support scripts.
In our solution, we have placed all our support libraries (3rd party and custom) into respective projects. The MVC2 project that is also in the solution is using the default MVC2 template.
In the MVC2 project, the "starting" structure is still pretty much unchanged. Under the Controllers directory, we have each controller AccountController.cs and HomeController.cs (for example). Under the Views directory, we have three subdirectories named Account, Home, and Shared. In the Scripts, directory we have also divided that up with three directories, Account, Home, and Shared. And finally we have the Models directory, that is also divided into Account, Home, and Shared subdirectories.
As you can see we haven't deviated from the basic template that much. But, as we start adding stuff to this, we're realizing how cumbersome this might become when we get upwards to 20 or 30 views and 100 support .js files.
When implementing MVP in a ASP.NET project, what are your preferences for how you organize the relationship between your presentation project and your web project?
I am thinking about this from now as I need several developers across the globe to work on this project. We will be using VS 2010 and I heard some thing about web application templates.
I use ASP.NET in C#, I am pretty new at developing so I would like have some advice from experts :-).uestions:What is the best ractice to organizing CLASS FILES?What kind of name do you use?For Web Application Project, how do you name NAMESPACE?In my Case I am building a simple CMS. I thought the FILE structure like this:
- AppCode - Common - UserDataInput.cs - ExternalLibrary [code]...
I work in a suite of ASP.NET applications that have several different "modules". The applications all share a main menu, so they all link to one-another. The modules are the high-level areas of the application. So, for example, it might be Payments, Orders, Customers, Products, etc. And Payments and Orders are in one app and Products and Customers are in another. Some of these menu links are "deep links", for example it might be a link to a particular page within the Customers module, such as Create New Customer.
We are about to start a project that will add several more modules to this suite, probably as a new .NET application. I'm thinking about doing these new modules in Silverlight (for various reasons that are not material to the question). If I were to do that, I need to make the menu look the same as the menu in ASP.NET, as the users still need to feel like they are inside one "application".
How should I organize the Silverlight project(s) so that I can "deep link" from ASP.NET pages into particular modules in the Silverlight app? What is even the best idea for creating these different Silverlight "modules"? If I had something that would've been a page in ASP.NET (for example - Create Customer), should each one of those be a separate Silverlight app? Or should it be a separate User Control? Or something else? Should I reuse our shared ASP.NET menu, and deep link to different Silverlight "modules" even within the new application? Or should I reimplement the menu in Silverlight for navigation within the app? Are there menu controls for Silverlight that look similar to ASP.NET menus (with flyout submenus in this case)? Could I maybe even share a SiteMap XML file between them?
I have been asked to join a very small team where one main developer has been buiding the web app (.NET 4.0) during ~6 months. The project should be delivered within next 2 months.
After first look at the code I can say that I would never allow it to go to production (things like catch { }, no tests at all with WebForms etc).
So the code quality is incredibly low.
My task is to improve that and still deliver the solution. So I plan to start with unit testing and MVC2 reimplementing most of the functionality (though using some of the existing code).
I estimate that I will need about 6 weeks to catch up with the current progress and be on te same functionality level as the application will be in 6 weeks.
The problem is that the main developer who has been working on the project seems to be really starting in IT and many basic things are unknown to him. It will take significant amount of time and effort to educate him how to do the proper testing, development and apply some patterns.
I am ready to take responsibility for the reimplemnting the application but at the same time I don't want the main developer to be on idle but as he won't be able to significantly contribute to the better-world project at this stage I am not sure what would the best way to keep productivity high for both of us.
Currently I think following solution is good enough: He proceeds doing what he does until I will catch up with him and then start working on a new project together.
The problem is that of course this approach is not very productive as one developer will do better-world project while the other will proceed with what he did, effectively doing similar tasks.
Another approach would be to pair and try to do things together, but again not sure how productive we will be.
Can you suggest how we could better organise the work together in order to be most efficient for the overall project?
I realize that VS200X can ident asp files properly, however, for the sake of ease in finding attributes, is there a tool that will also organize the attribute order within a tag alphabetically? I'm always scanning around a tag visually and if the attributes were sorted alphabetically, asides maybe from ID and runat which should remain first, I would have a much easier time.
I am trying to use Domain Driven Development (DDD) for my new ASP.NET MVC2 project with Entity Framework 4. After doing some research I came up with the following layer conventions with each layer in its own class project:
[code]... Currently my Repositories layer holds a reference to the Domain layer.From my understanding injecting a UserRepository to the UserService class works very well with unit testing as we can pass in fake user repositories.So with this architecture it looks like my Web project needs to have a references to both my Domain and Repositories layers.But is this a valid?Because historically the presentation layer only had a reference to the Business Logic layer.
I have 6 different service clients in one project and as we all know when I debug the WCF client hosts them and all is working great.
If i publish up to my server the site will run but i keep getting this error after login.
There was no endpoint listening at http://localhost:8731/Design_Time_Addresses/CISS.Services/SecurityService/ that could accept the message. This is often caused by an incorrect address or SOAP action. See InnerException, if present, for more details.
This is suppose to be easy to publish your site correct? what settings do I need in VS 2010 and what is required on IIS 7.0 to get it to work?
I am working on 2 projects. One that is in MVC2 that is an existing application, and then I have the MVC3 application that I am trying to build. After hearing that I could get intellisense for my work in VS2010, I went to install the VS tools for MVC3. Now my old project will not work. I'm not trying to move my MVC2 project to MVC3 right now either.
I didn't actually change anything about the MVC2 project, but now I get this error whenever I try to open a page:
Server Error in '/' Application.
Parser Error Description: An error occurred during the parsing of a resource required to service this request. Please review the following specific parse error details and modify your source file appropriately.
Parser Error Message: The type 'System.Web.Mvc.ViewMasterPage' is ambiguous: it could come from assembly 'C:WindowsassemblyGAC_MSILSystem.Web.Mvc2.0.0.0__31bf3856ad364e35System.Web.Mvc.dll' or from assembly 'C:WindowsMicrosoft.NetassemblyGAC_MSILSystem.Web.Mvcv4.0_3.0.0.0__31bf3856ad364e35System.Web.Mvc.dll'. Please specify the assembly explicitly in the type name.
I'm so excited about the webmatrix, because it makes all the works to be easier than ever before, so I make a try, but I met a problem when I deployed the project membership db to remoting host (I chose the cytanium.com), and I had deployed the project files and another database sucessfully, only the membership related database failed, the error message is below:
[Code]....
I thought that the problem source is about the collation of the database, so I change my database collation from "Chinese_PRC_CI_AS" to "SQL_Latin1_General_CP1_CI_AS", but the problem is the same, I had search a lot of articles, but I didn't find a way to resolve this issue.
A web application works with the database. Once a day, the database should be scanned and alerts should be sent to users. From what I've seen out there, additional project has to be created which will be installed on the server and will work with the same database. Executable created by this project has to be installed in Windows scheduler to be activated once a day.This seems complicated and inefficient: starting additional executable and working on the same database.
Is there a way of filtering large CSS files for the only required selectors on a page, and creating css files that contain just these selectors?
Case: I have a very large CSS file that I want to filter on a per page basis, so that the file size is cut down and can be cached by mobile devices. I was thinking along the lines of something like a server side dust me selectors tool.The particular project I am working on is using ASP.NET MVC.
I have an MVC 2 web application which uses models auto-generated by LINQ. When I add a Silverlight project to my solution, generating a new strongly-typed View fails with the message:
Compiling transformation: The type or namespace name 'Data' does not exist in the namespace 'System' (are you missing an assembly reference?). I understand this is most likely because Silverlight does not access the System.Data namespace (at least, I can't add the reference to my Silverlight app). However, it's not really important - I'm just trying to generate an .aspx View at this stage, not a Silverlight View.
Is there any way to get this template generation to work, or do I have to create my Silverlight project outside the solution and build it separately? I was kind of hoping to take advtage of WCF RIA and some of the other goodies one gets from including the Silverlight app within the main solution... anybody got a fix?
We use the MojoPortal to a website and have some problems to upload files that is around 100 MB with the upload module. (Pleas note that this has probably nothing to do with MojoPortal but with the ASP.NET and the IIS)
The MojoPortal is set to use regular file Upload(not Neat Uploader) and to be able to upload big files we have set the following :
The problem is that the upload will cacel after a couple of minuts (Aborted).
Is there any other values that I need to set to make this possible? The MojoPortal itself should not have any settings for this as far as I know so its regular ASP.NET 4.0.
Anyone got some good pointers at an open source (article for creating your own would even be better) component to upload large files.SlickUpload for instance works great, and surely worth the money, but as this is for a pet project, a paid solution is just not what I'm after.
I am building a website where i need a page where user can upload large video files, i have created WCF service with streaming but i am calling that WCF service from Button_Click event of web page.
I have used below mentioned article for WCF service creation
WCF Streaming
I have used streaming as it should be efficient and should not be buffered in memory of server.
Now questions
1) I am having doubts that the entire file is uploaded to the web server and then it is transferred to WCF Service server...if this is true then i am not getting advantage of streaming as well as iis and web server will be down very soon if user uploads large file or multiple user are uploading files con currently
2) Is there any other efficient way to do same operation with some other technique
EDIT :
If I am not calling WCF Service method from ASP .Net code in that case also it is transferring bytes to the web server which i have checked with HTTPFox
I have checked above thing with upload control and putting one button on UI whose click event is bound to one method in code behind.
So, still i am having that confusion that how data is transferred
Client Machine - Web Server (ASP .Net Application) - Service Server (WCF Service) Client Machine - Service Server (WCF Service)
NOTE : If i am putting a debug point on button_click and uploading 10 kb file it hits that in less then 1 sec. but if i am uploading 50 mb file then it is taking time.
I placed code of calling WCF service inside that button_click event
I have a large folder (1.5 GB) in my ASP.NET Web Site, called Downloads, which contains media files, graphics, etc. No code that needs to be compiled or even deployed on every build.
I use Web Deployment Projects to compile and deploy. Every time I build the deployment project, the Downloads folder gets included and copied to the output location. This is taking a significant amount of time, because the folder is copied twice.
I need to create an upload site to upload large files over 2GB I want ot create a site like [URL]. Once these files get upload i want them to have a link to the file created but the link encrypted. I know there is a limit to http upload. I have used a bunch of the flash upload web apps but are capped at a specfic mb becuase of .net. What options are out there.
I'm using a web form that allows users to upload media files. The code works great on small to medium size files, but I've found that if a file is really big(like bigger than 15MB), the user will get a 404 error. Currently I'm using the code below to handle the file. Does .NET provide another way to handle larger files?
I used this sample to work around the issue we were having with large files. [URL] Unfortunately, when I attempt to download large files of 30MBs or more, the download times out and the user gets a partial download. It doesn't seem to be a consistant percentage of the download either. I attempted to download a 50MB file and got to 33MB. When trying a 30MB file, I downloaded 24MB.Below is my code.
if (File.Exists(strFilePath)) { fileName = System.IO.Path.GetFileName(strFilePath); Response.Clear(); system.IO.Stream iStream = null; byte[] buffer = new Byte[10000]; int length; long dataToRead; try { iStream = new System.IO.FileStream(strFilePath, System.IO.FileMode.Open, System.IO.FileAccess.Read, System.IO.FileShare.Read); dataToRead = iStream.Length; //FileInfo file = new FileInfo(strFilePath); Response.ContentType = "application/octet-stream"; Response.AddHeader("Content-Length", iStream.Length.ToString()); Response.AddHeader("Content-Disposition", "attachment; filename=" + fileName.Replace(" ", string.Empty)); while (dataToRead > 0) { if (Response.IsClientConnected) { length = iStream.Read(buffer, 0, 10000);..................
I'm testing a very simple aspx page on Visual Studio's own ASP.NET Development Server(the local server). On the webpage there is a FileUpload control which can upload jpg file up to 2MB without problems. On uploading bigger files, the browser immidiately show "The web page cannot be displayed". It does not show any exception which really puzzles me. "The web page cannot be displayed" is normally caused by network problem, but in this case it's a local server and it can handle smaller jpg file fine. Whta's the problem here?