Reasons for C# projects to rebuild in Visual Studio
Asked Answered
A

5

7

I have a large solution, of some 320 projects, where even small changes to a single web form result in long build times to test/debug a small change. I suspect post-build file copy tasks for 'touching' file datetimes and causing multiple rebuilds.

Are there any other reasons for VS 2010 to run a rebuild other than source files being newer than binary output files, in the absence of any strong naming and versioning influences?

ADDENDUM: I have no immediate influence on the size or shape of the source tree. It is the trunk of a stable core product and any short-term "non-business" changes are considered risky and or costly. I will raise points mentioned in answers I receive here as input to future directions of the project.

Arette answered 25/3, 2011 at 7:41 Comment(2)
I don't know how to fix your rebuild problem specifically, but 320 projects in a solution is not a good idea. If I were you I'd post a question detailing what categories of projects these are, what the general dependencies are between them, asking for the best way to make them manageable.Zoellick
My god, 320 projects? I hit 10 once and it was unbearable. I'm so so sorry. To help with that you can create multiple solutions or a manual build process using a tool like psake that pair down just the essential projects for a given scenario.Preconcert
M
22

C# rebuilds because a file has been changed, or a dependency has been changed, but also frequently for "no obvious reason". You can often build 2 or 3 times in a row without making a single change and C# will trundle off and rebuild vast amounts of the code.

Turn on Tools > Options > Projects and Solutions > Build and Run > Only build startup projects and dependencies on run. This will minimise the build to only the dependencies of the project you intend to execute, so unless everything is one massive dependency tree, this will significantly reduce the number of projects considered in the build.

The better solution, however, is to avoid asking it to compile in the first place.

  • Every project adds an additional overhead to the build. On my PC this is around 3 seconds. By merging the code of two projects into one .csproj files I save 3 seconds of build time. So for 300+ projects you could knock 10 minutes off the build time just by merging projects - i.e. where possible use many modules/namespaces in one assembly rather than many assemblies. You'll find your application startup time is improved too.

  • Many projects don't need building all the time. Break these out as library projects and just link to (reference) the binaries

  • Also disable "copy local" and instead point all OutputPaths to a shared folder - this will significantly reduce the overheads by avoiding making 320 copies of 320 dlls all over your hard drive.

  • Add new build configurations (e.g. "Debug FastBuild") that only build the assemblies you are actually working on. With the configguration manager you can untick projects you don't want to build, and presto, they are ignored by the build system. After you get code from source control, build a full "Debug" build to refresh everything, and then switch back to "fastbuild"

Mattins answered 25/3, 2011 at 7:58 Comment(2)
Thank you for thee basic tips, I know that small twicks can come very in hand when building big solutions.Lentamente
The "no obvious reasons" include a change of the build configuration (x86/x64, Debug/Releaase), which seem to falsely create a file "build.force" in the obj folder: developercommunity.visualstudio.com/t/…Contemn
I
2

You can try to build your solution using msbuild and /v:d (for diagnostics logging).

There may be hints in the log for why a project is not up to date.

Iluminadailwain answered 25/3, 2011 at 8:4 Comment(0)
P
1

I think at 320 projects you've long ago exceeded the limit of what Visual Studio was designed to do comfortably.

There's lots of good advice here but if you really must keep 320 projects you might want to ditch Visual Studio for building and design your own build process.

Like I said in my comment, on windows at least, my recommendation for a build tool is psake which is built on powershell (and therefore included on every modern windows machine), quite powerful, and lightweight enough to commit the entire thing to your source control.

Preconcert answered 19/3, 2013 at 18:24 Comment(0)
A
0

320 Projects is a lot of files in the dependency tree: all need to be checked for changes by the build engine. Using something like Process Monitor will allow you to see what file operations are going on to confirm this.

Also 320 assemblies (on top of the .NET framework assemblies) to load at run time will slow your application down, and really slow the debugger (a lot of symbols to load). Look at the Debug | Windows | Modules to see the modules loaded).

As well as reducing the number of projects in the solution (it is quite possible to have multiple solutions each with a subset of the projects) do you really need that many separate projects? Outside the debugger the .NET loader only loads and JITs the code from assemblies as it is needed, larger assemblies don't mean slower load time and avoid the per assembly overheads from the Windows loader.

Airburst answered 25/3, 2011 at 7:58 Comment(1)
I believe they can help that last point out with a post-build ILMere stepPreconcert
S
0

First, check to make sure the references to other projects are PROJECT references and not assembly references.

Second, check for circular references.

Third, set the logging verbosity to DIAGNOSTIC and look at the very first line and see why it's rebuilding projects after building the solution for the first time.

There is lots more that can be checked, but take a look at those 3 things first.

Stammer answered 19/3, 2013 at 18:10 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.