Build Process - What to use?
Asked Answered
H

7

10

I'm considering writing my own delivery code using PowerShell and/or C#, maybe shelling to NAnt or MSBuild.

  1. Why should I not go this way? Is this such a really hard endeavor compared to using NAnt or MSBuild?
  2. Any good, modern book that can help?
  3. Any better ideas?

Background (P.S. This is a religious issue for some. No insult intended):

One person shop, multiple exploratory projects. As most of us - Now windows and ASP.Net. Considering mobile and cloud.

I've started meddling with NAnt, and have tried to follow Expert .Net Delivery Using NAnt and CruiseControl.Net. The whole issue of "delivery" was put on ice and now it's time to "defrost" it. However, I'm not sure which way to go. From what I've learned:

NAnt is showing its age. It's clumsy: it's much harder to understand and maintain than a modern, OO language such as C#. Even after I've followed the book it seems strange to work in an arcane environment where what you want executed is XML, and looping and inheritance are (as far as I remember before the "ice age") are hard to impossible.

MSBuid is MS specific. I'm not even sure if it would support non MS environment. Team foundation server is expensive.

Even so, they somehow both seem to provide value because on my SO search I haven't heard anybody using their own custom software. However, I don't understand why not use C# and simply call NAnt and/or MSBuild tasks as needed.

SO - NAnt Vs. MSBuild

My advice is just the opposite - Avoid MSBuild like the plague. NANT is far far easier to set up your build to do automatic testing, deploy to multiple production environments, integrate with cruisecontrol for an entry environment, integrate with source control. We've gone through so much pain with TFS/MSBuild (Using TFSDeployer, custom powershell scripts, etc) to get it to do what we were able to do with NANT out of the box. Don't waste your time.

SO - NAnt vs. scripts:

there's much more to building a product than just compiling it. Tasks such as creating installs, updating version numbers, creating escrows, distributing the final packages, etc. can be much easier because of what these tools (and their extensions) provide. While you could do all this with regular scripts, using NAnt or MSBuild give you a solid framework for doing all this

Haversine answered 14/11, 2009 at 10:19 Comment(1)
Thank you all for amazing answers - It'll take a little to digest but I really feel as I've just had a great consultants meeting. I wish I could "accept" all your answers! Thanks SO for such a great platform - both technically and socially.Haversine
T
9

NAnt as a project is dead, or on life support (last release was 0.86 beta 1, two years ago). It was basically cut short by the release of MSBuild. NAnt was nice, MSBuild is OK I guess, but I feel more and more drawn to write code in, well, a language instead of some XML-based procedural stuff. With XML-based build frameworks the debugging experience is awful and you end up cheating by embedding "scripts" in c# which defeats the purpose of declaration over programmation. Same sad story as XSLT, for that matter.

Some good XML-free build frameworks:

I still use MSBuild since it's the format of csproj files, but for specific stuff I shun building logic in XML (no custom MSBuild tasks).

Typesetter answered 14/11, 2009 at 11:44 Comment(3)
NAnt may not be actively maintained, but the last release works fine. I find it significantly easier to use than writing a custom MSBuild script. For building Visual Studio solutions/projects, I just shell (exec) out to MSBuild from NAnt.Spenserian
Yep, it still works, but extending it is painful and nantcontrib has flatlined for a while. What's needed for a build system is sensible dependency management, lots of custom tasks, easy debugging and easy extensibility. NAnt only provides the first two features, and custom tasks are sometimes a bit flaky. But if you have a working setup of Nant scripts, it's fine to keep it as is.Typesetter
NAnt is actively developed, see the activity log in sourceforge: sourceforge.net/projects/nant/developBowling
G
3

Don't use C# for building script because you don't want to compile it when you make changes to it.

If you plan to use PowerShell, then take a look on PSake.

If you are XML friendly then go with MSBuild or NAnt. Maybe those build scripts are valuable for you. MSBuild has one advantage: Ctrl + F5 builds this script in Visual Studio.

I have been moving slowly to Rake because it's nicer and Ruby is programming language (which means you can do anything): I blogged about how nice it could be, but you have to translate it or look at code only. If you like that, then you may want to see full script and dependencies.

Good book about continuous integration is from Paul Duvall, Steve Matyas and Andrew Glover (Continuous Integration: Improving Software Quality and Reducing Risk).

Gilgai answered 14/11, 2009 at 11:9 Comment(0)
M
3

At the end of the day, both MSBuild and NAnt tasks can shell out to the command line, so ultimately they could support non-MS stuff.

I'd favour MSBuild personally, and would let a Build Server like TeamCity of CCNET do the build and deployment etc. It's incredible flexible.

Minimalist answered 14/11, 2009 at 11:13 Comment(2)
+1 for TeamCity. We started with CC.NET at work and switched to TeamCity for ease of maintenance and additional features. We still use NAnt behind the scenes, but it needs to do far less than with CC.NET. jetbrains.com/teamcity I still have a hard time seeing how anyone can prefer writing a MSBuild script to an NAnt one, though - the former seems to be a clumsy clone of the latter.Spenserian
There's a whole raft of custom MSBuild tasks out there, see MSBuild Community Tasks, Sdc Tasks and the Microsoft MSBuild Extension pack to name a few. Implementing your own task is completely and utterly trivial - inherit from Task, and implement the Execute() method. I find it a doddle.Minimalist
R
2

I can see NO reason why not to do a simple build process in Powershell.

I use the following for my sandbox project:

#Types
Add-Type -Path D:\Code\OSLib\SharpSvn-x64\SharpSvn.dll

#Tools
$svn = New-Object SharpSvn.SvnClient
$msbuild = 'D:\Windows\Microsoft.NET\Framework64\v4.0.21006\msbuild'
$mstest = 'D:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\mstest'

#Sandbox
$sandbox = New-Object SharpSvn.SvnUriTarget -argumentlist "http://dan:password@sevenmagoo:81/svn/sandbox/trunk/"
$workingdir = 'D:\Code\sandbox'
$builddir = 'D:\Build\sandbox'
$solution = $builddir + '\sandbox.sln'
$tests = '/testcontainer:D:\Build\sandbox\sandbox.Tests\bin\Debug\sandbox.Tests.dll'

function sandbox() { ii D:\Code\sandbox\sandbox.sln }
function build()
{   
     echo 'Empty build directory and recreate'
    rm -r -Force $builddir | out-null;
    md $builddir  | out-null; 

    echo 'Checkout successful? '
    $svn.Checkout($sandbox, $builddir);;


    echo 'Building'
    .$msbuild $solution /nologo;

    echo 'Testing'
    .$mstest $tests /nologo;;  
}

Ayende Rahien has done a similar thing too, here.

Hope that helps,

Kindness,

Dan

Roubaix answered 14/11, 2009 at 10:27 Comment(2)
What is the point of wrapping MSBuild tasks with a PowerShell script? Just use MSBuild tasks (tons of good custom MSBuild tasks out there) and call MSBuild directly.Minimalist
I wrote it as I always have a Powershell window open and wanted to be able to build upon command and run the projects unit tests without opening studio. Kindness, DanRoubaix
P
2

The biggest problem is the maintenance of your build scripts. If you see your projects or environments staying somewhat static, then there's no reason not to pen your own build, packaging and deployment scripts.

Things typically get far more complex as your projects do. Some of the advantages MSBuild, nAnt and other (commercial) products offer is pre-baked support for integration with common services or concepts (say, zipping up files) and it costs considerably less time to bolt it into your build process.

There's going to be cost (real, or in terms of effort) so as long as you can reasonably forecast your automation needs, go in the direction that makes sense for your environment.

I added some considerations for whatever approach you find fits best here they may help in determining the scope of the automation.

Priceless answered 14/11, 2009 at 10:41 Comment(0)
H
1

If you intend to continue to be a "one person shop" and your projects are generally small, then you can probably get by with rolling your own build system. I would still recommend not doing it since having experience with commonly used build environments is a good thing to have on your resume.

I wrote a custom build system that we use at work about five years ago to handle some rather complex, multi-target builds that we have. We have been using it since about 2003 and continue to use it. However, I have been trying to move it towards Ant or even Make for the following reasons:

  1. The build system that I wrote runs on Windows and we have a need for other platforms. It could be rewritten to run elsewhere fairly easily, but the process management code is difficult to port easily.
  2. Integration into a CI environment is a pain if you are using a completely custom environment.
  3. Adding support for different SCM technologies is a bear. We have needed support for Visual SourceSafe, Subversion, and Perforce so far.
  4. Extending and maintaining it is a lot of work that "adds no customer value".

Now our environment is a little more complex than most since we do embedded systems development so it is not uncommon for a single product build to build with two or three different tool-chains on Windows, shell out to a Linux machine via ssh for a Linux only target, and psexec to a remote Windows machine to utilize a node-locked compiler. It's really quite funny to look back and think that the build system that we have started with a single batch file, was rewritten using Perl to accommodate a mixture of declarative statements and programming language statements, then was written again in an Ant-like XML declarative style that shells out the batch or shell. Now I'm thinking about replacing all of that with Ant+Maven+Ivy or some similar chain.

Rolling my own build system was the correct decision for me at the time since we were a pretty small shop doing builds that were primarily based on command-line tools and there weren't a wide array of tools available at the time. Today, however, I would recommend taking a good and hard look at the available tools. After all, writing your own build system means that you will be spending time and money writing and maintaining it instead of writing production code.

There are a lot of tools available for the task today that handle almost any twisted idea that you can come up with. I think that the time spent learning an existing system and extending it to meet your needs is probably more worthwhile. I found the experience of writing Ant tasks quite interesting. Overall, this was a good learning experience even though I did it on a contract job that used Ant and CruiseControl to publish documents.

Harvin answered 14/11, 2009 at 13:33 Comment(0)
H
1

One thing that I haven't seen much mention of so far: dependency/rebuild management. A good build system (even the venerable make) will handle this for you, and if you build your own build system you're doing one of two things:

  • Frequently rebuilding everything (or at least more than you need to)
  • Re-implementing the dependency tracking and update logic yourself

From that standpoint, I would think long and hard before rolling your own solution.

It's sad to hear that NAnt is dying, though; while it has its share of warts, Ant is a reasonable and flexible build system.

Hernandez answered 14/11, 2009 at 13:54 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.