Thursday, January 19, 2012

Automated Builds and Continuous Integration with Jenkins

The concept of an automated build is fairly basic, you simply take your existing build file (make, nant, ant, msbuild(.sln), rake) and set up a service to build it from your source control repository. 

Automating builds

Setting up automated builds provide a number of advantages:

  1. Everything your project needs to build and run is checked into source control.
  2. By scripting your builds you remove human error problems, computers are just better at running repetitive tasks
  3. Logging what was done and when it was done.

At a former employer the marketing department needed to do content updates on a fairy regular basis; So they wouldn’t have to wait for us, I set them up with svn for the content directory and created some users for content deployment.  All they had to do was update the content and commit it and then deploy it.  We where happy because we didn’t need to spend time on their deployments, and they where happy because they could push content any time they wanted to.

Continuous Integration 

This is a key part of doing Test Driven Development, making sure your tests are run when checked in.  This makes sure that all of the tests are working in the build environment, and lets everyone know when there is a problem as soon as the build runs, and lets the developer responsible to fix it as soon as possible, creating an over all better code quality. 

Using Jenkins

Jenkins is an open source Build server forked from the Hudson project, basically when Oracle bought Sun/Java some of the Hudson team decided to was time to go their own way.  Jenkins Benefits from a log history of community support from it’s Hudson days allowing it to be a very mature and extendable system.  Having said that, it not as easy to use as some other build servers (see Continues Integration with TeamCity) but what it lacks in ease of use, it makes up for in versatility and flexibility.

User Management

Like most things in Jenkins User management is very flexible by default supporting LDAP, it’s own internal user DB, delegate to servlet container, and has a huge list of user management plug-ins for everything from Active Directory to GitHub OAuth.  Once the user is added you also have a fairly flexible authorization set up  from allowing anyone to do anything to completely locking it down. 

My biggest complaint here is that by default everything is wide open and the responsibility of locking everything down is placed on the administrator.

Build Jobs

By default Jenkins supports Maven2/3, completely rolling your own multi part build, and Monitoring an external job but  with a large selection of plug-ins supporting everything from rake to powershell to Fitness, to an Android Emulator.

Rolling your own multi part build is fairly strait forward if a little involved with a lot of the basic options you would expect like wither to keep old builds, how long do you want to keep them, how many do you keep, Quiet period, retry count, etc.

With it’s plug-ins your source control provider options are just about anything you can think of including:SVN, CSV, Git, ClearCase, etc. The same for build triggers by default it supports build after other projects, build periodically(nightly builds for example), pull scm, and remote triggering(call a url with an auth token to start a build).

When your done with your build you have built in support to ftp or ssh your build results with plug-ins to do everything from upload to Amazon S3 to the Appaloose Store for mobile apps.

The down side to all of this flexibility is you are forced to do a lot for the configuration your self, unlike TeamCity, if you want your code test results published you have to configure it yourself, if you want your code coverage results, you have to configure it to use something like PartCover yourself.

Reporting

Just as important as automating your builds is knowing what is going on with them, when are they running, did it fail, what failed, etc.  Jenkins has fairly nice tools built in to show you what is going on.  With a build history trend report, rss feeds, Email notification, and plug-ins to support everything from dashboard views to creating a Wall friendly display of all build jobs.

Conclusion

While not as nice to set up and manage as some other build servers, Jenkins is incredibly powerful with a lot of flexibly, and to be honest it’s really hard to complain about an enterprise level build server for free.  I would make the comparison between running Linux VS. Windows servers, Windows may be more polished but it’s not as flexible.

Sunday, January 8, 2012

Finally decided to do a dev tools list

One of the most important parts of being a craftsman is your selection of tools, it doesn’t matter if you are a carpenter, a mason, cabinet maker, or even a software developer, your tools allow you to get the work done.   What tools you use really depends on your style and technique and may very greatly depending on the person.  This is a simple list of IDEs, editors, libraries, etc. that I like and work for me, some are pay for, most are not.  This isn't a complete list of the tools I use but it's a good solid overview of what I use on a day to day basis, take a look and see what you like, or don’t.

IDEs

  • Visual Studio 2010 – The default editor for doing .NET development, it has it’s share of problems but for the most part it a very sold IDE.
  • MonoDevelop – This is where .NET started for me 8 years ago, not a lot of frills, but for a fully functional cross platform IDE for .NET the price is right at free, unless you want to do Android or IOS, then you need to buy monoDroid or monoTouch.
  • NetBeans – My go to java and PHP IDE, with good intellisense and built in refactoring tools. 

Visual Studio Plugins

  • Resharper – The most valuable IDE plugin I have ever used, providing nUnit support, refactoring, add reference, and the list keeps going.  I honestly have a hard time using Visual Studio with out it.
  • DotCover – An important part of test driven development is code coverage, and DotCover is my coverage tool of choice.  It integrates with Resharper’s test runner to provide on request code coverage in the IDE.

Source Control

  • AnkhSVN – Provides SVN support for Visual Studio, so far this is the best free SVN plugin I have found for Visual Studio
  • Visual SVN server – A simple and easy SVN server that just works, it may not have a lot of bells and whistles but it’s easy to install and manage users.

Text Editors

  • Notepad++ – One of the first things I do on any new windows system is install Notepad++, it integrates with Windows explorer making it simple to edit any text file with out having to deal with what the default app is for the ext. with a huge list of plugins for everything from XML to powershell it is the Swiss Army Knife of text editors.

SQL Tools

  • Toad For SQL Server – This is a new item on my tools list, I just started using it this week, but I really like it.  The community version has some really nice intellisense, and the way the UI works is just clean.  I’m really interested in seeing a head to head comparison between Redgate’s SQL Tool belt and Toad Development Suite for SqlServer, the features look about the same, but at around $1,300 vs. $2,000 toad is a lot cheaper.
  • FluentMigriator – A .NET version of the Ruby Migrations tool.  With a fairly simple syntax and helpers it makes scripting DB changes easy, to read more check out this blog post I did a little while ago, Using Fluentmigrator with nant.

Unit Testing

  • NUnit – The work horse of the .NET unit testing world.  It’s a solid testing framework with some of the best tool support out there, with the exception if Visual Studio, but Resharper fixes that.
  • Moq – The Simplest mocking framework for .NET.  With a clean fluent syntax that provides ease of use and versatility.   

Build And CI Server

  • Team City – By far the easiest to get set up and start running with.  With built in support for most of the tools I use and has DotCover built in, it provides a solid solution out of the box with a very polished UX. 
  • Jenkins – A fork of the Hudson project, Jenkins is a solid if less then polished build and CI server solution, but what it lacks in polish it makes up in versatility and flexibility, with a huge selection of plugins for integrating into just about development environment and deploying to just about any other system out there.

Thursday, January 5, 2012

Test Driven Development for the management point of view


When I have given presentations on TDD, one of the biggest questions I get is “How do I sell this to my boss?”, and it’s a very valid question, you are basically asking your boss to let you write 2-3x as much code to make sure the code you wrote is correct, that’s a tough sell all on its own.  Software can never be done soon enough, most projects are already behind from the start, and now you want to take more time to write more code? 

Time is Money
There is a cost to Test Driven Development (TDD), you are doing more work, writing tests takes time, maintaining tests takes time, and running tests takes time.  With time equaling money the question is “what are you buying?”, the simple answer is security, and long term cost reduction.

Ever wonder what it would be like to have a car without a check engine light?  For the most part it’s something most of us never look at.  When it turns on we take it to the shop and the mechanic fixes whatever is wrong.  Without the check engine light to tell you something is wrong, what could have been a small repair is now an engine rebuild.  Having tests provides you with the same thing, it lets you know there is a problem before it becomes a larger problem, this could be inconsistent application behavior, data corruption, or the application crashing.

Real world example
At a former employer we were migrating to a new production database for our flagship application, the database it’s self was a very complex system, lots of tables, triggers, stored procedures, etc.   The DBA migrated the schema to the new server, got a snapshot of the data and then ran our integration tests and had failures everywhere, invalid permissions, missing columns, etc.  Even using the failed tests as a road map to took the DBAs 1 ½ days to fix all of the problems.  I can’t even guess how long it would have taken to fix if we would have just done to production, on top of that the company’s main product would have been down, not only affecting us, but also all of our customers that depended on our product to run their businesses.    

We Could have just blamed the DBA for not getting everything right the first time, but at the same time, how much time was he given to do it, would it have been cost effective to spend more time for this one time, verses spending the time writing tests that can test it over and over again.

Spending a little to save a lot
Software bugs have costs: they cost to find, they cost to track, they cost to fix, they cost to verify when fixed, and they cost to push out the fix. 
  • Cheap - A developer finds a bug when it was written, he/she fixes it, very little cost
  • More Expensive - QA finds the bug, the tester needs to make sure it’s an actual bug, retest to verify how to reproduce it, then records it a bug report, sends the bug report to the developer, the developer fixes the bug, send the fix back to QA, it’s retested, then the bug report is closed.
  •  Very Expensive – User finds bug, contacts customer service, customer service sends bug to QA, QA verifies the bug, creates a bug report, developer fixes bug, sends the fix to QA, QA verify bug is fixed, bug fix is redeployed to production.

How much the bug costs depends on how soon it’s discovered, the sooner it’s discovered, the less people touch it and the less it costs.  By writing tests developers are far more likely to find bugs sooner also nothing is more frustrating than having bugs reappear, manually retesting existing functionality takes time, and increases QA cost, also reduces the time QA has to find new bugs, and edge case bugs.  TDD inherently provides regression testing and can greatly reduce your QA time.  

Real world example
At one time I worked as a QA Engineer at a major printer company, a little over half my time was spent not only testing new functionality, but existing functionality as well, the rest of my time was spent sending bug reports to the developers, having the developers asking for clarification, verifying the bug was fixed, etc.    As much as it cost for me to test a printer, the cost of having a driver or even worse a firmware bug get released to the customer where astronomical.  By reducing the time I had to spend retesting and verifying bugs, I could spend more time looking for edge case bugs.

Something else to think about is unlike web development where you are only pushing code to your servers, having hardware in the hands of consumers with a bug requires the added cost of service representatives, service techs, etc.   

Reducing Costs by building on a well build foundation
TDD by it nature encourages good development practices, by breaking up chunks of code into more testable sections, you are turning them into more manageable pieces; this reduces the amount of time and effort required to update and maintain your application and greatly reducing the need for the dreaded rewrite. 
  
Ask any construction contractor what makes the biggest difference on a building remodel, and they are probably going to tell you “How well the original building was built”.  If the foundation isn't level or the walls aren't straight, it’s going to make his job that much harder having to compensate for the existing structure.  The same thing applies to software development, the less old code that needs to change to add new features, the less it costs to add new feature with the added bonus of the tests telling you where your changes affect the rest of the application.

Real world example
I worked on a project that was needed to replace how users authenticated; the user still entered a user name and password, but how they were verified need to be updated for security reasons.  By having the application broken up, all that needed to be changed is adding the new authentication code and changing one line in the existing code to implement the new functionality.

Using TDD isn't required to use these best practices, but trying to do TDD without using them is painful to next to impossible.