Breaking code dependencies with inplace refactoring

or, How to TDD Inside a Legacy Application

There's a common symptom of monolithic applications that I see fairly frequently. First, a developer writes a simple little program to solve a simple little problem or to plug a little gap in a solution.

Because it's just a simple app, not much attention is paid to separation of concerns, cohesion or any of the other Good Things we should be doing all the time.

If they and the application are lucky, the application will be useful and requests for new features and modifications will come in from its users. What started as a small utility is now blossoming into a bigger project and a bigger problem.

Crufts

By the time the project begins to reach to multiple files, multiple classes, configuration files, databases, multiple developers stretching out over time it is starting to suffer from accumulations of cruft. Anyone who works on the application is happy to add code, perhaps by copying, pasting and modifying, but not many are confident about refactoring when the code begins to scream out for it.

A monolithic application, for my purposes here, is definded as a single executable containing lots of functionality, perhaps in Windows Forms code-behind style, that cannot be unit tested.


public class ConfusedConcernsGodClass
{
    public int MixedConcerns()
    {
        // horrible code here
        return 75;
    }
}


static class Program
{
    static void Main(string[] args)
    {
        // simulated legacy app.
        Console.WriteLine(new ConfusedConcernsGodClass().MixedConcerns());
    }
}

Refactoring

Refactoring this kind of application can be challenging because you want to identify repeated pattens in the code and extract them into separate methods or classes, break out dependencies, factor out interfaces, separate concerns. Classically, we would extract code into a separate assembly and point the application and our new unit tests at that new assembly.

Of course, to refactor code we need to be as confident that we can be that we are not going to break the existing functionality.

But what happens if the code has been written in "utility" style with large "god" classes, implicit dependencies and mixed responsibilities? We may not be able to do a straight extract without breaking lots of dependent pieces or dragging along code which does not belong in the new "clean" assembly. Often, if UI and business logic are intertwingled in the same code, we don't want to bring the UI code with us into the new, refactored code.

Add to this, the majority of the unit test frameworks I have used don't support testing code in executables.

In-Place Refactoring

One way around this is to use an inplace refactoring technique. I've used this a number of times, so thought it was time I captured it in a set of code examples.

Basically, I wrote a tiny, super-simple unit test framework that can be added to a project as a single .cs file. As part of the framework, I am "spoofing" a very lightweight facsimile of the client code-facing attributes and Assert statements found in MsTest or NUnit.

In this way, you can create the tests in your application so, at the source code level, they look like they are targetting your favourite framework (but you're really using the inplace, lightweight one) and other developers will be immmediately familiar with the behaviour of the tests. When you come to move the code into a separate assembly and are ready to use the real framework, the test code shouldn't need to change and you just need to add the correct references.

Framework

Currently, I have only supported a subset of the features available from MsTest and NUnit. Other frameworks are available and the principles will remain the same.


using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.Reflection;
using System.Runtime.Serialization;

/*
 * Test Attributes - Add attributes to your test code
 * 
 */

/// <summary>
/// MsTest test class
/// </summary>
[AttributeUsage(AttributeTargets.Class, AllowMultiple = false)]
internal class TestClassAttribute : Attribute
{
}

/// <summary>
/// MsTest test method.
/// </summary>
[AttributeUsage(AttributeTargets.Method, AllowMultiple = false)]
internal class TestMethodAttribute : Attribute
{
}

/// <summary>
/// NUnit test class
/// </summary>
[AttributeUsage(AttributeTargets.Class, AllowMultiple = false)]
internal class TestFixtureAttribute : Attribute
{
}

/// <summary>
/// NUnit test method.
/// </summary>
[AttributeUsage(AttributeTargets.Method, AllowMultiple = false)]
internal class TestAttribute : Attribute
{
}


/*
 * Asserts - mixture of MsTest and NUnit
 * 
 */

internal static class Assert
{
    /* MsTest specific */
    public static void IsTrue(bool condition)
    {
        if (!condition)
            NotifyFailure("Assert.IsTrue");
    }

    public static void IsFalse(bool condition)
    {
        if (!condition)
            NotifyFailure("Assert.IsFalse");
    }

    public static void IsNull(object value)
    {
        if (value != null)
            NotifyFailure("Assert.IsNull");
    }

    public static void IsNotNull(object value)
    {
        if (value == null)
            NotifyFailure("Assert.IsNotNull");
    }

    public static void AreEqual(object expected, object actual)
    {
        if (!object.Equals(expected, actual))
            NotifyFailure("Assert.AreEqual");
    }

    // Nunit
    public static void AreNotEqual(object expected, object actual)
    {
        if (object.Equals(expected, actual))
            NotifyFailure("Assert.AreNotEqual");
    }

    public static void Fail()
    {
        NotifyFailure("Assert.Fail called");
    }

    /* End of MsTest specific */

    internal static void NotifyFailure(string message)
    {
        throw new AssertionFailedException(message);
    }
}

/*
 * Framework 
 * 
 */

[Serializable]
internal class AssertionFailedException : Exception
{
    public AssertionFailedException()
    {
    }

    public AssertionFailedException(string message)
        : base(message)
    {
    }

    public AssertionFailedException(string message, Exception ex)
        : base(message, ex)
    {
    }

    protected AssertionFailedException(SerializationInfo info, StreamingContext context)
        : base(info, context)
    {
    }
}

/// <summary>
/// Instance of one test.
/// </summary>
internal class TestCase
{
    public TestCase(Type t, MethodInfo m)
    {
        this.TestType = t;
        this.TestMethod = m;
    }

    public void Run()
    {
        object instance = Activator.CreateInstance(this.TestType);

        this.TestMethod.Invoke(instance, null);
    }

    public override string ToString()
    {
        return this.TestType.Name + " " + this.TestMethod.Name;
    }

    private Type TestType { get; set; }
    private MethodInfo TestMethod { get; set; }
}

/// <summary>
/// Finds test cases in an assembly.
/// </summary>
internal interface ITestFinder
{
    IEnumerable<TestCase> FindTests(Assembly a);
}

/// <summary>
/// Executes test cases and logs results.
/// </summary>
internal interface ITestExecutor
{
    int Passes { get; }

    int Failures { get; }

    void Run(IEnumerable<TestCase> testCases, ITestLogger logger);
}

/// <summary>
/// Writes log results.
/// </summary>
internal interface ITestLogger
{
    void Log(string text);
}

/// <summary>
/// Write results to debug window
/// </summary>
internal class DebugLogger : ITestLogger
{
    public void Log(string text)
    {
        Debug.WriteLine(text);
    }
}

/// <summary>
/// Finds MsTest-style tests
/// </summary>
internal class MsTestFinder : AttributedSourceTestFinder<TestClassAttribute, TestMethodAttribute>
{
}

/// <summary>
/// Finds NUnit-style tests
/// </summary>
internal class NUnitTestFinder : AttributedSourceTestFinder<TestFixtureAttribute, TestAttribute>
{
}

/// <summary>
/// Generic test finder, given class and method level attributes.
/// </summary>
/// <typeparam name="TestClass"></typeparam>
/// <typeparam name="TestMethod"></typeparam>
internal class AttributedSourceTestFinder<TestClass, TestMethod>
    : ITestFinder
    where TestClass : Attribute
    where TestMethod : Attribute
{
    public IEnumerable<TestCase> FindTests(Assembly a)
    {
        var list = new List<TestCase>();

        // look for all test classes,
        foreach (Type eachType in a.GetTypes())
        {
            object[] testClassAttributes = eachType.GetCustomAttributes(typeof(TestClass), false);

            if (testClassAttributes.Length == 0)
                continue;

            // build list of test methods...
            foreach (MethodInfo method in eachType.GetMethods())
            {
                object[] testMethodAttributes = method.GetCustomAttributes(typeof(TestMethod), false);

                if (testMethodAttributes.Length == 0)
                    continue;

                list.Add(new TestCase(eachType, method));
            }
        }

        return list;
    }
}

internal class SimpleTestExecutor : ITestExecutor
{
    public int Passes { get; private set; }

    public int Failures { get; private set; }

    public void Run(IEnumerable<TestCase> testCases, ITestLogger logger)
    {
        this.Passes = 0;
        this.Failures = 0;

        foreach (TestCase testCase in testCases)
        {
            Exception ex = null;

            try
            {
                testCase.Run();
                ++this.Passes;
            }
            catch (TargetInvocationException tie)
            {
                ex = tie.InnerException;
            }
            catch (AssertionFailedException afe)
            {
                ex = afe;
            }

            if (ex != null)
            {
                ++this.Failures;
                logger.Log(testCase.ToString() + " : " + ex.Message);
            }
        }
    }
}

internal class Inplace
{
    private Inplace()
    {
    }

    public static Inplace Tests()
    {
        return new Inplace
        {
            Finder = new MsTestFinder(),
            Executor = new SimpleTestExecutor(),
            Logger = new DebugLogger()
        };
    }

    private ITestFinder Finder { get; set; }

    private ITestExecutor Executor { get; set; }

    private ITestLogger Logger { get; set; }

    public Inplace UsesMsTest()
    {
        this.Finder = new MsTestFinder();

        return this;
    }

    public Inplace UsesNUnit()
    {
        this.Finder = new NUnitTestFinder();

        return this;
    }

    public void Run()
    {
        Assembly thisAssembly = Assembly.GetExecutingAssembly();

        var allTestCases = new List<TestCase>();

        allTestCases.AddRange(this.Finder.FindTests(thisAssembly));

        this.Executor.Run(allTestCases, this.Logger);

        if (this.Executor.Failures > 0)
        {
            this.Logger.Log(string.Format("{0} tests failed", this.Executor.Failures));
        }
    }
}

The Process

Step 0 in this approach is to include (copy and paste is your friend in this situation) the code below somewhere in your application. Either as a separate file in the project or just in the same file as the problem code.

Then you can add characterisation or specific scenario tests to the class (here using a simulation of MsTest).


// Ms Test example
[TestClass]
public class ConfusedConcernsGodClass
{
    public int MixedConcerns()
    {
        // horrible code here
        return 75;
    }

    // First test method 
    [TestMethod]
    public void MixedConcerns_Always_Returns_75()
    {
        Assert.AreEqual(75, this.MixedConcerns());
    }
}

Then, before any other code is loaded, you can make a call to the inplace test runner from your "main" method like this :


static class Program
{
    static void Main(string[] args)
    {
        Inplace.Tests().Run();

        // existing legacy application code here.
    }
}

The next step is to identify some candidate functionality that could be refactored. Wherever this code happens to be, create a test for it in a separate method, named and attributed as if targetting your favourite unit test framework. Continue identifying and writing tests in place until you have built up a description of the current functionality in small tests inside the application. With all the tests passing, we can begin to refactor: breaking dependencies; inserting interfaces; extracing methods; renaming etc. all while making sure the tests still pass.

Each time you want to run the tests, run the application and the tests run too. Leave all of the code in the application, don't be tempted to move anything out into another assembly yet.

Crunch Time

Once you are happy with the new structure, you can copy and paste the tests into a test assembly, copy the refactored code into separate assembly and point both the tests and the until-recently-monolithic-application at it. Delete the inplace test framework (or comment out if you want to do more refactoring). Remember to include references to your targetted unit test framework in your moved tests.

This technique has helped me more than once so I hope someone else finds it useful.