top of page
Search
  • First Digital

Getting started with Test Driven Development

There are multiple ways to adopt TDD (Test Driven Development) for different scenarios. Two of the most common scenarios are creating a brand-new project, and using tests to drive development on an existing one.



For the first scenario, we have a brand-new project we want to develop in incremental stages, to be released efficiently and having new features after every sprint. Let’s imagine that we are creating a brand-new system. The following steps will guide you to write lean, clean, modular, and highly scalable code which can easily adapt to unseen changes in requirements (sound familiar?).

 

Problem statement:

We want to add in logic that, if a number is provided, the logic does XYZ and returns a value. For illustration, let’s assume that this is functionality we are adding into a service layer.

  • Create a new MSTest (Microsoft Test, a nuget package that comes with a new Test project which is created from Visual Studio) project and install the NUnit (test automation frameworks that can be used within Visual Studio) & NUnit Test Adapter (Allows you to run tests within Visual Studio.) nuget package. This is subjective, you may want to write your tests within MSTest for example, nothing stops you.

  • Create an Interfaces folder and an Implementation folder (vague names just to illustrate the intention - to correlate with my example - I used IProcessingService

  • Add an empty interface to the Interfaces folder. It is good practice to name your interface with a leading capital I)

  • Add in empty method stub(s) that you can see upfront, that will at some point change (Add known parameters too, you can always refactor newer parameters in later)

  • In the Implementation folder, add in an empty class into the Implementation folder. (I used ProcessingService as a class name)

  • Make your class inherit from or implement the interface (you can very simply CTRL+. then enter to generate the method stub(s) it is meant to implement)

  • You will now have empty methods which "throw new NotImplementedException()" - time to write some tests!

  • How you write or structure your tests is up to you. I prefer writing them in one big test and then refactoring out common variables to a 'BaseTest' class which I inherit from when I need the objects to use in the subsequent tests which cover multiple scenarios. I will illustrate this entire problem-solving approach. In the meantime, open your test explorer.

  • I am using a fully licensed Visual Studio 2022 to illustrate, you may use the Community edition, go to the Tests menu item, and select Test Explorer. This menu item is available in most visual studios that I have worked with.

 

 

Visual Studio will open a window like this:



 

If you are following along, your dummy project should look something like this:



 















-  Your setup is now complete, now we write tests.

-  The problem in words is this -


We want to add in logic that, there might be a number that is provided, our logic needs to do XYZ and return a value.

What does this tell us exactly? We need to make a method that returns a value based on a NULLABLE input. We should name our test appropriately. I prefer using the Given -> We should semantics. (Taken from the BDD approach, Given When Then)

So, from the example, “Given that we receive a number, we should return a value” is one test case or scenario which we need to provide a solution for. Which tells us that we have our first test, 




Now that we have our very first test, after running it, we see that it fails… perfect!




Now to make it pass. Again, what does the scenario say? We should return a value. So, look at our implementation code and make exactly and only that happen.

 


 








Then we run our test in the test explorer and see that it works. So now the next scenario that we can see from the problem statement, “based on a value.” This tells us that the argument matters too. We write our next test. It is important that we add a new test rather than changing our old one, this way we are maintaining rules that were previously enforced by code. If done correctly, no piece of code is anywhere for no reason.


















Within our test class, we now have two tests which both use new instances of the ProcessingService. So, let us refactor that out to a common method that is executed before every test run. In NUnit, this is annotated with the [SetUp] attribute.

 



 

















Add code into your implementation, again, only enough that will allow the tests to pass.



 






It is crucial to only do exactly what is required to make your tests pass. Your tests retain business requirements, it is then easier for someone else down the line to pick up a piece of work that might need to be adjusted. It also helps to mitigate bloat in code and enables developers to conform with our revered SOLID principles. 


Faking/Mocking within unit tests

Now for the next part in the first example, mocking, let us assume we get an additional requirement which creeps in, stating that we want to do something with our result from XYZ. (For simplicity’s sake)

Since we will now be writing more tests, lets employ a vital technique called mocking, which allows us to isolate our code to unit test individual pieces of code. So, in our interface, lets add in a stub void DoSomething (). Repeat the process of creation by pressing “CTRL+. +Enter” to automatically generate the stub in your implementation. This technique should only be used when writing unit tests and not integration tests.

Now we write our next test. In my case, I named it we_should_do_something.

As soon as you see that code is now interdependent, ensure that they are dependent on abstractions rather than details, this is why we inject Interfaces instead of carrying around concrete objects in memory.

It is crucial to remember the importance of the dependency inversion principle, once you see you are violating it on this type of project, you need to decide to refactor co-reliant pieces of code into injectable interfaces without breaking business rules or requirements. For more information on this, visit: https://deviq.com/principles/dependency-inversion-principle 


Since we are now doing this, we should move the XYZ method to its own injectable interface, and after some refactoring, we have isolated pieces of the puzzle which we can test very easily.


















The best way to unit test void methods is to test its side effects, such as state changes on objects. In this case, that it does not throw exceptions.

 Fakes also have a massive impact on the execution speed of unit tests. So, let us go back to our unit tests and only change the param to It.IsAny<int?>() and then run the tests.

 

 

The last piece of test code we will write in this illustration is the TearDown method. This is simply the opposite of the Setup method within NUnit, the TearDown method runs after each test within a TestFixture. There are also TextFixtureSetups and TestFixtureTeardowns for objects that are shared across tests. Those run once before each test fixture, and once after each fixture.

Here is a link with a list of annotations with their functionality explained: https://docs.nunit.org/articles/nunit/writing-tests/attributes.html   



 






Now that we have a handle on unit tests, let us go to the next scenario, using integration tests to show us how we can prevent bad code from entering our systems.

It is worth mentioning that with Azure Repositories, on every build pipeline there is a Run Tests step, we can configure it to run tests like the ones above and then publish a release based on the outcome of that execution. This protects us from making ‘bad’ releases.

RGR (Red-Green-Refactor)

 

If you look at the process we followed, in writing a failing unit test first. (Red)

Then making it pass. (Green)

Then refactoring the test and the code. (Refactor)

 

The above flow is classic TDD concisely. We write failing tests, make them pass, then refactor the implementation and tests without changing requirements, as required.


Integration Tests

This section is another method of TDD, let us assume that we have inherited a system that already has a certain architecture and rewriting it is now jut out of budget. We need a way for tests to restrict and maintain functionality.

 

These tests call or invoke the methods that we are using or changing and ensure that behaviour is maintained when a new piece of code is added at a more holistic level, from the start of the process to the end. Let us use the same scenario.

 

Scenario:

We want to add in logic that, when provided, does XYZ and returns a value, based on a number and it should do something. So, the entire test should be that it should do something, since XYZ is invoked inside DoSomething, but this time we will provide actual real argument(s). Not fakes. This is essential in integration tests. We always need to call a database or api if one is somewhere along in the process.

 

Let us start with our test class, I named it “Given_a_number_we_should_return_a_value_and_do_something” a bit elaborate yes, but it is important to retain requirements in your tests and cover the entire scenario.

 

According to our layers, the XYZ method is invoked inside the DoSomething method, so we just need to invoke DoSomething to test the process according to the scenario provided. Since the result of XYZ is found within DoSomething however, we need to assert against it.

 

Have a look at the test below, we are providing an int within the range MinValue and MaxValue to the XYZService, obtaining the actual result of that execution, then passing that result to the DoSomething method, and asserting against a side effect.



 

But hold on, this is not the only route or scenario we should cover, remember that the argument could be null, so...

 


 

Test databases can be setup for the purpose of managing data needed for scenarios, and a whole array of tools can be employed to maintain them. Usually, the integration test pack is kept outside the system.

 

For developers, tests may also serve as documentation. It is never too late to add in an integration test pack to retain domain knowledge, help maintain your codebase, monitor your data & integrations, and overall, ensure system health. It is all requirements and scope dependent. Each time a piece of code is changed if we are working in a repo with a build pipeline, we could add in a step for tests.  A “Run Tests” step can be configured to execute against that code.

 

Go ahead and refactor these integration tests as per what we did in the unit tests section of this article as a practice exercise.

I really hope you enjoyed this article, thanks for reading.

 

Vishav Premlall

Innovation & IoT Consultant


Hi, I'm Vishav, a dynamic individual whose passion for gaming, commitment to fitness, and prowess in the world of software development make for a well-rounded and captivating personality. By day, I immerse myself in the ever-evolving realm of coding and problem-solving, crafting innovative solutions as a dedicated software developer. When the workday concludes, you'll find me swapping lines of code for a controller, diving into the immersive worlds of gaming with a fervor that matches my technical expertise. But it's not all about screens and keyboards for me, I am a dedicated fitness enthusiast, as I understand the importance of maintaining a healthy body and mind. Whether it's pumping iron at the gym or engaging in high energy workouts, I have found the perfect balance between the virtual and physical worlds. This unique combination of tech savvy, gaming enthusiasm, and a commitment to fitness sets me apart, creating a vibrant and multifaceted individual with a passion for both the digital and physical realms.

77 views0 comments

Komentar


bottom of page