Load test data from a json file for xUnit tests


xUnit 101

xUnit is a unit testing tool for the .Net framework. If you’re new to testing with xUnit, I suggest reading the getting started documentation.

xUnit allows support for both parameterless  and parameterised  tests. There are 3 different ways to supply data to the parameterized tests

  • Inline Data is good when the method parameters are constant but it gets unwieldy pretty quickly when you have a lot of test cases. It also can’t be used when the data is not constant.
  • Class Data removes clutter from test files by moving the data to a separate class. It also allows you to pass non-constant data to the test. The downside is that you have to create a new class.
  • Member Data is similar to class data but uses a static property or method of a type instead of a class.

Problems

All 3 of the above approaches have a shortcoming in that every time you want to add new data to test, you need a recompile. The classes/methods can also become quite large if you have a lot of data. For example, this is the sample input for the puzzle in the Advent Of Code 2018.


First Pass

Both these problems would just go away if we could load our test data from a file. Andrew Lock has a great article which shows how to create a custom attribute to load the data from a JSON file. Since this article leans heavily on his approach, I recommend reading that first.

Go on, I will wait.

Oh Good, you are back. So you might be wondering that if Andrew has already written the article, why am I writing this and more importantly why should you spend your precious time reading this?

Well. I found his solution to work very well for test cases with a small number of parameters. However, it becomes quite cumbersome to use with a large set of parameters. For the Advent of Code test input, we would have to have a lot of parameters. We could reduce the number of parameters required to just one as its just a single(albeit large) list of the same type. However, I was not able to figure out how to structure my JSON so that it could be parsed easily.


Improvements

Let us start by creating a new generic class which takes 2 different underlying types – 1 for the  Data and 1 for the Result class. This class will be used to deserialize the JSON data

class TestObject<T1, T2>
{
	public List<T1> Data { get; set; }

	public T2 Result { get; set; }
}

Now, lwts modify our attribute class. For brevity, I am just showing the relevant code here. For the whole file please see here.

public class JsonFileDataAttribute : DataAttribute
{
	public override IEnumerable<object[]> GetData(MethodInfo testMethod)
	{
		// fileData is the raw file data
		// _dataType and _resultType are set in the constructor and are the types for the input data and the result

		var specific = typeof(TestObject<,>).MakeGenericType(_dataType, _resultType);
		var generic = typeof(List<>).MakeGenericType(specific);

		var jsonData = JObject.Parse(fileData);
		dynamic datalist = JsonConvert.DeserializeObject(jsonData, generic);
		var objectList = new List<object[]>();
		foreach (var data in datalist)
		{
			objectList.Add(new object[] {data.Data, data.Result});
		}
		return objectList;
	}
}

So what exactly are we doing here?

  1. Use MakeGenericType to get the Type of TestObject by substituting the generic type parameters by the actual parameters specified in the test.
  2. Use MakeGenericType again to get a new type which is a List of the new constructed TestObject
  3. Parse the file data as JSON
  4. Deserialize the JSON data as the genericType and store it in a dynamic type. We need to use dynamic here as we dont know the types passed into TestObject at compile time and they can change for each test.
  5. Add all the data into a list of objects and return it

This allows us to write our tests in the following manner

[Theory]
[JsonFileData("testData.json", "Part1", typeof(string), typeof(int))]
public void Test(List<string> data, int expectedResult)
{
	var result = TestThisMethod(data);
	Assert.Equal(expectedResult, result);
}

References

Conclusion

In this post, we built upon Andrew’s basic implementation of a custom JSON data source to make it easier for us to work with larger sets of data as well are more complex data.

Steer the Conversation

This site uses Akismet to reduce spam. Learn how your comment data is processed.