MessageHandler

Setting Up Persistence

P

Introduction

So far, the lifetime of the decissions taken by your system was limited to the scope of a test run, as none of the events have been stored for future retrieval.

In this lesson, you'll learn how to make your system remember events for the long term.

By the end of this lesson, you will have:

  • Configured an event source to persist events to, and load them from, an azure storage table.
  • Verified that your event stream integrates correctly with the storage infrastructure, using an integration test.

The code created up until now can be found in the following commit of the reference repository, it provides the starting point for this lesson.

Writing an integration test

You start off by creating an integration test.

Integration tests verify the integration between multiple services used in a system.

In this case you want to verify the integration of your event model with the azure storage table service.

Add a project to hold the integration tests

Right click the Solution MessageHandler.LearningPath in the Solution Explorer and choose Add > New Project.

Select the project template called xUnit Test Project, tagged with C#. Click Next.

Enter project name OrderBooking.Events.IntegrationTests. Click Next.

Select framework .NET 6.0 (Long-term support). Click Create

Add a project reference to OrderBooking.Events

Right click on Dependencies of the OrderBooking.Events.IntegrationTests project in the Solution Explorer and choose Add Project Reference.

In the Reference Manager window, select OrderBooking.Events. Click OK

Add the MessageHandler.EventSourcing.AzureTableStorage package

Right click on Dependencies of the OrderBooking.Events.IntegrationTests project in the Solution Explorer and choose Manage Nuget Packages.

In the Nuget Package Manager window, browse for MessageHandler.EventSourcing.AzureTableStorage (include prerelease needs to be checked).

Select the latest version and click Install.

Click I Accept in the License Acceptance window.

Add the Microsoft.Extensions.Configuration.UserSecrets package

Right click on Dependencies of the OrderBooking.Events.IntegrationTests project in the Solution Explorer and choose Manage Nuget Packages.

In the Nuget Package Manager window, browse for Microsoft.Extensions.Configuration.UserSecrets (include prerelease needs to be unchecked).

Select the latest stable version and click Install.

Click I Accept in the License Acceptance window.

Writing an integration test

Integration tests are created in the OrderBooking.Events.IntegrationTests project.

First you rename the UnitTest1.cs file in the OrderBooking.Events.IntegrationTests project to WhileIntegratingTableStorage.cs.

Visual Studio will likely suggest to rename the class Class1 to WhileIntegratingTableStorage in the project automatically, you can allow it to do so. Otherwise you can change the class name manually.

Rename the method Test1 to reflect the scenario under test GivenAnEventStream_WhenPersistingEvents_ThenCanLoadEvents.

Your code now looks like this:

using Xunit;

namespace OrderBooking.Events.IntegrationTests
{
    public class WhileIntegratingTableStorage
    {
        [Fact]
        public void GivenAnEventStream_WhenPersistingEvents_ThenCanLoadEvents()
        {

        }
    }
}

Challenges

Implementing integration tests pose unique challenges. Typically these challenges are:

  • slowness: because dependencies are in a remote location
  • unreliability: because the network in between may or may not be working as expected
  • unrepeatability: multiple executions of the same test may not result in the same outcome
  • security risks: embedding credentials, needed to connect to the remote service, into a test may lead to leaking those credentials via the source control system.

You deal with the slowness and unreliability by not using to many integration tests in your test suite, use unit tests instead.

Dealing with the unrepeatability and security risks is to be taken care of as part of the integration tests though, and it will be different for each technology you depend on.

Making tests repeatable

When depending on azure storage tables, you make tests repeatable by provisioning unique resources for each test run and tear them down at the end.

There are no practical limits to the number of tables that can be provisioned in an azure storage account, so there are not many conventions to be taken into account in doing so.

The only convention that needs to be taken into account is naming the table.

The name has to be unique to the test run, as tests can run in parallel.

At the same time it can only include alpha numeric characters, but cannot start with a number.

One way to generate such a unique name is to use a globally unique identifier, or GUID, and strip out the hyphens.

GUIDs can however start with a numeric character, which is not allowed, so you prefix it with an alpahabetic character.

private string tableName = "t" + Guid.NewGuid().ToString("N");

To create the table at the start of the test run, and tear it down again after the run completed, you implement IAsyncLifetime on the test class.

This interface provides a method called InitializeAsync, which gets called before the test runs, and a method called DisposeAsync, which is invoked when the test has ended and needs cleanup.

public class WhileIntegratingTableStorage : IAsyncLifetime
{
    public Task InitializeAsync() => Task.CompletedTask;

    public Task DisposeAsync() => Task.CompletedTask;
}

On InitializeAsync you create a new table service client, by passing in the connection string towards the storage account you've set up as part of your development environment.

Then you get a table client for the random table name.

And create the table, should it not exist yet.

You consciously do not catch any exception that could occur while initializing, so that the test would stop if the table fails to create.

using Azure.Data.Tables;

public async Task InitializeAsync()
{
    var client = new TableServiceClient("YourConnectionStringGoesHere");

    var table = client.GetTableClient(tableName);

    await table.CreateIfNotExistsAsync();
}

On DisposeAsync you do almost the same, except you'll call DeleteAsync instead of the create method.

This time, you do catch any exception that might occur, as you don't want the test to fail due to clean up failures, after it succeeded.

public async Task DisposeAsync()
{
    try
    {
        var client = new TableServiceClient("YourConnectionStringGoesHere");

        var table = client.GetTableClient(tableName);

        await table.DeleteAsync();
    }
    catch { }
}

Use user secrets

Both initialization and cleanup steps are using a hardcoded connection string, and that connectionstring contains secrets.

This is a problem that needs to be tackled right now.

Secrets should not be part of the software, they should be outside of it, even for testing purposes.

The dotnet framework offers this capability, it's called User Secrets.

You set them by right-clicking on the OrderBooking.Events.IntegrationTests project in the Solution Explorer and choosing Manage User Secrets.

Visual Studio will now open your personal user secrets file associated to the project. This file resides physically outside of the solution folder, so that you don't accidentally check it into source control.

Modify the json content of the secrets file to include a TableStorageConnectionString property that has the connection string as a value.

{
  "TableStorageConnectionString": "YourConnectionStringGoesHere"
}

To use this value, the dotnet framework provides a ConfigurationBuilder class that handles loading the configuration settings from the user secrets for you.

As a fallback you also configure the ConfigurationBuilder to read from environment variables, which is usefull when running the test in a user less environment such as an automated build system.

You use this builder instance to build up a configuration model and read the TableStorageConnectionString value into a field called connectionString.

And you replace the actual connection string by the connectionString field in the TableServiceClient constructor.

using Microsoft.Extensions.Configuration;

public class WhileIntegratingTableStorage : IAsyncLifetime
{
    private string connectionString;

    public WhileIntegratingTableStorage()
    {
        var configuration = new ConfigurationBuilder()
            .AddUserSecrets<WhileIntegratingTableStorage>(optional: true)
            .AddEnvironmentVariables()                
            .Build();

        connectionString = configuration["TableStorageConnectionString"];                        
    }

    public async Task InitializeAsync()
    {
        var client = new TableServiceClient(connectionString);

        var table = client.GetTableClient(tableName);

        await table.CreateIfNotExistsAsync();
    }

    public async Task DisposeAsync()
    {
        try
        {
            var client = new TableServiceClient(connectionString);

            var table = client.GetTableClient(tableName);

            await table.DeleteAsync();
        }
        catch { }
    }
}

Implementing the test logic

Now that the challenges inherent to the integration test have been taken care of, you start by creating the test itself.

The first thing you do is define a stream of events. Each event needs:

  • A unique EventId
  • A shared SourceId, which indicates the stream that the event belongs to.
  • A Version number, which will be used for chronological ordering. It's not really mandatory to be sequential or even unique as events can happen in parallel, but for the purpose of learning it's easiest to use a sequential number. Events can be stored in any order, the source will properly order them based on this value.
  • A TargetBranchParentId needs to be set on all events, except for the root. The value should point to id of the event which preceeds it in the stream. In most scenario's the stream will represent a straight chain of events, but it is possible to create seperate branches in the stream by pointing multiple events to the same parent for advanced conflict scenarios.
[Fact]
public async Task GivenAnEventStream_WhenPersistingEvents_ThenCanLoadEvents()
{
    var streamId = "fe8430bf-00a4-42b7-b077-87d8fff4ba68";
    var streamType = "OrderBooking";

    var started = new BookingStarted
    {
        EventId = "89795ced-ea64-46c2-879e-10d285a09429", // unique
        SourceId = streamId,
        Version = 1,
        PurchaseOrder = new PurchaseOrder()
    };
    var confirmed = new SalesOrderConfirmed
    {
        EventId = "9a5937c2-5e14-461f-b452-fa504f300d15", // unique
        SourceId = streamId,
        Version = 2,
        TargetBranchParentId = started.EventId
    };

    var eventStream = new SourcedEvent[] { confirmed, started }; // will be reordered

When the stream context has been defined, you create a new instance of AzureTableStorageEventSource and pass in the secret connection string plus the random table name into the constructor.

    var eventSource = new AzureTableStorageEventSource(connectionString, tableName);

Then you use this instance to persist the stream to table storage, using its Persist operation.

    await eventSource.Persist(streamType, streamId, eventStream);

And load the events back using the Load operation.

    var history = await eventSource.Load(streamType, streamId, version: 0);

Finally you compare the two sets, to check if all events have been loaded in the expected order.

    Assert.Equal(2, history.Count());

    Assert.Equal(started.EventId, history.First().EventId);
    Assert.Equal(confirmed.EventId, history.Last().EventId);

    Assert.IsType<BookingStarted>(history.First());
    Assert.IsType<SalesOrderConfirmed>(history.Last());
}

Run the test

Right click the OrderBooking.Events.IntegrationTests project in the Solution Explorer and choose Run Tests.

Your integration test should SUCCEED.

Summary

In this lesson you learned how to configure a storage table as an event source, to remember the decissions taken by your software system.

The code created during this lesson can be found in the following commit of the reference repository.

What’s coming next?

In Part 6 of the Tutorial, you’ll configure an ASP.Net Web Api to host the aggregate root and projection created in lesson 3 and 4, while serving their events from the azure table storage event source.

TO PART 6

Sign up to our newsletter to get notified about new content and releases

You can unsubscribe at any time by clicking the link in the footer of your emails. I use Mailchimp as my marketing platform. By clicking subscribe, you acknowledge that your information will be transferred to Mailchimp for processing. Learn more about Mailchimp's privacy practices here.