ASP.Net CORE 3 and Specflow – Part 3

Working with Visual Studio Database Projects and Deploying a Dacpac

Microsoft doesn’t want you to use the Visual Studio Database Project anymore. They want you to use Entity Framework and use the Migrations technology and if I could, I would. However I keep coming up against scenarios where I can’t use it. Most noticeably…

  • Azure Synapse (SQL Datawarehouse). When designing a datawarehouse schema I often lean towards a narrow instead of wide database design in my star schema. So for an entity I may have data spread across several rows. When reading this from the database I don’t want to load all the data in code and then map into my domain object as it would be slow and use a lot of unnecessary memory.
    There are also limitations with Azure Synapse (no identity columns, etc) so you will need to write custom code and perform round trips to the database with EF.
  • Advance SQL Server features, for example Temporal Tables is not supported. There are many articles on work arounds for now but you won’t get the full features of Temporal Tables in EF Core.

So often I will start out with EF Core but end up creating a Visual Studio Database Project (And I am not alone on this).

This throws a spanner in the works for having our Specflow tests being re-runnable. We need to ensure that there is an instance of our database available and clean before we run our tests .

Recreating the Database

First thing is to automate the recreation of your database. I have two ways of doing this.
Firstly “drop and create” which is simple enough and my preferred way. However in a lot of large organizations you will not have this ability. When you run it from your build pipeline you will need to connect to a managed database and you will not have permissions to drop and recreate it. You will however have permissions to drop objects in the database. So I have create a “drop schema” method as well.
Which option I use can be config driven.

Create a new Hook

We can separate the Database setup code from our Test Server setup. To ensure that the database setup is run before the TestServerHook, there is a property on the Attribute BeforeTestRun to say what order it should run in.

[Binding]
    public class DatabaseHook
    {

        [BeforeTestRun(Order = 1)]
        public static void BeforeTestRun()
        {
            IDatabaseFactory factory =
                new DatabaseFactory("Data Source=(local);database=Test;User Id=sa;Password=yourStrong(!)Password");

            factory.DeleteDatabase();
            factory.CreateDatabase();
            factory.DropSchema("trusted");

        }
    }Code language: C# (cs)

I haven’t made it configurable yet, I’ll come back to that. What I have done is introduced a DatabaseFactory that has the methods we need. Just like TestServerHook, this will run once before any tests are run.

The different implementations all use SQLCommand.

The Delete Database is defensive as something may have gone wrong in the past and the database isn’t there. If it can’t connect it exits instead of throwing an execption.

public void DeleteDatabase()
{
    using var connection = new SqlConnection(_connectionString);

    var databaseName = connection.Database;

    try
    {
        connection.Open();
    }
    catch (SqlException ex)
    {
        //Database doesn't exist;
        if (ex.Message.StartsWith($"Cannot open database \"{databaseName}\"")) return;
    }

    connection.ChangeDatabase("master");

    using var command =
        new SqlCommand(
            $"ALTER DATABASE [{databaseName}] SET SINGLE_USER WITH ROLLBACK IMMEDIATE; DROP DATABASE [{databaseName}]",
            connection);
    command.ExecuteNonQuery();
}
Code language: C# (cs)
 public void CreateDatabase()
{
    using var oldConnection = new SqlConnection(_connectionString);

    var databaseName = oldConnection.Database;

    using var connection = new SqlConnection(_connectionString.Replace(databaseName, "master"));
    connection.Open();

    using var command = new SqlCommand($"CREATE DATABASE [{databaseName}];", connection);

    command.ExecuteNonQuery();
}Code language: C# (cs)

Dropping the schema is a bit more involved but it needs to drop all the objects under the schema in a particular order. Once all done it drops the schema itself.

public void DropSchema(string schema)
{
    using var connection = new SqlConnection(_connectionString);
    connection.Open();

    var commands = GetDropCommands(connection, schema);

    RunCommands(commands, connection, schema);

}


private List<string> GetDropCommands(SqlConnection connection, string schema)
{
    var list = new List<string>();

    using var command = new SqlCommand($@"select so.name, so.type, p.name
                                                    from sys.objects so
                                                    inner join sys.schemas sc on so.schema_id = sc.schema_id
                                                    left join sys.objects p on p.object_id = so.parent_object_id
                                                    where sc.name = '{schema}'
                                                    order by (case so.type
	                                                            when 'P' THEN 0
	                                                            when 'V'THEN 1
	                                                            when 'UQ' Then 2
	                                                            WHEN 'F' THEN 3
	                                                            WHEN 'D'THEN 5
	                                                            WHEN 'C'THEN 6
	                                                            WHEN 'PK'THEN 7
	                                                            ELSE 12
	                                                            END)", connection);
    using var reader = command.ExecuteReader();

    while (reader.Read())
    {
        var name = reader.GetString(0).Trim();
        var type = reader.GetString(1).Trim();
        var parent = (reader.GetValue(2) != DBNull.Value) ? reader.GetString(2).Trim() : string.Empty;

        var sql = type switch
        {
            "P" => $"DROP PROCEDURE [{schema}].[{name}]",
            "V" => $"DROP VIEW [{schema}].[{name}]",
            "U" => $"DROP TABLE [{schema}].[{name}]",
            _ => $" ALTER TABLE [{schema}].[{parent}] DROP CONSTRAINT [{name}]"

        };

        list.Add(sql);
    }

    return list;

}

private void RunCommands(List<string> commands, SqlConnection connection, string schema)
{
    commands.ForEach(sql =>
    {
        using var command = new SqlCommand(sql, connection);
        System.Diagnostics.Trace.TraceInformation(sql);
        command.ExecuteNonQuery();

    });

    if (commands.Count > 0)
    {
        var command = new SqlCommand($"Drop SCHEMA {schema}", connection);
        command.ExecuteNonQuery();
    }
}Code language: C# (cs)

Deploying a new dacpac

So far it’s been straight forward but when it comes to deploying the Visual Studio Database project we need to programmatically find the dacpac file.

First let’s expand our DatabaseFactory to have a Publish method. This method needs to find the dacpac file. I am going to have to make some assumptions

  • That there is only one dacpac file per Solution Configuration
  • There is only Debug and Release Solution Configuration
  • The Project structure and where the test project is in relation to the root.
 public void Publish()
{
    //Root directory of the Solution
    var directory = new DirectoryInfo(Directory.GetCurrentDirectory()).Parent.Parent.Parent.Parent.Parent;

    var dacPathFile = FindDacpacPath(directory);
}

private string FindDacpacPath(DirectoryInfo directory)
{

    var files = directory.GetFiles("*.dacpac");


    if (files.Length > 0 && directory.Name == (isDebug() ? "Debug" : "Release")) return files[0].FullName;


    foreach (var subDirectory in directory.GetDirectories())
    {
        var file = FindDacpacPath(subDirectory);

        if (!string.IsNullOrEmpty(file)) return file;
    }

    return string.Empty;

}

public bool isDebug()
{
    var isDebug = false;
#if DEBUG
    isDebug = true;
#endif
    return isDebug;
}Code language: C# (cs)

Now that we have the file, we need to publish it. We can do this by installing the package

Microsoft.SqlServer.DacFx

and adding the following code.

public void Publish()
{
    //Root directory of the Solution
    var directory = new DirectoryInfo(Directory.GetCurrentDirectory()).Parent.Parent.Parent.Parent.Parent;

    var dacPathFile = FindDacpacPath(directory);

    using var connection = new SqlConnection(_connectionString);
    var instance = new DacServices(_connectionString);
    using var dacpac = DacPackage.Load(dacPathFile);

    instance.Deploy(dacpac, connection.Database, upgradeExisting: true, options: new DacDeployOptions() { AllowIncompatiblePlatform = true });
}

Code language: C# (cs)

Now we have code that clears down and creates our database.

Making the Process Config Driven

Previously I hardcoded everything into our Hook. Now I want to make sure that when the tests run it doesn’t effect the same database I am developing on. I also want to be able to change the settings on the build server.

First thing to do is add a new EnvironmentHook

[Binding]
public class EnvironmentHook
{
    [BeforeTestRun(Order = 0)]
    public static void BeforeTestRun()
    {
        Environment.SetEnvironmentVariable("ASPNETCORE_ENVIRONMENT", "Test");
    }
}Code language: C# (cs)

Now I can add an appSettings.Test.config

In here I am going to change the database name but also add custom settings for the database hook

  "Setup": {
    "Database": {
      "ShouldDropAndRecreateDatabase": false,
      "ShouldDropSchema": true,
      "ShouldPublish": true
    } 
  },Code language: JSON / JSON with Comments (json)

Now I can plug this in my DatabaseHook

[Binding]
public class DatabaseHook
{

    [BeforeTestRun(Order = 1)]
    public static void BeforeTestRun()
    {
        var config = Program.Configuration;

        IDatabaseFactory factory =
 new DatabaseFactory(config.GetConnectionString("Default"));

        var setup = new Setup();
        config.Bind("Setup", setup);

        if (setup.Database.ShouldDropAndRecreateDatabase)
        {
            factory.DeleteDatabase();
            factory.CreateDatabase();
        }
        if(setup.Database.ShouldDropSchema)
        {
            factory.DropSchema("trusted");
        }

        if (setup.Database.ShouldPublish)
        {
            factory.Publish();
        }
    }
}Code language: C# (cs)

I fetch the Configuration from the Program class in my main service. I already have the code as I am loading Configuration programmatically into Serilog.

I can then bind the json into some objects I have created and drive it from my appsetings.Test.json.

On Azure Devops I can manipulate this file using substitutions so I don’t drop the database and clear down the schema instead.

Leave a Reply