Transition to Azure DevOps (Part II – Pipelines)

In Part I we accomplished the most important aspects of the transition from local code development to a cloud-based solution: getting credentials and other sensitive values our of our code base and progressing that clean code base into Azure DevOps source control. Now that has been completed we want to take further advantage of being in Azure DevOps. In addition to all the benefits of cloud storage and source control we want to look at Continuous Integration, Continuous Deployment, and Source Control Features (Branching Policies).

The targets I want to hit are

  1. to confirm build (both compilation and unit test passing) on any pull request (PR) to main
  2. to have an automated deployment of Azure Functions on any successful PR to main

Source Control Features (aka Branching Policies)

Now that I have source control enabled I want to take advantage and put some automated controls on so that whether I’m developing solo or with a team that source control enhances my capabilities to produce high-quality code deliverables. In Azure DevOps this is surprisingly easy; you select the branch you want to put some controls on (for me it is the main branch) hover over the branch row; three dots will show on the far right; click those and select “Branch Policies” from here you can set all sorts of policies about what is and is not permitted in your branch; I went with:

  • Check for linked work items: I don’t want code coming in for no reason; code should be linked to some reason fore being created.
  • Check for comment resolution: I always like this one and it is on by default; it means if someone makes a comment on the PR that comment is then reviewed and marked as completed before the PR is completed.
  • Automatically included reviewers: this makes a nice stop gap so that if there is a lead or key person in the development cycle that they are always included as a reviewer.
  • Build Validation: this was the main reason for figuring out Branching Policies; I want all unit tests to pass before a PR can be completed (I’ll come back to this one; this wasn’t as simple as I wanted it to be)

Continuous Integration (Build Pipeline)

With my Branching Policies I now know that the code is solid and has been approved for delivery; this is where Azure DevOps really starts to make some automation of tasks really nice. In this case the creation of a simple pipeline that does the basic “when I commit it goes” that takes commits to main and does the build of that code. The defaults are absolutely great for getting that first process up; but I had a lot of failed builds (due to the YAML being incorrect or process being incorrect) while I got the aspects after the initial setup completed; this is what it all looked like

Initial, Baseline Pipeline Creation

This was a matter of logging into Azure DevOps; navigation to the Pipelines section; selecting “Create your first pipeline”; and then selecting the template I was after. In this case I was using Azure DevOps Git; selected my repository (created in Part I); and then the template of Azure Functions.

I was then provided with a big chunk of YAML that was going to build my application and push it to Azure Functions; and it did just that. A quick update to the code, commit, PR, and it sprung to life and pushed that change out for me … brilliant; I thought I was done (but your scroll bar shows that there is a bit more to the story). Somewhat surprisingly there wasn’t much additional direction or input to the template; it didn’t auto-detect a Unit Test project and didn’t put any unit test validation in; same with database migration or any similar other features; it did just what it said “took the code, built the code, pushed the code to deployment location”. So onto next steps

[ initial code snippet of YAML auto created from template intentionally left out; see segments below and full YAML created at the end]

Unit Test Validation

The first thing I wanted to ensure is that I didn’t push out broken code to my deployment; by default it wasn’t going to push out code that didn’t compile, but it didn’t automatically run the unit tests in the code to confirm the code before deployment; it also didn’t have a template for that; so I had to do some searching (and multiple testing iterations) to get that implemented; I ended up with the following “job” in the “Build stage” of the pipeline.

  - job: UnitTest
    displayName: Unit Test
    dependsOn: Build
    pool:
      vmImage: $(vmImageName)
      
    steps:
    - task: DotNetCoreCLI@2
      displayName: Run Unit Tests
      inputs:
        command: test
        projects: '**/MyUnitTest.csproj'
        arguments: '--configuration $(buildConfiguration)'

EF Core Migration Build and Deployment

This was looking good and I was on my way; however, to my surprise, the next time I did a change that included a migration; the migration was not progressed through the pipeline and didn’t deploy with the Azure Function; so again back to the internet to search on how to do this; and this time I needed YAML for both the “Build stage” and the “Deploy stage” of the pipeline.

This was a good exercise because I was introduced to the concept of variables in the pipeline for the same reason we did in Part I (i.e. the YAML is in source control and we don’t want our credentials and sensitive data in source control); so you can create variables in the process, there is a “Variables” button in the editor and then you use them like $(db-username) in the YAML.

Code: EF Core Creation of Migration Scripts Artifact (Build)

  - job: EFMigrationScripts
    displayName: Build EF Migration Scripts
    dependsOn: UnitTest
    pool:
      vmImage: $(vmImageName)
      
    steps:
    - task: DotNetCoreCLI@2
      displayName: 'Initialize EntityFrameworkCore'
      inputs:
        command: custom
        custom: tool
        arguments: 'install --global dotnet-ef'

    - task: DotNetCoreCLI@2
      displayName: 'Create SQL Migration Scripts'
      inputs:
        command: custom
        custom: ef
        arguments: 'migrations script --output $(Build.SourcesDirectory)/SQLScripts/dbmigration.sql --idempotent --project $(Build.SourcesDirectory)/MyProject.csproj'

    - publish: $(Build.SourcesDirectory)/SQLScripts/dbmigration.sql
      artifact: migration

Code: EF Core Deployment of Migration Scripts (Deploy)

          - task: SqlAzureDacpacDeployment@1
            inputs:
              azureSubscription: '[subscriber detail here]'
              AuthenticationType: 'server'
              ServerName: $(db-servername)
              DatabaseName: $(db-name)
              SqlUsername: $(db-username)
              SqlPassword: $(db-password)
              deployType: 'SqlTask'
              SqlFile: '$(Pipeline.Workspace)/migration/dbmigration.sql'
              IpDetectionMethod: 'AutoDetect'

At this point I am somewhat concerned that I’ve taken a template and might be off the best practices path; it works and it works the way I expect it do; but it doesn’t have that good feel. However I wasn’t able to find really good documentation on best practices and/or recommended patterns; and it feels like everyone is still determining what that is: should you have one pipeline that does everything? should pipelines always create artifacts and then other release pipelines process those? (if so why doesn’t the template direct you that way). It feels like it is all new and that it is progressing well; but that I’ll have to revisit these scripts again as future features, enhancements, and directives become more clear.

Continuous Deployment (Release Pipeline)

I ended up not creating a release pipeline; the project that I’m using just is local development and testing and then a production environment; interestingly enough that seems to be the Azure Functions YAML template as well (it created the build and deploy stages by default); adding in the unit testing and database migrations and it was basically the way that it should be without extra effort. So I’ve kept things as close to out of the box as they came; and while I’m guessing I’ll revisit the release pipeline (particularly if/when I start to have staging and other environments); but for now this was intentionally not undertaken after I saw what the template provided.

Build Validation (Additional Pipeline)

I said I’d come back to this part of the Branch Policies and here we are. In the branch policies I somewhat glanced over the Build Validation; the key to the build validation policy is that it requires the selection of a pipeline to execute that validation. The part that was frustrating is that I couldn’t use the template pipeline, even though I had added Unit Testing to it, because it did a deployment as well (and also built migration script and other artifacts not needed), I just wanted to run Unit Tests before a PR was accepted.

To accomplish this I had to create a new pipeline; one that was just the unit test job. Again the feeling that best practices are being determine and there is more to come occurs here; as I had to cut and paste the job details (see above) into a new pipeline. I’m thinking there must be a way to reuse pipelines, I’d prefer that I have a single Unit Test Pipeline that I can call both for Build Validation in a Branch Policy as well as use in the Continuous Integration (Build Pipeline); but at this point I wasn’t able to discover that as an option; so create new and cut and paste and remember to update in both locations as changes occur / are improved.

What’s Next?

Initially I was planning just the two blog posts about my transition to Azure DevOps; I figured get my code right and into DevOps and get a nice simple pipeline and back to coding. There does just seem to be a lot of additional automation of processes that would be great to continue with the Pipelines; I did do some initial searching on each of these topics and none seemed to have standard or quick solutions; so I’ll be looking at how I can implement something like this, hopefully with some best practice type improvements along the way as well

  • How do I setup code coverage? There is a nice link in the output of the pipeline that says “setup code coverage” but the link didn’t take me to a straight forward “cut and paste this YAML and you’re good to go” so some additional research needed there.
  • How do I take a database backup before deployment of migration scripts? There seem to be a lot of different database capabilities; but nothing clear for what I thought a standard functionality of “take database backup and put it in Azure Storage before you change the database”.
  • Should I transition the Deploy Stage that was auto-created in the Build Pipeline to a Release Pipeline? And what would be the decision points for why to leave it where the template created it or move it to a release pipeline; what should that decision be based on?
  • Is there a way to make these YAML files more DRY and create building blocks of pipelines? How can I get some better reuse out of the YAML files created; can I call a YAML files from a YAML file; or maybe a pipeline from another pipeline? I want to have one Unit Test pipeline that I can reuse rather than cut and paste.
  • Is there a fancy / nice way of doing “auto-versioning” and update the version number in the code as part of this process as well? I want something that basically does “you didn’t update version numbers in your code; so I’ll auto update the code to the next version number” (on successful build and unit test pass), is that a potential for these pipelines?
  • Is there a way to do blackouts and time-based deployment? With a timer-trigger Azure Function it would be even more nifty to be able to have some blackout periods where it may run the build process but waits on the deployment unit either the function is not running and/or the blackout period is over (this might be part of “Should I transition the Deployment Stage that was auto-created in the Build Pipeline to a Release Pipeline?”)

Azure Functions Pipeline with Unit Tests and Migrations

This is the YAML I ended up with to do a “standard”, “simple” Pipeline that builds a code base; runs unit tests; creates migrations and then deploys the migrations and Azure Functions to Azure.

# .NET Core Function App to Windows on Azure
# Build a .NET Core function app and deploy it to Azure as a Windows function App.
# Add steps that analyze code, save build artifacts, deploy, and more:
# https://docs.microsoft.com/en-us/azure/devops/pipelines/languages/dotnet-core

trigger:
- main

variables:
  # Azure Resource Manager connection created during pipeline creation
  azureSubscription: '[subscription]'

  # Function app name
  functionAppName: 'AzureFunctionName'

  # Agent VM image name
  vmImageName: 'vs2017-win2016'

  # Working Directory
  workingDirectory: '$(System.DefaultWorkingDirectory)/MyProject'

stages:
- stage: Build
  displayName: Build Stage

  jobs:
  - job: Build
    displayName: Build
    pool:
      vmImage: $(vmImageName)

    steps:
    - task: DotNetCoreCLI@2
      displayName: Build
      inputs:
        command: 'build'
        projects: |
          $(workingDirectory)/*.csproj
        arguments: --output $(System.DefaultWorkingDirectory)/publish_output --configuration Release
    
    - task: ArchiveFiles@2
      displayName: 'Archive files'
      inputs:
        rootFolderOrFile: '$(System.DefaultWorkingDirectory)/publish_output'
        includeRootFolder: false
        archiveType: zip
        archiveFile: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
        replaceExistingArchive: true
    
    - publish: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
      artifact: drop

  - job: UnitTest
    displayName: Unit Test
    dependsOn: Build
    pool:
      vmImage: $(vmImageName)
      
    steps:
    - task: DotNetCoreCLI@2
      displayName: Run Unit Tests
      inputs:
        command: test
        projects: '**/MyUnitTests.csproj'
        arguments: '--configuration $(buildConfiguration)'

  - job: EFMigrationScripts
    displayName: Build EF Migration Scripts
    dependsOn: UnitTest
    pool:
      vmImage: $(vmImageName)
      
    steps:
    - task: DotNetCoreCLI@2
      displayName: 'Initialize EntityFrameworkCore'
      inputs:
        command: custom
        custom: tool
        arguments: 'install --global dotnet-ef'

    - task: DotNetCoreCLI@2
      displayName: 'Create SQL Migration Scripts'
      inputs:
        command: custom
        custom: ef
        arguments: 'migrations script --output $(Build.SourcesDirectory)/SQLScripts/dbmigration.sql --idempotent --project $(Build.SourcesDirectory)/MyProject.csproj'

    - publish: $(Build.SourcesDirectory)/SQLScripts/dbmigration.sql
      artifact: migration

- stage: Deploy
  displayName: Deploy Stage
  dependsOn: Build
  condition: succeeded()

  jobs:
  - deployment: Deploy
    displayName: Deploy
    environment: 'development'
    pool:
      vmImage: $(vmImageName)

    strategy:
      runOnce:
        deploy:

          steps:
          
          - task: SqlAzureDacpacDeployment@1
            inputs:
              azureSubscription: '[subscription]'
              AuthenticationType: 'server'
              ServerName: $(db-servername)
              DatabaseName: $(db-name)
              SqlUsername: $(db-username)
              SqlPassword: $(db-password)
              deployType: 'SqlTask'
              SqlFile: '$(Pipeline.Workspace)/migration/dbmigration.sql'
              IpDetectionMethod: 'AutoDetect'

          - task: AzureFunctionApp@1
            displayName: 'Azure functions app deploy'
            inputs:
              azureSubscription: '$(azureSubscription)'
              appType: functionApp
              appName: $(functionAppName)
              package: '$(Pipeline.Workspace)/drop/$(Build.BuildId).zip'

Links and Research Reference

General Pipeline Information

Unit Test Validation

Entity Framework Migrations

Leave a comment