Optimizely DXP Deployment API PowerShell Scripts for CI/CD in Azure Devops – Part 2

In my last-post, we went over the deployment process in the EpiServer DXP environment. I mentioned that the next post would be related to using this within Azure DevOps, but I need to make a slight adjustment based on today’s update.

As of today (4/27/2020), EpiServer released their weekly update #313, which included a lot of enhancements for this Deployment API.

In this post, we’re going to go through the following:

  • The new features for the DXP Deployment API from this release
  • How to use the scripts in my previous post to take advantage of the sync-down
  • How to build a script to export your environment database

Update 313 – What is New?

In this update, there were six updates to the DXP Deployment and API. Most of them are not relating to the deployment API, so we wont go into those in this post, but I will explain how to use the new API features.

These new EpiCloud Powershell Module features came within version 0.9.9.

Support for Content sync-down

This feature allows for the deployments to run in the opposite direction. You will be able to go from Production down to PreProduction or Integration.

In the past, I have used this in order to keep the lower environments as similar to production as possible, including database and blobs. This made it a lot easier to be able to work with existing actual content and/or troubleshoot if something wasn’t working the exact way it should.

Database Export as BACPAC file

This features will add two more API methods which will trigger a new export to begin and then get the details of the export. They are named Start-EpiDatabaseExport and Get-EpiDatabaseExport, respectively.
This is a welcome addition, as we never really had too much access to the EpiServer database for any environment higher than integration.

This will be great for troubleshooting, and potentially pushing a database package up using the Deployment Package script to other environments. (This has yet to be tested…just thinking out loud)

Powershell Scripts

As seen in the EpiServer documentation, there are a bunch of PowerShell commands that can be used to control these deployments.

Below, I will explain how to do the following things:

  • Content Sync-Down using the existing “Perficient_PromoteToEnvironment” script
  • Creating a script to export a database from an environment
If you want to see how the “Perficient_PromoteToEnvironment” script is created, please review Part 1 of this blog series.

Content Sync-Down

The order of environments in the DXP go as such in a normal deployment stream:

Integration → PreProduction → Production

As we saw in Part 1, we could use the script to promote upward, but now we need to promote downward.

The script is set up to do this, but now we just have to specify the switch arguments to have the ‘SourceEnvironment’ be higher in the stream than the ‘TargetEnvironment.’

 
.Perficient_PromoteToEnvironment.ps1 -ClientKey "****"
                                      -ClientSecret "****"
                                      -ProjectID "****"
                                      -SourceEnvironment "Production"
                                      -TargetEnvironment "Preproduction"
                                      -UseMaintenancePage 1
                                      -IncludeBlobs $false
                                      -IncludeDb 0

Export Environment Database

This is going to be a brand new script, as it is new functionality.

This script is going to be using the following of the API calls:

  • Start-EpiDatabaseExport
  • Get-EpiDatabaseExport
The flow is going to look something like the following:
    1. Invoke script with 5 mandatory Parameters
      1. Client Key for intended environment
      2. Client Secret for intended environment
      3. Project ID (global for subscription)
      4. Target Environment (Integration, Preproduction, Production)
      5. DatabaseName (epicms, epicommerce)
param
  (
    [Parameter(Position=0, Mandatory)]
    [ValidateNotNullOrEmpty()]
    [string]$ClientKey,
    [Parameter(Position=1, Mandatory)]
    [ValidateNotNullOrEmpty()]
    [string]$ClientSecret,
    [Parameter(Position=2, Mandatory)]
    [ValidateNotNullOrEmpty()]
    [string]$ProjectID,
    [Parameter(Position=3, Mandatory)]
    [ValidateNotNullOrEmpty()]
    [ValidateSet("Integration", "Preproduction", "Production")]
    [string]$TargetEnvironment,
    [Parameter(Position=4, Mandatory)]
    [ValidateSet('epicms','epicommerce')]
    [String] $DatabaseName
    
  )
    1. Validate the parameters
      1. Not Empty/Null
      2. Not Whitespace
      3. Source Environment has a valid environment name
      4. Database Name has a valid selection
if([string]::IsNullOrWhiteSpace($ClientKey)){
    throw "A Client Key is needed. Please supply one."
}
if([string]::IsNullOrWhiteSpace($ClientSecret)){
    throw "A Client Secret Key is needed. Please supply one."
}
if([string]::IsNullOrWhiteSpace($ProjectID)){
    throw "A Project ID GUID is needed. Please supply one."
}

    1. Check to see that the EpiCloud powershell module is installed
      1. If not, install it (make sure to include the -force, as this came a new EpiCloud version)
if (-not (Get-Module -Name EpiCloud -ListAvailable)) {
    Write-Host "Installing EpiServer Cloud Powershell Module"
    Install-Module EpiCloud -Scope CurrentUser -Force
}
    1. Set up and Start the export
      1. When setting up the object, the Wait param can be set to true or false, but within my scenario, I would like to be able to keep messages flowing to the front-end, so I set it to false. If it is set to true, the command will not return any value until it is done.
      2. The reason we set this to a variable is because we want the ID from the object, which will allow us to efficiently get the Epi Export status in the next bit of code.
$startEpiExportmentSplat = @{
    ProjectId = "$ProjectID"
    Wait = $false
    Environment = "$TargetEnvironment"
    DatabaseName = "$DatabaseName"
    ClientSecret = "$ClientSecret"
    ClientKey = "$ClientKey"
}

Write-Host "Starting the Export. Environment: $TargetEnvironment | DB Name: $DatabaseName"

$export = Start-EpiDatabaseExport @startEpiExportmentSplat
    1. Set up the object and Get the current export job
$exportId = $export | Select -ExpandProperty "id"

$getEpiExportSplat = @{
    ProjectId = "$ProjectID"
    ClientSecret = "$ClientSecret"
    ClientKey = "$ClientKey"
    Id = "$exportId"
    Environment = "$TargetEnvironment"
    DatabaseName = "$DatabaseName"
}

$timesRun = 0
$currExport = Get-EpiDatabaseExport @getEpiExportSplat | Select-Object -First 1
$status = $currExport | Select -ExpandProperty "status"
$exit = 0
    1. Set up a loop to query the export status and print to the screen the current elapsed time
      1. This method does not have progress, so in order to let the user know it is still running, I have it writing elapsed time to the screen each minute
while($exit -ne 1){

$currExport = Get-EpiDatabaseExport @getEpiExportSplat | Select-Object -First 1

$status = $currExport | Select -ExpandProperty "status"

Write-Host "Exporting In Progress. Elapsed time - "$timesRun":00"

if($status -ne 'InProgress'){
    $exit = 1
}
$timesRun = $timesRun + 1;

start-sleep -Milliseconds 60000

}
    1. If the export fails, throw an error
      1. This will also fail the Azure DevOps Release Task if this process fails
    2. If the export succeeds, output the download link and set an Azure DevOps Output Variable
      1. This download link has a max life of 24 hours
if($status -eq "Failed"){
    throw "Export Failed."
}

$downloadLink = $currExport | Select -ExpandProperty "downloadLink"

Write-Host "Export Finished. Download URL is: $downloadLinknn"

#Set the Output variable for the Export URL, if needed
Write-Host "##vso[task.setvariable variable=ExportDownload;]'$downloadLink'"
Write-Host "Output Variable Created. Name: ExportDownload | Value: $downloadLink"

Invoking the script looks something like the following:

 
.Perficient_ExportDatabase.ps1 -ClientKey "****"
                                -ClientSecret "****"
                                -ProjectID "****"
                                -TargetEnvironment "Integration"
                                -DatabaseName "epicms"

Conclusion

At this point, from the contents of this post, you should be able to:

  • Use the existing promotion script to promote downward within the DXP environments
  • Create a script that will allow you to export a database from any one of your DXP environments
As I mentioned in my last post, the next post (hopefully, if Epi doesn’t release more API features) will show how we use all of these deployment API features to start creating our CI/CD pipeline inside Azure DevOps.
Eric Markson
Eric Markson

Technical Architect @ Verndale | Optimizely MVP - Platinum | Optimizely Content Cloud SME - Platinum

Articles: 17

8 Comments

  1. Thanks for your efforts putting this together. It would be good if you can put these ps scripts in Github repo 🙂

  2. Great post, thank you for putting it together, I'm sure this will be really helpful to many people!

    A side note regarding Start-EpiDatabaseExport, you can specify "-Wait" together with "-ShowProgress" to get messages posted to the PowerShell Progress-stream, and/or add "-Verbose" to have the status posted to the Verbose stream (similar functionality in other EpiCloud cmdlets as well).

    If needed in Azure DevOps, the verbose stream can be redirected to the output using "4>&1" at the end using redirection (https://docs.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_redirection).

    That way, you get messages out without having to add your own loop/polling logic, if you'd prefer that.

    Thanks again for putting this information together!

    • Hey Anders,

      Thank you for that. The only reason this may not work in an Azure DevOps environment is because it doesnt allow for Write-Progress, unfortunately.

      I do see in the EpiCloud code that you do have some other Verbose messages, but the main polling message wont show.

    • We should post a "Export status: " verbose message each time we poll the API, are you sure that doesn't work in Azure DevOps?

      It's a bit of a hack, but you can try to add:
      4>&1 | %{ Write-Host $_ }

      After the call of the Start-EpiDatabaseExport cmdlet to force the verbose messages to be posted to the Information stream. Make sure you also specify the "-Verbose" switch for Start-EpiDatabaseExport though 🙂

    • Hey Anders,

      Just wanted to follow up with this 🙂

      I did run some of these scripts yesterday in DevOps, and it does look like the Verbose option works quite well, which is nice!

      I did also discover that powershell has some nice hooks into DevOps via some variables. I'm wondering if there is a way to detect that this is running on DevOps (maybe check a variable?) and then provide the ability to set the progress that way?

      https://thinkrethink.net/2017/12/13/vsts-powershell-commands/

    • Glad to hear that it works! Also, thanks for following up on that! 🙂

      That's definitely interesting! Thanks for bringing that up, I'll try to make a note of that for future updates. We do have some plans (no dates decided though) of making more dedicated tools/addons for common deployment tools like Azure DevOps and Octopus Deploy as well to make it even easier to get started with the deployment API. But a PowerShell module felt like a good first step since it enables integration with pretty much anything with a CPU these days 🙂

      These details could be really helpful in improving this further though. Thanks again!

  3. Great posts, looking forward to seeing how we start creating our CI/CD pipeline inside Azure DevOps.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.