In my last-post, we went over the deployment process in the EpiServer DXP environment. I mentioned that the next post would be related to using this within Azure DevOps, but I need to make a slight adjustment based on today’s update.
As of today (4/27/2020), EpiServer released their weekly update #313, which included a lot of enhancements for this Deployment API.
In this post, we’re going to go through the following:
- The new features for the DXP Deployment API from this release
- How to use the scripts in my previous post to take advantage of the sync-down
- How to build a script to export your environment database
Update 313 – What is New?
These new EpiCloud Powershell Module features came within version 0.9.9.
Support for Content sync-down
This feature allows for the deployments to run in the opposite direction. You will be able to go from Production down to PreProduction or Integration.
Database Export as BACPAC file
This will be great for troubleshooting, and potentially pushing a database package up using the Deployment Package script to other environments. (This has yet to be tested…just thinking out loud)
Powershell Scripts
Below, I will explain how to do the following things:
- Content Sync-Down using the existing “Perficient_PromoteToEnvironment” script
- Creating a script to export a database from an environment
Content Sync-Down
Integration → PreProduction → Production
As we saw in Part 1, we could use the script to promote upward, but now we need to promote downward.
The script is set up to do this, but now we just have to specify the switch arguments to have the ‘SourceEnvironment’ be higher in the stream than the ‘TargetEnvironment.’
.Perficient_PromoteToEnvironment.ps1 -ClientKey "****" -ClientSecret "****" -ProjectID "****" -SourceEnvironment "Production" -TargetEnvironment "Preproduction" -UseMaintenancePage 1 -IncludeBlobs $false -IncludeDb 0
Export Environment Database
- Start-EpiDatabaseExport
- Get-EpiDatabaseExport
-
- Invoke script with 5 mandatory Parameters
- Client Key for intended environment
- Client Secret for intended environment
- Project ID (global for subscription)
- Target Environment (Integration, Preproduction, Production)
- DatabaseName (epicms, epicommerce)
- Invoke script with 5 mandatory Parameters
param ( [Parameter(Position=0, Mandatory)] [ValidateNotNullOrEmpty()] [string]$ClientKey, [Parameter(Position=1, Mandatory)] [ValidateNotNullOrEmpty()] [string]$ClientSecret, [Parameter(Position=2, Mandatory)] [ValidateNotNullOrEmpty()] [string]$ProjectID, [Parameter(Position=3, Mandatory)] [ValidateNotNullOrEmpty()] [ValidateSet("Integration", "Preproduction", "Production")] [string]$TargetEnvironment, [Parameter(Position=4, Mandatory)] [ValidateSet('epicms','epicommerce')] [String] $DatabaseName )
-
- Validate the parameters
- Not Empty/Null
- Not Whitespace
- Source Environment has a valid environment name
- Database Name has a valid selection
- Validate the parameters
if([string]::IsNullOrWhiteSpace($ClientKey)){ throw "A Client Key is needed. Please supply one." } if([string]::IsNullOrWhiteSpace($ClientSecret)){ throw "A Client Secret Key is needed. Please supply one." } if([string]::IsNullOrWhiteSpace($ProjectID)){ throw "A Project ID GUID is needed. Please supply one." }
-
- Check to see that the EpiCloud powershell module is installed
- If not, install it (make sure to include the -force, as this came a new EpiCloud version)
- Check to see that the EpiCloud powershell module is installed
if (-not (Get-Module -Name EpiCloud -ListAvailable)) { Write-Host "Installing EpiServer Cloud Powershell Module" Install-Module EpiCloud -Scope CurrentUser -Force }
-
- Set up and Start the export
- When setting up the object, the Wait param can be set to true or false, but within my scenario, I would like to be able to keep messages flowing to the front-end, so I set it to false. If it is set to true, the command will not return any value until it is done.
- The reason we set this to a variable is because we want the ID from the object, which will allow us to efficiently get the Epi Export status in the next bit of code.
- Set up and Start the export
$startEpiExportmentSplat = @{ ProjectId = "$ProjectID" Wait = $false Environment = "$TargetEnvironment" DatabaseName = "$DatabaseName" ClientSecret = "$ClientSecret" ClientKey = "$ClientKey" } Write-Host "Starting the Export. Environment: $TargetEnvironment | DB Name: $DatabaseName" $export = Start-EpiDatabaseExport @startEpiExportmentSplat
-
- Set up the object and Get the current export job
$exportId = $export | Select -ExpandProperty "id" $getEpiExportSplat = @{ ProjectId = "$ProjectID" ClientSecret = "$ClientSecret" ClientKey = "$ClientKey" Id = "$exportId" Environment = "$TargetEnvironment" DatabaseName = "$DatabaseName" } $timesRun = 0 $currExport = Get-EpiDatabaseExport @getEpiExportSplat | Select-Object -First 1 $status = $currExport | Select -ExpandProperty "status" $exit = 0
-
- Set up a loop to query the export status and print to the screen the current elapsed time
- This method does not have progress, so in order to let the user know it is still running, I have it writing elapsed time to the screen each minute
- Set up a loop to query the export status and print to the screen the current elapsed time
while($exit -ne 1){ $currExport = Get-EpiDatabaseExport @getEpiExportSplat | Select-Object -First 1 $status = $currExport | Select -ExpandProperty "status" Write-Host "Exporting In Progress. Elapsed time - "$timesRun":00" if($status -ne 'InProgress'){ $exit = 1 } $timesRun = $timesRun + 1; start-sleep -Milliseconds 60000 }
-
- If the export fails, throw an error
- This will also fail the Azure DevOps Release Task if this process fails
- If the export succeeds, output the download link and set an Azure DevOps Output Variable
- This download link has a max life of 24 hours
- If the export fails, throw an error
if($status -eq "Failed"){ throw "Export Failed." } $downloadLink = $currExport | Select -ExpandProperty "downloadLink" Write-Host "Export Finished. Download URL is: $downloadLinknn" #Set the Output variable for the Export URL, if needed Write-Host "##vso[task.setvariable variable=ExportDownload;]'$downloadLink'" Write-Host "Output Variable Created. Name: ExportDownload | Value: $downloadLink"
Invoking the script looks something like the following:
.Perficient_ExportDatabase.ps1 -ClientKey "****" -ClientSecret "****" -ProjectID "****" -TargetEnvironment "Integration" -DatabaseName "epicms"
Conclusion
- Use the existing promotion script to promote downward within the DXP environments
- Create a script that will allow you to export a database from any one of your DXP environments
Thanks for your efforts putting this together. It would be good if you can put these ps scripts in Github repo 🙂
Great post, thank you for putting it together, I'm sure this will be really helpful to many people!
A side note regarding Start-EpiDatabaseExport, you can specify "-Wait" together with "-ShowProgress" to get messages posted to the PowerShell Progress-stream, and/or add "-Verbose" to have the status posted to the Verbose stream (similar functionality in other EpiCloud cmdlets as well).
If needed in Azure DevOps, the verbose stream can be redirected to the output using "4>&1" at the end using redirection (https://docs.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_redirection).
That way, you get messages out without having to add your own loop/polling logic, if you'd prefer that.
Thanks again for putting this information together!
Hey Anders,
Thank you for that. The only reason this may not work in an Azure DevOps environment is because it doesnt allow for Write-Progress, unfortunately.
I do see in the EpiCloud code that you do have some other Verbose messages, but the main polling message wont show.
We should post a "Export status: " verbose message each time we poll the API, are you sure that doesn't work in Azure DevOps?
It's a bit of a hack, but you can try to add:
4>&1 | %{ Write-Host $_ }
After the call of the Start-EpiDatabaseExport cmdlet to force the verbose messages to be posted to the Information stream. Make sure you also specify the "-Verbose" switch for Start-EpiDatabaseExport though 🙂
Hey Anders,
Just wanted to follow up with this 🙂
I did run some of these scripts yesterday in DevOps, and it does look like the Verbose option works quite well, which is nice!
I did also discover that powershell has some nice hooks into DevOps via some variables. I'm wondering if there is a way to detect that this is running on DevOps (maybe check a variable?) and then provide the ability to set the progress that way?
https://thinkrethink.net/2017/12/13/vsts-powershell-commands/
Glad to hear that it works! Also, thanks for following up on that! 🙂
That's definitely interesting! Thanks for bringing that up, I'll try to make a note of that for future updates. We do have some plans (no dates decided though) of making more dedicated tools/addons for common deployment tools like Azure DevOps and Octopus Deploy as well to make it even easier to get started with the deployment API. But a PowerShell module felt like a good first step since it enables integration with pretty much anything with a CPU these days 🙂
These details could be really helpful in improving this further though. Thanks again!
Great posts, looking forward to seeing how we start creating our CI/CD pipeline inside Azure DevOps.
Thanks Mike! Should be getting the first post within the next week or so!