Continuous Integration and Continuous Delivery (CI/CD) for Power Platform¶
Adoption of Continuous Integration/Continuous Delivery (CI/CD) can increase development complexity for Citizen Developers. However, a scalable and automated development process that can quickly incorporate new features and bug correction is critical for a reliable, sustainable fusion development process.
This guidance supplements basics of Power Platform ALM and CI/CD Engineering Fundamentals. It facilitates modern software engineering practices in the context of development by a fusion team.
To facilitate this process, we recommend the following considerations:
- Text-based source stored in a Git repository in Azure DevOps or GitHub.
- Azure DevOps Pipelines or GitHub Workflows that build, test, validate, and deploy changes based on Pull Requests.
- Merge triggers build and deploy to dev/test/prod (depending on branch).
- Automating source code deployment between the development environment and the Git repository.
- Packaging all Power Platform development work using Power Platform Solutions.
Environments¶
Environment setup and configuration are critical to sustainable and reliable CI/CD. Power Platform environments can be set up to isolate different stages of solution development, testing, and deployment. The CI/CD process moves changes through those stages. For more information about Power Platform environments, see Power Platform Environments Overview.
Development Environment Approaches¶
Application development in Power Platform consists of two main areas: low-code customizations and more traditional code-first approaches. Depending on the size, scale, and requirements, solutions may use either low-code, code-first, or both.
A tool and procedure-based approach to environments allows multiple developers to simultaneously work on a single artifact within a Power Platform solution.
The conceptual outline of two such approaches - single and multiple development environments - is presented below.
For more information, see Engineering Tools about the tools to automate the co-development process.
Approach 1 - Single Development Environment¶
For small projects where work items do not overlap or the development team is small (1-3 developers), we recommend using a single Main Dev
environment for all the developers.
Pros:
- Easy to build ALM process.
- Low effort to set up.
Cons:
- Does not scale for large development teams - as the team and project grow, the requirements for clear communication about who is working on what also grow.
- As all features are developed in the same environment, the entire current state is deployed during release. so, individual feature releases are not practical. Either the whole set of updates is released together, or a painstaking manual roll-back process of customizations is required.
graph LR
A(MAIN DEV) --> B(VALIDATION) & C(TEST)
C --> D(PROD)
Environment strategy Approach 1. Here, Main Dev
is the place where all the in-progress features are developed
Approach 2 - Multiple Development Environments¶
In more complex projects or larger teams, we recommend each feature be developed on a separate Feature Dev
environment and then merged back to the Main Dev
.
Developer environments can be used for this approach.
Pros:
- Ability to quickly scale ALM process.
Main Dev
contains only the ready-to-deploy state.- Developers will not overwrite each other's work.
Cons:
- More complex to set up and maintain than a single development environment approach.
Feature Dev
environment creation has to be automated, as manual setup may be complex and ultimately take more time and effort than the work itself.- Need to keep correct, up-to-date state (configuration, customizations, integrations, processes) on a
Feature Dev
environment as onMain Dev
to ensure consistency in development.
graph LR
A1[DEV Feature-1] --> A
A2[DEV Feature-2] --> A
A3[DEV Feature-3] --> A
A(MAIN DEV) --> B(VALIDATION)
A(MAIN DEV) --> C(TEST)
C --> D(PROD)
Environment strategy 2. Each feature is developed in its own Feature Dev
environment before merging into Main Dev
Continuous Integration¶
In order to scale well during the development process, one needs to quickly and easily package the solution to be deployable with history of previous changes. The best way to facilitate easy scaling is to automatically integrate low-code and professional developers' changes into a single codebase.
.The steps in this process vary depending on the Power Apps and configuration used in your project, as well as the testing tools and commands used in the CI/CD process.
Microsoft Power Platform CLI (often referred to as PAC CLI
) can be used to execute most of the needed actions.
To wrap the CLI command into Azure DevOps Extension Task, use Power Platform Build Tools with Azure DevOps, or Power Platform Actions for GitHub.
Use the dedicated extension for the DevOps platform of your choice, and if a needed piece is not available, then fallback to PAC CLI command tools.
Export Power Platform Solution(s)¶
- name: Export Solution
uses: microsoft/powerplatform-actions/export-solution@v0.9.1
with:
environment-url: 'https://myenv.crm.dynamics.com'
user-name: 'me@myenv.onmicrosoft.com'
password-secret: ${{ secrets.MYPASSWORD }}
solution-name: Solution
solution-output-file: 'Solution.zip'
working-directory: 'out-solution'
Always have all your solution-aware components added to solution(s) that will be used in the development process.
See a list of all solution-aware components at Power Platform documentation.
Unpack Solution¶
Because Dataverse solutions are exported as zip files, storing them directly in a source repository does not allow tracking individual changes or use "Diff" technique. It is preferable to "unpack" a solution into a text-based representation of its component parts.
Examples:
- name: Unpack Solution
uses: microsoft/powerplatform-actions/unpack-solution@v0.9.1
with:
environment-url: 'https://myenv.crm.dynamics.com'
user-name: 'me@myenv.onmicrosoft.com'
password-secret: ${{ secrets.MYPASSWORD }}
solution-file: './Solution.zip'
solution-folder: 'out/solution'
solution-type: 'Unmanaged'
overwrite-files: true
Export non-solution aware data¶
Ideally, what you retrieve from the development environment are solution aware components, which can be downloaded as an archive file back to your repository.
However, often we will need to export elements that are outside the solution such as Dataverse records, and treated as part of the package.
Exporting Dataverse records can be achieved using following CLI command.
Some environment Variables in the solution need to be defined by its use when it is imported for the first time. These variables can be exported as a deployment settings file for reuse, such as when you import the solution in CD process.
=== "PAC CLI"
``` pwsh
pac solution create-settings --solution-zip .\SampleSolution.zip --settings-file .\SampleDeploymentSettingsDev.json
```
Working with Power Pages¶
CI/CD tooling with Power Platform varies depending on the services used. For example, you can import/export your Power Page app using the CLI when using Power Pages.
- name: Download Power Pages Portal
uses: microsoft/powerplatform-actions/download-paportal@v0.9.1
with:
environment-url: 'https://myenv.crm.dynamics.com'
user-name: 'me@myenv.onmicrosoft.com'
password-secret: ${{ secrets.MYPASSWORD }}
download-path: './portal'
website-id: 00000000-0000-0000-0000-000000000000
overwrite: true
Set Version¶
when you commit a new solution to source control, ensure that you increment the version number of the solution. It can be done in solution settings or in PAC CLI (see below). You can also use Git tags to quickly correlate the version in source control with a deployed version.
Solutions can be versioned using the following CLI command (before downloading from Dataverse):
You can also update version number using find and replace tool in a command line, or in a workflow or in a pipeline. Updates can be done at the time of unpack (import) or pack (export).
Validate Solution Correctness¶
The tools listed below perform static analysis of your solution to ensure that your codebase follows best practices:
- Solution Checker - dedicated tools for Power Platform Model-Driven Apps. This task should be included in a Validation Pipeline.
- Portal Checker - dedicated tools for Power Platform Portal.
The Solution Checker and Portal Checker tools can both be executed on demand. The Solution Checker tool can also be run via:
Quality Checks for Custom Code¶
When including custom code in Power Pages and Templates, you can add linters and code quality checks to a Validation Pipeline to ensure clean custom code is being committed to source control. Regardless of the coding language and syntax patterns you're using, there are multiple options for specifying which code you would like to lint:
For example, ESLint is a tool for maintaining JavaScript code quality. This linter can be used to validate inline scripts contained in HTML files through the eslint-plugin-html. Using a linter will provide consistent custom JavaScript code quality throughout your solution.
SonarQubeis also a powerful tool for checking code quality. SonarQube includes thousands of automated and predefined Static Code Analysis rules for Java, CSS, and HTML among other languages. The tool identifies bugs, vulnerabilities, and code smells and provides remediation and code coverage report. SonarQube can be directly integrated into AzureDevOps pipelines. If you want to use this tool, you should consider hosting an independent instance of a SonarQube server for amplified security and governance.
Validate solution deployability¶
Downloading the solution and other artifacts does not confirm that the project can be successfully deployed to production. The longer the time between development and production release, the more likely that deployment will fail due to missing dependencies.
You can upload you solution(s) to a copy of your productive system and run your tests against it. Successful test will confirm that your solution is ready for production deployment. The validation environment can be treated as ephemeral and can be created and destroyed on demand.
- Create your validation environment
- task: PowerPlatformCreateEnvironment@2
inputs:
authenticationType: 'PowerPlatformSPN'
PowerPlatformSPN: ${{ parameters.connectionName }}
DisplayName: 'validation'
EnvironmentSku: 'Sandbox'
LocationName: 'unitedstates'
LanguageName: 'English (United States)'
CurrencyName: 'USD'
DomainName: 'validation'
- Copy your production environment.
- Deploy and test your solution(s), as described in Continuous Delivery section.
Automated Unit Testing¶
Once your solution is in the repository, you can add automated unit tests that will help ensure your solution is working as expected. The tests can be executed as part of your CI/CD pipeline. The tests can be executed against your solution deployability testing or a dedicated test environment.
Continuous Delivery¶
Once the solution is stored and versioned in the code repository, you will want to automate release process of your codebase.
Import Solution(s)¶
As in CI process, your extracted solution, stored in repository can be automated using extension for your DevOps platform of choice. if such an extension is not available, Microsoft Power Platform CLI can be used:
Firstly, you will need to pack your artifacts back to a solution(s) zip file(s). As a best practice, managed solution should be imported into all non-dev environments.
After solution is packed, you can import it:
- name: Import Solution
uses: microsoft/powerplatform-actions/import-solution@v0.9.1
with:
environment-url: 'https://myenv.crm.dynamics.com'
user-name: 'me@myenv.onmicrosoft.com'
password-secret: ${{ mypassword }}
solution-file: './Solution.zip'
activate-plugins: true
use-deployment-settings-file: true
deployment-settings-file: './SampleDeploymentSettingsTestEnv.json'
With the --settings-file
parameter, you can import environment-specific values such as connections and environment variables. Every time you have a setting specific for a given environment, you can use environment variables to store them and import then using this parameter.
Import Non-Solution Aware Data¶
After the solution is imported, you can import other elements such as data, that you need in the target environment. This set maybe different between environments so you can keep more than one dataset in your repository.
Importing data that is non-solution aware using tools like
PAC CLI
will not remove existing records but only create/update data in your target environment. The records will be validated against their unique GUID; if a record with GUID exists, it will be updated; otherwise, a new record will be created. This simplifies the release process because records in the automated release process will have the same GUID across environments.
If you need to create a CD step to delete records prior to the import task, use Dataverse SDK (.NET Framework, .NET Core only) or use the Dataverse WebApi and any programming language. Libraries in various languages exist to support the task, for example Microsoft.Xrm.Data.PowerShell for PowerShell.
Smoke Tests¶
Once a solution is imported, run smoke tests to validate the solution is working as expected. As it is not possible to test all possible scenarios, a post-deployment smoke test can be used helps validate that the most critical test scenarios are working as expected.
CI/CD Work Diagram¶
sequenceDiagram
participant DevOps
participant Dev Environment
participant Validation Environment
participant Test Environment
participant Production Environment
rect rgb(50, 255, 255,0.2)
note over DevOps, Production Environment: Continuous Integration
DevOps ->> Dev Environment: Set Version
Dev Environment ->> DevOps: Export Solution
DevOps ->> DevOps: Extract Solution
Dev Environment ->> DevOps: Export Data
DevOps ->> DevOps: Validate Solution Correctness
DevOps ->> Validation Environment: Create Environment
DevOps ->>+ Production Environment: Copy Environment
Production Environment -->>- Validation Environment: Copy Environment
DevOps ->> Validation Environment: Execute Automated UI Tests
end
rect rgb(50, 255, 255,0.2)
note over DevOps, Production Environment: Continuous Delivery
loop Test Deployment
DevOps ->> Test Environment : Import Solution
DevOps ->> Test Environment : Import Data
DevOps ->> Test Environment : Execute Smoke Tests
end
loop Production Deployment
DevOps ->> Production Environment : Import Solution
DevOps ->> Production Environment : Import Data
DevOps ->> Production Environment : Execute Smoke Tests
end
end
References¶
Application lifecycle management with Microsoft Power Platform