Skip to content

Continuous Integration and Continuous Delivery (CI/CD) for Power Platform

Adoption of Continuous Integration/Continuous Delivery (CI/CD) can increase development complexity for Citizen Developers. However, a scalable and automated development process that can quickly incorporate new features and bug correction is critical for a reliable, sustainable fusion development process.

This guidance supplements basics of Power Platform ALM and CI/CD Engineering Fundamentals. It facilitates modern software engineering practices in the context of development by a fusion team.

To facilitate this process, we recommend the following considerations:

  • Text-based source stored in a Git repository in Azure DevOps or GitHub.
  • Azure DevOps Pipelines or GitHub Workflows that build, test, validate, and deploy changes based on Pull Requests.
  • Merge triggers build and deploy to dev/test/prod (depending on branch).
  • Automating source code deployment between the development environment and the Git repository.
  • Packaging all Power Platform development work using Power Platform Solutions.

Environments

Environment setup and configuration are critical to sustainable and reliable CI/CD. Power Platform environments can be set up to isolate different stages of solution development, testing, and deployment. The CI/CD process moves changes through those stages. For more information about Power Platform environments, see Power Platform Environments Overview.

Development Environment Approaches

Application development in Power Platform consists of two main areas: low-code customizations and more traditional code-first approaches. Depending on the size, scale, and requirements, solutions may use either low-code, code-first, or both.

A tool and procedure-based approach to environments allows multiple developers to simultaneously work on a single artifact within a Power Platform solution.

The conceptual outline of two such approaches - single and multiple development environments - is presented below.

For more information, see Engineering Tools about the tools to automate the co-development process.

Approach 1 - Single Development Environment

For small projects where work items do not overlap or the development team is small (1-3 developers), we recommend using a single Main Dev environment for all the developers.

Pros:

  • Easy to build ALM process.
  • Low effort to set up.

Cons:

  • Does not scale for large development teams - as the team and project grow, the requirements for clear communication about who is working on what also grow.
  • As all features are developed in the same environment, the entire current state is deployed during release. so, individual feature releases are not practical. Either the whole set of updates is released together, or a painstaking manual roll-back process of customizations is required.
graph LR
    A(MAIN DEV) --> B(VALIDATION) & C(TEST)
    C --> D(PROD)

Environment strategy Approach 1. Here, Main Dev is the place where all the in-progress features are developed

Approach 2 - Multiple Development Environments

In more complex projects or larger teams, we recommend each feature be developed on a separate Feature Dev environment and then merged back to the Main Dev.

Developer environments can be used for this approach.

Pros:

  • Ability to quickly scale ALM process.
  • Main Dev contains only the ready-to-deploy state.
  • Developers will not overwrite each other's work.

Cons:

  • More complex to set up and maintain than a single development environment approach.
  • Feature Dev environment creation has to be automated, as manual setup may be complex and ultimately take more time and effort than the work itself.
  • Need to keep correct, up-to-date state (configuration, customizations, integrations, processes) on a Feature Dev environment as on Main Dev to ensure consistency in development.
graph LR
    A1[DEV Feature-1] --> A
    A2[DEV Feature-2] --> A
    A3[DEV Feature-3] --> A
    A(MAIN DEV) --> B(VALIDATION)
    A(MAIN DEV) --> C(TEST)
    C --> D(PROD)

Environment strategy 2. Each feature is developed in its own Feature Dev environment before merging into Main Dev

Continuous Integration

In order to scale well during the development process, one needs to quickly and easily package the solution to be deployable with history of previous changes. The best way to facilitate easy scaling is to automatically integrate low-code and professional developers' changes into a single codebase.

.The steps in this process vary depending on the Power Apps and configuration used in your project, as well as the testing tools and commands used in the CI/CD process.

Microsoft Power Platform CLI (often referred to as PAC CLI) can be used to execute most of the needed actions.

To wrap the CLI command into Azure DevOps Extension Task, use Power Platform Build Tools with Azure DevOps, or Power Platform Actions for GitHub.

Use the dedicated extension for the DevOps platform of your choice, and if a needed piece is not available, then fallback to PAC CLI command tools.

Export Power Platform Solution(s)

pac solution export --path .\Solution.zip --name SampleComponentSolution --managed false --include general
- name: Export Solution
uses: microsoft/powerplatform-actions/export-solution@v0.9.1
with:
    environment-url: 'https://myenv.crm.dynamics.com'
    user-name: 'me@myenv.onmicrosoft.com'
    password-secret: ${{ secrets.MYPASSWORD }}
    solution-name: Solution
    solution-output-file: 'Solution.zip'
    working-directory: 'out-solution'
- task: PowerPlatformExportSolution@2
displayName: 'Power Platform Export Solution'
inputs:
    authenticationType: PowerPlatformSPN
    PowerPlatformSPN: ${{ parameters.connectionName }}
    SolutionName: 'Solution'
    SolutionOutputFile: '.\Solution'
    AsyncOperation: true
    MaxAsyncWaitTime: '60'

Always have all your solution-aware components added to solution(s) that will be used in the development process.

See a list of all solution-aware components at Power Platform documentation.

Unpack Solution

Because Dataverse solutions are exported as zip files, storing them directly in a source repository does not allow tracking individual changes or use "Diff" technique. It is preferable to "unpack" a solution into a text-based representation of its component parts.

Examples:

pac solution unpack --zipfile .\SampleSolution.zip --folder .\SampleSolutionUnpacked
- name: Unpack Solution
uses: microsoft/powerplatform-actions/unpack-solution@v0.9.1
with:
    environment-url: 'https://myenv.crm.dynamics.com'
    user-name: 'me@myenv.onmicrosoft.com'
    password-secret: ${{ secrets.MYPASSWORD }}
    solution-file: './Solution.zip'
    solution-folder: 'out/solution'
    solution-type: 'Unmanaged'
    overwrite-files: true
- task: PowerPlatformUnpackSolution@2
displayName: 'Power Platform Unpack Solution'
inputs:
    SolutionInputFile: './Solution.zip'
    SolutionTargetFolder: 'out/solution'
    SolutionType: 'Unmanaged'

Export non-solution aware data

Ideally, what you retrieve from the development environment are solution aware components, which can be downloaded as an archive file back to your repository.

However, often we will need to export elements that are outside the solution such as Dataverse records, and treated as part of the package.

Exporting Dataverse records can be achieved using following CLI command.

pac data export --schemaFile .\schema.xml -dataFile data.zip  --overwrite
- name: Data Export
uses: microsoft/powerplatform-actions/export-data@v0.9.1
with:
    environment-url: 'https://myenv.crm.dynamics.com'
    user-name: 'me@myenv.onmicrosoft.com'
    password-secret: ${{ secrets.MYPASSWORD }}
    schema-file: './schema.xml'
    data-file: './data.zip'
    overwrite: true
- task: PowerPlatformExportData@2
inputs:
    authenticationType: 'PowerPlatformSPN'
    PowerPlatformSPN: ${{ parameters.connectionName }}
    SchemaFile: 'schema.xml'
    DataFile: 'data.zip'
    Overwrite: true

Some environment Variables in the solution need to be defined by its use when it is imported for the first time. These variables can be exported as a deployment settings file for reuse, such as when you import the solution in CD process.

=== "PAC CLI"

``` pwsh
pac solution create-settings --solution-zip .\SampleSolution.zip --settings-file .\SampleDeploymentSettingsDev.json
```

Working with Power Pages

CI/CD tooling with Power Platform varies depending on the services used. For example, you can import/export your Power Page app using the CLI when using Power Pages.

pac paportal download --path ".\portal" --webSiteId 00000000-0000-0000-0000-000000000000 --overwrite
- name: Download Power Pages Portal
uses: microsoft/powerplatform-actions/download-paportal@v0.9.1
with:
    environment-url: 'https://myenv.crm.dynamics.com'
    user-name: 'me@myenv.onmicrosoft.com'
    password-secret: ${{ secrets.MYPASSWORD }}
    download-path: './portal'
    website-id: 00000000-0000-0000-0000-000000000000
    overwrite: true
- task: PowerPlatformDownloadPaportal@2
inputs:
    authenticationType: 'PowerPlatformEnvironment'
    PowerPlatformSPN: ${{ parameters.connectionName }}
    DownloadPath: './portal'
    WebsiteId: '00000000-0000-0000-0000-000000000000'
    Overwrite: true

Set Version

when you commit a new solution to source control, ensure that you increment the version number of the solution. It can be done in solution settings or in PAC CLI (see below). You can also use Git tags to quickly correlate the version in source control with a deployed version.

Solutions can be versioned using the following CLI command (before downloading from Dataverse):

pac solution online-version --solution-name .\Samplesolution --solution-version 1.0.0.2
# Coming soon
- task: PowerPlatformSetSolutionVersion@2
inputs:
    authenticationType: 'PowerPlatformEnvironment'
    PowerPlatformSPN: ${{ parameters.connectionName }}
    SolutionName: './SampleSolution'
    SolutionVersionNumber: '1.0.0.2'

You can also update version number using find and replace tool in a command line, or in a workflow or in a pipeline. Updates can be done at the time of unpack (import) or pack (export).

Validate Solution Correctness

The tools listed below perform static analysis of your solution to ensure that your codebase follows best practices:

  1. Solution Checker - dedicated tools for Power Platform Model-Driven Apps. This task should be included in a Validation Pipeline.
  2. Portal Checker - dedicated tools for Power Platform Portal.

The Solution Checker and Portal Checker tools can both be executed on demand. The Solution Checker tool can also be run via:

pac solution check --path .\Solution.zip --outputDirectory .\samplepackage --geo UnitedStates
- name: Check Solution
    uses: microsoft/powerplatform-actions/check-solution@v0.9.1
    with:
    environment-url: 'https://myenv.crm.dynamics.com'
    user-name: 'me@myenv.onmicrosoft.com'
    password-secret: ${{ secrets.MYPASSWORD }}
    path: './solution.zip'
    checker-logs-artifact-name: './samplepackage'
- task: PowerPlatformChecker@2
    inputs:
    authenticationType: 'PowerPlatformSPN'
    PowerPlatformSPN: ${{ parameters.connectionName }}
    FilesToAnalyze: './solution.zip'
    RuleSet: '0ad12346-e108-40b8-a956-9a8f95ea18c9'
    ErrorLevel: 'MediumIssueCount'
    ArtifactDestinationName: './samplepackage'

Quality Checks for Custom Code

When including custom code in Power Pages and Templates, you can add linters and code quality checks to a Validation Pipeline to ensure clean custom code is being committed to source control. Regardless of the coding language and syntax patterns you're using, there are multiple options for specifying which code you would like to lint:

For example, ESLint is a tool for maintaining JavaScript code quality. This linter can be used to validate inline scripts contained in HTML files through the eslint-plugin-html. Using a linter will provide consistent custom JavaScript code quality throughout your solution.

SonarQubeis also a powerful tool for checking code quality. SonarQube includes thousands of automated and predefined Static Code Analysis rules for Java, CSS, and HTML among other languages. The tool identifies bugs, vulnerabilities, and code smells and provides remediation and code coverage report. SonarQube can be directly integrated into AzureDevOps pipelines. If you want to use this tool, you should consider hosting an independent instance of a SonarQube server for amplified security and governance.

Validate solution deployability

Downloading the solution and other artifacts does not confirm that the project can be successfully deployed to production. The longer the time between development and production release, the more likely that deployment will fail due to missing dependencies.

You can upload you solution(s) to a copy of your productive system and run your tests against it. Successful test will confirm that your solution is ready for production deployment. The validation environment can be treated as ephemeral and can be created and destroyed on demand.

  1. Create your validation environment
pac admin create --type Sandbox --name validation --domain validationEnv
- name: Create Validation Environment
  uses: microsoft/powerplatform-actions/create-environment@v0.9.1
  with:
    user-name: 'me@myenv.onmicrosoft.com'
    password-secret: ${{ secrets.MYPASSWORD }}
    name: 'validation'
    type: 'Sandbox'
    domain: 'validationEnv'
- task: PowerPlatformCreateEnvironment@2
  inputs:
    authenticationType: 'PowerPlatformSPN'
    PowerPlatformSPN: ${{ parameters.connectionName }}
    DisplayName: 'validation'
    EnvironmentSku: 'Sandbox'
    LocationName: 'unitedstates'
    LanguageName: 'English (United States)'
    CurrencyName: 'USD'
    DomainName: 'validation'
  1. Copy your production environment.
  pac admin copy --source-env-name production --target-env-name validation --type MinimalCopy
- name: Copy Environment
  uses: microsoft/powerplatform-actions/copy-environment@v0.9.1
  with:
    user-name: 'me@myenv.onmicrosoft.com'
    password-secret: ${{ secrets.MYPASSWORD }}
    source-url: 'https://production.crm.dynamics.com'
    target-env: 'https://validation.crm.dynamics.com'
    type: 'MinimalCopy'
- task: PowerPlatformCopyEnvironment@2
  inputs:
    authenticationType: 'PowerPlatformSPN'
    PowerPlatformSPN: ${{ parameters.connectionName }}
    Environment:  'https://production.crm.dynamics.com'
    TargetEnvironmentUrl: 'https://validation.crm.dynamics.com'
  1. Deploy and test your solution(s), as described in Continuous Delivery section.

Automated Unit Testing

Once your solution is in the repository, you can add automated unit tests that will help ensure your solution is working as expected. The tests can be executed as part of your CI/CD pipeline. The tests can be executed against your solution deployability testing or a dedicated test environment.

Continuous Delivery

Once the solution is stored and versioned in the code repository, you will want to automate release process of your codebase.

Import Solution(s)

As in CI process, your extracted solution, stored in repository can be automated using extension for your DevOps platform of choice. if such an extension is not available, Microsoft Power Platform CLI can be used:

Firstly, you will need to pack your artifacts back to a solution(s) zip file(s). As a best practice, managed solution should be imported into all non-dev environments.

pac solution pack --zipfile .\SampleSolution.zip --folder .\SampleSolutionUnpacked --packagetype Managed
- name: Pack Solution
uses: microsoft/powerplatform-actions/pack-solution@v0.9.1
with:
    solution-file: './SampleSolution.zip'
    solution-folder: './SampleSolutionUnpacked'
    solution-type: 'Managed'
- task: PowerPlatformPackSolution@2
inputs:
    SolutionSourceFolder: './SampleSolutionUnpacked'
    SolutionOutputFile: './SampleSolution.zip'
    SolutionType: 'Managed'

After solution is packed, you can import it:

pac solution import --path .\Solution.zip --activate-flows --activate-plugins --settings-file .\SampleDeploymentSettingsTestEnv.json
- name: Import Solution
uses: microsoft/powerplatform-actions/import-solution@v0.9.1
with:
    environment-url: 'https://myenv.crm.dynamics.com'
    user-name: 'me@myenv.onmicrosoft.com'
    password-secret: ${{ mypassword }}
    solution-file: './Solution.zip'
    activate-plugins: true
    use-deployment-settings-file: true
    deployment-settings-file: './SampleDeploymentSettingsTestEnv.json'
- task: PowerPlatformImportSolution@2
inputs:
    authenticationType: 'PowerPlatformSPN'
    PowerPlatformSPN: ${{ parameters.connectionName }}
    SolutionInputFile: './Solution.zip'
    UseDeploymentSettingsFile: true
    DeploymentSettingsFile: './SampleDeploymentSettingsTestEnv.json'

With the --settings-file parameter, you can import environment-specific values such as connections and environment variables. Every time you have a setting specific for a given environment, you can use environment variables to store them and import then using this parameter.

Import Non-Solution Aware Data

After the solution is imported, you can import other elements such as data, that you need in the target environment. This set maybe different between environments so you can keep more than one dataset in your repository.

pac data import --dataFile .\dataFolderOrZip
- name: Import Data
uses: microsoft/powerplatform-actions/import-data@v0.9.1
with:
    environment-url: 'https://myenv.crm.dynamics.com'
    user-name: 'me@myenv.onmicrosoft.com'
    password-secret: ${{ secrets.MYPASSWORD }}
    data-file: './dataFolderOrZip'
- task: PowerPlatformImportData@2
inputs:
    authenticationType: 'PowerPlatformSPN'
    PowerPlatformSPN: ${{ parameters.connectionName }}
    DataFile: ./dataFolderOrZip'

Importing data that is non-solution aware using tools like PAC CLI will not remove existing records but only create/update data in your target environment. The records will be validated against their unique GUID; if a record with GUID exists, it will be updated; otherwise, a new record will be created. This simplifies the release process because records in the automated release process will have the same GUID across environments.

If you need to create a CD step to delete records prior to the import task, use Dataverse SDK (.NET Framework, .NET Core only) or use the Dataverse WebApi and any programming language. Libraries in various languages exist to support the task, for example Microsoft.Xrm.Data.PowerShell for PowerShell.

Smoke Tests

Once a solution is imported, run smoke tests to validate the solution is working as expected. As it is not possible to test all possible scenarios, a post-deployment smoke test can be used helps validate that the most critical test scenarios are working as expected.

CI/CD Work Diagram

sequenceDiagram
    participant DevOps
    participant Dev Environment
    participant Validation Environment
    participant Test Environment
    participant Production Environment
    rect rgb(50, 255, 255,0.2)
        note over DevOps, Production Environment: Continuous Integration
        DevOps ->> Dev Environment: Set Version
        Dev Environment ->> DevOps: Export Solution
        DevOps ->> DevOps: Extract Solution
        Dev Environment ->> DevOps: Export Data
        DevOps ->> DevOps: Validate Solution Correctness
        DevOps ->> Validation Environment: Create Environment
        DevOps ->>+ Production Environment: Copy Environment
        Production Environment -->>- Validation Environment: Copy Environment
        DevOps ->> Validation Environment: Execute Automated UI Tests
    end
    rect rgb(50, 255, 255,0.2)
        note over DevOps, Production Environment: Continuous Delivery
        loop Test Deployment
            DevOps ->> Test Environment : Import Solution
            DevOps ->> Test Environment : Import Data
            DevOps ->> Test Environment : Execute Smoke Tests
        end
        loop Production Deployment
            DevOps ->> Production Environment : Import Solution
            DevOps ->> Production Environment : Import Data
            DevOps ->> Production Environment : Execute Smoke Tests
        end
    end

References

Application lifecycle management with Microsoft Power Platform