lingdanglfw(DAX)

导航

Set up ALM Accelerator for Advanced Maker components - Pipeline for PowerApps

 

The ALM Accelerator components enable makers to apply source control strategies using Azure DevOps and use automated builds and deployment of solutions to their environments without the need for manual intervention by the maker, administrator, developer, or tester. In addition the ALM Accelerator provides makers the ability to work without intimate knowledge of the downstream technologies and to be able to switch quickly from developing solutions to source controlling the solution and ultimately pushing their apps to other environments with as few interruptions to their work as possible.

This solution uses Azure DevOps for source control and deployments. You can sign up for Azure DevOps for free for up to 5 users on the Azure DevOps site.

The ALM Accelerator components solution doesn't have a dependency on other components of the CoE Starter Kit. It can be used independently.

[!NOTE] The pipelines in the ALM Accelerator components rely on some third party extensions to fill gaps in areas where native utilities are not available or the effort to recreate the functionality is prohibitive. We recognize that this isn't ideal and are working toward eliminating these utilities as we grow the solution and native capabilities become available. However, in the interest of providing a sample implementation of full end to end Power Platform Pipelines it was decided that this would be necessary at the moment. Where possible the documentation calls out these third party tools, their purpose and their documentation / source code.

Document structure

The GETTINGSTARTED.md is structured into 7 main sections

  • Prequisites - Considerations and requirements in order to complete the setup.
  • Foundational Setup - This sections walks through the base setup of the ALM Accelerator for Advanced Makers. The base setup consist of the steps and configurations required.
  • Development Project Setup - This sections includes the steps required to set up a new Development Project covering project specific setup of Azure DevOps, generic build and deployment pipelines, Service Connections, Power Platform Environments and Application Users
  • Solution Setup - These steps are specific to each solution you wish to support with the ALM Accelerator. The section covers setting up the solution specific pipelines, branch policies, deployment variables to support connections references, environment variables and AAD group sharing.
  • Importing the Solution and Configuring the App - This section takes you through the steps required to import the actual ALM Accelerator for Advanced Makers canvas app and configuring the included custom connector.
  • Using the ALM Accelerator App - A short introduction to using the AA4AM canvas app
  • Troubleshooting - A few pointers on some know issues and how to remediate these.

Table of Contents

Prerequisites

Environments

The application will manage deploying solutions from Development to Validation to Testing and to Production. While you can setup your pipelines to use two environments initially (e.g. one for your Development / Deploying the ALM Accelerator Solution and one for Validation, Test and Production. Ultimately, you will want to have separate environments setup for each of at least Development, Validation, Test and Production.

  • The environment into which you are deploying the ALM Accelerator app will need to be created with a Dataverse database. Additionally, any target environment requires a Dataverse database in order to deploy your solutions.

Users and Permissions

In order to complete the steps below you will need the following users and permissions in Power Platform, Azure DevOps and Azure.

  • A licensed Azure user with Permissions to create and view AAD Groups, create App Registrations and Grant Admin consent to App Registrations in Azure Active Directory.
  • A licensed Azure DevOps user with Permissions to create and manage Pipelines, Service Connections, Repos and Extensions.
  • A licensed Power Platform user with Permissions to create Application Users and grant Administrative Permissions to the Application User

Connectors and DLPs

For the ALM Accelerator for Advanced Makers Canvas App to work as expected the following connectors must be available to be used together in the environment into which the ALM Accelerator solution is imported.

  • Dataverse
  • Power Apps for Makers
  • ALM Accelerator Custom Azure DevOps

Foundational Setup

The following steps will guide you through setting up the foundations of the ALM Accelerator for Advanced Makers. These steps are general to the functionality of the ALM Accelerator and not project or solution specific.

Create an App Registration in your AAD Environment

Creating an App Registration for the ALM Accelerator is a one time setup step to grant permissions to the App and the associated pipelines the permissions required to perform operations in Azure DevOps and Power Apps / Dataverse Sign in to portal.azure.com.

  1. Go to Azure Active Directory > App registrations. image.png

  2. Select New Registration and give the registration a name (e.g. ALMAcceleratorServicePrincipal) leave all other options as default and select Register.

  3. Select API Permissions > + Add a permission.

  4. Select Dynamics CRM, and configure permissions as follows: image.png

  5. Select Delegated permissions, and then select user_impersonation. image.png

  6. Select Add permissions.

  7. Repeat adding permissions steps above for

    • PowerApps-Advisor (Analysis All) (Required for running static analysis via App Checker https://docs.microsoft.com/en-us/power-platform/alm/checker-api/overview). This permission can be found under APIs my organization uses. image-20210216135345784

    • Azure DevOps. (Required for connecting to Azure DevOps via the custom connector in the ALM Accelerator App). This permission can be found under APIs my organization uses.

    • When adding the Azure DevOps permission go to APIs my organization uses and search for Azure DevOps and copy the Application (client) ID.

      [!IMPORTANT] Disambiguation: We'll use this value later and specifically call it out as the Azure DevOps Application (client) ID which is different from the Application (client) ID copied in Step 12 below

    • image.png

  8. After adding permissions in your App Registration select Grant Admin consent for (your tenant)

  9. Select Certificates & Secrets and select New client secret.

  10. Set the Expiration and select Add.

  11. After adding the secret copy the value and store for safe keeping to be used later.

  12. Return to the Overview section of your App Registration and copy the Application (client) ID and Directory (tenant) ID.

    [!IMPORTANT] Disambiguation: We'll use this value later and call it out as the Application (client) ID which is different from the Azure DevOps Application (client) ID copied in Step 7 above

  13. Select Add a Redirect URI > Add a Platform > Web

  14. Set the Redirect URI to https://global.consent.azure-apim.net/redirect

    [!NOTE] You may need to update this later when configuring your custom connector after installing the app if this url is different than the Redirect URL populated in the Custom Connector

  15. Select Configure

Give Power App Management Permission to your App

In order for the pipelines to perform certain actions against the environments in your Power Platform tenant you will need to grant Power App Management permissions to your App registration. To do so you will need to run the following PowerShell commandlet as an interactive user that has Power Apps administrative privileges. You will need to run this command once, using an interactive user, in PowerShell after your app registration has been created. The command gives permissions to the Service Principal to be able to execute environment related functions including querying for environments and connections via Microsoft.PowerApps.Administration.PowerShell (https://docs.microsoft.com/en-us/powershell/module/microsoft.powerapps.administration.powershell/new-powerappmanagementapp?view=pa-ps-latest). For more information on the New-PowerAppManagementApp cmdlet see here https://docs.microsoft.com/en-us/powershell/module/microsoft.powerapps.administration.powershell/new-powerappmanagementapp?view=pa-ps-latest

Install-Module -Name Microsoft.PowerApps.Administration.PowerShell
Install-Module -Name Microsoft.PowerApps.PowerShell -AllowClobber
New-PowerAppManagementApp -ApplicationId [the Application (client) ID you copied when creating your app registration]

Install Azure DevOps Extensions.

The ALM Accelerator uses several Azure DevOps extensions, including some third-party Extensions that are available in the Azure DevOps marketplace. Under Organization Settings in Azure DevOps install the following extensions. For more information regarding Microsoft and third-party Azure DevOps extensions see here https://docs.microsoft.com/en-us/azure/devops/marketplace/trust?view=azure-devops. In addition, each of the thrid-party extensions web pages and the link to their source code are provided below.

  1. Go to https://dev.azure.com and select Organization settings
  2. Select General > Extension image.png
  3. Install the following Extensions

Clone the YAML Pipelines from GitHub to your Azure DevOps instance

  1. Go to https://dev.azure.com/ and sign in to Azure DevOps (AzDO).

  2. Create a new project or select an existing project.

  3. Go to Repos and select Import repository from the repository dropdown

    clone-template-repository

  4. Enter https://github.com/microsoft/coe-alm-accelerator-templates as the Clone URL and click Import

    [!NOTE] The AzDO repo you created above will be where the Solution Pipeline Templates and the Export / Import Pipelines will run. Later when you create the Pipelines for your solutions you may need to reference this specific Project/Repo if you choose to source control your solutions in another repo in AzDO.

Create Pipelines for Import, Delete and Export of Solutions

Following the steps below to create the following pipelines based on the YAML in the DevOps Repo. These pipelines will run when you Commit to Git, Import a Solution or Delete a Solution from the App, respectively.

YAML FilePipeline Name
export-solution-to-git.yml export-solution-to-git
import-unmanaged-to-dev-environment.yml import-unmanaged-to-dev-environment
delete-unmanaged-solution-and-components.yml delete-unmanaged-solution-and-components
  1. In Azure DevOps go to Pipelines and Create a New Pipeline
  2. Select Azure Repos Git for your code Repository and point to Azure DevOps repo you created and seeded with the pipeline templates in the steps above. image.png
  3. On the Configure your pipeline page select Existing Azure Pipelines YAML file and point to /Pipelines/export-solution-to-git.yml, /Pipelines/import-unmanaged-to-dev-environment.yml or /Pipelines/delete-unmanaged-solution-and-components.yml and Select Continue. image-20210309102040713
  4. On the next screen Select Save and then Select the 3 dots next to Run Pipeline and Select Rename/Move. image.png
  5. Update the pipeline name to export-solution-to-git, import-unmanaged-to-dev-environment or delete-unmanaged-solution-and-components and select Save.

Get the Pipeline ID for the Export Solution Pipeline to use for global variables

For the next step you will need to get the Pipeline ID that the build pipelines use to find resources required for the build process.

  1. Open the export-solution-to-git pipeline and copy the pipeline ID from the address bar (e.g. If the URL for the Pipeline is (https://dev.azure.com/org/project/_build?definitionId=**39**) the Pipeline ID for this pipeline would be 39)

Create Pipeline global variables

  1. In Azure DevOps Select Pipelines > Library > Create a new Variable Group

  2. Name the Variable Group global-variable-group.

    [!NOTE] The pipelines reference this specific variable group so it has to be named exactly as what's shown. Also, be sure to create all of the variables below as the pipelines depend on them to be present and configured in order to execute (i.e. if you don't have a production environment while configuring the variable group, for example, use the Test Service Connection for the time being).

  3. Add the following Variables to the variable group

    NameValue
    CdsBaseConnectionString AuthType=ClientSecret;ClientId=$(ClientId);ClientSecret=$(ClientSecret);Url=
    ClientId [The Application (client) ID you copied when creating the App Registration]
    ClientSecret [The Application (client) Secret you copied when creating the App Registration] NOTE: It's recommeded that you secure this value by clicking the lock next to the value so others can't see your secret.
    TenantID [The Directory (tenant) ID you copied when creating the App Registration]
    PipelineIdToLoadJsonValuesFrom [The pipeline ID for export-solution-to-git copied in the previous step]
    ValidationServiceConnection [The url of the validation instance of Dataverse e.g. https://deploy.crm.dynamics.com/] NOTE: This must be identical to the Azure DevOps Validation Environment Service Connectionname you specified previously including any trailing forward slash. This environment will be used to run solution checker during the build process.

Update Permissions for the Project Build Service

[!IMPORTANT] There are a number of "Build Service" accounts in Azure DevOps that may confuse the steps below. Pay close attention to the names / format specified in Step 3 and 5 below. You may need to search for the specific account if it doesn't show up in the initial list.

  1. In Azure DevOps Select Project Settings in the left hand navigation.

  2. Select Repositories > Security.

  3. Find and select Project Collection Build Service ([Your Organization Name]) under Users.

    [!NOTE: In some cases you may not see Your Organization Name after the Project Collection Build Service user. In some cases it may just be a unique identifier and you may need to use the search function to find this user. Select this user]

  4. Set the following permissions for the Build Service user.

    PermissionValue
    Contribute Allow
    Contribute to pull requests Allow
    Create branch Allow
    image.png  
  5. Find and select the user name [Your Project Name] Build Service ([Your Organization Name]) under Users and set the same values as above.

Development Project Setup

The following section will guide you through the setup steps required for each of the Development Projects you will support. In this context a Development Project signifies the required infratructure and configuration needed to support healthy ALM including configuraiton of you Dataverse environment that will support the ALM process.

Create an App User in your Dataverse Environments

Each environment (i.e. Development, Validation, Test and Production) will need to have an Application User. For each of your environments follow the steps below to setup the Application User.

  1. Go to https://make.powerapps.com and select your environment

  2. Select the COG in the upper right hand corner and select Advanced Settings. image.png

  3. Select Settings > Security > Users. image.png

  4. Under System Views select the Application User view. image-20210216172510237

  5. Select New then select the Application User form when the user form loads image-20210216172532444

  6. In the Application ID field copy and paste the Application (client) ID you copied when creating your App Registration then select Save.

  7. After the user is created assign the user a Security Role.

    [!NOTE] It's recommended you give this user System Administrator rights to be able to perform the required functions in each of the environments.

  8. Repeat these steps as needed for each of your environments (i.e. Development, Validation, Test and Production).

Create Service Connections for DevOps to access Power Platform

Each Dataverse environment (i.e. Development, Validation, Test and Production) will need to have a Power Platform service connection in DevOps. For each of your environments follow the steps below to setup the service connection.

[!NOTE] Users of the ALM Accelerator for Advanced Makers app will only see environments for which they have either User or Administrator role on the Service Connection in Azure DevOps. If using personal development environments all developers should have User or Administrator role for the Service Connection for their own development environment.

  1. Go to https://dev.azure.com and select your Project

  2. Under Project Settings in your Azure DevOps project select the Service connections menu item.

  3. Select Create/New service connection and Search for Power Platform and select the Power Platform Service connection type and Select Next. image.png

  4. In the Server URL put your environment url (e.g. https://myorg.crm.dynamics.com/). NOTE: You must include the trailing forward slash see below

  5. Enter the same value as above for the Service Connection Name.  NOTE: You must include the trailing forward slash

    [!IMPORTANT] Currently ALM Accelerator will use the Service connection name to identify the service connection to use per environment so this needs to be the same url you entered above including the trailing forward slash).

  6. Enter the Tenant ID, Application (client) ID and Client Secret you copied from AAD when you created your App Registration and select Save.

  7. In order for users to be able to use the service connection from the ALM Accelerator App the Service Connections must provide Userpermissions to all users to be able to use the Service Connections. Update Permissions as follows:

    • Select the Service Connection to be shared with users from the Service Connections list.

      image-20210401084558807

    • Select the 3 dots in the top right corner and Select Security.

      image-20210401084807231

    • Select the Group or User you want to provide User permissions to in the Dropdown.

    • Select the User Role and Select Add

      image-20210401093313321

  8. Repeat these steps as needed for each of your environments (i.e. Development, Validation, Test and Production).

Solution Setup

When you create a solution in Dataverse you'll need to create pipelines specifically for that solution. Follow these steps for creating pipelines for your solution in Azure DevOps. There are sample pipelines included in the Pipeline directory in the CoE ALM Templates repo.

The sample pipelines provides flexibility for organizations to store their pipeline templates in a separate project or repo from the specific solution pipeline YAML. Follow the steps below to configure your solution pipeline. Repeat the steps for each of the solutions you will be source controlling with the ALM Accelerator.

[!IMPORTANT] The pipeline YAML for your solution pipeline will always be stored in the same repo to which you will be source controlling your solution. However, the pipeline templates (i.e. the folder Pipeline\Templates) can exist in either the same repo as your solution pipeline YAML or in a separate repo and/or project.

Validate Your Setup Using the ALM Accelerator Sample Solution (Optional)

The steps below provide generic step-by-step instructions on how to create pipelines to handle the application lifecycle of your solution. Since these steps are generic and can be difficult to follow without context. We've create a similar step-by-step setup guide for getting started with a Sample Solution that we've created. This will provide specific context for when you are ready to create and configure your own pipelines for your solution and validate the setup steps performed above. To validate your setup and complete the Sample Solution walkthrough follow the steps in the Sample Solution Setup Guide.

Create the Solution Build and Deployment Pipeline(s)

Solution Pipelines are used to build and deploy your source controlled solutions to environments in your tenant. You can create as many solution pipelines as needed based on your organization's environment strategy. The sample pipelines provided assume only 3 environments (Validation, Test, Production). However, more or less can be created as needed with specific triggers in the pipelines or without triggers that can be run manually as well. The sample deployment pipelines trigger off of changes to a branch (i.e. Test and Production) or as a part of a branch policy in Azure DevOps (i.e. Validation). See Setting Branch Policies for Pull Request Validation below for more information on Branch Policies.

The following steps show how to create a pipeline from the sample pipeline YAML (build-deploy-validation-SampleSolution.yml). Follow these steps to create all of your deployment pipelines.

[NOTE!] The following steps will create pipelines that build and deploy for each environment (Validation, Test and Production). However, you may want to only build and deploy for Validation and Test and then deploy the artifacts from the Test build to Production. Included in the section following this section are instructions for doing the latter. If this is your preferred method of setting up the pipelines follow the steps below for only the Validation and Test environment and then skip to the next section to see how to configure your release pipeline.

  1. In Azure DevOps go to the Repo that contains the Pipelines folder you committed and select the Pipelines folder

  2. Open the sample deployment pipeline (i.e. build-deploy-validation-SampleSolution.yml, build-deploy-test-SampleSolution.yml or build-deploy-prod-SampleSolution.yml) and copy the YAML to use in your new Pipeline. Note the name of this repo for use in your pipeline.

    image-20210408172106137

  3. Navigate to the Repo where you want to source control your solution.

  4. Create a new Branch based on your default branch in the Repo with the name of your solution (e.g. MyNewSolution)

    [NOTE!] This branch will be your 'v-next' branch for your Solution in the repo. All development work should be branched from this branch to a developers personal working branch and then merged into the v-next branch in order to push to Validation and Testing. Later when a release is ready the v-next branch can be merged into the main or default branch.

    image-20210507110620258

    image-20210507110801961

  5. Select New from the top menu and then Folder

    image-20210408144230561

  6. Give the new Folder the same name as your solution (e.g. MyNewSolution) and the new Pipeline YAML file a name (e.g. build-deploy-validation-SampleSolution.yml, build-deploy-test-SampleSolution.yml or build-deploy-prod-SampleSolution.yml). Select Create.

    image-20210408144634619

  7. Paste the YAML from deploy-validation-SampleSolution.yml, deploy-test-SampleSolution.yml or deploy-prod-SampleSolution.ymlinto your new Pipeline YAML file.

    image-20210408155252306

  8. Update the following values in your new Pipeline YAML.

    • Change the resources -> repositories -> name to the repo name that contains your pipeline templates. If your template repository is in another AzDO project you can use the format projectname/reponame here. In this case the repo is called coe-alm-accelerator-templates and it exists in the same project as our MyNewSolution repo. Additionally, you can specify a branch for where your templates live using the ref parameter if required.

      image-20210408175435181

    • Change any value that references SampleSolutionName to the unique name of your Solution (e.g. MyNewSolution).

      image-20210408175919661

    • Select Commit to save your changes.

  9. In Azure DevOps go to Pipelines and Create a New Pipeline

  10. Select Azure Repos Git for your code Repository. image.png

  11. Select the Azure DevOps repo which contains the deployment Pipeline YAML.

    image-20210409083340072

  12. On the Configure your pipeline page select Existing Azure Pipelines YAML file, point to the YAML File in your repo that you created in step 5 and Select Continue. image-20210409083824702

  13. On the next screen Select Save and then Select the 3 dots next to Run Pipeline and Select Rename/Move. image-20210301103145498

  14. Update the pipeline name to deploy-validation-MyNewSolution, deploy-test-MyNewSolution or deploy-prod-MyNewSolution(where 'MyNewSolution' is the name of your solution) and select Save.

    image-20210409083958467

  15. Update the Default branch for manual and scheduled builds for more information see the documentation here

    [!NOTE] If your new pipeline was not created in the default branch of the repo you may need to update the Default branch for manual and scheduled builds. See the following link for more information on Default branch for manual and scheduled builds. Configure pipeline triggers - Azure Pipelines | Microsoft Docs

    • Select Edit on your new Pipeline

    • Select the 3 dots on the top right and Select Triggers

      image-20210510163520532

    • Select the YAML tab and Select Get Sources.

    • Update the Default branch for manual and scheduled builds to point to your Solution branch

      image-20210510163722833

  16. Repeat the steps above to create a deployment pipeline for each of your environments referencing the sample deployment pipeline yaml from the coe-alm-accelerator-templates repo (i.e. deploy-validation-SampleSolution.yml, deploy-test-SampleSolution.yml and deploy-prod-SampleSolution.yml).

  17. Select Save and Queue and Select Save

    image-20210510163834521

Create the Solution Deployment Pipeline(s) (Optional)

As mentioned in the note above, the previous section allows you to create pipelines that build and deploy for each environment(Validation, Test and Production). However, if you want to only build and deploy for Validation and Test and then deploy the artifacts from the Test build to Production you can follow these instructions to create your production deployment pipeline after you've created your build and deploy pipeline for Validation and Test above.

  1. In Azure DevOps go to the Repo that contains the Pipelines folder you committed and select the Pipelines folder

  2. Open the sample deployment pipeline (i.e. deploy-prod-pipelineartifact-SampleSolution.yml) and copy the YAML to use in your new Pipeline. Note the name of this repo for use in your pipeline.

    image-20210429113205147

  3. Navigate to the Repo where you want to source control your solution.

  4. Select New from the top menu and then File

    image-20210429113559672

  5. Give the new Pipeline YAML file a name (e.g. deploy-prod-MyNewSolution.yml). Select Create

    image-20210429120113505

  6. Paste the YAML from deploy-prod-pipelineartifact-SampleSolution.yml into your new Pipeline YAML file.

    image-20210429130240109

  7. Update the following values in your new Pipeline YAML.

    • Update the trigger -> branches -> include to the branch(es) for which changes would trigger a deployment to production.

    • Change the resources -> repositories -> name to the repo name that contains your pipeline templates. If your template repository is in another AzDO project you can use the format projectname/reponame here. In this case the repo is called coe-alm-accelerator-templates and it exists in the same project as our MyNewSolution repo. Additionally, you can specify a branch for where your templates live using the ref parameter if required.

      image-20210429131636443

    • Update resources -> pipelines -> source to specify the build pipeline that contains the artifacts to be deployed by this pipeline. In this case we are going to deploy the artifacts from our Test pipeline, created above, that built and deployed our Solution to the Test environment.

      image-20210429132004303

    • Change any value that references SampleSolutionName to the unique name of your Solution (e.g. MySolutionName).

      image-20210429132557463

  8. Repeat the same steps performed for deploy-validation-ALMAcceleratorSampleSolution and deploy-test-ALMAcceleratorSampleSolution to create a pipeline from the new production pipeline YAML called deploy-prod-ALMAcceleratorSampleSolution.

Importing Data from your Pipeline

In many cases there will be configuration or seed data that you will want to import into your Dataverse environment initially after deploying your solution to the target environment. The pipelines are configured to import data using the Configuration Migration tool available via nuget https://www.nuget.org/packages/Microsoft.CrmSdk.XrmTooling.ConfigurationMigration.Wpf. To add configuration data for your pipeline use the following steps. For more information on the Configuration Migration tool see here https://docs.microsoft.com/en-us/power-platform/admin/manage-configuration-data

  1. Clone the AzDO Repo where your solution is to be source controlled and where you created your solution pipeline YAML to your local machine.

  2. Install the Configuration Migration tool per the instructions here https://docs.microsoft.com/en-us/dynamics365/customerengagement/on-premises/developer/download-tools-nuget

  3. Open the Configuration Migration tool select Create schema and select Continue

    image-20210217093038901

  4. Login to the tenant from which you want to export your configuration data

    image-20210217092809637

  5. Select your environment

    image-20210217092931725

  6. Select the specific Tables and Columns you want to export for your configuration data.

    image-20210217093237070

  7. Select Save and Export and save the data to a folder called ConfigurationMigrationData in your local Azure DevOps repo under the solution folder for which this configuration data is to be imported.

    [!NOTE] The pipeline will look for this specific folder to run the import after your solution is imported. Ensure that the name of the folder and the location are the same as the screenshot below.

    image-20210217093946271

    image-20210217093914368

  8. When prompted to export the data select Yes

    image-20210217094104975

  9. Choose the same location for your exported data and select Save then Export Data.

    image-20210217094247030

    image-20210217094341476

  10. When the export is complete unzip the files from the data.zip file to the ConfigurationMigrationData directory and delete the data.zipfile.

    image-20210309121221510

  11. Finally, Commit the changes with your data to Azure DevOps.

Setting Branch Policies for Pull Request Validation

In order to leverage executing the build pipeline for your solution when a Pull Request is created you'll need to create a Branch Policy to execute the Pipeline you created in the previous step. Use the following steps to set your Branch Policy. For more information on Branch Policies see here https://docs.microsoft.com/en-us/azure/devops/repos/git/branch-policies?view=azure-devops

  1. In Azure DevOps go to Repos and select the Branches folder

  2. Locate the target branch on which you want to run the Pull Request policy and select the ellipsis to the right of the target branch and Select Branch Policies.

    image-20210301103354462

  3. On the Branch Policies screen go to Build Validation

  4. Click the + Button to add a new Branch Policy

    image-20210301103528302

  5. Select the Pipeline you just created from the Build pipeline dropdown

  6. Specify a Path filter (if applicable). The path filter will ensure that only changes to the path specified will trigger the pipeline for your Pull Request.

  7. Set the Trigger to Automatic

  8. Set the Policy requirement to Required

  9. Set the Build expiration to Immediately

  10. Set a Display name for your Branch Policy (e.g. PR Build Validation)

  11. Click Save

    image-20210301104042544

Setting Deployment Pipeline Variables

The ALM Accelerator uses JSON formatted Pipeline variables for updating connection references, environment variables, setting permissions for AAD Groups and Dataverse teams as well as sharing Canvas Apps and updating ownership of solution componentssuch as Power Automate flows. EnvironmentName and ServiceConnection variables are required for each pipeline. All other pipeline variables are optional and depend on what type of components your solution pipelines deploy. For instance, if your solutions only contain Dataverse Tables, Columns and Model Driven Apps some of these steps may not be necessary and can be skipped. The following variables allow you to fully automate the deployment of your solutions and specify how to configure items that are specific to the environment to which the solution is being deployed.

[!IMPORTANT] These pipeline variables will be set for each deployment pipeline you've configured above based on the environment to which your pipeline deploys.

Create Environment and Service Connection (Required)

These variables are required by every deployment pipeline. The Environment variable is EnvironmentName and the Service Connection variable is ServiceConnection.

The EnvironmentName variable is used to specify the Azure DevOps environment being deployed to in order to enable tracking deployment history and set permissions and approvals for deployment to specific environments. Depending on the environment to which you're deploying set this value to Validate, Test or Production For more information on Environments in AzureDevOps see https://docs.microsoft.com/en-us/azure/devops/pipelines/process/environments.

image-20210414170154479

The ServiceConnection variable is used to specify how the deployment pipeline connects to the Power Platform. The values used for the Service Connection variable are the names of the Service Connections created above Create a Service Connection for DevOps to access Power Platform

image-20210414170210916

Create Connection Reference Pipeline Variable (Optional)

The connection reference variable is ConnectionReferences. This pipeline variable is used for setting connection references in your solution to specific connections configured in a target environment after the solution is imported into an environment. Additionally, the ConnectionReferences variable is used to enable flows after the solution is imported based on owner of the connection specified in the variable.

  1. You will need to create the connections manually in your target environments and copy the IDs for the connection to use in the JSON value below

  2. The format of the JSON for these variables take the form of an array of name/value pairs.

    [
       [ 
         "connection reference1 schema name",
         "my environment connection ID1"
       ],
       [
         "connection reference2 schema name",
         "my environment connection ID2"
       ]
    ]
    • The schema name for the connection reference can be obtained from the connection reference component in your solution. image.png

    • The connection id can be obtained via the url of the connection after you create it. For example the id of the connection below is 9f66d1d455f3474ebf24e4fa2c04cea2 where the url is https://.../connections/shared_commondataservice/9f66d1d455f3474ebf24e4fa2c04cea2/details# image.png

  3. Once you've gathered the connection reference schema names and connection ids go to the pipeline for your solution that you created above Select Edit -> Variables

  4. On the Pipeline Variables screen create the ConnectionReferences pipeline variables.

  5. Set the value to the json formatted array of connection reference schema and connection ids.

    • For the example above the values look like the following image.png
  6. Where applicable repeat the steps above for each solution / pipeline you create.

Create Environment Variable Pipeline Variable (Optional)

The environment variable pipeline variable is EnvironmentVariables. This pipeline variable is used for setting Dataverse Environment variables in your solution after the solution is imported into an environment.

  1. The format of the JSON for these variables take the form of an array of name/value pairs.

    [
       [
          "environment variable1 schema name",
          "environment variable1 value"
       ],
       [
          "environment variable2 schema name",
          "environment variable2 value"
       ]
    ]
    • The schema name for the environment variable can be obtained from the environment variable component in your solution. image.png
  2. Once you've gathered the environment variable schema names and connection ids go to the pipeline for your solution that you created above

  3. Click Edit -> Variables

  4. On the Pipeline Variables screen create the EnvironmentVariables pipeline variables.

  5. Set the value to the json formatted array of environment variable schema and values.

  6. For the example above the values look like the following image.png

  7. Where applicable repeat the steps above for each solution / pipeline you create.

Create AAD Group Canvas Configuration Pipeline Variable (Optional)

The aad group canvas configuration pipeline variable is AadGroupCanvasConfiguration. This pipeline variable is used for sharing canvas apps in your solution with specific Azure Active Directory Groups after the solution is imported into an environment.

  1. The format of the JSON for these variables take the form of an array of objects. The roleName can be one of CanView, CanViewWithShare and CanEdit

    [
     {
         "aadGroupId": "azure active directory group id",
         "canvasNameInSolution": "canvas app schema name1",
         "roleName": "CanView"
     },
     {
         "aadGroupId": "azure active directory group id",
         "canvasNameInSolution": "canvas app schema name2",
         "roleName": "CanViewWithShare"
     },
     {
         "aadGroupId": "azure active directory group id",
         "canvasNameInSolution": "canvas app schema name1",
         "roleName": "CanEdit"
     }
    ]
    • The schema name for the Canvas App can be obtained from the Canvas App component in your solution. image.png

    • The azure active directory group id can be obtained from the Group blade in Azure Active Directory from the Azure Portal. image.png

  2. Once you've gathered the Canvas App schema names and aad group ids go to the pipeline for your solution that you created above

  3. Click Edit -> Variables

  4. On the Pipeline Variables screen create the AadGroupCanvasConfiguration pipeline variables.

  5. Set the value to the json formatted array of objects per the sample above.

  6. For the example above the values look like the following image.pngWhere applicable repeat the steps above for each solution / pipeline you create.

Create AAD Group / Team Configuration Pipeline Variable (Optional)

The pipeline variable is AadGroupTeamConfiguration. This pipeline variable is used for mapping Dataverse Teams and Roles to specific Azure Active Directory Groups after the solution is imported into an environment. The security roles will need to added to your solution if they are not manually created in the target environment.

  1. The format of the JSON for these variables take the form of an array of objects. One or many roles can be applied to any given team and these roles provide permissions to solution components required by the users in the group.

    [
     {
         "aadGroupTeamName": "dataverse team1 name to map",
         "aadSecurityGroupId": "azure active directory group id1",
         "dataverseSecurityRoleNames": [
             "dataverse role1 to apply to the team"
         ]
     },
     {
         "aadGroupTeamName": "dataverse team2 name to map",
         "aadSecurityGroupId": "azure active directory group id2",
         "dataverseSecurityRoleNames": [
             "dataverse role2 to apply to the team"
         ]
     }
    ]
    • The Dataverse team name can be any existing team or a new team to be created in Dataverse and mapped to an AAD Group after the solution is imported via the pipeline.

    • The azure active directory group id can be obtained from the Group blade in Azure Active Directory from the Azure Portal.

    image.png

    • The Dataverse role can be any Security Role in Dataverse that would be applied to the existing or newly created Team after the solution is imported via the pipeline. The role should have permissions to the resources required by the solution (e.g. Tables and Processes)
  2. Once you've gathered the team names, aad group ids and roles go to the pipeline for your solution that you created above. Click Edit -> Variables

  3. On the Pipeline Variables screen create the AadGroupTeamConfiguration pipeline variables.

  4. Set the value to the json formatted array of environment variable schema and values.

  5. For the example above the values look like the following image.png

  6. Where applicable repeat the steps above for each solution / pipeline you create.

Create Solution Component Ownership Pipeline Variable (Optional)

The pipeline variable is SolutionComponentOwnershipConfiguration. This variable is used for assigning ownership of solution components to Dataverse Users after the solution is imported into an environment. This is particularly useful for components such as Flows that will be owned by default by the Service Principal user when the solution is imported by the pipeline and organizations want to reassign them after import. Additionally, the  SolutionComponentOwnershipConfiguration will be used to enable flows that don't have any connection references. The flow will be enabled by the user specified when no connection references are found to use to enable the flow.

[!NOTE] The current pipeline only implements the ability to set ownership of Flows. The ability to assign other components to users could be added in the future.

  1. The format of the JSON for these variables take the form of an array of objects.

    [
     {
         "solutionComponentType": solution component1 type code,
         "solutionComponentUniqueName": "unique id of the solution component1",
         "ownerEmail": "new owner1 email address"
     },
     {
         "solutionComponentType": solution component2 type code,
         "solutionComponentUniqueName": "unique id of the solution component2",
         "ownerEmail": "new owner2 email address"
     }
    ]
    • The solution component type code is based on the component types specified in the following doc https://docs.microsoft.com/en-us/dynamics365/customer-engagement/web-api/solutioncomponent?view=dynamics-ce-odata-9(e.g. a Power Automate Flow is component type 29). The component type should be specified as an integer value (i.e. with no quotes)
    • The unique name of the solution component, in the case of a Power Automate Flow, has to be taken from the unpacked solution. This is a limitation of flows currently not requiring unique names when they are created. As such the only true unique identifier for a Flow is the internal ID the system uses to identify it in a solution. image.pngimage.png
    • The owner email can be gathered from the user's record in Dataverse or Office 365.
  2. Once you've gathered the component type codes, unique name of the components and owner emails go to the pipeline for your solution that you created above

  3. Click Edit -> Variables

  4. On the Pipeline Variables screen create the SolutionComponentOwnershipConfiguration pipeline variables.

  5. Set the value to the json formatted array of environment variable schema and values.

  6. For the example above the values look like the following image.png

  7. Where applicable repeat the steps above for each deployment solution / pipeline you create.

Importing the Solution and Configuring the App

Install ALM Accelerator Solutions in Dataverse

[!NOTE] As of January 2021, before installing the solutions you will need to enable Power Apps Component Framework for Canvas apps from the Power Platform Admin Center by going to https://admin.powerplatform.microsoft.com/ selecting your environment and selecting Settings - Product - Features. From here select the toggle for Power Apps component framework for canvas apps to turn it on. image.png

  1. Download the latest managed solution from GitHub (https://github.com/microsoft/coe-starter-kit/releases).

    [NOTE!] The screenshot below is for reference as to where the managed solution exists under a release. The actual version should be the most recent release.

    image-20210430150752479

  2. Go to https://make.powerapps.com and select the environment you want to use to host the ALM Accelerator App

  3. Select Solutions from the left navigation.

  4. Click Import and Browse to the location of the managed solution you downloaded.

  5. Click Next and Next again.

  6. On the Connections page select or create a new connection to use to connect to Dataverse for the CDS DevOps connection.

  7. Click Import and wait for the solution to complete the import process.

Configure the Azure DevOps Custom Connector.

  1. In the Power App maker portal select your Environment and Select Data > Custom Connectors > CustomAzureDevOps

  2. Select Edit and go to the Security section and select Edit and set the following fields. image.png

    NameValue
    Client id [The Application (client) ID you copied when creating the App Registration]
    Client secret [The Application (client) Secret you copied when creating the App Registration]
    Tenant ID leave as the default common
    Resource URL [The Azure DevOps Application (client) ID you copied when adding permissions to your App Registration]
  3. Select Update Connector

  4. Verify that the Redirect URL is populated on the Security page with the url https://global.consent.azure-apim.net/redirect. If the Redirect URL is other than https://global.consent.azure-apim.net/redirect copy the URL and return to the app registration your created and update the Redirect URI you set earlier to the updated url.

  5. Verify the connector from the Test menu once you've completed the steps above.

    • Navigate to the Test menu.

    • Select New Connnection and follow the prompts to create a new connection.

    • In the Power App maker portal select your Environment and Select Data > Custom Connectors > CustomAzureDevOps.

    • Select Edit and go to the Test section and find the GetOrganizations operation.

    • Select Test operation and verify you Response Status returned is 200.

    image-20210222135128137

Using the ALM Accelerator App

  1. Once the app is installed and configured launch it from your Environment under Apps.

    [!NOTE] When you first launch the app you may need to consent to the app using your connections.

  2. Select the Cog in the top right to select your Azure DevOps Environment, Project and Repo to which you'll push your changes and submit your pull requests and select Save image-20210303085854533

    [!NOTE] If you don't see your DevOps Organization / Project in the dropdown double check that the Custom connector is working correctly after updating it's Security settings.

  3. From the Environment Drop Down Select the Dataverse Environment in which you will be doing your development work. image-20210303085806618

    [!NOTE] In order for your Environment to show up in this drop down a service connection in the Azure DevOps project you just selected is required (see Create a Service Connection for DevOps to access Power Platform. Additionally, verify that you've followed the steps to reconnect the flow above if you do not see any environments in the list.

  4. By default the unmanaged solutions in your Environment should be displayed in the main window with buttons to Push Changes and Create Pull Requests.

  5. To import an unmanaged solution from an existing Azure DevOps project to begin making changes select the + Import Solutionsbutton and select a solution and version.

    [!NOTE] the solutions available are based on previous builds of your pipelines. The idea is that others have previously built the solutions and you are pulling the latest copy of the solution into your new development environment. If the solution has never been built previously you would begin with the next step.

    image-20210303085946610

  6. Once your solution is imported into Dataverse, or you've created a new unmanaged solution and made your customizations, you can push your changes to Git using the Push Changes to Git button for your solution.

    [!NOTE]: Be sure to publish your changes before initiating the push. If a newly created solution doesn't show in your list immediately. Click the Refresh button to reload all solutions.

    • Select an existing branch or create a new branch based on an existing branch and enter a comment. Use the hashtag notation e.g. #123 to link the changes to a specific work item in Azure DevOps and Select Commit. image-20210303085710535

    [!NOTE]: There is an option to specify if the latest changes contain Delete Components. This allows the user to specify whether to perform an update or an upgrade of the solution when it is deployed. The former will increase the performance of the pipelines and reduce the overall time to deploy.

    • When the push begins a waiting indicator will appear. If the push is successful a checkbox will appear otherwise a red x will appear. In order to see the progress of your push select the progress indicator which will take you to the running pipeline in Azure DevOps.
    • Repeat the pushes as you iterate on your solution.
  7. When you are ready to create a pull request for the changes to your branch select the Create Pull Request button.

    [!NOTE]: Be sure to publish your changes before initiating the push.

    • Specify the Source and Target branch and enter a Title and Comment for your Pull Request and Select Create.** image-20210303085543943
  8. Once a Pull Request is created for your changes the remaining steps to Merge and Release to Test occur in Azure DevOps. Depending on the Branch Policies and Triggers configured for your Target Branch, an Azure DevOps user can approve or reject your Pull Request based on their findings in the submitted changes and that status will appear in the App. Approving the PR will initiate the deployment of your solution to the Test environment. If the Pull Request is approved you will see the progress move to Test and a status based on the pipeline's success or failure in that stage.

    image-20210303085132733

  9. For Production a Pull Request will need to be created in Azure DevOps that merges the changes into your Production release branch. The same approval process will be required depending on your branch policies and once the PR is completed your solution will be pushed to Production. Once the pipeline for deploying to Production is finished you will see the status of the deployment in the App.

 

posted on 2021-05-18 08:43  lingdanglfw  阅读(1486)  评论(0编辑  收藏  举报