Announcing General Availability (GA) of Power Apps portals as mobile apps

Microsoft announced the General availability of portals as a progressive web app that was introduced as a preview featureenabling makers to enable portals into apps with native app like look & feel right from portal studio in just few steps! This is my favorite announcement for which I was waiting for…

Microsoft is leveraging Progressive web app technology (PWA) to enable portal to multi-platform supported app which works on all platforms (android/iOS/windows) as well as all form factors (mobile/desktop/tablet).

With PWA, end user can pin Power Apps portals as an app to the home screen on a mobile device or install it from the app store

This great new feature is stuffed with following capabilities :

  • Create low code-no code branded app in just few steps from portal studio
  • Manage basic offline experience of your app
  • Distribute your app via browser or app stores

Please refer the below link for more information –

Microsoft announced Tenant-level analytics for Power Automate Cloud flows

To better accommodate the rapid expansion of Power Platform adoption across the enterprise, service administrators depend on advanced monitoring solutions.  Tenant-wide analytics are built on service telemetry data derived from Power Platform apps, bots, and services. These solutions allow environment admins to effectively monitor activities related to Cloud flows across the organization.  This update includes an expansion to the Data Export pipeline making Usage & Inventory metrics for Cloud flows available to Power Platform customers.

Please refer below link for more information –

Power Automate for desktop – February 2022 update

Microsoft released the new version of power automate desktop which has below features –

  1. Monitoring notification window for console attended desktop flow runs
  2. Performance improvements in flow designer
  3. New ‘Display custom form’ action (Preview)
  4. New actions to convert a file to base64 and vice versa
  5. New ‘Extract tables from PDF’ action
  6. New property for datatable variables
  7. Renaming variables to existing names is now permitted
  8. New ability to copy a subflow and paste in other flows
  9. Auto clean up of local actions log file

Please refer below blog for more information –

Announcing new Dataverse auditing features

Microsoft introduced a set of new audit features helping Dataverse administrators to stay compliant with internal and external auditing requirements and to manage log capacity at the same time.

Audit data is now also stored in a separate physical log storage so an organization’s audit log can grow to many terabytes in size without limiting available physical database storage. This change in physical storage is independent from the continued need for having sufficient storage entitlements for logs and database records.

What is the auditing feature?

The auditing feature logs changes that are made to customer records and user access so you can review the activity later. The auditing feature is designed to meet the auditing, compliance, security, and governance policies of many regulated enterprises.

Audit retention policy

Administrators can select an audit retention period from a dropdown menu or specify a custom retention period. Audit records are automatically deleted after the retention period is over, relieving administrators from the burden of deleting audit data manually. It is also possible to keep the audit log indefinitely by not specifying a retention period.

Audit settings
Audit settings with the option to specify a retention period.

Try it out now! Go to Audit data and user activity for security and compliance – Power Platform | Microsoft Docs to learn more.

Flexible audit data management

With new flexible audit deletion options, we are removing a major pain point our customers were facing frequently: how do I delete a subset of the audit logs while keeping the rest?

We are introducing a set of new audit deletion options so administrators can delete the logs of one or more audited tables, delete the user access logs, or delete logs up to a specific date in order to free up storage space. The new delete options are replacing the audit log management view and quarterly log partitions.

New audit deletion options
New options to delete audit data.

Try it out now! Go to Free up storage space – Power Platform | Microsoft Docs to learn more.

Use the BulkDelete API to tailor deletions to your specific needs. Try it out now! Go to Retrieve and delete the history of audited data changes (Microsoft Dataverse) – Power Apps | Microsoft Docs to learn more.

How to use Environment Variables with AKV Secrets in the CI/CD Pipeline when deploying applications

Why Azure Key vault?

Imagine you have an application that requires authenticating a data source with a particular client id or a secret key. You would not want to expose those secret values to other authors or users of your application. In such a case you would want to store these sensitive values, which are important to the functioning of your application, in Azure Key Vault. The example used in this blog to support secrets, we are storing the clientid and authentication key as secrets in Azure Key vault and referencing those values when deploying them from one environment to the next. This blog highlights the assumption that data sources used in development environments are different than the similar data sources used for QA, which requires updated values for your clientid and authentication key to such data sources. For the basic concepts of Azure Key vault please refer to the documentation.

Figure 1: Example of a secure application using Azure Key Vault for storing application secrets

Grant the SPN for the CI/Cd Pipeline access to the target environment Key Vault

In this example, we have a ServiceNow connector, which is used in an application within Power Platform, this is similar to the setup mentioned in the blog. Two different ServiceNow instances are used, one for development and another for testing purposes.

Figure 2: Flow of the ALM pipeline sequence

The service principal account that you are using needs to have read access to the target environment’s Azure Key Vault. In this example I have a production key vault that I use to secrets for my QA and Production environment (it is recommended that you have separate Key Vaults for Dev, Test, and Production not like my setup which is a demo environment)

Figure 3: Provide SPN account being used to run the Pipeline reader access

SPN account needs access to the target environment

To make sure that the SPN can deploy to the target environment, make sure that the SPN has access to and the right role setup in the target environment, which is typically the System Administrator role.

Figure 4: SPN access to the target environment to deploy the solution

setting up the Azure DevOps service connection

To set up the service connections in Azure please go to the project settings and select Service Connections.

Graphical user interface, text, application Description automatically generated

Then proceed to specify the connection details for the environment

Figure 5: Specifying details for the service connection

You can do this for both your Development and QA environments, or any other Power Platform environment you might be using in your pipeline. Ideally, you should have different SPN accounts that deploy to QA and to Prod and not use the same SPN account if possible.

Setting up your solution

You can set up the environment variables with references to Azure Key Vault Secrets using the following blog from Sameer Chabungbam

Easier deployments of Custom Connectors | Power Automate Blog (microsoft.com)

Using Azure DevOps to store the solution to source

So, we will create a pipeline that will extract the solution from the development environment and commit the artifacts to source control. In this case my pipeline exports the solution from the development environment and checks-in both the unpacked solution and the solution zip into the repository. For more details on how to setup Azure DevOps with Power Platform build tools please to this link

Graphical user interface, text, application, email Description automatically generated

Figure 6: Getting solution from the development environment and committing to source

Inside the Git repository in your Azure DevOps environment, you will see the following:

Both the SvcNow.zip (solution zip file) and the unpacked solution file are now in the repository. For secrets, we recommend that you do not create default values with the location of your secrets and only store them as current value.

Create the deployment settings file and provide target values

To create the deployment settings file, you can proceed to open the repository in VS code with the Power Platform extension installed and run the create-settings subcommand under the Power Platform CLI command. Running the create settings file creates a settings json file that can be populated with the values for the target environment.

Text Description automatically generated

Figure 7: Creating the deployment settings file

Open the new settings json file and supply the values for the target environment. Keep in mind the string value on how the path to the Azure Key Vault is provided.

Graphical user interface Description automatically generated

Figure 8: update setting file with the target values

Check in the changes into the repository and let us proceed to deploy the solution into the target environment.

Running the deployment

Once the changes are committed into the code repository, as shown in Figure 9, we can proceed to start the deployment pipeline.

Figure 9: checked-in deployment settings file

Now in the pipeline menu let us create the Power Platform Deployment Pipeline to deploy to QA. The Key tasks here are the pack, Solution Checker, and the import command (but don’t forget the tool installer as your first step).

Figure 10: Solution import settings in the deployment pipeline

Make sure that you have selected the “Use deployment settings file”, this is a key operation to make this pipeline work with Azure Key Vault for the target environment.

Text Description automatically generated

Figure 11: Example of a deployment run into QA

Once the deployment is done you can now check the solution in the target environment with the appropriate values.

Figure 12: Changed Key Vault values for the Target QA environment