Synchronising Azure Active Directory Security Groups between D365 Records and SharePoint Sites

In my previous post I discussed how to synchronise permissions between D365 Access Teams and a related SharePoint site. This works really well if you have a system locked down and only want to grant access to records to specific named users, but what happens if you want to allow all Users to Read all records, unless they need to be restricted?

In my organisation, our default position is to enable every User to Read and Update all Accounts unless we have a specific need to restrict access (e.g. in a situation where there is a conflict of interest, or particular client requirements). We also need to ensure these permissions settings are replicated in SharePoint.

We do this by utilising an Azure Active Directory Security Group which we add all relevant Users to. We then Share/Unshare records in Dynamics 365 as necessary, and add/remove the AAD Group to the SharePoint Site Members Permissions Group simultaneously.

The SETUP

In order to enable this functionality the first thing we do is ensure Users only have User-level Read and Write privileges on Accounts.

Next, we’re going to add a custom field that we’ll use for our trigger. In my case I’ve added an Option Set field called SharePoint Permissions. You’ll note from the screenshot below that there are only two options, but I’ve added it as an Option Set field. The reason I’ve used an Option Set instead of a Two-Option field is that we may need to change the options in future, and an Option Set gives us the flexibility to include additional options if needed

As we want to synchronise an Azure Active Directory Security Group, we need to make one in the Azure Portal (or you need to ask your friendly neighbourhood IT Team to do it for you, and to provide you with the Object Id

The final step in the setup is to create a Team in Dynamics 365, then set the “Team Type” to AAD Security Group, and set the “Azure Object Id for a group” to the Object Id from Azure

THE SOLUTION

1. When SharePoint Permissions field is updated – we’re using the using the Common Data Service (Current Environment) connector, and the “When a Record is Created, Updated or Deleted” trigger. We will set the Trigger Condition to Update, and then we’ll set the Filtering Attribute to our “SharePoint Permissions” field we created.

2. Initialize integrationGroupName – Next we’ll create a String Variable to hold the name of the Azure Active Directory Group we created earlier.

3. Initialize membersGroupID – we need to initialize another string Variable to hold the members Group ID from SharePoint. In this step we’ll leave the value blank, as we’ll set it later in the Flow.

4. Get all Groups – we’ll use the “Send HTTP Request to SharePoint” action to retireve a list of the Permissions Groups from the SharePoint site related to the D365 record. For the Site Address I’m using the custom field I created in this post. Of course you could also retrieve the Absolute URL from the related SharePoint Site record. When you have the site URL then it’s a imple call the API to retrieve all the existing groups

The output from this step will be a JSON, and I’ve included a sample extract below:

{
  "d": {
    "results": [
      {
        "__metadata": {
          "id": "https://<YOURSITE>.sharepoint.com/sites/ACC-01001-2020/_api/Web/SiteGroups/GetById(5)",
          "uri": "https://<YOURSITE>.sharepoint.com/sites/ACC-01001-2020/_api/Web/SiteGroups/GetById(5)",
          "type": "SP.Group"
        },
        "Owner": {
          "__deferred": {
            "uri": "https://<YOURSITE>.sharepoint.com/sites/ACC-01001-2020/_api/Web/SiteGroups/GetById(5)/Owner"
          }
        },
        "Users": {
          "__deferred": {
            "uri": "https://<YOURSITE>.sharepoint.com/sites/ACC-01001-2020/_api/Web/SiteGroups/GetById(5)/Users"
          }
        },
        "Id": 5,
        "IsHiddenInUI": false,
        "LoginName": "A. Datum Corporation (sample) Members",
        "Title": "A. Datum Corporation (sample) Members",
        "PrincipalType": 8,
        "AllowMembersEditMembership": true,
        "AllowRequestToJoinLeave": false,
        "AutoAcceptRequestToJoinLeave": false,
        "Description": null,
        "OnlyAllowMembersViewMembership": false,
        "OwnerTitle": "A. Datum Corporation (sample) Owners",
        "RequestToJoinLeaveEmailSetting": ""
      }
    ]
  }
}

5. Parse Groups JSON – in order to be able to use the JSON that we’ve returned above we need to parse it using a Parse JSON action, as this will allow us to use the output in the next step

6. Loop through results to find Members Group – For this part we’ll use an Apply to Each control to loop through each of the Groups we returned in Step 4 to find the ID of the Members group

6A. Check if the Group Title is the Members Group – we’ll use a Condition control to check if the Title of the current Group ends with “Members” because the Members group is the one we want to be able to add/remove the AD group to/from

6B. If Yes, Set membersGroupID Variable – if the result of the condition at 6A is True then we’ll set the membersGroupID variable to the Group ID of the current Group

7. Check if Permissions is set to All Users – next we’ll use a Condition control to check if the SharePoint Permissions field is set to “All Users”. For this step we need to put in the Option Set Value that corresponds to the option we want to check against

If the answer to the question above is Yes, then we’ll follow steps 7A and 7B

7A. Add Integration Group to Members Group – for this step we’ll be using the “Send HTTP Request to SharePoint” action agiain, and this time we’ll be using a POST method to add the Active Directory Group to the SharePoint Members Group.

As in Step 4, we’ll be using the SharePoint Site URL from custom field on our account, and we’ll be using the following Uri:

_api/web/SiteGroups(<GroupID>)/users 

As we retrieved the Group ID in Step 6, we can use the membersGroupID variable for the Group ID parameter in the Uri.

In the Body of the request we’ll use the following format:

{  
   "__metadata": {  
   "type":"SP.User"  
   },  
"LoginName":"c:0t.c|tenant|4c31af32-7f0c-493b-bc05-c3abc0224280"
}  

One of the key things to note here is that the LoginName parameter has to have the prefix “c:0t.c|tenant|”. The GUID that completes the LoginName is the Group Object Id from Azure Active Directory:

7B. Share Account with All Users Team – as we’re sharing the SharePoint site with all Users in our organisation, we also want to share the Account record in Dynamics 365 with all Users. We can do that by using the “Perform an Unbound Action” action from the Common Data Service (Current Environment) connector. There is an unbound action called GrantAccess that we can use. This action has two parameters:

1. Target – for this we use a reference to the Account just as we would if we were setting a lookup field, and we will retrieve the Account ID from our trigger.

/accounts(@{triggerOutputs()?['body/accountid']})

2. PrincipalAccess – the PrincipalAccess parameter requires JSON in the following format:

{
  "Principal": {
    "teamid": "E65D6729-E162-EA11-A812-000D3A86BA0B",
    "@{string('@')}odata.type": "Microsoft.Dynamics.CRM.team"
  },
  "AccessMask": "ReadAccess, WriteAccess"
}

There are a couple of key things to note here:

  1. For the TeamID, I’ve hardcoded the GUID in this instance, but you could retrieve the GUID dynamically if you wish.
  2. For the @odata.type: parameter we’ve escaped the “@” symbol by putting it into a string expression. You could also escape it by formatting it as @@, but in my experience reopening the action for edit caused it to revert back to just @, and then the action failed.
  3. The AccessMask specifies the rights you’re granting to the User/Team you’re sharing with. The full list of privileges you can grant is “ReadAccess, WriteAccess, AppendAccess, AppendToAccess, CreateAccess, DeleteAccess, ShareAccess, AssignAccess”

If the answer to the condition in Step 7 is False then we’ll follow steps 7C-7F:

7C. Get all Members Group Users – we’ll use the “Send HTTP Request to SharePoint action to get a list of all the Users from the Members Group, again using the membersGroupID variable from above. The Uri is formatted as:

_api/Web/SiteGroups/GetById(@{variables('membersGroupID')})/users?$select=Id,Title

Note that we have a $select parameter to specify that we want to retrieve the User ID and Title.

7D. Parse returned Users – next we’ll use a Parse JSON action to parse the JSON that’s returned by the previous action so we can use it in the next step. The schema for this step is:

{
    "type": "object",
    "properties": {
        "d": {
            "type": "object",
            "properties": {
                "results": {
                    "type": "array",
                    "items": {
                        "type": "object",
                        "properties": {
                            "__metadata": {
                                "type": "object",
                                "properties": {
                                    "id": {
                                        "type": "string"
                                    },
                                    "uri": {
                                        "type": "string"
                                    },
                                    "type": {
                                        "type": "string"
                                    }
                                }
                            },
                            "Id": {
                                "type": "integer"
                            },
                            "Title": {
                                "type": "string"
                            }
                        },
                        "required": [
                            "__metadata",
                            "Id",
                            "Title"
                        ]
                    }
                }
            }
        }
    }
}

7E Check for the Integration Group – we use an Apply to Each control to loop through each of the returned Users to find the AD Group so we can remove it from the SharePoint Group

7E1. If Current User equal Integration Group – first we use a Condition control to check if the current User’s Title is equal to the integrationGroupName variable we set in Step 2

7E2. Remove Integration Group from Site Members – we use a final “Send HTTP Request to SharePoint” action to remove the AD Group from the SharePoint Group. The Uri for this action is:

_api/Web/SiteGroups/GetById(@{variables('membersGroupID')})/users/removeByid(@{items('Check_for_the_Integration_Group')?['Id']})

As you can see, we’re using the membersGroupID variable to specify the Group we’ll be removing the AD Group from, then we’ll use the current User’s ID from the loop to specify that this is the User we want to remove

7F. Unshare Account with All Users Team – the final step in the Flow is to use another Unbound Action to RevokeAccess to the AllUsers team. The RevokeAccess action has two parameters:

1. Target – as in Step 7B we set this as we would a lookup field, using the following format:

/accounts(@{triggerOutputs()?['body/accountid']})

2. Revokee – this is another parameter that we can set as we would a lookup field. In this case we’re revoking access for a team, so we use the following format:

/teams(E65D6729-E162-EA11-A812-000D3A86BA0B)

Conclusion

Hopefully you’ll find this Flow useful. One of the things I like about it is that it’s using the same Azure Active Directory Group in both Dynamics 365 and SharePoint, so you don’t have to worry about trying to maintain and synchronise team memberships in two systems, it can all be centrally controlled and managed in Azure.

Let me know your thoughts on this by reaching out to me on Twitter or LinkedIn, or drop a comment below.

Synchronising Permissions Between Dynamics 365 and SharePoint Using Power Automate

This is the third post in my series discussing a better integration betwen Dynamics 365 and SharePoint. If you’d like to read the previous two posts, check them out by clicking the links below:

  1. Creating a custom site structure
  2. Synchronising Document Libraries

In this post I’m going to demonstrate how to synchronise a Dynamics 365 Access Team with a related SharePoint site using Power Automate. One of the key frustrations I’ve experienced with the default integration between D365 and SP is that there is no reciprocity in permissions management between the two systems; working in the financial services industry we need to ensure that only those who should have access to a record and the related documents stored on SharePoint will have access, and we wanted to avoid having to make this synchronisation manual where possible.

THe SETUP

For this scenario you will need to have enabled the SharePoint integration and the custom site structure that I talked about in Part 1 of this series. Whilst you can secure items in SharePoint at item/folder level this is not recommended; SharePoint best practice typically recommends dealing with security at Site level.

For the entities that you are going to be synchronising with SharePoint you will need to enable them for Access Teams and set up an appropriate Access Team Template.

We will also need to add two new fields to the entities that we’re going to be doing the permissions synchronisation for:

  1. Sync with SharePoint – a two-option field we’ll use to trigger our Flow
  2. Last Team Sync – a datetime field we’ll use to let Users know when the D365 team was last synced with SP

Now we’ve completed the setup, lets get on with making the Flow!

The Solution

1. When Account Sync with SharePoint equals Yes – for the trigger we’re using the Common Data Service (Current Environment) connector, and the “When a Record is Created, Updated or Deleted” trigger. We will set the Trigger Condition to Update, and then we’ll set the Filtering Attribute to our “Sync with SharePoint” two-option field we’ve created. As we only want the Flow to trigger when that field is set to Yes we will set the filter expression to ensure the two-option field equals true. This avoids us triggering redundant flow runs and implementing guard conditions

2. Get Related Access Team – next we’ll use a List Records action to retrieve the Access Team related to our entity. This is straightforward to do because Access Teams that are system generated automatically set the RegardingObjectID to the related record, so we can use this in our Fetch XML query to find the team we want. In my demo I only have one access team template for my entity, if you had two then you could add additional parameters to your query to ensure you return the right team.

In this case I’m going to be using a link-entity clause so I can find a team where the regardingobjectid equals the Account from my trigger.

The Fetch XML query I used is:

<fetch version="1.0" output-format="xml-platform" mapping="logical" distinct="false">
  <entity name="team">
    <attribute name="name" />
    <attribute name="teamid" />
    <attribute name="administratorid" alias="ownerid"/>
    <order attribute="name" descending="false" />
    <link-entity name="account" from="accountid" to="regardingobjectid" link-type="inner" alias="ab">
      <filter type="and">
        <condition attribute="accountid" operator="eq" value="@{triggerOutputs()?['body/accountid']}" />
      </filter>
    </link-entity>
  </entity>
</fetch>

3. Compose TeamID – as we’re only returning one result in our list records step above, we don’t want to add a redundant “Apply to Each” loop to find the attribute we want; instead we’ll use a First expression to query the returned JSON for the attribute. The expression I’ve used is:

first(outputs('Get_Related_Access_Team')?['body/value'])?['teamid']

4. List CRM Team Members – now that we’ve got the Team ID set, we can interrogate the TeamMembership entity. If you’re not aware of it, the TeamMembership entity is an intersect entity between the SystemUser and Team entities. It will not appear in the list of available entities, so you will have to manually input teammemberships into the Entity Name field.

We can use another Fetch XML query to find the related Users and, importantly, their email addresses which we will use to add them to SharePoint later in the Flow. In the Fetch XML we’ll use the Team ID from the step above, and we’ll use a link-entity to retrieve the Users email address.

The Fetch XML query I used is:

<fetch>
  <entity name="teammembership" >
    <attribute name="teammembershipid" />
    <attribute name="teamid" />
    <attribute name="systemuserid" />
    <filter>
      <condition attribute="teamid" operator="eq" value="@{outputs('Compose_TeamID')}" />
    </filter>
    <link-entity name="systemuser" from="systemuserid" to="systemuserid" >
      <attribute name="internalemailaddress" />
    </link-entity>
  </entity>
</fetch>

The output of this step will produce JSON that will look something like below. As you can see, we’ve selected the internalemailaddress attribute from the SystemUser entity by using a link-entity in the Fetch XML query, and it is returned in the JSON as an attribute called systemuser1.internalemailaddress; this is important for the next step

5. Select CRM Users Email Addresses – in this step I’m using a Select action to create an array of the returned Users email addresses from the previous step. I do this because it gets rid of any additional “noise” from the returned JSON, which will make it easier to debug later. In the Select action I set the Key to “Email Address” and then in the Value I use the following expression:

@item()?['systemuser1.internalemailaddress']

6. Get SharePoint Team Members – for this step I’m going to be using the “Send HTTP Request to SharePoint” action. The Send HTTP request to SharePoint action lets you use the REST API, so we’re going to retrieve a list of the Users, and we’ll use a $select tag to specify that we want the Users email addresses. For the Site Address I’m using the value from the custom SP Site URL Field I created in Part 1 of this series.

The Uri we’re using is:

_api/Web/SiteGroups/GetById(5)/users?$select=Email

You’ll see above that I’ve specified the Group ID as 5, which is typically the members group (though Al Eardley has informed me this may not always be the case). If you have a specific group you would like to sync your Users to then you could retrieve all the Groups from the SP site, then loop through them to find the correct one and get the ID for that one to use in the step above.

7. Compose SPTeamMembers – next we’ll use another Compose step to convert the output from the previous step into a String object, as we need it to be a string for the purposes of the following steps. I’m also using a toLower expression to ensure consistency of results.

The expression I used is:

toLower(string(outputs('Get_SharePoint_Team_Members')?['body']))

8. For Each CRM User – in this step we’ll be using an Apply to Each to loop through the outputs from the Select action in Step 5

8A. Compose Email Address – the first step is to use a Compose action with a String expression to convert the array object to a string so we can use it in the next step. As in Step 7, I’m also using a toLower expression to ensure consistency of results. The expression is:

toLower(string(items('For_Each_CRM_User')?['Email Address']))

8B – Check if they are in the SharePoint Team – we’ll use a condition control to check if the current User’s email address appears in the list of Users email addresses we’ve extracted from SharePoint using a contains expression:

contains(outputs('Get_SharePoint_Team_Members')?['body'],outputs('Compose_EmailAddress'))

If the answer to this question is true we’ll do nothing (as they’re already in the SharePoint group), if the answer to the question is No then we’ll add them to the SharePoint group in the next step

8C – Add User to SharePoint Team – we’ll use another “Send HTTP Request to SharePoint” action to add a User to the SharePoint Group. In this instance we’re going to be using the POST method rather than the GET method. The Uri is:

_api/web/SiteGroups(5)/users

The body of our request is:

{   "__metadata": {  
   "type":"SP.User"  
   },  
"LoginName":"i:0#.f|membership|@{outputs('Compose_EmailAddress')}"  
}  

Please note that the prefix “i:0#.f|membership|” is required for the login name, as this is part of how SharePoint handles claims based authentication

9. Parse SPTeamMembers – next we’ll use a Parse JSON action on the outputs from Step 6 above, and we’ll use the output of the Parse JSON step in step 10 below to loop through each returned user in the SharePoint Group. The schema for the Parse JSON action is:

{
    "type": "object",
    "properties": {
        "d": {
            "type": "object",
            "properties": {
                "results": {
                    "type": "array",
                    "items": {
                        "type": "object",
                        "properties": {
                            "__metadata": {
                                "type": "object",
                                "properties": {
                                    "id": {
                                        "type": "string"
                                    },
                                    "uri": {
                                        "type": "string"
                                    },
                                    "type": {
                                        "type": "string"
                                    }
                                }
                            },
                            "Email": {
                                "type": "string"
                            }
                        }
                    }
                }
            }
        }
    }
}

10. For each SP User – We’ll use an Apply to Each control to loop through the output results from the previous step

10A. Check if they are in the CRM Team – Similarly to step 8B, we’re going to use a Contains expression to check if the current SharePoint User’s email address is in the list of returned Users email addresses from Step 5

contains(string(body('Select_CRM_Users_Email_Addresses')),string(items('For_Each_SP_User')?['Email']))

If the answer to this question is Yes (the User is in both teams), then we’ll do nothing. If the answer to this question is No (the User is in the SharePoint group but is not in the D365 team) then we’ll run through the steps to remove them from the SharePoint group

10B. Get User ID – we’ll use another “Send HTTP Request to SharePoint” action to retrieve the ID of the User we want to remove from the SharePoint Group. We are using the GET method, and the following Uri:

_api/web/SiteUsers/getByEmail('@{items('For_Each_SP_User')?['Email']}')

We don’t need any headers or body parameters for this request

The output from this action will return JSON that looks like

As you can see, the User’s ID is under the path [‘d’][‘id’], so we’ll use this in the next step

10C. Remove User from SP Group – we’re going to use one more “Send HTTP Request to SharePoint” action to remove the User from the group. We’ll be using a POST method to perform the action, and the Uri for the action is:

_api/Web/SiteGroups(5)/users/removebyid(@{body('Get_User_Id')['d']['id']})

As you can see, we’ve specified the Site Group number we want to remove the User from, and then taken the ID from the step above to specify the User we want to remove.

11. Update Account – the final step in the Flow is to use an Update Record step to set the “Last Sync Date” field to the current date using a utcNow expression

We will also set the “Sync with SharePoint” field back to No so it is ready for the next time the Team needs to be synced

As we set the filter expression in the trigger to only fire the Flow when this field equals Yes, changing the value back to No will not trigger a new Flow

Conclusion

This has hopefully given you an overview of how relatively straightforward it could be to synchronise security between Dynamics 365 records and an associated SharePoint site. There are a couple of things you may wish to consider alongside this:

  1. If you’re going to have multiple entities enabled for Document Management, you may wish to put the logic fro synchronising permissions in a Child Flow that could then be called by any entity
  2. As discussed above, good practice would dictate that you should always dynamically retrieve the specific SharePoint Group ID you want to synchronise permissions with, rather than hardcoding a value
  3. You may wish to set this to run on a schedule so permissions are synced automatically every X hours/days/weeks, etc.
  4. As of the date of publication, Flows cannot be triggered on N:N Associate/Dissociate actions in Dynamics 365, but you could use a trigger when a new record is created/deleted in the teammembership entity, or alternatively you could use a solution like North52 to create a custom trigger. I am personally more in favour of not using this approach for two reasons:
    1. Power Automate flows are currently asynchronous only, so this means you could potentially run into issues with concurrency if you are adding/removing multiple people at the same time from teams, which would lead to Dynamics 365 and SharePoint being out of sync
    2. If you have lots of teams, and the membership changes frequently, then you could trigger a lot of Flow runs, compared to this method which catches all changes in a single run

If you do find this useful I’d love to hear from you, please reach out to me on social media or drop a comment below to give me your thoughts!

Better Integration Between Dynamics 365 and SharePoint – Part 2

In my previous post I discussed how to improve the integration between Dynamics 365 and SharePoint by creating a site collection for each record, instead of using the OOTB integration.

Now that you have a site, you might give some thought as to how you will structure your data; typically SharePoint best practice would tend to suggest you should use metadata instead of folders, but if you do need to segregate information then the recommendation would be to use Document Libraries.

In this post I will demonstrate how you can add Document Libraries to the SharePoint site for a record and have it visualised on the Dynamics 365 record. There are two approaches for this:

  1. Triggering the creation of a Document Library from the Dynamics 365 record
  2. Creating the Document Library directly in SharePoint and syncing with D365

For each method, I have created a Power Automate Flow and I’ll run through the key features below.

Creating A Document Library from Dynamics 365

1. When a Record is Selected – the first step is to use the “Common Data Service – When a Record is Selected” trigger. This trigger is only available in the “old” CDS connector (see this great post from Sara Lagerquist for a comparison between the CDS connectors). In order to be able to use the When a Record is Selected trigger you have to ensure the Flow is not in a solution, as it will not be able to select otherwise. One of the best things about using this trigger is that you can also request input values from Users, so in this case we’re going to ask the Users to input their desired Document Library name

2. Get Related SharePoint Site – next we’ll use a List Records action to get the related SharePoint site. In this step we’ll use an OData filter to look for sites where the Absolute URL matches the SharePoint site URL for our Account

3. Compose SharePoint Site ID – we will use a Compose step to extract the SharePoint site ID from the SharePoint site record we’ve returned in the previous step. As we’ll only have one SharePoint site per record, we don’t need to use an Apply to Each control, rather we can use a First expression to extract the specific field value. The expression I’ve used is:

first(outputs('Get_Related_SP_Site')?['body/value'])?['sharepointsiteid']

4. Create Document Library – we will use the “Send HTTP Request to SharePoint” action to create a Document Library on our Account site. The Send HTTP Request action allows you to leverage the full SharePoint REST API to do actions that aren’t available with the OOTB connector. If you’re not aware, Microsoft have some great resources available to help you understand the REST API, so I’m just using the instructions available here to create a Document Library. (This does say working with Lists, but Document Libraries are really just a fancy kind of list)

There are a few aspects to this action we need to fill in:

  1. Site Address – we will use the Site Address from the custom field on our Account record
  2. Method – as we’re creating a Document Library we’re going to be using the POST method
  3. Headers – we will be specifying that we are sending a JSON request (content-type) and expect a JSON back (accept)
  4. Body – Microsoft have helpfully outlined the key elements of the body JSON at the link. They key values we need to specify are:
    1. BaseTemplate: 101 – this signifies that this is a Document Library
    2. Title – this will be the value that is input in our trigger

The Header JSON is:

{
  "Accept": "application/json;odata=verbose",
  "Content-Type": "application/json;odata=verbose"
}

The Body JSON is:

{
 "__metadata": {
 "type": "SP.List"},
 "BaseTemplate": 101, 
 "ContentTypesEnabled": false,
 "Description": "Created by Flow",
 "Title": "@{triggerBody()['text']}" 
}

5. Create a Document Location – the final step is to use a Create a Record action to create a Document Location record in Dynamics 365. We need to input the following information:

  1. Name – we will use the input value from the trigger
  2. Service Type Value – set this to SharePoint
  3. Relative URL – we will use the input value from the trigger
  4. Parent Site or Location ID – we will use the output from the Compose step at Step 3 above
  5. Parent Site or Location Type – set this to SharePointSites
  6. Regarding Object ID – we will set this to the Account ID from the trigger
  7. Regarding Object Type – set this to Accounts

When this is all done, we can use the Flow “on-demand” to create Document Libraries and an associated Dynamics 365 Document Location on any Accounts in our organisation that have a SharePoint site.

Note: I have not added any error handling to this Flow, but you may wish to include some guard conditions to ensure you don’t try and create document libraries against records with no associated SharePoint site

CREATING DOCUMENT LIBRARY FROM SharePoint and Syncing to Dynamics 365

Steps 1 – 3 are the exact same as Steps 1-3 in the previous Flow (i.e. use the When a Record is Selected Trigger (though we don’t need an input value for this Flow), then List Records to get the associated SharePoint Site record, then a Compose step to extract the SharePoint Site ID.

4. List all related Document Locations for Account – next we will use a List Records step to find all Document Locations in Dynamics 365 related to the SharePoint site record for the Account

5. Select key values from returned JSON – in order to remove the noisiness from the returned JSON for the List Records step above we will use a Select action to extract the following values for each Document Location:

  1. Name
  2. Relative URL
  3. ID

6. Get all Lists and Libraries – we’ll use the standard Get all Lists and Libraries action to return a list of the Document Libraries on the site. For the Site Address we’ll input the SharePoint Site URL from the custom field on our Account entity

7. Apply to each returned SP Document Library – for each Document Library that is returned in the step above, we’ll loop through them to check if they are already in Dynamics 365 and, if not, we’ll create a Document Location for them

7A – Is there a D365 Document Location for the SharePoint Doc Library? – we’re using a condition control here to check if the SharePoint Document Library appears in the list of D365 Document Libraries with a Contains expression. The expression I’ve used is:

contains(toLower(string(body('Select_key_values_from_returned_JSON'))),toLower(items('Apply_to_each_Returned_SP_Document_Library')?['DisplayName']))

7B – If No then Create a New Document Location in D365 – if there is not a D365 Document Location existing for the SharePoint Document Library then we’ll use a Create a new Record action to create one. We need to input the following information:

  1. Name – we will use the DisplayName from the returned SharePoint Document Library
  2. Service Type Value – set this to SharePoint
  3. Relative URL – we will use the DisplayName from the returned SharePoint Document Library
  4. Parent Site or Location ID – we will use the output from the Compose step at Step 3 above
  5. Parent Site or Location Type – set this to SharePointSites
  6. Regarding Object ID – we will set this to the Account ID from the trigger
  7. Regarding Object Type – set this to Accounts

Note: if a User inputs any invalid characters in the title of the Document Library on create, or if they change the name of the Document Library after it is created, then this Flow may not execute perfectly. In those situations we’d have to include workarounds with the “Send HTTP Request to SharePoint” action and utilise the REST API

Conclusion

With the two Flows I have outlined above we can enable Users to add Document Libraries to their SharePoint sites from either the Dynamics 365 record or the SharePoint site, and be able to interact with them directly from the Dynamics 365 record. From my own experience, I know that this kind of flexibility is appreciated by Users.

Ten Regex Expressions To Use With Forms Pro

If you’ve been using Forms Pro to create surveys then you should be aware that you can do custom validation on fields with Regular Expressions (regex). My friend Megan Walker blogged about this recently, showing how easy it is to add the custom formatting checks to a field on your survey. Megan and I were discussing this functionality and, as I’m a massive geek, she asked if we could work together to come up with ten different custom expressions that you could use in a Forms Pro survey; we came up with the following list:

  1. Phone Numbers
  2. Website address
  3. Twitter handle
  4. Zip code (5 digits)
  5. UK Postcode
  6. National Insurance Number (UK)
  7. Social Security Number (USA)
  8. Vehicle Registration (UK)
  9. Date of birth (both dd/mm/yyyy & mm/dd/yyyy)
  10. Time (24 hour & 12 hour AM/PM formats)

I’ll go through each of these in turn below to show you how they work. I’ll include a copy of the expression, an explanation of the key rules that make up the query, a link to Regex101 for testing the query, and a visual overview of how the query works.

UK Phone Number

(\s*\(?(0|\+44)(\s*|-)\d{4}\)?(\s*|-)\d{3}(\s*|-)\d{3}\s*)|(\s*\(?(0|\+44)(\s*|-)\d{3}\)?(\s*|-)\d{3}(\s*|-)\d{4}\s*)|(\s*\(?(0|\+44)(\s*|-)\d{2}\)?(\s*|-)\d{4}(\s*|-)\d{4}\s*)|(\s*(7|8)(\d{7}|\d{3}(\-|\s{1})\d{4})\s*)|(\s*\(?(0|\+44)(\s*|-)\d{3}\s\d{2}\)?(\s*|-)\d{4,5}\s*)

In order to construct any regex query you have to understand the rules of the information you’re trying to validate. Luckily for me, Wikipedia has a great page about UK telephone numbers. Some of the key rules we need to be aware of are:

  1. The numbers can start with +44 (the international dialling code) or 0
  2. The telephone numbers can be prefixed with a 3, 4 or 5 digit area code
  3. 3 digit area code numbers have a format of 3-4-4 (i.e. 000 0000 0000)
  4. 4 digit area codes have a 4-3-4 format (i.e. 0000 000 0000)
  5. 5 digit area codes have either a 5-6 or 5-3-3 format (i.e. 00000 000000 or 00000 000 000)
  6. Area codes may or may not be enclosed in brackets
  7. There are some special case numbers such as Sedbergh and Brampton which have unique area codes (4-2 format area code, followed by 5 or 4 numbers respectively)

As we know the rules, this makes the construction of the regex query much easier, we can take the rules in turn and construct a query for them. In order to demonstrate the query in action I have created a sample on Regex101 that you can use for testing, see https://regex101.com/r/ACqAiu/2

Website Address

^(http:\/\/www\.|https:\/\/www\.|http:\/\/|https:\/\/|www.)[a-z0-9]+([\-\.]{1}[a-z0-9]+)*\.[a-z]{2,5}(:[0-9]{1,5})?(\/.*)?$

the key rules we need to consider for a website address are:

  1. The address may begin with http://, https:// or www.
  2. After the protocol, the address may optionally begin with www.
  3. The web domain may use letters, numbers or hyphens
  4. Hyphens cannot be used at the beginning or end of the domain
  5. The web address typically ends with a top-level domain that is between 2-5 letters (i.e. .com, .net, .gov, etc.)

You can test this expression on Regex101 by clicking here

Twitter Handle

[\@][A-Za-z0-9_]{1,15}$

Twitter usernames have some pretty simple rules:

  1. the username begins with an @ symbol
  2. the username cannot be longer than 15 characters
  3. the username can only be comprised of alphanumeric digits or an underscore

You can test this expression on Regex101 by clicking here

US Zipcode

^[0-9]{5}(?:-[0-9]{4})?$

US Zip Codes are really straightforward, the key rules are:

  1. It can be comprised of 5 numbers
  2. It may optionally also have a 4 digit suffix following a hyphen for the ZIP+4 format

You can test this expression on Regex101 by clicking here

UK PostCode

^(([A-Z]{1,2}[0-9][A-Z0-9]?|ASCN|STHL|TDCU|BBND|[BFS]IQQ|PCRN|TKCA) ?[0-9][A-Z]{2}|BFPO ?[0-9]{1,4}|(KY[0-9]|MSR|VG|AI)[ -]?[0-9]{4}|[A-Z]{2} ?[0-9]{2}|GE ?CX|GIR ?0A{2}|SAN ?TA1)$

The Wikipedia article on UK Postcodes has a great section on validation that covers all of the key rules, and provides the regex that I have inlcuded above. I’d recommend reading it to get an overview of how the validation works.

You can test this expression on Regex101 by clicking here

UK National Insurance Number

^[A-CEGHJ-PR-TW-Z]{2}[0-9]{6}[A-DFM]{1}$

UK National Insurance numbers have some key rules:

  1. They are formatted as two prefix letters, six digits and one suffix letter
  2. Neither of the first two letters can be D, F, I, Q, U or V
  3. The suffix letter can be A, B, C, D, F or M

You can test this expression on Regex101 by clicking here

US SOcial Security Number

^\d{3}(\s*|-)\d{2}(\s*|-)\d{4}$

The rules for US Social Security numbers are:

  1. 9 digits in total
  2. May be in a 3-2-4 format (i.e. 000-00-0000)
  3. The delimiter between the sections of the number may be a hyphen or a white space

You can test this expression on Regex101 by clicking here

UK Vehicle Registration

^([A-Z]{3}\s?(\d{3}|\d{2}|d{1})\s?[A-Z])|([A-Z]\s?(\d{3}|\d{2}|\d{1})\s?[A-Z]{3})|(([A-HK-PRSVWY][A-HJ-PR-Y])\s?([0][2-9]|[1-9][0-9])\s?[A-HJ-PR-Z]{3})|([A-Z]{3}\s?(\d{4})\s?)$

There are a number of rules regarding UK Vehicle Registration numbers, and there is a great gist on GitHub by Daniel Bradley that covers the rules so I’d recommend head there to read about it.

You can test this expression on Regex101 by clicking here

Date of Birth

^(((19|20)\d\d[- \/.](0[1-9]|1[012])[- \/.](0[1-9]|[12][0-9]|3[01]))|((0[1-9]|1[012])[- \/.](0[1-9]|[12][0-9]|3[01])[- \/.](19|20)\d\d)|((0[1-9]|[12][0-9]|3[01])[- \/.](0[1-9]|1[012])[- \/.](19|20)\d\d))$

For a date of birth we want to validate the following rules:

  1. The date may be input in the DD/MM/YYYY, MM/DD/YYYY or YYYY/MM/DD format
  2. The elements of the date may be separated by a forward slash (/), period (.), hyphen (-) or space

You can test this expression on Regex101 by clicking here

TIME – 24 hour Format

^([0-1]?[0-9]|2[0-4]):([0-5][0-9])(:[0-5][0-9])?$

Hopefully the rules around time are obvious, but you can have any time from 00:00 to 24:00, with the hours and minutes separated by a colon. You can optionally include the seconds which will also be separated from the minutes by a colon

You can test this expression on Regex101 by clicking here

TIME – 12 HOUR FORMAT

^([1-9]|1[012]):(0[0-9]|[1-5][0-9])\s?(am|AM|pm|PM)$

Time in a 12 hour format can be anytime from 12:00am to 11:59pm, and may be suffixed with am or pm, which may be in upper or lower case

You can test this expression on Regex101 by clicking here

Conclusion

Hopefully the examples above will give you some ideas for how you can use regular expressions in your own surveys. The examples are intended as a guide, so please test them before implementing them in any production environment. If I have made any glaring errors then please let me know in the comments or by reaching out to me on Twitter or LinkedIn.

If you’d like to know more about Forms Pro then please subscribe to Megan’s blog, follow her on Twitter, watch her YouTube channel or enrol in her excellent online training course.

Better Integration between Dynamics 365 and SharePoint using Power Automate — Part 1

If you’ve used the standard OOTB connection between Dynamics 365 and SharePoint then I’m sure, like me, you’ll have found your share of frustrations.

The technical functionality is great, allowing you to see your documents stored in SharePoint in a subgrid on the related Dynamics 365 record, however the standard implementation will probably make your SharePoint admins cry because it uses a single SharePoint Site, with a Document Library for each Entity, and a Folder for each record.

Whilst this approach sort of makes sense; at scale it becomes pretty unmanageable and it runs counter to any recognised SharePoint best practices. In my organisation we wanted to have a SharePoint site per record which would allow us to control security much more robustly. Peter Baddeley has blogged about similar issues before and I’d highly recommend reading his posts about this.

The Scenario

In my organisation we have Client SharePoint sites, linked to the Account record in Dynamics 365. Each of these SharePoint sites can have one or many Document Libraries depending upon the requirements of the Client. We then have a custom entity for Projects, and each Project has a SharePoint Site with one or many Document Libraries. Finally, each Project may have one or many Work Streams, and each of the Work Streams may have one or many Document Libraries

The relationships between the entities and the two systems can be visualised as below:

The Setup

As indicated above, in my scenario we have some custom entities:

  • Project
    • Lookup to Account entity
  • Work Stream
    • Lookup to Account entity
    • Lookup to Project entity

Note that while there is a parent:child relationship between Accounts and Projects, and Projects and Work Sites, this doesn’t have to be reflected in SharePoint, thus we can avoid getting into nested site structures in SharePoint and take advantage of modern functionality like Hub Sites.

As part of my demonstration I’ll also add a few custom fields to my Account entity:

  • Create SP Site – two option field I’ll use to trigger my Flow to run
  • Reference ID – An autonumber field to generate a unique ID for each Account, that we’ll use for the new site URL
  • SP Site URL – a field to hold the URL for the site we’re creating, which we’ll use in Part 2 and Part 3 of this series

A key part of the setup is also to ensure that you’ve set up SharePoint integration and enabled Document Management on the entities we’ll be using – see the Microsoft documentation for more info on how to do this. As we’re creating a custom document management structure, you might want to avoid the default document location logic from firing – Alex Shlega covered this in his blog about custom folder structures for SharePoint integration.

The Solution

Now that we have a clear understanding of our scenario, we need to create a custom document management structure using a Power Automate flow. Before we jump into this you should be aware that the standard SharePoint Flow connector doesn’t allow you to create SharePoint Site collections, it will only allow you to create subsites. To work around this restriction we’re going to use an Azure Automation runbook to execute some PowerShell.

Azure Automation

I had written a lengthy explanation of how to create an Azure Automation runbook that we could call from a Flow, but then I found this post from Planet Technologies that explains so straightforwardly how to do this that I’d rather just link you to their blog. If you’ve never created an Azure Automation runbook, or you feel intimidated by the idea of PowerShell then I can assure you that you have no reason to worry. The steps in this blog are super easy to follow!

In my Azure Automation Runbook I’ve added a step using the Register-PnPHubSite cmdlet to register my Client (Account) site as a Hub site. By doing this I can then associate my Project and/or Work sites to this Hub using the Add-PnPHubSiteAssociation cmdlet, without having to implement a nested site structure.

The Flow

1. When a Record is Updated – for the trigger I’m using the “When a Record is Created, Updated or Deleted” trigger from the Common Data Service (Current Environment) Connector. In this instance I’m using the Update trigger condition, to check when the new “Create SP Site” field I created is set to “Yes”. Note below that I’ve set the Filtering attributes to this field, and set a Filter expression to only trigger the flow when the value of this field is True

2. List the Default SP sites to get the Base URL – next I am using a List Records step with a FetchXML query to retrieve the Default SharePoint site URL. When you set up your integration with SharePoint you would have specified a site, and this is listed in Dynamics 365 as a default site, so we can retrieve this easily. The FetchXML query I used is:

<fetch version="1.0" output-format="xml-platform" mapping="logical" distinct="false">
  <entity name="sharepointsite">
    <attribute name="name" />
    <attribute name="parentsite" />
    <attribute name="relativeurl" />
    <attribute name="absoluteurl" />
    <attribute name="validationstatus" />
    <attribute name="isdefault" />
    <order attribute="name" descending="false" />
    <filter type="and">
      <condition attribute="isdefault" operator="eq" value="1" />
    </filter>
  </entity>
</fetch>

3. Compose AbsoluteURL – As I’m only returning a single record in the List Records step above, I will use a Compose action to retrieve the Absolute URL field value, rather than having to use an Apply to Each step. To do this, we use a First expression to retrieve the first returned value, then we’ll interrogate the JSON to get the value we want. The expression I used is:

first(outputs('List_the_Default_SP_sites_to_get_the_Base_URL')?['body/value'])?['absoluteurl']

4. Initialise SiteURL – next we’ll create a string variable for the Site URL of the new SharePoint site we’re going to create. To do this, we’ll use the output of the compose step above, append it with the value “/sites/” (as the new site will be within the sites managed path in SharePoint) and then complete it with the autonumber value from the new Reference ID field we created.

Note: if you want to use the name of your record in D365 for the URL you might need to strip out any “dangerous” characters for the URL. In this case you could follow the great advice from AlanPS1 on his blog for stripping unwanted characters

5. Create Azure Automation Job – now that we have our Site URL for the new site we want to create we’re going to use the Create Job action from the Azure Automation connector. In this action you’ll specify the Subscription, Resource Group, Automation Account and Runbook Name we created above. This will then bring up the Input parameters you specified in your runbook. In my case I just need the Site URL and the Account Name for my site title. Note that I’ve set “Wait for Job” to Yes, this is important to ensure the Job is completed before the action is marked as complete in your flow.

6. Create a new SP Site record – after the Azure Automation has done it’s work and created the SharePoint site we’re going to create a SharePoint Site record in Dynamics 365 for it. With this SharePoint Site record we will set the Absolute URL to the SiteURL we created earlier.

7. Create a new Document Location for Account – the final step in the Flow is create a Document Location record in D365 for the Account so we can actually see the SharePoint site and the documents stored there when we look at the Account record in D365. For the purposes of this demo I’m just creating a SharePoint Document Location for the default Documents document library that is created when you create a SharePoint site, so I know that the Name is “Documents” and the Relative URL is “shared documents”. If you were using a Template that created multiple document libraries you could use the SharePoint “Get all Lists and Libraries” action to retrieve them all then create a new Document Location for each Library that is returned.

One thing to note here is that when I’m setting the Lookup fields I have to put them in the format “/PluralofEntityName(GUID)”. This is due to a known bug with the Common Data Service (Current Environment) connector – see this blog by Sara Lagerquist for more information on this issue.

Conclusion

In this blog post I wanted to demonstrate how easy it can be to create an advanced custom integration between Dynamics 365 and SharePoint, and how accessible it is even to non-developers like me now that we have such advanced functionality available to us in Power Automate flows and Azure Automation runbooks.

Implementing a custom integration allows you to implement proper security management, and makes the solution much more scalable for proper document management functionality. In the next blog I’ll show you how to create new SharePoint Document Libraries from within Dynamics 365.

If you think this is useful I’d love to get your thoughts on it, please drop a comment below or reach out to me on Twitter or LinkedIn!

Exchange Rate Conversion with Power Automate

Many of us will work in organisations that have a global presence and therefore may have a need to know the value of something in a local currency. There isn’t any out of the box way to convert a base currency to another currency in Power Automate, so I wanted to see how easy it would be to configure.

The Solution

For the purposes of this blog I’m going to configure a simple manually triggered flow that will take a number input for a value to convert, and the base currency from which we are converting, and it will return this value at it’s current exchange rate in 10 different currencies. There are two parts to this solution:

  1. Configure a Custom Connector to retrieve current exchange rates
  2. Configure the Flow to output the converted value

Configuring a Custom Connector

As there is no out-of-the-box connector to do exchange rate conversion, we need to create our own. If you’ve not created a custom connector before, it is really straightforward; you just need an API that you can access with the appropriate authorisation and then it is a really simple process. Joe Unwin, AKA Flow Joe, covered the steps to creating a custom connector on his blog here.

As I needed an API to access, a bit of google-fu led me to this free Exchange Rates API, which really handily allows you to request the rates against a specific currency by setting the base parameter in the requet:

So, how do we set up this Custom Connector? There are basically 2 main steps for this one, as there is no authentication required.

Step 1 – input the host URL in the General Information section of the Custom Connector:

Step 2 – as we saw above, the Open Exchange Rates API allows us to request the latest rates by specifying a base currency parameter. To add this to our custom connector we simply add a new Action in the Definition section, then under the request heading we click Import from Sample and paste the URL with the parameter into the flyout window and specify it as a GET Request. To show just how easy it is, watch the gif below:

Now that we have the custom connector created, we’re ready to create the Flow

COnverting Values with a Flow

This Flow is pretty straightforward so I’ll run through each step in turn:

1. Manually trigger a Flow – for the purposes of this demo I’m using a manual trigger, and I’m specifying two inputs: the value to convert and the base currency we’re converting from. Of course, if you were implementing this in your environment any trigger could be used, as long as you’re providing these input values

2. Initialize RateConversion Array variable – In this array I want to specify the Currency name, Currency Code (ISO 4217 code) and the Locale (BCP 47 Code). The Currency Code is used to identify the exchange rate we’ll be converting to in the returned results, while the Locale will be used as part of a formatNumber expression later in the Flow. The Array I’m using is:

[
  {
    "Currency Name": "US Dollar",
    "Currency Code": "USD",
    "Locale": "en-US"
  },
  {
    "Currency Name": "British Pound Stirling",
    "Currency Code": "GBP",
    "Locale": "en-GB"
  },
  {
    "Currency Name": "Japanese Yen",
    "Currency Code": "JPY",
    "Locale": "ja-JP"
  },
  {
    "Currency Name": "Euro",
    "Currency Code": "EUR",
    "Locale": "fr-FR"
  },
  {
    "Currency Name": "Swedish Krona",
    "Currency Code": "SEK",
    "Locale": "sv-SE"
  },
  {
    "Currency Name": "Australian Dollar",
    "Currency Code": "AUD",
    "Locale": "en-AU"
  },
  {
    "Currency Name": "Canadian Dollar",
    "Currency Code": "CAD",
    "Locale": "en-CA"
  },
  {
    "Currency Name": "New Zealand Dollar",
    "Currency Code": "NZD",
    "Locale": "en-NZ"
  },
  {
    "Currency Name": "South African Rand",
    "Currency Code": "ZAR",
    "Locale": "en-ZA"
  },
  {
    "Currency Name": "Brazilian Real",
    "Currency Code": "BRL",
    "Locale": "pt-BR"
  }
]

3. Parse RateConversion Array – in this step we’re using a Parse JSON action to enable us to use the elements from the array we create above later in the Flow. The Schema for this step is:

{
    "type": "array",
    "items": {
        "type": "object",
        "properties": {
            "Currency Name": {
                "type": "string"
            },
            "Currency Code": {
                "type": "string"
            },
            "Locale": {
                "type": "string"
            }
        },
        "required": [
            "Currency Name",
            "Currency Code",
            "Locale"
        ]
    }
}

4. Retrieve Latest Rates – in this step we’re going to use our Custom Connector that we created earlier to retrieve the latest rates. As you can see, because we specified a “base” parameter, it is asking us for that input in the action. We’re using the “Base Currency” input from our trigger in this field

The output from this step looks like:

{ 
   "rates":{ 
      "CAD":1.7211925099,
      "HKD":10.0868447976,
      "ISK":164.0533917057,
      "PHP":65.6558566704,
      "DKK":8.8888624521,
      "HUF":403.8758952152,
      "CZK":29.5962311737,
      "GBP":1.0,
      "RON":5.670251493,
      "SEK":12.4872112113,
      "IDR":17741.190606486,
      "INR":92.5961835875,
      "BRL":5.6254015085,
      "RUB":81.7923338647,
      "HRK":8.871850389,
      "JPY":142.7942611054,
      "THB":40.3863998668,
      "CHF":1.2663874943,
      "EUR":1.1896547622,
      "MYR":5.3701015965,
      "BGN":2.3267267839,
      "TRY":7.8425610888,
      "CNY":9.0500606724,
      "NOK":11.9494872588,
      "NZD":2.0024268957,
      "ZAR":19.1850864879,
      "USD":1.2983892075,
      "MXN":24.1825882129,
      "SGD":1.7989959314,
      "AUD":1.9242665778,
      "ILS":4.4403863999,
      "KRW":1532.2753336982,
      "PLN":5.062456875
   },
   "base":"GBP",
   "date":"2020-02-12"
}

5. Initialize ConvertedRates string variable – In this step we’re just creating an empty string variable that we’ll add to when we loop through the Array we created in step 2 to convert our values

6. For each requested Rate – We’re going to loop through each of the Rates in the Array we created in Step 2 to convert them

6A. Check that there is a value returned for the requested Rate – this condition is to allow for some error handling in the event that the rate we’ve requested in our Array isn’t returned in the results from our Custom Connector. The expression we’re using is

body('Retrieve_Latest_Rates')?['rates']?[items('For_Each_Requested_Rate')['Currency Code']]

6B. Append No Conversion Available to String – If the condition above returns a negative result then we’ll use an Append to String Variable action to append a note that there is No Conversion Available for the requested rate. This ensures consistency of results

6C – Append Converted Value to String – if the condition at step 6A returns a positive result then we want to multiply the requested value from our trigger by the exchange rate returned, then format the results in the local currency. The expression we use to do this is:

formatnumber(mul(triggerBody()['number'], float(body('Retrieve_Latest_Rates')?['rates']?[items('For_Each_Requested_Rate')['Currency Code']])), 'C', items('For_Each_Requested_Rate')['Locale'])

Lets break this down from inside out:

float(
body('Retrieve_Latest_Rates')?['rates']?[items('For_Each_Requested_Rate')['Currency Code']])

For this part of the expression, we’re interrogating the JSON that is returned from our custom connector in Step 4 above to find an exchange rate. We’re using the Currency Code for the current item in our Array to find the related Exchange Rate. We then need to convert this string to a Float value using the Float expression so we can use it in our multiplication

mul(triggerBody()['number'],[FLOAT_VALUE_FROM_ABOVE])

Now that we’ve returned a Float value, we’re going to use a mul expression to multiply the number we input in our trigger by this value.

formatnumber([MULTIPLICATION_RESULT], 'C', items('For_Each_Requested_Rate')['Locale'])

The final step is to take the value that we’ve returned from our multiplication and format it in a manner that’s representative of the currency we’re displaying. The formatNumber expression allows us to specify that the number we’re working with is a Currency, and the Locale that we retrieve from the array specifies exactly what currency the value is.

7. Respond to a PowerApp or Flow – for this demo I’m assuming we’d want to return the string of converted values to a PowerApp or Flow, but of course you could use this output for any appropriate purpose

The output of the Flow for an input of 10000 GBP looks like:

US Dollar: $12,983.89
British Pound Stirling: £10,000.00
Japanese Yen: ¥1,427,943
Euro: 11 896,55 €
Swedish Krona: 124 872,11 kr
Australian Dollar: $19,242.67
Canadian Dollar: $17,211.93
New Zealand Dollar: $20,024.27
South African Rand: R191 850,86
Brazilian Real: R$ 56.254,02

Conclusion

This is a relatively simple Flow that shows how easy it is to configure a Custom Connector and use an open API in the context of a Flow. Converting values to other currencies could be useful in a number of scenarios, in particular if you are using multiple currencies in your Dynamics 365 environment where keeping the exchange rates up to date is vitally important.

See the Flow in action below: