## Calculating ISO 8601 Week Number for Dates in Power Automate

An issue that a lot of users have faced with Power Automate is the inability to find out the ISO 8601 Week Number for a given date; this was raised as an idea in the Power Apps Ideas forum back in 2017, and you can find numerous questions about it on the Power Automate forums, but a lot of the answers I’ve seen are incomplete or are overly complex.

If I was using Excel, I could find out the week number for a date using the ISOWEEKNUM function, so it’s a bit frustrating that I can’t do it this easily in Power Automate. I’m not the kind of person to back down from a challenge though, so I decided to see if I could figure out an easy solution to this problem.

## The SOlution

There are a couple of things to be aware of when trying to work out the ISO Week Number:

1. An ISO week-numbering year will have either 52 or 53 full weeks
2. The ISO weeks always start on a Monday
3. The first ISO week of a year is the first week that contains the first Thursday of the Gregorian calendar

The Wikipedia article on ISO Week Dates has some great information if you’re not familiar with it.

Now that we know the key elements of ISO Week Dates, we can get our thinking caps on. I went through a lot of trial and error to work this out, but what I’ve ended up with below is a solution that I’ve tested successfully against over 1,000 randomly generated dates and it has been 100% correct.

1. Manually trigger a flow – to demonstrate the Flow I’m just using a manual trigger with a Date input

2. Compose Start of Week (Monday) – as we know that ISO weeks always start on a Monday, we want to work out the Monday of the week our trigger date is in. Power Automate numbers days of the week beginning from Sunday, where Sunday is 0, Monday is 1, etc. with the dayofweek expression.

To calculate Monday we subtract the dayofweek value of our trigger date minus 1 to get a total amount of days to subtract. As we know a Sunday will be a 0, we also add in an IF condition to set the value to subtract to 6 if the day is a Sunday.

To give an example of this:

Date = 2020-04-29

Dayofweek for Date = 3

3 – 1 = 2

Monday = 2020-04-29 minus 2 days = 2020-04-27

The expression is constructed as follows:

```subtractFromTime(triggerBody()['date'], if(equals(dayofweek(triggerBody()['date']),0),6,sub(dayofweek(triggerBody()['date']),1)), 'Day')
```

3. Compose Thursday – now that we have the Monday value, we want to find out the value for the Thursday in the week of our triggering date. We do this as the first ISO week of the year will always have a Thursday in it, and therefore it’s best to set our calculations from this day. Thursday is always 3 days after Monday, so we just add 3 days to the Monday value with the expression below:

```addDays(outputs('Start_of_Week_(Monday)'),3)
```

4. Compose ISO Week Number – To calculate the week number, we need to first find out what day of the year the Thursday is on, and we can do that with the dayofyear expression. You would think that you could just divide the value by 7 and get the week number, but its not that simple.

In 2020, the first Thursday was 2nd January which is Day 2 of the year. the integer value of 2/7 is 0 (in integer maths everything after the decimal point is ignored). ISO Week Numbers start at 1, so 0 is an invalid value. You might think we could just add 1 to this value, but this doesn’t work either.

The 1st Thursday of 2016 was January 7th (7th day of the year). The integer value of 7/7 is 1, and if we added 1 to this it would give us a first week of the year as 2, which obviously isn’t correct.

The simplest way to avoid this is to add 6 days to the DayofYear value for our Thursday, then divide the result by 7 to get our integer value.

To see this in action, and continuing our example from above:

Monday = 2020-04-27

Thursday = Monday+3 = 2020-04-30

DayofYear Thursday = 121

121 + 6 = 127

127/7 = 18.142

Integer value of 18.142 = 18

We can therefore say that the date 2020-04-29 is in Week 18 of 2020

The expression is written as:

```div(add(dayofyear(outputs('Thursday')),6),7)
```

5. Respond to a PowerApp or Flow – the final step in this Flow is to set a response action with 2 outputs:

1. Input date – the value that we input to the Flow
2. Week Number – we will set this as follows:
```Week @{outputs('ISO_Week_Number')} of @{formatdatetime(triggerBody()['date'],'yyyy')}
```

## Conclusion

With this Flow I wanted to create a simple solution to calculate the ISO Week Number for any given date, without lots of nested conditions and weird workarounds. I presented it above as a set of 3 compose actions which is easier for debugging and writing the formula, but you could also have the whole expression in a single compose action as below:

```div(add(dayofyear(addDays(subtractFromTime(triggerBody()['date'], if(equals(dayofweek(triggerBody()['date']),0),6,sub(dayofweek(triggerBody()['date']),1)), 'Day'),3)),6),7)
```

## Dynamics 365 Opportunity Revenue Forecasting – Creating Monthly Forecasts

In my last post I explored how to create a structure and a triggering flow to create Opportunity Revenue Forecasts. In this post I’ll delve into the child flow to create a set of Monthly Revenue Forecasts.

## The Solution

1. Manually trigger a flow – as I explained in the last post, when we trigger this child flow we need the following elements from the triggering flow:

1. Loops – we use the output from the DateDiff calculated field on the opportunity entity, and we’ll use this to work out how many Revenue Forecast records we will be creating
2. Estimated Close Date
3. Project End Date
4. Revenue Per Day
5. OpportunityID – The GUID for our triggering record, we’ll use this to set the Opportunity lookup field on the Revenue Forecast records we will create
6. Estimated Revenue

Steps 2-9 – Initialize variables – the next 8 steps in the Flow are to initialize empty variables that we’ll then be setting in each Loop of our Flow. I’ve provided an outline of the Variables below:

10. Do Until – The real bones of the logic of this Flow occurs within the Do Until control step. We set the Do Until to run until the value in the loopIndex variable we created above is equal to the value from the Loops input value for the Flow.

Note: the standard limits for a Do Until control are a count of 60 or a timeout of 1 hour. If you expect to create more loops then you may wish to increase these limits (though note there is an upper limit of a count of 5000 loops or a timeout of 30 days – https://docs.microsoft.com/en-us/power-automate/limits-and-config#run-duration-and-retention)

10A. Increment loopIndex – the first step in the Do Until loop is to use an Increment Variable action to increment the loopIndex variable by 1.

10B. Check if First Loop – the next step in the Do Until loop is a set of nested conditions that first checks if we’re in the First Loop (i.e. loopIndex = 1), if not it then checks if we’re in the Final Loop (i.e. loopIndex = Loops value from trigger), otherwise it carries out the intermediate Loop steps.

In the loops we set the variables we initialised above as follows:

I’ll run through the steps in the loops below, and call out where there are differences in the First and Final loops as identified in the table above

10A. set startOfMonth – in the first loop we set this to Estimated Close Date from the trigger inputs. In the intermediate and final loops we use an adddays expression to add one day to the endOfMonth variable to set this to the first day of the new month. The expression we use is:

```adddays(variables('endOfMonth'), 1)
```

10B. Compose startOfMonthTicks – in order to calculate the number of days in the Forecast period we need to convert the start date and end date to their ticks representation. For this step we use the following expression:

```ticks(variables('startofMonth'))
```

10C. set endOfMonth – in the Final Loop we set this to the Project End Date from the trigger inputs, and in the First and Intermediate loops we set it to the last day of the Month using the following expression:

```addDays(startOfMonth(addToTime(variables('startOfMonth'), 1, 'Month')), -1)
```

This expression works as follows:

1. Add one Month to the date from the startOfMonth variable
2. Use the startOfMonth expression to get the first day of this month
3. Add -1 days (i.e. subtract one day) to the resulting date to get the end date of the previous month

an example of this in action would be:

startOfMonth = 2020-04-21

startOfMonth(2020-05-21) = 2020-05-01

10D. Compose endOfMonthTicks – as at Step 10B, we need the ticks representation of the endOfMonth date, so we use the following expression:

```ticks(variables('endOfMonth'))
```

10E – Set daysInMonth – to calculate the number of full days in the month we subtract the ticks value of the startOfMonth from the ticks value of the endOfMonth, then divide that value by 864000000000 (the number of ticks in a day), then add 1 to the value to give us the inclusive number of full days, using the following expression:

```add(div(sub(outputs('Compose_endOfMonthTicks_-_First_Loop'), outputs('Compose_startofMonthTicks_-_First_Loop')), 864000000000),1)
```

10F – Set workingDaysInMonth – As we calculated the total number of full days in the forecast period in the previous step, we’ll set the workingDaysInMonth variable to this value, and we’ll subtract from it in the next step

10G. WorkingDays Calculation – I covered how we calculate the number of working days from the number of full days in my previous post, so you can read about that by clicking here. I’m using the exact same logic in each loop in this Flow

10H. Set monthRevenue – to set the revenue amount for the Forecast period we multiply the number of Working Days we calculated above by the Revenue Per Day from the input trigger with the following expression:

```mul(triggerBody()['number_1'], variables('workingDaysInMonth'))
```

10I. Set cumulativeAmount – in the First Loop we set the cumulativeAmount to the monthRevenue value from above, whilst in the Intermediate and Final loops we increment the cumulativeAmount by the monthRevenue value

10J. Compose monthTitle – in order to ensure consistency of naming for the Revenue Forecast records we use a formatDateTime expression to create the title for the record:

```formatDateTime(variables('startofMonth'), 'MMMM yyyy')
```

10K. Create Forecast – we use a Create Record action from the Common Data Service (Current Environment) connector to create the Forecast, and input the variables as follows:

10L. Set previousForecast – the last step in the loop is to set the previousForecast variable to the GUID of the record we created in the previous step. This is used to the set the Previous Forecast field on the Forecast we create in the next loop, which maintains the hierarchical link between the records. This step is obviously not required in the Final loop.

Note: if you wanted your flow to be more efficient you could remove the compose steps at 10B, 10D and 10J and just use the expressions directly where the values from those steps are used in 10E and 10K, respectively. I left the compose steps in for debugging and ease of understanding

The overall Loop section of the Flow looks like this:

Note: if you have a situation where you only have a single loop (i.e. the Estimated Close Date and the Project End Date are in the same month), then you could change the expression in the endOfMonth calculation to if(greater(addDays(startOfMonth(addToTime(variables(‘startOfMonth’), 1, ‘Month’)), -1),triggerBody()[‘date_1’]),triggerBody()[‘date_1’],addDays(startOfMonth(addToTime(variables(‘startOfMonth’), 1, ‘Month’)), -1))

11. Respond to a PowerApp or Flow – the final step in any child flow is to respond to the calling Flow. I haven’t set any outputs in this response action, though of course you may choose to if you wish.

## Conclusion

Whilst this Flow may look complicated at first, I think it is actually relatively simple in reality. There are 7 main variables in the Flow (plus the loopIndex), and the nested conditions loop through these variables to set them with the expressions to create the Forecasts.

As this relies on a Do Until loop, it is not going to be instantaneous, so this is something to take into consideration if you were designing this for deployment in a production environment. In my testing, creating up to 100 Revenue Forecast records with this Flow typically took somewhere between 5-10 minutes. I’m sure this speed could be improved if you chose to code this in a plugin, but I wanted to demonstrate what could be done within the context of a Flow.

## Dynamics 365 Opportunity Revenue Forecasting with Power Automate – Part 1

A common complaint I hear about Opportunity pipeline management in Dynamics 365 is that it’s difficult to use for revenue/income forecasting, as the system considers that you will receive the whole Estimated Revenue on the Estimated Close Date, whereas in reality it’s common for the revenue on a Won Opportunity to split into periodic draw downs (i.e. the client will be billed weekly, monthly, quarterly or annually).

I have created a solution previously with North52 to break the estimated revenue down over the life of the project, and I wanted to see if it would be possible to do the same with a Power Automate Flow.

Over the next few posts in this series I’ll show how I got this functionality working:

1. The triggering Flow
2. Creating Monthly Forecasts
3. Creating Yearly Forecasts
4. Creating Weekly Forecasts
5. Creating Quarterly Forecasts

## The Scenario

In my organisation we wanted to be able to get more detail into our revenue forecasting pipeline to increase the accuracy of it. The requirement was that we should be able to put in an estimated project end date as well as an opportunity Estimated Close Date; the time period between the Estimated Close Date and the Project End Date would be the project lifecycle.

From this time period we wanted to be able to break the Estimated Revenue down into Weekly, Monthly, Quarterly or Yearly revenue forecasts, and these forecasts should be based on the number of Working Days in the time period.

## The Setup

In order to set up your environment to do create the revenue forecasts you need to add the following:

1. Opportunity Entity

1. Add “Breakdown Type” option set field with the following options:
1. Weekly
2. Monthly
3. Quarterly
4. Yearly
2. Add “Project End Date” date only field
3. Add “DateDiff” whole number calculated field (see calculation below)

2. New “Revenue Forecast” Entity

Create a new entity in your environment called “Revenue Forecast” with the following fields:

1. Name – a single line of text field for the name of the Forecast
2. Amount – a currency field to hold the amount for the Forecast
3. Cumulative Amount – a currency field
4. Start Date – a date-only datetime field
5. End Date – a date-only datetime field
6. Opportunity – a lookup field to the Opportunity entity
7. Previous Forecast – a hierarchical lookup field to the Revenue Forecast Entity

## The Triggering Flow Solution

1. When Opportunity is Updated – The trigger uses the “When a Record is Created, Updated or Deleted” trigger from the Common Data Service (Current Environment) Connector. In this instance we’re using the Update trigger condition, and the filtering attributes that I used are the Estimated Revenue, Estimated Close Date, Breakdown Type and Project End Date fields.

2. List related Forecasts – the first step in the Flow is using a List Records action to retrieve any existing Revenue Forecasts for the Opportunity. As we’ll be generating new ones any time we change any of the filtering attributes, we need to remove the existing ones. We set the filter query using the following OData query:

```_rm365_opportunity_value eq @{triggerOutputs()?['body/opportunityid']}
```

3. Apply to each Returned Forecast – we use an Apply to Each control to loop through the records returned from the List Records step above, and then we use a Delete Record action to delete the existing Revenue Forecast record

4. Compose fullDays – now that we’re starting from a clean slate we need to first calculate how many total days there are between the Estimated Close Date the Project End Date. To do this we use a compose step with the following expression:

```add(div(sub(ticks(triggerOutputs()?['body/rm365_projectenddate']),ticks(triggerOutputs()?['body/estimatedclosedate'])),864000000000),1)
```

In my previous post regarding calculating working days I discussed the fact that we can’t do a simple “DateDiff” calculation in a Flow, rather we have to convert the dates to their ticks representation, then subtract the start date from the end date then divide the result by 864000000000 to convert the result to the number of days. Finally we add 1 to this result to give us the total number of whole days between the start date and end date, inclusive of the end date.

5. Initialize workDays – next we initialise an Integer variable called workDays and set it to the output from the previous step. We’ll subtract from this number to calculate the total number of working days in the time period.

### 6. WorkingDays calculation –

we use a Scope control to keep all of the decrement steps for calculating the amount of working days together and to keep the flow tidy

I discussed calculating working days in this previous post, however I’ve optimised the solution to make it more accurate.

6A. Decrement workDays by 2 for each Full Week – we know that in any given 7 day period there will be two weekend days, so we need to remove these from the total number of days returned above. The way we do this is to divide the total number of days value by 7, then multiply it by 2, and then subtract this result from the total number of days. Note that in the full days calculation above we had added one to the result, so we’re subtracting that from this value before we do the division.

```mul(div(sub(outputs('Compose_fullDays'),1), 7), 2)
```

6B. Decrement workDays by 2 if EndDate DayofWeek is less than StartDate DayofWeek – Next we’re checking if the Project End Date is on an earlier Day of Week than the Estimated Close Date. (i.e. it starts on a Friday but ends on a Wednesday). In this case, we’ll have to account for an extra weekend so we remove it here.

```if(less(sub(dayofweek(triggerOutputs()?['body/rm365_projectenddate']),dayofweek(triggerOutputs()?['body/estimatedclosedate'])),0),2,0)
```

6C. Decrement workDays by 1 if StartDate and EndDate are on the same weekend DayofWeek – If both the Estimated Close Date and Project End Date are on a Saturday or Sunday we need to subtract 1 from our working days calculation.

```if(and(equals(dayofweek(triggerOutputs()?['body/estimatedclosedate']),dayofweek(triggerOutputs()?['body/rm365_projectenddate'])),or(equals(dayofweek(triggerOutputs()?['body/estimatedclosedate']),6),equals(dayofweek(triggerOutputs()?['body/estimatedclosedate']),0))),1,0)
```

6D. Decrement workDays by 1 if StartDate is Sun and EndDate is any other day – If the Estimated Close Date is a Sunday and the Project End Date is any other day then we need to subtract 1 from our working days calculation

```if(and(equals(dayofweek(triggerOutputs()?['body/estimatedclosedate']),0),greater(sub(dayofweek(triggerOutputs()?['body/rm365_projectenddate']),dayofweek(triggerOutputs()?['body/estimatedclosedate'])),0)),1,0)
```

6E. Decrement workDays by 1 if EndDate is Saturday and StartDate is any other day – If the Project End Date is on a Saturday and the Estimated Close Date then we need to subtract one from our Working Days calculation.

```if(and(not(equals(dayofweek(triggerOutputs()?['body/estimatedclosedate']),6)),equals(dayofweek(triggerOutputs()?['body/rm365_projectenddate']),6)),1,0)
```

I used an Excel workbook to verify these expressions against the networkdays formula. If you’d like a copy of the spreadsheet to do your own testing then download it by clicking here.

7. Compose revenuePerDay – now that we’ve calculated how many working days there are in our time period, the last step is to divide the Estimated Revenue by the amount of working days to calculate the amount of Revenue we will get for each working day in our project.

```div(triggerOutputs()?['body/estimatedvalue'],variables('workDays'))
```

8. Switch on Breakdown Type – we use a Switch control to trigger one of the child flows we’ll create in the upcoming blog posts for creating Weekly/Monthly/Quarterly/Yearly breakdowns.

The inputs for the Child Flows ask for:

1. Loops – we use the output from our DateDiff calculated field, and we’ll use this to work out how many Revenue Forecast records we will be creating
2. Estimated Closed Date
3. Project End Date
4. Revenue Per Day
5. OpportunityID – The GUID for our triggering record, we’ll use this to set the Opportunity lookup field on the Revenue Forecast records we will create
6. Estimated Revenue

## Conclusion

This post provides the groundwork for the revenue forecasting solution, and is useful for accurately calculating the number of working days in any given time period. I think it showcases some relatively complex expressions, so at the very least I hope it demonstrates how you might be able to use the expressions to carry out complicated functions.

In the next post in this series I’ll show how to create Monthly Revenue Forecasts for an Opportunity. If you’ve got any questions in the meantime please reach out.

## Selecting a specific SharePoint Document Library to Upload Dynamics 365 Email Attachments using Teams Adaptive Cards

In my recent series of posts I covered how to implement a better integration between Dynamics 365 and SharePoint. One of the common scenarios that you will encounter if you’ve integrated the systems is the requirement to move Email Attachments from Dynamics 365 to SharePoint. In my implementation a lot of my sites have more than one Document Library, and Users want to be able to select a specific library for their attachments to be uploaded to for each email than having them all go to one standard location and then having to be moved after they’ve been uploaded to SharePoint.

## The Solution

The solution I developed to enable this was to use Teams Adaptive Cards. I have created a Flow that is triggered when a User tracks an email against an Account; the Flow sends the User an Adaptive Card in Teams which has a list of the available Document Libraries on the SharePoint site related to the Account, and asks the User to select which one they want to upload the files to. After a Document Library is selected the Flow then moves the attachments to SharePoint. I’ll run through the details below:

1. When a Tracked Email is Created – the first step is to define the trigger, and for this I’m using the Common Data Service (Current Environment) connector, and the “When a Record is Created, Updated or Deleted” trigger. In this instance I used the Filter Expression below to ensure the Flow is only triggered on emails that have attachments, and that are Set Regarding an item

```_regardingobjectid_value ne null and attachmentcount gt 0
```

2. Get User who Created Email – next I use the Get Record action to retrieve the User record for the User who tracked the email into the system. ONe thing to note here is that I’m using the value in the “Owner” field rather than the “Created by” field. This is because when an email is tracked on creation it is created by the SYSTEM account and then ownership is assigned to the tracking User. We get the User details so that we can retrieve their email address for use later in the Flow

3. Get Regarding Account – we use another Get Record action to retrieve the details of the Account against whom the email is tracked. Note that in your environment if you have multiple entities enabled for email integration then you may wish to implement a Switch action here so you can retrieve the details of any record that you might be tracking emails against; for the purposes of this demo I’ve kept it simple. We need to retrieve the Account record so we can use some of the values in our adaptive card later in the Flow.

4. List Document Locations related to Account – next we want to retrieve all of the related Document Locations for the Account as each Document Location represents a SharePoint Document Library, so we’ll use a List Records action with the following filter query:

```_regardingobjectid_value eq @{triggerOutputs()?['body/_regardingobjectid_value']}
```

We’ll loop through these Document Locations to create the ChoiceSet for our adaptive card in the next steps.

5. Initialize Choices Array variable – we need to use an “Initialize Variable” action to create an empty array that we can use when we loop through the Document Locations we returned above.

6. Apply to Each Document Location – we’ll use an Apply to Each control to loop through each Document Location we returned in Step 4

6A. Append to Choices Array – for each Document Location we’re going to use an Append to Array Variable to create a new element in the Choices Array we created above. This array will be used in the Adaptive Card we create later in the Flow. The array value we use is:

```{
"value": "@{items('Apply_to_each_Document_Location')?['sharepointdocumentlocationid']}/@{items('Apply_to_each_Document_Location')?['name']}",
"title": "@{items('Apply_to_each_Document_Location')?['name']}"
}
```

One of the things you may notice is that for the “Value” in the array element I’ve constructed it as [GUID_OF_DOCUMENT_LOCATION]/[DOCUMENT_LOCATION_NAME]. The reason for this is that the choice a User makes will return that value to the Flow and I need both the GUID and the Name later in the Flow, so I concatenate them just now and then I’ll split them after they get returned.

A sample of the output from this step is below:

```{
"value": "6127202a-b958-ea11-a811-000d3a86ba0b/Communication Info",
"title": "Communication Info"
}
```

7. Ask User which Document Library for Uploads – this is the key action in the Flow. We’re using the “Post an Adaptive Card to a Teams user and wait for a response” action. The adaptive cards are created with JSON, and fortunately there are some great samples available at https://adaptivecards.io/samples/. I also found the “Utimate Guide” written by Tomasz Poszytek to be incredibly helpful.

The card I designed looks like this:

The JSON I used to create my card is:

```{
"version": "1.0",
"body": [
{
"type": "TextBlock",
"size": "Medium",
"weight": "Bolder",
},
{
"type": "ColumnSet",
"columns": [
{
"type": "Column",
"items": [
{
"type": "TextBlock",
"weight": "Bolder",
"text": "@{triggerOutputs()?['body/subject']}",
"wrap": true
},
{
"type": "TextBlock",
"spacing": "None",
"text": "Created @{formatDateTime(utcnow(), 'dd MMMM yyyy HH:mm')}",
"isSubtle": true,
"wrap": true
}
],
"width": "stretch"
}
]
},
{
"type": "TextBlock",
"text": "You tracked an email against @{outputs('Get_regarding_Account')?['body/name']}.  Please select a Document Library from the List below to upload the attached documents to",
"wrap": true
},
{
"type": "FactSet",
"facts": [
{
"title": "Account:",
"value": "@{outputs('Get_regarding_Account')?['body/name']}"
},
{
"title": "SharePoint Site:",
"value": "@{outputs('Get_regarding_Account')?['body/rm365_spsiteurl']}"
},
{
"title": "Subject:",
"value": "@{triggerOutputs()?['body/subject']}"
},
{
"title": "Sent To:",
"value": "@{triggerOutputs()?['body/torecipients']}"
},
{
"title": "Sent From:",
"value": "@{triggerOutputs()?['body/sender']}"
}
]
},
{
"type": "Input.ChoiceSet",
"id": "CompactSelectVal",
"value": "1",
"choices":
@{variables('choices')}
}
],
"actions": [
{
"type": "Action.Submit",
"title": "Submit"
}
]
}
```

As you can see, in Line 71 I’m using the Choices Variable to set the ChoiceSet for the User.

I also use the User’s Primary Email address that I got in Step 2 to specify who I will be sending the adaptive card to:

The output from the Adaptive card is JSON that looks like:

```{
"responseTime": "2020-04-02T15:38:10.0668583Z",
"responder": {
"objectId": "[GUID]",
"tenantId": "[GUID]",
"displayName": "Ryan Maclean"
},
"submitActionId": "Submit",
"data": {
"CompactSelectVal": "9a819bbc-4d56-ea11-a811-000d3a86ba0b/Contracts"
}
}
```

We’ll be using the information returned in the Data section in the next two steps.

8. Compose DocLib Name – for this step we’ll be using a Compose action with a substring expression to extract the Document Library name that we concatenated with the GUID above. The GUID is a 36 character string, so we can set the startindex of our substring to 36. The expression we use is:

```substring(outputs('Ask_User_which_Document_Library_for_Uploads')?['body']?['data']?['CompactSelectVal'],36)
```

9. Compose DocLib GUID – in this step we’re doing the exact same as the step above, but in this case we’ll set the startindex of our substring to 0 and the length to 36:

```substring(outputs('Ask_User_which_Document_Library_for_Uploads')?['body']?['data']?['CompactSelectVal'],0,36)
```

10. Get AbsoluteURL for SharePoint Document Library – in this step we’re going to be using the HTTP with Azure AD connector and the Invoke an HTTP request action to query the Web API with the RetrieveAbsoluteAndSiteCollectionUrl Function so we can return the Site Collection URL for our SharePoint site and get the absolute URL for the specific Document Location the User selected. All credit for this goes to Thomas Peschat. His blog covers the full detail of this step, so I’d recommend reading it for more information.

In the Url of the request I’m using the GUID that I retrieved as an output from Step 9 above to specify the document library I want the URLs from.

11. Compose AbsoluteURL – in this step we’re using a Compose action to extract the AbsoluteURL of the Document library from the output of the step above

12. Compose SiteCollectionUrl – as above, we’re using a Compose action to extract the SiteCollectionUrl from the output of Step 10

13. List Email Attachments – in order to get the Attachment details from D365 we need to do a List Records action on the activitymimeattachment entity. This won’t appear in the dropdown list so you’ll need to enter it as a custom value. Our filter query will be:

```_objectid_value eq @{triggerOutputs()?['body/activityid']}
```

14. Apply to each Attachment – we’ll use an Apply to Each control to loop through each Attachment we returned in the previous step

14A. Create file in selected SharePoint Document Library – for each Attachment from our D365 email we’ll create a file in the SharePoint Document Library using a Create File action. The key elements are:

1. Site Address – we use the SiteCollectionUrl output from the compose step 12 above
2. Folder Path – we use the DocLibName output from the compose step 8 above
3. File Name – we use the File Name from the attachment
4. File Content – the file content needs to be in binary format, but in D365 it is stored in base64 format, so we need to use a base64ToBinary expression

You may also wish to add a final step to your flow to delete the attachments from Dynamics 365 after they’ve been uploaded to SharePoint.

## Conclusion

This is my first attempt at using Teams Adaptive Cards and I’m really impressed with the flexibility they offer. At the moment they wait for up to 30 days for an input from a User, I’d like it if I could change that and also offer a default resolution if no option is selected.

I think this approach offers a good degree of flexibility to Users to enable the migration of email attachments from D365 to SharePoint, and I believe the use of Adaptive Cards will help with adoption. I’m also experimenting with Outlook actionable messages to offer similar functionality.

I’d love to hear your feedback on whether you think this approach could work for you, so please feel free to reach out to me on social media or drop a comment below!

## Synchronising Azure Active Directory Security Groups between D365 Records and SharePoint Sites

In my previous post I discussed how to synchronise permissions between D365 Access Teams and a related SharePoint site. This works really well if you have a system locked down and only want to grant access to records to specific named users, but what happens if you want to allow all Users to Read all records, unless they need to be restricted?

In my organisation, our default position is to enable every User to Read and Update all Accounts unless we have a specific need to restrict access (e.g. in a situation where there is a conflict of interest, or particular client requirements). We also need to ensure these permissions settings are replicated in SharePoint.

We do this by utilising an Azure Active Directory Security Group which we add all relevant Users to. We then Share/Unshare records in Dynamics 365 as necessary, and add/remove the AAD Group to the SharePoint Site Members Permissions Group simultaneously.

## The SETUP

In order to enable this functionality the first thing we do is ensure Users only have User-level Read and Write privileges on Accounts.

Next, we’re going to add a custom field that we’ll use for our trigger. In my case I’ve added an Option Set field called SharePoint Permissions. You’ll note from the screenshot below that there are only two options, but I’ve added it as an Option Set field. The reason I’ve used an Option Set instead of a Two-Option field is that we may need to change the options in future, and an Option Set gives us the flexibility to include additional options if needed

As we want to synchronise an Azure Active Directory Security Group, we need to make one in the Azure Portal (or you need to ask your friendly neighbourhood IT Team to do it for you, and to provide you with the Object Id

The final step in the setup is to create a Team in Dynamics 365, then set the “Team Type” to AAD Security Group, and set the “Azure Object Id for a group” to the Object Id from Azure

## THE SOLUTION

1. When SharePoint Permissions field is updated – we’re using the using the Common Data Service (Current Environment) connector, and the “When a Record is Created, Updated or Deleted” trigger. We will set the Trigger Condition to Update, and then we’ll set the Filtering Attribute to our “SharePoint Permissions” field we created.

2. Initialize integrationGroupName – Next we’ll create a String Variable to hold the name of the Azure Active Directory Group we created earlier.

3. Initialize membersGroupID – we need to initialize another string Variable to hold the members Group ID from SharePoint. In this step we’ll leave the value blank, as we’ll set it later in the Flow.

4. Get all Groups – we’ll use the “Send HTTP Request to SharePoint” action to retireve a list of the Permissions Groups from the SharePoint site related to the D365 record. For the Site Address I’m using the custom field I created in this post. Of course you could also retrieve the Absolute URL from the related SharePoint Site record. When you have the site URL then it’s a imple call the API to retrieve all the existing groups

The output from this step will be a JSON, and I’ve included a sample extract below:

```{
"d": {
"results": [
{
"id": "https://<YOURSITE>.sharepoint.com/sites/ACC-01001-2020/_api/Web/SiteGroups/GetById(5)",
"uri": "https://<YOURSITE>.sharepoint.com/sites/ACC-01001-2020/_api/Web/SiteGroups/GetById(5)",
"type": "SP.Group"
},
"Owner": {
"__deferred": {
"uri": "https://<YOURSITE>.sharepoint.com/sites/ACC-01001-2020/_api/Web/SiteGroups/GetById(5)/Owner"
}
},
"Users": {
"__deferred": {
"uri": "https://<YOURSITE>.sharepoint.com/sites/ACC-01001-2020/_api/Web/SiteGroups/GetById(5)/Users"
}
},
"Id": 5,
"IsHiddenInUI": false,
"LoginName": "A. Datum Corporation (sample) Members",
"Title": "A. Datum Corporation (sample) Members",
"PrincipalType": 8,
"AllowMembersEditMembership": true,
"AllowRequestToJoinLeave": false,
"AutoAcceptRequestToJoinLeave": false,
"Description": null,
"OnlyAllowMembersViewMembership": false,
"OwnerTitle": "A. Datum Corporation (sample) Owners",
"RequestToJoinLeaveEmailSetting": ""
}
]
}
}
```

5. Parse Groups JSON – in order to be able to use the JSON that we’ve returned above we need to parse it using a Parse JSON action, as this will allow us to use the output in the next step

6. Loop through results to find Members Group – For this part we’ll use an Apply to Each control to loop through each of the Groups we returned in Step 4 to find the ID of the Members group

6A. Check if the Group Title is the Members Group – we’ll use a Condition control to check if the Title of the current Group ends with “Members” because the Members group is the one we want to be able to add/remove the AD group to/from

6B. If Yes, Set membersGroupID Variable – if the result of the condition at 6A is True then we’ll set the membersGroupID variable to the Group ID of the current Group

7. Check if Permissions is set to All Users – next we’ll use a Condition control to check if the SharePoint Permissions field is set to “All Users”. For this step we need to put in the Option Set Value that corresponds to the option we want to check against

If the answer to the question above is Yes, then we’ll follow steps 7A and 7B

7A. Add Integration Group to Members Group – for this step we’ll be using the “Send HTTP Request to SharePoint” action agiain, and this time we’ll be using a POST method to add the Active Directory Group to the SharePoint Members Group.

As in Step 4, we’ll be using the SharePoint Site URL from custom field on our account, and we’ll be using the following Uri:

```_api/web/SiteGroups(<GroupID>)/users
```

As we retrieved the Group ID in Step 6, we can use the membersGroupID variable for the Group ID parameter in the Uri.

In the Body of the request we’ll use the following format:

```{
"type":"SP.User"
},
}
```

One of the key things to note here is that the LoginName parameter has to have the prefix “c:0t.c|tenant|”. The GUID that completes the LoginName is the Group Object Id from Azure Active Directory:

7B. Share Account with All Users Team – as we’re sharing the SharePoint site with all Users in our organisation, we also want to share the Account record in Dynamics 365 with all Users. We can do that by using the “Perform an Unbound Action” action from the Common Data Service (Current Environment) connector. There is an unbound action called GrantAccess that we can use. This action has two parameters:

1. Target – for this we use a reference to the Account just as we would if we were setting a lookup field, and we will retrieve the Account ID from our trigger.

```/accounts(@{triggerOutputs()?['body/accountid']})
```

2. PrincipalAccess – the PrincipalAccess parameter requires JSON in the following format:

```{
"Principal": {
"teamid": "E65D6729-E162-EA11-A812-000D3A86BA0B",
"@{string('@')}odata.type": "Microsoft.Dynamics.CRM.team"
},
}
```

There are a couple of key things to note here:

1. For the TeamID, I’ve hardcoded the GUID in this instance, but you could retrieve the GUID dynamically if you wish.
2. For the @odata.type: parameter we’ve escaped the “@” symbol by putting it into a string expression. You could also escape it by formatting it as @@, but in my experience reopening the action for edit caused it to revert back to just @, and then the action failed.
3. The AccessMask specifies the rights you’re granting to the User/Team you’re sharing with. The full list of privileges you can grant is “ReadAccess, WriteAccess, AppendAccess, AppendToAccess, CreateAccess, DeleteAccess, ShareAccess, AssignAccess”

If the answer to the condition in Step 7 is False then we’ll follow steps 7C-7F:

7C. Get all Members Group Users – we’ll use the “Send HTTP Request to SharePoint action to get a list of all the Users from the Members Group, again using the membersGroupID variable from above. The Uri is formatted as:

```_api/Web/SiteGroups/GetById(@{variables('membersGroupID')})/users?\$select=Id,Title
```

Note that we have a \$select parameter to specify that we want to retrieve the User ID and Title.

7D. Parse returned Users – next we’ll use a Parse JSON action to parse the JSON that’s returned by the previous action so we can use it in the next step. The schema for this step is:

```{
"type": "object",
"properties": {
"d": {
"type": "object",
"properties": {
"results": {
"type": "array",
"items": {
"type": "object",
"properties": {
"type": "object",
"properties": {
"id": {
"type": "string"
},
"uri": {
"type": "string"
},
"type": {
"type": "string"
}
}
},
"Id": {
"type": "integer"
},
"Title": {
"type": "string"
}
},
"required": [
"Id",
"Title"
]
}
}
}
}
}
}
```

7E Check for the Integration Group – we use an Apply to Each control to loop through each of the returned Users to find the AD Group so we can remove it from the SharePoint Group

7E1. If Current User equal Integration Group – first we use a Condition control to check if the current User’s Title is equal to the integrationGroupName variable we set in Step 2

7E2. Remove Integration Group from Site Members – we use a final “Send HTTP Request to SharePoint” action to remove the AD Group from the SharePoint Group. The Uri for this action is:

```_api/Web/SiteGroups/GetById(@{variables('membersGroupID')})/users/removeByid(@{items('Check_for_the_Integration_Group')?['Id']})
```

As you can see, we’re using the membersGroupID variable to specify the Group we’ll be removing the AD Group from, then we’ll use the current User’s ID from the loop to specify that this is the User we want to remove

7F. Unshare Account with All Users Team – the final step in the Flow is to use another Unbound Action to RevokeAccess to the AllUsers team. The RevokeAccess action has two parameters:

1. Target – as in Step 7B we set this as we would a lookup field, using the following format:

```/accounts(@{triggerOutputs()?['body/accountid']})
```

2. Revokee – this is another parameter that we can set as we would a lookup field. In this case we’re revoking access for a team, so we use the following format:

```/teams(E65D6729-E162-EA11-A812-000D3A86BA0B)
```

## Conclusion

Hopefully you’ll find this Flow useful. One of the things I like about it is that it’s using the same Azure Active Directory Group in both Dynamics 365 and SharePoint, so you don’t have to worry about trying to maintain and synchronise team memberships in two systems, it can all be centrally controlled and managed in Azure.

Let me know your thoughts on this by reaching out to me on Twitter or LinkedIn, or drop a comment below.

## Synchronising Permissions Between Dynamics 365 and SharePoint Using Power Automate

This is the third post in my series discussing a better integration betwen Dynamics 365 and SharePoint. If you’d like to read the previous two posts, check them out by clicking the links below:

In this post I’m going to demonstrate how to synchronise a Dynamics 365 Access Team with a related SharePoint site using Power Automate. One of the key frustrations I’ve experienced with the default integration between D365 and SP is that there is no reciprocity in permissions management between the two systems; working in the financial services industry we need to ensure that only those who should have access to a record and the related documents stored on SharePoint will have access, and we wanted to avoid having to make this synchronisation manual where possible.

## THe SETUP

For this scenario you will need to have enabled the SharePoint integration and the custom site structure that I talked about in Part 1 of this series. Whilst you can secure items in SharePoint at item/folder level this is not recommended; SharePoint best practice typically recommends dealing with security at Site level.

For the entities that you are going to be synchronising with SharePoint you will need to enable them for Access Teams and set up an appropriate Access Team Template.

We will also need to add two new fields to the entities that we’re going to be doing the permissions synchronisation for:

1. Sync with SharePoint – a two-option field we’ll use to trigger our Flow
2. Last Team Sync – a datetime field we’ll use to let Users know when the D365 team was last synced with SP

Now we’ve completed the setup, lets get on with making the Flow!

## The Solution

1. When Account Sync with SharePoint equals Yes – for the trigger we’re using the Common Data Service (Current Environment) connector, and the “When a Record is Created, Updated or Deleted” trigger. We will set the Trigger Condition to Update, and then we’ll set the Filtering Attribute to our “Sync with SharePoint” two-option field we’ve created. As we only want the Flow to trigger when that field is set to Yes we will set the filter expression to ensure the two-option field equals true. This avoids us triggering redundant flow runs and implementing guard conditions

2. Get Related Access Team – next we’ll use a List Records action to retrieve the Access Team related to our entity. This is straightforward to do because Access Teams that are system generated automatically set the RegardingObjectID to the related record, so we can use this in our Fetch XML query to find the team we want. In my demo I only have one access team template for my entity, if you had two then you could add additional parameters to your query to ensure you return the right team.

In this case I’m going to be using a link-entity clause so I can find a team where the regardingobjectid equals the Account from my trigger.

The Fetch XML query I used is:

```<fetch version="1.0" output-format="xml-platform" mapping="logical" distinct="false">
<entity name="team">
<attribute name="name" />
<attribute name="teamid" />
<order attribute="name" descending="false" />
<filter type="and">
<condition attribute="accountid" operator="eq" value="@{triggerOutputs()?['body/accountid']}" />
</filter>
</entity>
</fetch>
```

3. Compose TeamID – as we’re only returning one result in our list records step above, we don’t want to add a redundant “Apply to Each” loop to find the attribute we want; instead we’ll use a First expression to query the returned JSON for the attribute. The expression I’ve used is:

```first(outputs('Get_Related_Access_Team')?['body/value'])?['teamid']
```

4. List CRM Team Members – now that we’ve got the Team ID set, we can interrogate the TeamMembership entity. If you’re not aware of it, the TeamMembership entity is an intersect entity between the SystemUser and Team entities. It will not appear in the list of available entities, so you will have to manually input teammemberships into the Entity Name field.

We can use another Fetch XML query to find the related Users and, importantly, their email addresses which we will use to add them to SharePoint later in the Flow. In the Fetch XML we’ll use the Team ID from the step above, and we’ll use a link-entity to retrieve the Users email address.

The Fetch XML query I used is:

```<fetch>
<entity name="teammembership" >
<attribute name="teammembershipid" />
<attribute name="teamid" />
<attribute name="systemuserid" />
<filter>
<condition attribute="teamid" operator="eq" value="@{outputs('Compose_TeamID')}" />
</filter>
</entity>
</fetch>
```

The output of this step will produce JSON that will look something like below. As you can see, we’ve selected the internalemailaddress attribute from the SystemUser entity by using a link-entity in the Fetch XML query, and it is returned in the JSON as an attribute called systemuser1.internalemailaddress; this is important for the next step

5. Select CRM Users Email Addresses – in this step I’m using a Select action to create an array of the returned Users email addresses from the previous step. I do this because it gets rid of any additional “noise” from the returned JSON, which will make it easier to debug later. In the Select action I set the Key to “Email Address” and then in the Value I use the following expression:

```@item()?['systemuser1.internalemailaddress']
```

6. Get SharePoint Team Members – for this step I’m going to be using the “Send HTTP Request to SharePoint” action. The Send HTTP request to SharePoint action lets you use the REST API, so we’re going to retrieve a list of the Users, and we’ll use a \$select tag to specify that we want the Users email addresses. For the Site Address I’m using the value from the custom SP Site URL Field I created in Part 1 of this series.

The Uri we’re using is:

```_api/Web/SiteGroups/GetById(5)/users?\$select=Email
```

You’ll see above that I’ve specified the Group ID as 5, which is typically the members group (though Al Eardley has informed me this may not always be the case). If you have a specific group you would like to sync your Users to then you could retrieve all the Groups from the SP site, then loop through them to find the correct one and get the ID for that one to use in the step above.

7. Compose SPTeamMembers – next we’ll use another Compose step to convert the output from the previous step into a String object, as we need it to be a string for the purposes of the following steps. I’m also using a toLower expression to ensure consistency of results.

The expression I used is:

```toLower(string(outputs('Get_SharePoint_Team_Members')?['body']))
```

8. For Each CRM User – in this step we’ll be using an Apply to Each to loop through the outputs from the Select action in Step 5

8A. Compose Email Address – the first step is to use a Compose action with a String expression to convert the array object to a string so we can use it in the next step. As in Step 7, I’m also using a toLower expression to ensure consistency of results. The expression is:

```toLower(string(items('For_Each_CRM_User')?['Email Address']))
```

8B – Check if they are in the SharePoint Team – we’ll use a condition control to check if the current User’s email address appears in the list of Users email addresses we’ve extracted from SharePoint using a contains expression:

```contains(outputs('Get_SharePoint_Team_Members')?['body'],outputs('Compose_EmailAddress'))
```

If the answer to this question is true we’ll do nothing (as they’re already in the SharePoint group), if the answer to the question is No then we’ll add them to the SharePoint group in the next step

8C – Add User to SharePoint Team – we’ll use another “Send HTTP Request to SharePoint” action to add a User to the SharePoint Group. In this instance we’re going to be using the POST method rather than the GET method. The Uri is:

```_api/web/SiteGroups(5)/users
```

The body of our request is:

```{   "__metadata": {
"type":"SP.User"
},
}
```

Please note that the prefix “i:0#.f|membership|” is required for the login name, as this is part of how SharePoint handles claims based authentication

9. Parse SPTeamMembers – next we’ll use a Parse JSON action on the outputs from Step 6 above, and we’ll use the output of the Parse JSON step in step 10 below to loop through each returned user in the SharePoint Group. The schema for the Parse JSON action is:

```{
"type": "object",
"properties": {
"d": {
"type": "object",
"properties": {
"results": {
"type": "array",
"items": {
"type": "object",
"properties": {
"type": "object",
"properties": {
"id": {
"type": "string"
},
"uri": {
"type": "string"
},
"type": {
"type": "string"
}
}
},
"Email": {
"type": "string"
}
}
}
}
}
}
}
}
```

10. For each SP User – We’ll use an Apply to Each control to loop through the output results from the previous step

10A. Check if they are in the CRM Team – Similarly to step 8B, we’re going to use a Contains expression to check if the current SharePoint User’s email address is in the list of returned Users email addresses from Step 5

```contains(string(body('Select_CRM_Users_Email_Addresses')),string(items('For_Each_SP_User')?['Email']))
```

If the answer to this question is Yes (the User is in both teams), then we’ll do nothing. If the answer to this question is No (the User is in the SharePoint group but is not in the D365 team) then we’ll run through the steps to remove them from the SharePoint group

10B. Get User ID – we’ll use another “Send HTTP Request to SharePoint” action to retrieve the ID of the User we want to remove from the SharePoint Group. We are using the GET method, and the following Uri:

```_api/web/SiteUsers/getByEmail('@{items('For_Each_SP_User')?['Email']}')
```

We don’t need any headers or body parameters for this request

The output from this action will return JSON that looks like

As you can see, the User’s ID is under the path [‘d’][‘id’], so we’ll use this in the next step

10C. Remove User from SP Group – we’re going to use one more “Send HTTP Request to SharePoint” action to remove the User from the group. We’ll be using a POST method to perform the action, and the Uri for the action is:

```_api/Web/SiteGroups(5)/users/removebyid(@{body('Get_User_Id')['d']['id']})
```

As you can see, we’ve specified the Site Group number we want to remove the User from, and then taken the ID from the step above to specify the User we want to remove.

11. Update Account – the final step in the Flow is to use an Update Record step to set the “Last Sync Date” field to the current date using a utcNow expression

We will also set the “Sync with SharePoint” field back to No so it is ready for the next time the Team needs to be synced

As we set the filter expression in the trigger to only fire the Flow when this field equals Yes, changing the value back to No will not trigger a new Flow

## Conclusion

This has hopefully given you an overview of how relatively straightforward it could be to synchronise security between Dynamics 365 records and an associated SharePoint site. There are a couple of things you may wish to consider alongside this:

1. If you’re going to have multiple entities enabled for Document Management, you may wish to put the logic fro synchronising permissions in a Child Flow that could then be called by any entity
2. As discussed above, good practice would dictate that you should always dynamically retrieve the specific SharePoint Group ID you want to synchronise permissions with, rather than hardcoding a value
3. You may wish to set this to run on a schedule so permissions are synced automatically every X hours/days/weeks, etc.
4. As of the date of publication, Flows cannot be triggered on N:N Associate/Dissociate actions in Dynamics 365, but you could use a trigger when a new record is created/deleted in the teammembership entity, or alternatively you could use a solution like North52 to create a custom trigger. I am personally more in favour of not using this approach for two reasons:
1. Power Automate flows are currently asynchronous only, so this means you could potentially run into issues with concurrency if you are adding/removing multiple people at the same time from teams, which would lead to Dynamics 365 and SharePoint being out of sync
2. If you have lots of teams, and the membership changes frequently, then you could trigger a lot of Flow runs, compared to this method which catches all changes in a single run

If you do find this useful I’d love to hear from you, please reach out to me on social media or drop a comment below to give me your thoughts!