Calculate Quarter for a Date in Power Automate

A common request in my company is for dates to be expressed in relation to the Quarter of the year in which they fall. Using Power Automate there is no native way of calculating this, so in this post I’m demonstrating two different methods to calculate the Quarter from a given input date:

  1. Nested IF Conditions
  2. Filtered Array

Nested IF Conditions

1. Trigger Flow – for the purposes of this demo I’m using a manual trigger with a date input, but of course it will work with any date so the logic can be implemented within a bigger flow

2. Calculate Month Number – the next step is to use a FormatDateTime function to convert the input date to the Month Number. We use the following expression to achieve this

formatdatetime(triggerBody()['date'], 'MM')

3. Calculate Quarter Number – the final step is to use the calculated Month number to work out which Quarter it is in. This is achieved with a set of nested IF conditions. The expression to use for this is:

if(
	or(
		equals(outputs('MonthNum'), '01'), 
		equals(outputs('MonthNum'), '02'), 
		equals(outputs('MonthNum'), '03')
		), 
	'Q1', 
	if(
		or(
			equals(outputs('MonthNum'), '04'), 
			equals(outputs('MonthNum'), '05'), 
			equals(outputs('MonthNum'), '06')
		), 
		'Q2', 
		if(
			or(
				equals(outputs('MonthNum'), '07'), 
				equals(outputs('MonthNum'), '08'), 
				equals(outputs('MonthNum'), '09')
			), 
			'Q3', 
			if(
				or(
					equals(outputs('MonthNum'), '10'), 
					equals(outputs('MonthNum'), '11'), 
					equals(outputs('MonthNum'), '12')
				), 
				'Q4', 
				'N/A'
			)
		)
	)
)

Filtered Array

1. Trigger Flow – as in the previous method, I’m using a manual trigger for the Flow with a Date input

2. Calculate Month Number – as in the previous method, the first action is to use a FormatDateTime expression to calculate Month Number from the input date

3. Compose Quarter Array – next we compose an Array that contains the Month Numbers and their related Quarter Number. The Array I used is:

[
  {
    "Month": "01",
    "Quarter": "Q1"
  },
  {
    "Month": "02",
    "Quarter": "Q1"
  },
  {
    "Month": "03",
    "Quarter": "Q1"
  },
  {
    "Month": "04",
    "Quarter": "Q2"
  },
  {
    "Month": "05",
    "Quarter": "Q2"
  },
  {
    "Month": "06",
    "Quarter": "Q2"
  },
  {
    "Month": "07",
    "Quarter": "Q3"
  },
  {
    "Month": "08",
    "Quarter": "Q3"
  },
  {
    "Month": "09",
    "Quarter": "Q3"
  },
  {
    "Month": "10",
    "Quarter": "Q4"
  },
  {
    "Month": "11",
    "Quarter": "Q4"
  },
  {
    "Month": "12",
    "Quarter": "Q4"
  }
]

4. Filter Quarter Array – next we use a Filter Array action to find the Array element where the Month Number is equal to the Month Number we calculated in Step 2. If we edit the Filter Array action in Advanced Mode, the expression we use is:

@equals(item()?['month'], outputs('MonthNum'))

5. Compose Quarter – the final step in the Flow is to use a compose action to extract the quarter number from the Array element we returned from the previous step. We do that by using the following expression:

body('Filter_Quarter_array')[0]?['Quarter']

Conclusion

This post should demonstrate that we can use different methods to achieve the same outcomes in Power Automate. If you’d like to download the Flows to test them for yourself then click the links below:

Change a User’s Business Unit and retain their Security Roles using Power Automate

I saw a tweet recently from Linn Zaw Win asking if it was possible to change a Users Business Unit without losing their Security Roles

If you’ve been following my blog you might be aware that I posted a solution to do this using North52 last year, but I wondered whether it would be possible to do this with Power Automate now.

The Setup

As we can’t run pre-validation Flows to retrieve a the prior value when we change a field in D365, we need to add another lookup to the Business Unit entity from the User entity. I’ve creatively called it “New Business Unit” and we’ll be using this as the trigger for our Flow. In my scenario, if you want to change the Business Unit of a User you would select the one you wish to migrate them to by updating this field

The Solution

1. When “New Business Unit” is updated – the Flow trigger is when the “New Business Unit” field is updated, and we set the filter expression to ensure it isn’t triggered when the field equals null (i.e. if the field is cleared)

2. List Existing Security Roles – next we use a List Records action to retrieve the current list of Security Roles applied to the User. We have to use a FetchXML query to retrieve the list Roles; when you apply a Security Role to a User it creates a record in the systemuserroles entity, so we can retrieve the list of applied Roles for a given User by querying across this entity with some link-entity parameters. The FetchXML query we use is:

<fetch>
  <entity name="role" >
    <attribute name="name" />
    <attribute name="businessunitid" />
    <attribute name="roleid" />
    <link-entity name="systemuserroles" from="roleid" to="roleid" >
      <link-entity name="systemuser" from="systemuserid" to="systemuserid" >
        <filter>
          <condition attribute="systemuserid" operator="eq" value="@{triggerOutputs()?['body/systemuserid']}" />
        </filter>
      </link-entity>
    </link-entity>
  </entity>
</fetch>

3. Initialize roleFilterCondition – we’re going to use the roles we returned above to find the same roles under the new business unit with another FetchXML query. Before we do that we need to initialize an empty string variable to hold the FetchXML conditions we’ll create

4. Apply to each Existing Role – for each role we returned in step 2 we will append a condition to the roleFilterCondition variable we created in Step 3 in the following format:

<condition attribute="name" operator="eq" value="@{items('Apply_to_each_Existing_Role')?['name']}" />

5. List Roles for New Business Unit – now that we’ve created our FetchXML conditions we’ll use a List Records step to retrieve the Security Roles for the new business unit that we picked for our User. The FetchXML we use for this query is:

<fetch>
  <entity name="role" >
    <attribute name="name" />
    <attribute name="businessunitid" />
    <attribute name="roleid" />
    <filter>
      <condition attribute="businessunitid" operator="eq" value="@{triggerOutputs()?['body/_rm365_newbusinessunit_value']}" />
      <filter type="or" >
        @{variables('roleFilterCondition')}
      </filter>
    </filter>
  </entity>
</fetch>

6. Update Business Unit – now that we’ve retrieved the new Security Roles we are going to be applying to the User we can change their Business Unit; to do this we use an Update Record action and set the Business Unit to the “New Business Unit” value. Changing the User’s Business Unit will remove their existing security roles.

7. Apply to each New Role – we use another Apply to Each control to iterate through the values we returned in the List Records action in Step 5 above. For each Role we use the Relate Records action. The parameters for the Relate Records action are:

  • Entity Name: Users
  • Item ID: The User value from the trigger
  • Relationship: systemuserroles_association
  • URL: the full resource address for the Security Role

8. Unrelate “New Business Unit” to clear field – the final step in the Flow is to use the Unrelate Records action to clear the “New Business Unit” field so it can be used again in future. The schema is the same as in the Relate Records action, but we’re triggering it on the New Business Unit record

Conclusion

This method will enable you to update Business Units for Users in your organisation and have their Security Roles persist. You could extend this functionality to Teams as well. It is worth noting that this won’t work for Security Roles that are created for specific Business Units, and it may encounter issues if you have duplicate role names, but I think the basic functionality is quite useful. Let me know if you think is handy for you in the comments below!

Calculating ISO 8601 Week Number for Dates in Power Automate

An issue that a lot of users have faced with Power Automate is the inability to find out the ISO 8601 Week Number for a given date; this was raised as an idea in the Power Apps Ideas forum back in 2017, and you can find numerous questions about it on the Power Automate forums, but a lot of the answers I’ve seen are incomplete or are overly complex.

If I was using Excel, I could find out the week number for a date using the ISOWEEKNUM function, so it’s a bit frustrating that I can’t do it this easily in Power Automate. I’m not the kind of person to back down from a challenge though, so I decided to see if I could figure out an easy solution to this problem.

The SOlution

There are a couple of things to be aware of when trying to work out the ISO Week Number:

  1. An ISO week-numbering year will have either 52 or 53 full weeks
  2. The ISO weeks always start on a Monday
  3. The first ISO week of a year is the first week that contains the first Thursday of the Gregorian calendar

The Wikipedia article on ISO Week Dates has some great information if you’re not familiar with it.

Now that we know the key elements of ISO Week Dates, we can get our thinking caps on. I went through a lot of trial and error to work this out, but what I’ve ended up with below is a solution that I’ve tested successfully against over 1,000 randomly generated dates and it has been 100% correct.


1. Manually trigger a flow – to demonstrate the Flow I’m just using a manual trigger with a Date input

2. Compose Start of Week (Monday) – as we know that ISO weeks always start on a Monday, we want to work out the Monday of the week our trigger date is in. Power Automate numbers days of the week beginning from Sunday, where Sunday is 0, Monday is 1, etc. with the dayofweek expression.

To calculate Monday we subtract the dayofweek value of our trigger date minus 1 to get a total amount of days to subtract. As we know a Sunday will be a 0, we also add in an IF condition to set the value to subtract to 6 if the day is a Sunday.

To give an example of this:

Date = 2020-04-29

Dayofweek for Date = 3

3 – 1 = 2

Monday = 2020-04-29 minus 2 days = 2020-04-27

The expression is constructed as follows:

subtractFromTime(triggerBody()['date'], if(equals(dayofweek(triggerBody()['date']),0),6,sub(dayofweek(triggerBody()['date']),1)), 'Day')

3. Compose Thursday – now that we have the Monday value, we want to find out the value for the Thursday in the week of our triggering date. We do this as the first ISO week of the year will always have a Thursday in it, and therefore it’s best to set our calculations from this day. Thursday is always 3 days after Monday, so we just add 3 days to the Monday value with the expression below:

addDays(outputs('Start_of_Week_(Monday)'),3)

4. Compose ISO Week Number – To calculate the week number, we need to first find out what day of the year the Thursday is on, and we can do that with the dayofyear expression. You would think that you could just divide the value by 7 and get the week number, but its not that simple.

In 2020, the first Thursday was 2nd January which is Day 2 of the year. the integer value of 2/7 is 0 (in integer maths everything after the decimal point is ignored). ISO Week Numbers start at 1, so 0 is an invalid value. You might think we could just add 1 to this value, but this doesn’t work either.

The 1st Thursday of 2016 was January 7th (7th day of the year). The integer value of 7/7 is 1, and if we added 1 to this it would give us a first week of the year as 2, which obviously isn’t correct.

The simplest way to avoid this is to add 6 days to the DayofYear value for our Thursday, then divide the result by 7 to get our integer value.

To see this in action, and continuing our example from above:

Monday = 2020-04-27

Thursday = Monday+3 = 2020-04-30

DayofYear Thursday = 121

121 + 6 = 127

127/7 = 18.142

Integer value of 18.142 = 18

We can therefore say that the date 2020-04-29 is in Week 18 of 2020

The expression is written as:

div(add(dayofyear(outputs('Thursday')),6),7)

5. Respond to a PowerApp or Flow – the final step in this Flow is to set a response action with 2 outputs:

  1. Input date – the value that we input to the Flow
  2. Week Number – we will set this as follows:
Week @{outputs('ISO_Week_Number')} of @{formatdatetime(triggerBody()['date'],'yyyy')}

Conclusion

With this Flow I wanted to create a simple solution to calculate the ISO Week Number for any given date, without lots of nested conditions and weird workarounds. I presented it above as a set of 3 compose actions which is easier for debugging and writing the formula, but you could also have the whole expression in a single compose action as below:

div(add(dayofyear(addDays(subtractFromTime(triggerBody()['date'], if(equals(dayofweek(triggerBody()['date']),0),6,sub(dayofweek(triggerBody()['date']),1)), 'Day'),3)),6),7)

If you find this useful, then I’d love to hear from you, so please leave a comment below or contact me on Twitter or LinkedIn.

Dynamics 365 Opportunity Revenue Forecasting – Creating Monthly Forecasts

In my last post I explored how to create a structure and a triggering flow to create Opportunity Revenue Forecasts. In this post I’ll delve into the child flow to create a set of Monthly Revenue Forecasts.

The Solution

1. Manually trigger a flow – as I explained in the last post, when we trigger this child flow we need the following elements from the triggering flow:

  1. Loops – we use the output from the DateDiff calculated field on the opportunity entity, and we’ll use this to work out how many Revenue Forecast records we will be creating
  2. Estimated Close Date
  3. Project End Date
  4. Revenue Per Day
  5. OpportunityID – The GUID for our triggering record, we’ll use this to set the Opportunity lookup field on the Revenue Forecast records we will create
  6. Estimated Revenue

Steps 2-9 – Initialize variables – the next 8 steps in the Flow are to initialize empty variables that we’ll then be setting in each Loop of our Flow. I’ve provided an outline of the Variables below:

Step No.NameTypeNotes
2startOfMonthStringUsed for calculating the first day of the Month
3endOfMonthStringUsed for calculating the last day of the Month
4daysInMonthIntegerUsed for calculating the total number of full days in a month
5workingDaysInMonthIntegerUsed to hold a count of the working days in the month
6monthRevenueFloatUsed for calculating the amount of revenue that is forecast for the month
7cumulativeAmountFloatUsed for calculating the cumulative amount of revenue to date for the forecast
8previousForecastStringUsed to hold the GUID for the forecast created in the previous loop
9loopIndexIntegerUsed to hold a count of which loop we are in for the purposes of identifying the First and Final loops

10. Do Until – The real bones of the logic of this Flow occurs within the Do Until control step. We set the Do Until to run until the value in the loopIndex variable we created above is equal to the value from the Loops input value for the Flow.

Note: the standard limits for a Do Until control are a count of 60 or a timeout of 1 hour. If you expect to create more loops then you may wish to increase these limits (though note there is an upper limit of a count of 5000 loops or a timeout of 30 days – https://docs.microsoft.com/en-us/power-automate/limits-and-config#run-duration-and-retention)

10A. Increment loopIndex – the first step in the Do Until loop is to use an Increment Variable action to increment the loopIndex variable by 1.

10B. Check if First Loop – the next step in the Do Until loop is a set of nested conditions that first checks if we’re in the First Loop (i.e. loopIndex = 1), if not it then checks if we’re in the Final Loop (i.e. loopIndex = Loops value from trigger), otherwise it carries out the intermediate Loop steps.

In the loops we set the variables we initialised above as follows:

VariableFirst LoopIntermediate LoopFinal Loop
startOfMonthEstimated Close DateFirst Day of MonthFirst Day of Month
endOfMonthLast Day of MonthLast Day of MonthProject End Date
daysInMonthendOfMonthstartOfMonth +1endOfMonthstartOfMonth +1endOfMonthstartOfMonth +1
workingDaysInMonthworkingDays CalculationworkingDays CalculationworkingDays Calculation
monthRevenueworkingDaysInMonth x Revenue Per DayworkingDaysInMonth x Revenue Per DayworkingDaysInMonth x Revenue Per Day
cumulativeAmountmonthRevenueIncrement by monthRevenueIncrement by monthRevenue
previousForecastGUID of created Revenue ForecastGUID of created Revenue ForecastN/A

I’ll run through the steps in the loops below, and call out where there are differences in the First and Final loops as identified in the table above

10A. set startOfMonth – in the first loop we set this to Estimated Close Date from the trigger inputs. In the intermediate and final loops we use an adddays expression to add one day to the endOfMonth variable to set this to the first day of the new month. The expression we use is:

adddays(variables('endOfMonth'), 1)

10B. Compose startOfMonthTicks – in order to calculate the number of days in the Forecast period we need to convert the start date and end date to their ticks representation. For this step we use the following expression:

ticks(variables('startofMonth'))

10C. set endOfMonth – in the Final Loop we set this to the Project End Date from the trigger inputs, and in the First and Intermediate loops we set it to the last day of the Month using the following expression:

addDays(startOfMonth(addToTime(variables('startOfMonth'), 1, 'Month')), -1)

This expression works as follows:

  1. Add one Month to the date from the startOfMonth variable
  2. Use the startOfMonth expression to get the first day of this month
  3. Add -1 days (i.e. subtract one day) to the resulting date to get the end date of the previous month

an example of this in action would be:

startOfMonth = 2020-04-21

addToTime(variables(‘startOfMonth’), 1, ‘Month’) = 2020-05-21

startOfMonth(2020-05-21) = 2020-05-01

addDays(2020-05-01, -1) = 2020-04-30

10D. Compose endOfMonthTicks – as at Step 10B, we need the ticks representation of the endOfMonth date, so we use the following expression:

ticks(variables('endOfMonth'))

10E – Set daysInMonth – to calculate the number of full days in the month we subtract the ticks value of the startOfMonth from the ticks value of the endOfMonth, then divide that value by 864000000000 (the number of ticks in a day), then add 1 to the value to give us the inclusive number of full days, using the following expression:

add(div(sub(outputs('Compose_endOfMonthTicks_-_First_Loop'), outputs('Compose_startofMonthTicks_-_First_Loop')), 864000000000),1)

10F – Set workingDaysInMonth – As we calculated the total number of full days in the forecast period in the previous step, we’ll set the workingDaysInMonth variable to this value, and we’ll subtract from it in the next step

10G. WorkingDays Calculation – I covered how we calculate the number of working days from the number of full days in my previous post, so you can read about that by clicking here. I’m using the exact same logic in each loop in this Flow

10H. Set monthRevenue – to set the revenue amount for the Forecast period we multiply the number of Working Days we calculated above by the Revenue Per Day from the input trigger with the following expression:

mul(triggerBody()['number_1'], variables('workingDaysInMonth'))

10I. Set cumulativeAmount – in the First Loop we set the cumulativeAmount to the monthRevenue value from above, whilst in the Intermediate and Final loops we increment the cumulativeAmount by the monthRevenue value

10J. Compose monthTitle – in order to ensure consistency of naming for the Revenue Forecast records we use a formatDateTime expression to create the title for the record:

formatDateTime(variables('startofMonth'), 'MMMM yyyy')

10K. Create Forecast – we use a Create Record action from the Common Data Service (Current Environment) connector to create the Forecast, and input the variables as follows:

10L. Set previousForecast – the last step in the loop is to set the previousForecast variable to the GUID of the record we created in the previous step. This is used to the set the Previous Forecast field on the Forecast we create in the next loop, which maintains the hierarchical link between the records. This step is obviously not required in the Final loop.

Note: if you wanted your flow to be more efficient you could remove the compose steps at 10B, 10D and 10J and just use the expressions directly where the values from those steps are used in 10E and 10K, respectively. I left the compose steps in for debugging and ease of understanding

The overall Loop section of the Flow looks like this:

Note: if you have a situation where you only have a single loop (i.e. the Estimated Close Date and the Project End Date are in the same month), then you could change the expression in the endOfMonth calculation to if(greater(addDays(startOfMonth(addToTime(variables(‘startOfMonth’), 1, ‘Month’)), -1),triggerBody()[‘date_1’]),triggerBody()[‘date_1’],addDays(startOfMonth(addToTime(variables(‘startOfMonth’), 1, ‘Month’)), -1))

11. Respond to a PowerApp or Flow – the final step in any child flow is to respond to the calling Flow. I haven’t set any outputs in this response action, though of course you may choose to if you wish.

Conclusion

Whilst this Flow may look complicated at first, I think it is actually relatively simple in reality. There are 7 main variables in the Flow (plus the loopIndex), and the nested conditions loop through these variables to set them with the expressions to create the Forecasts.

As this relies on a Do Until loop, it is not going to be instantaneous, so this is something to take into consideration if you were designing this for deployment in a production environment. In my testing, creating up to 100 Revenue Forecast records with this Flow typically took somewhere between 5-10 minutes. I’m sure this speed could be improved if you chose to code this in a plugin, but I wanted to demonstrate what could be done within the context of a Flow.

Dynamics 365 Opportunity Revenue Forecasting with Power Automate – Part 1

A common complaint I hear about Opportunity pipeline management in Dynamics 365 is that it’s difficult to use for revenue/income forecasting, as the system considers that you will receive the whole Estimated Revenue on the Estimated Close Date, whereas in reality it’s common for the revenue on a Won Opportunity to split into periodic draw downs (i.e. the client will be billed weekly, monthly, quarterly or annually).

I have created a solution previously with North52 to break the estimated revenue down over the life of the project, and I wanted to see if it would be possible to do the same with a Power Automate Flow.

Over the next few posts in this series I’ll show how I got this functionality working:

  1. The triggering Flow
  2. Creating Monthly Forecasts
  3. Creating Yearly Forecasts
  4. Creating Weekly Forecasts
  5. Creating Quarterly Forecasts

The Scenario

In my organisation we wanted to be able to get more detail into our revenue forecasting pipeline to increase the accuracy of it. The requirement was that we should be able to put in an estimated project end date as well as an opportunity Estimated Close Date; the time period between the Estimated Close Date and the Project End Date would be the project lifecycle.

From this time period we wanted to be able to break the Estimated Revenue down into Weekly, Monthly, Quarterly or Yearly revenue forecasts, and these forecasts should be based on the number of Working Days in the time period.

The Setup

In order to set up your environment to do create the revenue forecasts you need to add the following:

1. Opportunity Entity

  1. Add “Breakdown Type” option set field with the following options:
    1. Weekly
    2. Monthly
    3. Quarterly
    4. Yearly
  2. Add “Project End Date” date only field
  3. Add “DateDiff” whole number calculated field (see calculation below)

2. New “Revenue Forecast” Entity

Create a new entity in your environment called “Revenue Forecast” with the following fields:

  1. Name – a single line of text field for the name of the Forecast
  2. Amount – a currency field to hold the amount for the Forecast
  3. Cumulative Amount – a currency field
  4. Start Date – a date-only datetime field
  5. End Date – a date-only datetime field
  6. Opportunity – a lookup field to the Opportunity entity
  7. Previous Forecast – a hierarchical lookup field to the Revenue Forecast Entity

The Triggering Flow Solution

1. When Opportunity is Updated – The trigger uses the “When a Record is Created, Updated or Deleted” trigger from the Common Data Service (Current Environment) Connector. In this instance we’re using the Update trigger condition, and the filtering attributes that I used are the Estimated Revenue, Estimated Close Date, Breakdown Type and Project End Date fields.

2. List related Forecasts – the first step in the Flow is using a List Records action to retrieve any existing Revenue Forecasts for the Opportunity. As we’ll be generating new ones any time we change any of the filtering attributes, we need to remove the existing ones. We set the filter query using the following OData query:

_rm365_opportunity_value eq @{triggerOutputs()?['body/opportunityid']}

3. Apply to each Returned Forecast – we use an Apply to Each control to loop through the records returned from the List Records step above, and then we use a Delete Record action to delete the existing Revenue Forecast record

4. Compose fullDays – now that we’re starting from a clean slate we need to first calculate how many total days there are between the Estimated Close Date the Project End Date. To do this we use a compose step with the following expression:

add(div(sub(ticks(triggerOutputs()?['body/rm365_projectenddate']),ticks(triggerOutputs()?['body/estimatedclosedate'])),864000000000),1)

In my previous post regarding calculating working days I discussed the fact that we can’t do a simple “DateDiff” calculation in a Flow, rather we have to convert the dates to their ticks representation, then subtract the start date from the end date then divide the result by 864000000000 to convert the result to the number of days. Finally we add 1 to this result to give us the total number of whole days between the start date and end date, inclusive of the end date.

5. Initialize workDays – next we initialise an Integer variable called workDays and set it to the output from the previous step. We’ll subtract from this number to calculate the total number of working days in the time period.

6. WorkingDays calculation –

we use a Scope control to keep all of the decrement steps for calculating the amount of working days together and to keep the flow tidy

I discussed calculating working days in this previous post, however I’ve optimised the solution to make it more accurate.

6A. Decrement workDays by 2 for each Full Week – we know that in any given 7 day period there will be two weekend days, so we need to remove these from the total number of days returned above. The way we do this is to divide the total number of days value by 7, then multiply it by 2, and then subtract this result from the total number of days. Note that in the full days calculation above we had added one to the result, so we’re subtracting that from this value before we do the division.

mul(div(sub(outputs('Compose_fullDays'),1), 7), 2)

6B. Decrement workDays by 2 if EndDate DayofWeek is less than StartDate DayofWeek – Next we’re checking if the Project End Date is on an earlier Day of Week than the Estimated Close Date. (i.e. it starts on a Friday but ends on a Wednesday). In this case, we’ll have to account for an extra weekend so we remove it here.

if(less(sub(dayofweek(triggerOutputs()?['body/rm365_projectenddate']),dayofweek(triggerOutputs()?['body/estimatedclosedate'])),0),2,0)

6C. Decrement workDays by 1 if StartDate and EndDate are on the same weekend DayofWeek – If both the Estimated Close Date and Project End Date are on a Saturday or Sunday we need to subtract 1 from our working days calculation.

if(and(equals(dayofweek(triggerOutputs()?['body/estimatedclosedate']),dayofweek(triggerOutputs()?['body/rm365_projectenddate'])),or(equals(dayofweek(triggerOutputs()?['body/estimatedclosedate']),6),equals(dayofweek(triggerOutputs()?['body/estimatedclosedate']),0))),1,0)

6D. Decrement workDays by 1 if StartDate is Sun and EndDate is any other day – If the Estimated Close Date is a Sunday and the Project End Date is any other day then we need to subtract 1 from our working days calculation

if(and(equals(dayofweek(triggerOutputs()?['body/estimatedclosedate']),0),greater(sub(dayofweek(triggerOutputs()?['body/rm365_projectenddate']),dayofweek(triggerOutputs()?['body/estimatedclosedate'])),0)),1,0)

6E. Decrement workDays by 1 if EndDate is Saturday and StartDate is any other day – If the Project End Date is on a Saturday and the Estimated Close Date then we need to subtract one from our Working Days calculation.

if(and(not(equals(dayofweek(triggerOutputs()?['body/estimatedclosedate']),6)),equals(dayofweek(triggerOutputs()?['body/rm365_projectenddate']),6)),1,0)

I used an Excel workbook to verify these expressions against the networkdays formula. If you’d like a copy of the spreadsheet to do your own testing then download it by clicking here.

7. Compose revenuePerDay – now that we’ve calculated how many working days there are in our time period, the last step is to divide the Estimated Revenue by the amount of working days to calculate the amount of Revenue we will get for each working day in our project.

div(triggerOutputs()?['body/estimatedvalue'],variables('workDays'))

8. Switch on Breakdown Type – we use a Switch control to trigger one of the child flows we’ll create in the upcoming blog posts for creating Weekly/Monthly/Quarterly/Yearly breakdowns.

The inputs for the Child Flows ask for:

  1. Loops – we use the output from our DateDiff calculated field, and we’ll use this to work out how many Revenue Forecast records we will be creating
  2. Estimated Closed Date
  3. Project End Date
  4. Revenue Per Day
  5. OpportunityID – The GUID for our triggering record, we’ll use this to set the Opportunity lookup field on the Revenue Forecast records we will create
  6. Estimated Revenue

Conclusion

This post provides the groundwork for the revenue forecasting solution, and is useful for accurately calculating the number of working days in any given time period. I think it showcases some relatively complex expressions, so at the very least I hope it demonstrates how you might be able to use the expressions to carry out complicated functions.

In the next post in this series I’ll show how to create Monthly Revenue Forecasts for an Opportunity. If you’ve got any questions in the meantime please reach out.

Selecting a specific SharePoint Document Library to Upload Dynamics 365 Email Attachments using Teams Adaptive Cards

In my recent series of posts I covered how to implement a better integration between Dynamics 365 and SharePoint. One of the common scenarios that you will encounter if you’ve integrated the systems is the requirement to move Email Attachments from Dynamics 365 to SharePoint. In my implementation a lot of my sites have more than one Document Library, and Users want to be able to select a specific library for their attachments to be uploaded to for each email than having them all go to one standard location and then having to be moved after they’ve been uploaded to SharePoint.

The Solution

The solution I developed to enable this was to use Teams Adaptive Cards. I have created a Flow that is triggered when a User tracks an email against an Account; the Flow sends the User an Adaptive Card in Teams which has a list of the available Document Libraries on the SharePoint site related to the Account, and asks the User to select which one they want to upload the files to. After a Document Library is selected the Flow then moves the attachments to SharePoint. I’ll run through the details below:

1. When a Tracked Email is Created – the first step is to define the trigger, and for this I’m using the Common Data Service (Current Environment) connector, and the “When a Record is Created, Updated or Deleted” trigger. In this instance I used the Filter Expression below to ensure the Flow is only triggered on emails that have attachments, and that are Set Regarding an item

_regardingobjectid_value ne null and attachmentcount gt 0

2. Get User who Created Email – next I use the Get Record action to retrieve the User record for the User who tracked the email into the system. ONe thing to note here is that I’m using the value in the “Owner” field rather than the “Created by” field. This is because when an email is tracked on creation it is created by the SYSTEM account and then ownership is assigned to the tracking User. We get the User details so that we can retrieve their email address for use later in the Flow

3. Get Regarding Account – we use another Get Record action to retrieve the details of the Account against whom the email is tracked. Note that in your environment if you have multiple entities enabled for email integration then you may wish to implement a Switch action here so you can retrieve the details of any record that you might be tracking emails against; for the purposes of this demo I’ve kept it simple. We need to retrieve the Account record so we can use some of the values in our adaptive card later in the Flow.

4. List Document Locations related to Account – next we want to retrieve all of the related Document Locations for the Account as each Document Location represents a SharePoint Document Library, so we’ll use a List Records action with the following filter query:

_regardingobjectid_value eq @{triggerOutputs()?['body/_regardingobjectid_value']}

We’ll loop through these Document Locations to create the ChoiceSet for our adaptive card in the next steps.

5. Initialize Choices Array variable – we need to use an “Initialize Variable” action to create an empty array that we can use when we loop through the Document Locations we returned above.


6. Apply to Each Document Location – we’ll use an Apply to Each control to loop through each Document Location we returned in Step 4

6A. Append to Choices Array – for each Document Location we’re going to use an Append to Array Variable to create a new element in the Choices Array we created above. This array will be used in the Adaptive Card we create later in the Flow. The array value we use is:

{
  "value": "@{items('Apply_to_each_Document_Location')?['sharepointdocumentlocationid']}/@{items('Apply_to_each_Document_Location')?['name']}",
  "title": "@{items('Apply_to_each_Document_Location')?['name']}"
}

One of the things you may notice is that for the “Value” in the array element I’ve constructed it as [GUID_OF_DOCUMENT_LOCATION]/[DOCUMENT_LOCATION_NAME]. The reason for this is that the choice a User makes will return that value to the Flow and I need both the GUID and the Name later in the Flow, so I concatenate them just now and then I’ll split them after they get returned.

A sample of the output from this step is below:

{
  "value": "6127202a-b958-ea11-a811-000d3a86ba0b/Communication Info",
  "title": "Communication Info"
}

7. Ask User which Document Library for Uploads – this is the key action in the Flow. We’re using the “Post an Adaptive Card to a Teams user and wait for a response” action. The adaptive cards are created with JSON, and fortunately there are some great samples available at https://adaptivecards.io/samples/. I also found the “Utimate Guide” written by Tomasz Poszytek to be incredibly helpful.

The card I designed looks like this:

The JSON I used to create my card is:

{
    "type": "AdaptiveCard",
    "$schema": "http://adaptivecards.io/schemas/adaptive-card.json",
    "version": "1.0",
    "body": [
        {
            "type": "TextBlock",
            "size": "Medium",
            "weight": "Bolder",
            "text": "Upload Attachments to SharePoint"
        },
        {
            "type": "ColumnSet",
            "columns": [
                {
                    "type": "Column",
                    "items": [
                        {
                            "type": "TextBlock",
                            "weight": "Bolder",
                            "text": "@{triggerOutputs()?['body/subject']}",
                            "wrap": true
                        },
                        {
                            "type": "TextBlock",
                            "spacing": "None",
                            "text": "Created @{formatDateTime(utcnow(), 'dd MMMM yyyy HH:mm')}",
                            "isSubtle": true,
                            "wrap": true
                        }
                    ],
                    "width": "stretch"
                }
            ]
        },
        {
            "type": "TextBlock",
            "text": "You tracked an email against @{outputs('Get_regarding_Account')?['body/name']}.  Please select a Document Library from the List below to upload the attached documents to",
            "wrap": true
        },
{
            "type": "FactSet",
            "facts": [
                {
                    "title": "Account:",
                    "value": "@{outputs('Get_regarding_Account')?['body/name']}"
                },
                {
                    "title": "SharePoint Site:",
                    "value": "@{outputs('Get_regarding_Account')?['body/rm365_spsiteurl']}"
                },
                {
                    "title": "Subject:",
                    "value": "@{triggerOutputs()?['body/subject']}"
                },
                {
                    "title": "Sent To:",
                    "value": "@{triggerOutputs()?['body/torecipients']}"
                },
                {
                    "title": "Sent From:",
                    "value": "@{triggerOutputs()?['body/sender']}"
                }
            ]
        },       
		{
            "type": "Input.ChoiceSet",
            "id": "CompactSelectVal",
            "value": "1",
            "choices": 
               @{variables('choices')}
        }
    ],
    "actions": [
        {
            "type": "Action.Submit",
            "title": "Submit"
        }
    ]
}

As you can see, in Line 71 I’m using the Choices Variable to set the ChoiceSet for the User.

I also use the User’s Primary Email address that I got in Step 2 to specify who I will be sending the adaptive card to:

The output from the Adaptive card is JSON that looks like:

{
  "responseTime": "2020-04-02T15:38:10.0668583Z",
  "responder": {
    "objectId": "[GUID]",
    "tenantId": "[GUID]",
    "email": "user@emailaddress.com",
    "userPrincipalName": "user@emailaddress.com",
    "displayName": "Ryan Maclean"
  },
  "submitActionId": "Submit",
  "data": {
    "CompactSelectVal": "9a819bbc-4d56-ea11-a811-000d3a86ba0b/Contracts"
  }
}

We’ll be using the information returned in the Data section in the next two steps.

8. Compose DocLib Name – for this step we’ll be using a Compose action with a substring expression to extract the Document Library name that we concatenated with the GUID above. The GUID is a 36 character string, so we can set the startindex of our substring to 36. The expression we use is:

substring(outputs('Ask_User_which_Document_Library_for_Uploads')?['body']?['data']?['CompactSelectVal'],36)

9. Compose DocLib GUID – in this step we’re doing the exact same as the step above, but in this case we’ll set the startindex of our substring to 0 and the length to 36:

substring(outputs('Ask_User_which_Document_Library_for_Uploads')?['body']?['data']?['CompactSelectVal'],0,36)

10. Get AbsoluteURL for SharePoint Document Library – in this step we’re going to be using the HTTP with Azure AD connector and the Invoke an HTTP request action to query the Web API with the RetrieveAbsoluteAndSiteCollectionUrl Function so we can return the Site Collection URL for our SharePoint site and get the absolute URL for the specific Document Location the User selected. All credit for this goes to Thomas Peschat. His blog covers the full detail of this step, so I’d recommend reading it for more information.

In the Url of the request I’m using the GUID that I retrieved as an output from Step 9 above to specify the document library I want the URLs from.

11. Compose AbsoluteURL – in this step we’re using a Compose action to extract the AbsoluteURL of the Document library from the output of the step above

12. Compose SiteCollectionUrl – as above, we’re using a Compose action to extract the SiteCollectionUrl from the output of Step 10

13. List Email Attachments – in order to get the Attachment details from D365 we need to do a List Records action on the activitymimeattachment entity. This won’t appear in the dropdown list so you’ll need to enter it as a custom value. Our filter query will be:

_objectid_value eq @{triggerOutputs()?['body/activityid']}

14. Apply to each Attachment – we’ll use an Apply to Each control to loop through each Attachment we returned in the previous step

14A. Create file in selected SharePoint Document Library – for each Attachment from our D365 email we’ll create a file in the SharePoint Document Library using a Create File action. The key elements are:

  1. Site Address – we use the SiteCollectionUrl output from the compose step 12 above
  2. Folder Path – we use the DocLibName output from the compose step 8 above
  3. File Name – we use the File Name from the attachment
  4. File Content – the file content needs to be in binary format, but in D365 it is stored in base64 format, so we need to use a base64ToBinary expression

You may also wish to add a final step to your flow to delete the attachments from Dynamics 365 after they’ve been uploaded to SharePoint.

Conclusion

This is my first attempt at using Teams Adaptive Cards and I’m really impressed with the flexibility they offer. At the moment they wait for up to 30 days for an input from a User, I’d like it if I could change that and also offer a default resolution if no option is selected.

I think this approach offers a good degree of flexibility to Users to enable the migration of email attachments from D365 to SharePoint, and I believe the use of Adaptive Cards will help with adoption. I’m also experimenting with Outlook actionable messages to offer similar functionality.

I’d love to hear your feedback on whether you think this approach could work for you, so please feel free to reach out to me on social media or drop a comment below!