Calculate Working Days in a time period using Power Automate

I’ve been working on a Flow recently that requires either the total amount of days between two dates, or the amount of working days (i.e. total days minus weekends and holidays), and the lack of a proper DateDiff expression in Power Automate has been a bit of a headache for me to work around.

If you’ve ever worked with Excel then you’ll know that the networkdays formula would be perfect for this scenario, so I wanted to see if I could replicate this in a Flow.

The Setup

The networkdays formula takes three parameters:

  1. Start Date
  2. End Date
  3. Holidays (an array containing a list of holiday dates)

We need to include the same parameters in our Flow, so you’ll need to ensure that your trigger contains this information, or that you can gather it within your Flow.

The Solution

The Flow that I created is below, and I’ll go through each step in turn to explain what I’ve done

1. Trigger – For this Flow I’m using a manual trigger that takes the three parameters I outlined above. Of course you could also get this to work from a CDS record or any other trigger. The Holidays are required to be input in a “yyyy-MM-dd” format, separated by a comma. I’ve also experimented with retrieving the holidays from the Gov.UK Bank Holidays API, the important thing is to be able to construct an array of Holiday days.

2. Initialize HolidaysArray – The first action is to take the comma-separated list of Holidays from the trigger and convert it to an array so we can use it later in the Flow. We do this using the Split expression; this will convert a comma-separated list such as 2019-12-25,2019-12-26,2020-01-01,2020-01-02,2020-04-10 into an array that looks like:


3. Compose StartDateTicks – in Excel dates are stored as sequential serial numbers starting with 1 for January 1 1900. Unfortunately, Power Automate doesn’t do the same thing, so we need to be slightly more creative in order to calculate the number of whole days between two dates; the ticks function gives the nanosecond interval for a specified datetime. For example, the date 10 December 2019 has a tick value of 637115328000000000. We use the compose action to convert the input start date to it’s representative tick value

4. Compose EndDateTicks – as above, use another Compose action to convert the input end date to it’s representative tick value

5. Initialize FullDays integer Variable – Now that we have the tick values for the start date and the end date, we can subtract the start date from the end date then divide the result 864000000000 to convert the result to the number of days. We also add 1 to this result to give us the total number of whole days between the start date and end date, inclusive of the end date. The expression we use is


Note: The next 7 steps in the Flow (Step 6 – Step 12) will help us to calculate the number of working days in the total number of days we had returned above.

6. Initialize WorkDays Integer Variable – now that we have the Full Days calculated, we’ll initialize another Integer Variable to calculate the Working Days. When we initialize it we’ll set the default value to the FullDays value and we’ll decrement it in subsequent steps

7. Decrement WorkDays by 2 for each Full Week – we know that in any given 7 day period there will be two weekend days, so we need to remove these from the total number of days returned above. The way we do this is to divide the total number of days value by 7, then multiply it by 2, and then subtract this result from the total number of days.

For example, if you had 14 days total, then you could divide this by 7 (giving 2), then multiply it by 2 (giving 4), and subtract this from 14 (giving 10). This tells you that in a 14 day period there are 10 working days and 4 weekend days. The expression we use to calculate this is:


8. Decrement WorkDays by 1 if Start and End are the same day on a weekend – we need to account for situations in which the Start Day and End Day are the same day of the week (i.e. it starts and ends on Sunday). If it is midweek then we don’t need to do anything, but if it’s a weekend then we need to ensure we’re decrementing the WorkDays value by 1. The reason for this is that if we started and finished on a Sunday, this would be 8 days in total. The calculation at step 7 would remove 2 days for each full week, but we’d also need to ensure we’re removing an additional 1 day to account for the weekend.


9. Decrement WorkDays by 2 if Start Sat & End Midweek – in this step we’re checking if the time period we’ve selected for the Flow starts on a Saturday and ends on a midweek day. If so, we want to decrement the WorkDays variable by 2


10. Decrement WorkDays by 1 if Start Sun & End Midweek – this step is almost the exact same as above, but this time we’re checking if the time period starts on a Sunday and ends on a midweek day. If so, we want to Decrement the WorkDays variable by 1


11. Decrement WorkDays by 1 if Start Midweek & End Sat – this action is the inverse of Step 9; we’re checking to see if the time period selected starts on a midweek day and ends on a Saturday. If so, we want to Decrement the WorkDays variable by 1


12. Decrement WorkDays by 2 if Start Midweek & End Sun – as above, this is the inverse of Step 10; we’re checking to see if the time period selected starts on a midweek day and ends on a Sunday. If so, we want to Decrement the WorkDays variable by 2


NOTE: it would probably make more sense to have the decrement actions above contained within a Switch action to make your Flow more efficient

13. Initialize HolidaystoRemove – for this step we’re initializing another integer variable that we’ll use in the Condition in Step 14 to count the number of Holidays that occur in the selected time period

14. Check if the Date Range contains any of the holidays – for this step we’re going to iterate through the Holidays array we created in step 2 and check if that date is in between the Start Date and End Date of the time period we selected, using an “is greater than or equal to” and “is less than or equal to” condition respectively. If the holiday is in the time period, we’ll increment the HolidaystoRemove variable by 1.

15. Decrement WorkDays by HolidaystoRemove – Once we’ve iterated through all the holidays, we then Decement the WorkDays variable by the HolidaystoRemove variable, and this will give us our final amount of Working Days in our selected Time Period

16. Response – the final step in my Flow is a Response action. I’m going to be calling this Flow as a child flow from another one, so I need the response to return the information. In my response I have 3 outputs:

  1. Total Days
  2. Working Days
  3. Holidays Removed


This Flow was a bit more frustrating than I’d expected to try and pull together, and it involves some hefty expressions, but I’ve tested it pretty thoroughly and its worked in all the scenarios I’ve thrown at it, but I’d love for other people to do more testing.

I would love to get your feedback on whether you think this is useful, or if you think I’ve missed anything or made any mistakes!

If you’d like to download a copy of this Flow please click here

Creating a Lead for every ClickDimensions Posted Form using Power Automate

Recently, my good friend Megan Walker did a guest post for ClickDimensions showing how to create a Lead for every ClickDimensions Posted Form and if you haven’t read it then you really should!

When I was reading this post one of the things I noticed was the requirement to use a Filter Array step and Compose step for each question in your Form in order to be able to use them when you created the Lead. This approach works perfectly, but it could be quite time consuming to create if your Form has lots of questions, so I wondered if there could be a way to simplify it a little bit.

Please note, for the purposes of this post I’m only going to be focusing on an alternative way to get the Posted Field data and use it to create your Lead, Megan has covered everything else in her post so please read it!

If you’d prefer to watch a video overview of this blog, click here.

The Solution

The first step is still to List your Posted Fields using the List Records action. If you’re using the Common Data Service (Current Environment) connector then you’ll be able to select the specific attributes you want returned. In this case we’re only interested in the Label (i.e. the Question) and the Value (i.e. the Answer). You should also set the Order By to the Label; this will ensure consistency in the returned results which is important for later steps in the Flow.

If you look at the Output for the List Records action, you’ll see that it gives a JSON array that looks something like:

We want to make matching pairs for the Questions (Labels) and Answers (Values) from each of the returned Posted Fields and combine them in an array. Fortunately for us, in Power Automate there is an Action called Select that allows you to “Select the specified properties from all elements of the ‘From’ array into a new array”. This is really just a fancy way of saying you can take the elements of the output above that you like and keep them, while disregarding the rest, and you can reshape them into pairs as required.

For the Select action, we take the data From the output of the List Records step, and then we create a map of the Label and the Value as a name pair

When this is done, the output from the Select action will look something like this:

This is much better!

As you can see, the Questions and Answers have been combined into array elements. You can also see that, as we set the Order By to the Label in the List Records Step above, the array elements are listed alphabetically. One important thing to note at this stage is that each element in an Array has an index number beginning from 0, so you can identify the array elements I’ve returned above as follows

The final step for creating the Lead is to add a Create a New Record action, and then we’re going to use an expression to pull the values from the output above into the fields we need to populate:

As you can see, for each field we are populating we have an expression with the following format:

@{outputs('Select')['body']?[0]?['Company Name']}

The key things to note in the construction of this expression are:

  1. for the Outputs expression, the name ‘Select’ must match the name that you’ve given to the Select step
  2. the Integer number [0] must match the element number for the array element as indicated above. For example, if we want to set the Telephone Number we will use [4]
  3. The final part of the expression [‘Company Name’] must match the text of the Question from the array.

If we wanted to get the Email Address instead of the Company Name then the expression would be:

@{outputs('Select')['body']?[5]?['Work Email Address']}

After we’ve completed all of the above our flow now only has three actions to retrieve Posted Fields and to create a Lead with the answers submitted:


The first thing to say is that none of this would have been possible without the valuable insights of Megan Walker and Rob Dawson. I keep saying this, but one of my favourite things about this community is the collaborative efforts we all make to help each other.

The second thing is that while this works, if the Questions on your form were to change, either by adding/removing questions or by changing the Label of the questions then this would probably affect the success of the Flow, so it might be worth putting some additional validation logic in to catch any potential issues like that.

The final thing is that Power Automate is so powerful and it’s amazing what you can achieve with some lateral thinking. This has been a fun little challenge for me, and hopefully you find it useful. Please reach out to me if you have any questions or comments.

Using Flic Buttons with CDS – Managing Fire Evacuations

I was asked to create a tech demo in work recently that could show off some of the capabilities of Power Apps and Power Automate, and I figured this would be a perfect opportunity to use the Flic button I got from Matt Beard of Data8 when I attended the User Group Summit in Amsterdam earlier this year .

The Scenario

Anyone who’s worked in an office will undoubtedly have experienced the fun of a fire drill (usually on a day where it is cold and miserable). I thought it would be a great demonstration of the Power Platform if we could create a system to record when a fire evacuation has occurred in one of our offices so we could notify the other offices, and to send notifications when the evacuation was over.

The Setup

The first thing to do was create a new Evacuations entity in CDS where I wanted to record the following:

  1. Evacuation Start Date/Time
  2. Evacuation End Date/Time
  3. Office being evacuated
  4. Any comments/notes about the evacuation

It would be really simple to manually record the evacuations in here, but where’s the fun in that? I had the Flic buttons at my disposal, and I was going to use them!

Using a Flic Button

For those of you who are not aware, Flic buttons have three trigger events:

  1. Click
  2. Double Click
  3. Hold

Each of these triggers can be mapped to a specific action, but the exciting part for us #PowerAddicts is that you can use them to trigger a Flow.

The Flows

For the purposes of this tech demo, I wanted to have two flows:

  1. Create an Evacuation Record
  2. Close the Evacuation Record

Create an Evacuation Record

I created an Instant Flow that is triggered “When a Flic is Pressed”:

As you can see from the screenshot above, there are two options on this trigger. First you select the Flic Button you’ll be associating with the Flow, then you select one of the available Events; I’m going to be using a “Click” event for this Flow.

When you use a Flic trigger, it returns the Click Time in the following format:


This isn’t particularly presentable, so I’ve added a Compose step to format the Date using the following expression:

formatDateTime(triggerBody()?['clicked_at'], 'dd MMMM yyyy')

This compose step formats the Click Time above as “12 November 2019“, and I’m going to use this when I create the record in CDS

The last step in the Flow is to use the Send an Email (V2) action to send an email to everyone in my organisation to let them know our office has been evacuated

And that’s it, we have a simple Flow configured run on clicking a Flic button to create a record in CDS when an evacuation is happening and to notify all employees in our organisation. The next step is to create a Flow to close the evacuation record when everyone returns to the office

Close the Evacuation Record

For this Flow, I’m using the Same “When a Flic is Pressed” trigger as above, but this time I’m going to use the Double Click event. For this Flow I want to retrieve the Active Evacuation records for this office and set them to Inactive, recording the time that the Evacuation was ended.

If you want to find out an easy way to generate OData queries to use in the List Records step, then I’d recommend reading Sara Lagerquist’s recent blog on this topic

Once the Evacuation records are closed, I use the Send an Email (V2) action again to send an update email to all employees to tell them that the Evacuation of the office is now over.


This is a relatively simple use case for using Flic buttons but it shows the power of combining IoT devices with the Power Platform and it reinforces my belief that “no UI is the best UI”. Through two actions we have a complete record in our database, and it’s required no input from the User other than a Click and a Double Click

In part 2 of this blog, I’ll show how I created a canvas app for reviewing and monitoring Evacuations, and how to use the location data so the button knows which office has been evacuated.

Filtered Lookup Field based on Linked Entity using North52

If you’ve ever had a requirement to filter lookup fields then you’ll no doubt be aware that this is possible in Dynamics 365, but that there are some limitations to the functionality.

Microsoft have done a great job of enabling out of the box filtering for simple scenarios using the “Related Records Filtering” options or by limiting the records returned using specific view(s)

To read more about the options available out of the box I’d recommend referring to Carl de Souza‘s blog post –

For the more developmentally minded amongst us there is also the option to use the addCustomFilter JavaScript function, more information on which can be found on the Microsoft Docs site –

For those who are comfortable with JavaScript I’d recommend reading Aileen Gusni‘s posts about this for some tips and tricks –

The Scenario

In my scenario we have a Peer Review entity to record the outputs of peer reviews carried out for activities related to an Account. The Peer Review entity has several Reviewer Roles which are lookups to the User entity. The lookups need to be filtered to only show Users who are in the Account Team. I’ve mapped the relationships between the entities below:

I tried to get this to work with the OOTB options, but found that I couldn’t quite get them to work for this scenario. I also looked at the JavaScript options but again ran into issues, primarily because I need the filtering criteria to be dynamic on each Peer Review record depending on the selected Account, whereas the JavaScript was a bit prescriptive for me. (Note: I’m not a coder, so someone cleverer than me could probably get it to do what they needed).

However, in exploring the JavaScript i stumbled upon a potential solution. You can use an “in” operator in a condition in your FetchXML to specify the list of values to be returned, like so:

<filter type='and'> 
        <condition attribute='YOUR_FIELD_HERE' operator='in'>

If I could figure out a way to make this list of values dynamic then that would solve my problem!

The Solution

To solve this issue I turned to my trusty old friend North52. I’ve written previously about using looping functions and I’ll be doing something similar here.

The first step is to get the FetchXML to get the Users from the Team, which I’ve done using advanced find to output:

<fetch version="1.0" output-format="xml-platform" mapping="logical" distinct="true">
    <entity name="systemuser">
        <attribute name="systemuserid" />
        <link-entity name="teammembership" from="systemuserid" to="systemuserid" visible="false" intersect="true">
            <link-entity name="team" from="teamid" to="teamid" alias="ab">
                <filter type="and">
                    <condition attribute="teamid" operator="eq" value="{0}" />

As you can see above the value in the teamid condition is set to {0}, and we’ll set this dynamically in the ClientSide – Perform Action formula, which is below:



    FindRecordsFD('TeamMembers', true, SetParams([ryan_peerreview.ryan_accountid.ryan_accountteam.teamid.?])), 


      When(0), Then(SetVar('teammembers', StringFormat('<filter type="and"><condition attribute="systemuserid" operator="in"> <value>{0}</value>', CurrentRecord('systemuserid')))),

      When(RecordTotal()-1), Then(SetVarConcat('teammembers', StringFormat('<value>{0}</value></condition></filter>', CurrentRecord('systemuserid')))),

      Default(SetVarConcat('teammembers', StringFormat('<value>{0}</value>', CurrentRecord('systemuserid'))))





I’ll explain the key elements of this formula below:

SmartFlow: SmartFlow allows you to run multiple actions in one Formula

ForEachRecord: ForEachRecord is a looping function, and it iterates through the output of the FetchXML query we created above using the FindRecordsFD function and carries out the actions specified. As mentioned above, I set the value of the TeamID to be {0}, and now I use the SetParams function to define the value that will be put in here.

As we’re using ForEachRecord to loop through the records returned by the FetchXML, I will use the Case function to create a variable for the filter that I will be putting on the lookup field using the SetVar/SetVarConcat functions

The Case function works by splitting the Filter FetchXML into 3 parts:

  1. The Opening section, which includes the open tags for the Filter and Condition, and the first value returned from the FindRecordsFD function
  2. The Looping section, for all the values between the first and last values returned from the FindRecordsFD function
  3. The Closing section, which includes the closing tags for the Filter and Condition, and the last value returned from the FindRecordsFD function

To make this work with the Case function, we use the RecordIndex function, which contains an integer with the current index number of the loop, so the Case function can be described in plain English as:

WHEN we are on the first loop, THEN create a variable with the opening section of the Filter FetchXML;
WHEN we are on the last loop, THEN concatenate the variable with the closing section of the Filter FetchXML;
OTHERWISE if we are not on the First or Last loops, THEN concatenate the variable with another value

When we have created the Filter FetchXML we use the AddPreFilterLookup function to add the filter to the selected field.

Once we’ve done all of this, the field will show only the people who are in the Team related to the Account on the Peer Review record:


I think this is a good method of dynamically altering the available options in a lookup field, and I can envision a number of useful scenarios for this functionality, but please leave a comment below or reach out to me on social media with your thoughts.

Postcode Region Mapping via Workflow

I recently delivered a session at Dynamics 365 Saturday Scotland covering some advanced functionality you can implement in your Dynamics 365 environment using free custom workflow activities.

To read my thoughts on #D365SatSco and how amazing it was, see the article I posted on LinkedIn

One of the scenarios I covered in my session was looking at how we can carry out regional analysis of our account using workflows, and I’ve outlined my solution below.

The Scenario

For this scenario I wanted to be able to check if the postcode that had been entered for the address on an Account was valid, and if so I wanted to be able to extract the outward code and use this to map the Account to it’s postcode area, locale, sub-region and region.

The Setup

For this scenario I added the following to my environment:

  • A new Entity called Region Mapping, containing
    • An Option Set with 4 options:
      1. Postcode Area
      2. Locale
      3. Sub-Region
      4. Region
    • A hierarchical Parent lookup field
  • Added fields to the Account entity
    • A single line of text field called “Extracted Postcode”
    • 4 lookup fields to the Region Mapping entity (one for each Option in the Option Set

Once this is all created, I imported my dataset, which I derived from data sources from the Office for National Statistics. You can download a copy of my dataset below:

The Workflow

To create my workflow I used tools from two different custom workflow assemblies:

  1. Jason LattimerString Workflow Utilities
    1. Regex Match
    2. Regex Replace with Space
  2. Alex ShlegaTCS Tools
    1. Attribute Setter

Step 1 – Postcode Verification

In the UK, all postcodes follow standard formats, so it makes it relatively easy to determine if the postcode is valid or not. For my workflow I’m using the Regex Match step, so I need a Regex pattern to use. I wanted to be able to separate out the outward and inward sections of the postcode, so the expression I ended up with is:

((?:(?:gir)|(?:[a-pr-uwyz])(?:(?:[0-9]?)|(?:[a-hk-y][0-9]?)))) ?([0-9][abd-hjlnp-uw-z]{2})

I am not an expert at Regex, but I am very good at googling! I added this pattern to Regex101, which does a great job of explaining the component parts if you’d like to understand it further

The output from a Regex Match step will be True or False. If it returned false you could use a cancel step in your real-time workflow to display an error message to your user informing them that their Postcode was not valid

Step 2 – Extract Postcode Area

As I mentioned above all UK Postcodes follow standard formats, and this particularly true for the second part of the postcode which is always one number followed by two letters. To carry out my region mapping I needed to be able to extract the first part of the postcode, so I used the Regex Replace with Space step to replace the second part of the postcode with 0 spaces, in effect just deleting it.

From my Regex pattern in the previous step, I used the second capturing group to match with the second part of the postcode:


The output from this step leaves us with the first part of the postcode, so we update the Extracted Postcode field on the Account entity with this, and we’ll use that in the next step.

Step 3 – Run the Attribute Setter

I’ve previously discussed Alex Shlega’s Attribute Setter, and it’s one of my favourite custom workflow activities. It’s super easy to work with and allows you to dynamically set lookup fields from within your workflow.

The first thing to do is to create a Lookup Configuration with a FetchXML query to find the record you will be setting in the lookup field. For mine, I’ll be looking for the Region Mapping record that matches the extracted postcode. As I’ve discussed before, the magic in the Lookup Configuration is the ability to dynamically pass values to the FetchXML query by putting the schema name of the field that contains the value inside a pair of # marks.

The key part of the FetchXML query abouve is the second condition:

<condition attribute=”ryan_name” operator=”eq” value=”#ryan_extractedpostcode#” />

By putting the schema name of my Extracted Postcode field, whatever value is in there will be added to my query when it is run by the workflow. The Attribute Setter will output the GUID of the Region Mapping record (i.e. the Fetch Result Attribute) and it will set it in the Postcode Area lookup field on the Account (i.e. the Entity Attribute)

Step 4 – Update Account

The final step, now that the Postcode Area has been updated, is to run a child workflow to update the Locale, Sub-Reigon and Region fields. For each of these fields, we’ll run an Update Record step, and select the Parent of the predecessor (i.e. for the Locale we will find the Parent of the Postcode Area field value


This is a relatively simple approach to allow you to carry out regional segmentation of your Accounts, which can be used for marketing purposes or for reporting.

If you’ve found it useful, or if you have any other ideas then please reach out to me on Twitter or LinkedIn