Ten Regex Expressions To Use With Forms Pro

If you’ve been using Forms Pro to create surveys then you should be aware that you can do custom validation on fields with Regular Expressions (regex). My friend Megan Walker blogged about this recently, showing how easy it is to add the custom formatting checks to a field on your survey. Megan and I were discussing this functionality and, as I’m a massive geek, she asked if we could work together to come up with ten different custom expressions that you could use in a Forms Pro survey; we came up with the following list:

  1. Phone Numbers
  2. Website address
  3. Twitter handle
  4. Zip code (5 digits)
  5. UK Postcode
  6. National Insurance Number (UK)
  7. Social Security Number (USA)
  8. Vehicle Registration (UK)
  9. Date of birth (both dd/mm/yyyy & mm/dd/yyyy)
  10. Time (24 hour & 12 hour AM/PM formats)

I’ll go through each of these in turn below to show you how they work. I’ll include a copy of the expression, an explanation of the key rules that make up the query, a link to Regex101 for testing the query, and a visual overview of how the query works.

UK Phone Number


In order to construct any regex query you have to understand the rules of the information you’re trying to validate. Luckily for me, Wikipedia has a great page about UK telephone numbers. Some of the key rules we need to be aware of are:

  1. The numbers can start with +44 (the international dialling code) or 0
  2. The telephone numbers can be prefixed with a 3, 4 or 5 digit area code
  3. 3 digit area code numbers have a format of 3-4-4 (i.e. 000 0000 0000)
  4. 4 digit area codes have a 4-3-4 format (i.e. 0000 000 0000)
  5. 5 digit area codes have either a 5-6 or 5-3-3 format (i.e. 00000 000000 or 00000 000 000)
  6. Area codes may or may not be enclosed in brackets
  7. There are some special case numbers such as Sedbergh and Brampton which have unique area codes (4-2 format area code, followed by 5 or 4 numbers respectively)

As we know the rules, this makes the construction of the regex query much easier, we can take the rules in turn and construct a query for them. In order to demonstrate the query in action I have created a sample on Regex101 that you can use for testing, see https://regex101.com/r/ACqAiu/2

Website Address


the key rules we need to consider for a website address are:

  1. The address may begin with http://, https:// or www.
  2. After the protocol, the address may optionally begin with www.
  3. The web domain may use letters, numbers or hyphens
  4. Hyphens cannot be used at the beginning or end of the domain
  5. The web address typically ends with a top-level domain that is between 2-5 letters (i.e. .com, .net, .gov, etc.)

You can test this expression on Regex101 by clicking here

Twitter Handle


Twitter usernames have some pretty simple rules:

  1. the username begins with an @ symbol
  2. the username cannot be longer than 15 characters
  3. the username can only be comprised of alphanumeric digits or an underscore

You can test this expression on Regex101 by clicking here

US Zipcode


US Zip Codes are really straightforward, the key rules are:

  1. It can be comprised of 5 numbers
  2. It may optionally also have a 4 digit suffix following a hyphen for the ZIP+4 format

You can test this expression on Regex101 by clicking here

UK PostCode

^(([A-Z]{1,2}[0-9][A-Z0-9]?|ASCN|STHL|TDCU|BBND|[BFS]IQQ|PCRN|TKCA) ?[0-9][A-Z]{2}|BFPO ?[0-9]{1,4}|(KY[0-9]|MSR|VG|AI)[ -]?[0-9]{4}|[A-Z]{2} ?[0-9]{2}|GE ?CX|GIR ?0A{2}|SAN ?TA1)$

The Wikipedia article on UK Postcodes has a great section on validation that covers all of the key rules, and provides the regex that I have inlcuded above. I’d recommend reading it to get an overview of how the validation works.

You can test this expression on Regex101 by clicking here

UK National Insurance Number


UK National Insurance numbers have some key rules:

  1. They are formatted as two prefix letters, six digits and one suffix letter
  2. Neither of the first two letters can be D, F, I, Q, U or V
  3. The suffix letter can be A, B, C, D, F or M

You can test this expression on Regex101 by clicking here

US SOcial Security Number


The rules for US Social Security numbers are:

  1. 9 digits in total
  2. May be in a 3-2-4 format (i.e. 000-00-0000)
  3. The delimiter between the sections of the number may be a hyphen or a white space

You can test this expression on Regex101 by clicking here

UK Vehicle Registration


There are a number of rules regarding UK Vehicle Registration numbers, and there is a great gist on GitHub by Daniel Bradley that covers the rules so I’d recommend head there to read about it.

You can test this expression on Regex101 by clicking here

Date of Birth

^(((19|20)\d\d[- \/.](0[1-9]|1[012])[- \/.](0[1-9]|[12][0-9]|3[01]))|((0[1-9]|1[012])[- \/.](0[1-9]|[12][0-9]|3[01])[- \/.](19|20)\d\d)|((0[1-9]|[12][0-9]|3[01])[- \/.](0[1-9]|1[012])[- \/.](19|20)\d\d))$

For a date of birth we want to validate the following rules:

  1. The date may be input in the DD/MM/YYYY, MM/DD/YYYY or YYYY/MM/DD format
  2. The elements of the date may be separated by a forward slash (/), period (.), hyphen (-) or space

You can test this expression on Regex101 by clicking here

TIME – 24 hour Format


Hopefully the rules around time are obvious, but you can have any time from 00:00 to 24:00, with the hours and minutes separated by a colon. You can optionally include the seconds which will also be separated from the minutes by a colon

You can test this expression on Regex101 by clicking here



Time in a 12 hour format can be anytime from 12:00am to 11:59pm, and may be suffixed with am or pm, which may be in upper or lower case

You can test this expression on Regex101 by clicking here


Hopefully the examples above will give you some ideas for how you can use regular expressions in your own surveys. The examples are intended as a guide, so please test them before implementing them in any production environment. If I have made any glaring errors then please let me know in the comments or by reaching out to me on Twitter or LinkedIn.

If you’d like to know more about Forms Pro then please subscribe to Megan’s blog, follow her on Twitter, watch her YouTube channel or enrol in her excellent online training course.

Better Integration between Dynamics 365 and SharePoint using Power Automate — Part 1

If you’ve used the standard OOTB connection between Dynamics 365 and SharePoint then I’m sure, like me, you’ll have found your share of frustrations.

The technical functionality is great, allowing you to see your documents stored in SharePoint in a subgrid on the related Dynamics 365 record, however the standard implementation will probably make your SharePoint admins cry because it uses a single SharePoint Site, with a Document Library for each Entity, and a Folder for each record.

Whilst this approach sort of makes sense; at scale it becomes pretty unmanageable and it runs counter to any recognised SharePoint best practices. In my organisation we wanted to have a SharePoint site per record which would allow us to control security much more robustly. Peter Baddeley has blogged about similar issues before and I’d highly recommend reading his posts about this.

The Scenario

In my organisation we have Client SharePoint sites, linked to the Account record in Dynamics 365. Each of these SharePoint sites can have one or many Document Libraries depending upon the requirements of the Client. We then have a custom entity for Projects, and each Project has a SharePoint Site with one or many Document Libraries. Finally, each Project may have one or many Work Streams, and each of the Work Streams may have one or many Document Libraries

The relationships between the entities and the two systems can be visualised as below:

The Setup

As indicated above, in my scenario we have some custom entities:

  • Project
    • Lookup to Account entity
  • Work Stream
    • Lookup to Account entity
    • Lookup to Project entity

Note that while there is a parent:child relationship between Accounts and Projects, and Projects and Work Sites, this doesn’t have to be reflected in SharePoint, thus we can avoid getting into nested site structures in SharePoint and take advantage of modern functionality like Hub Sites.

As part of my demonstration I’ll also add a few custom fields to my Account entity:

  • Create SP Site – two option field I’ll use to trigger my Flow to run
  • Reference ID – An autonumber field to generate a unique ID for each Account, that we’ll use for the new site URL
  • SP Site URL – a field to hold the URL for the site we’re creating, which we’ll use in Part 2 and Part 3 of this series

A key part of the setup is also to ensure that you’ve set up SharePoint integration and enabled Document Management on the entities we’ll be using – see the Microsoft documentation for more info on how to do this. As we’re creating a custom document management structure, you might want to avoid the default document location logic from firing – Alex Shlega covered this in his blog about custom folder structures for SharePoint integration.

The Solution

Now that we have a clear understanding of our scenario, we need to create a custom document management structure using a Power Automate flow. Before we jump into this you should be aware that the standard SharePoint Flow connector doesn’t allow you to create SharePoint Site collections, it will only allow you to create subsites. To work around this restriction we’re going to use an Azure Automation runbook to execute some PowerShell.

Azure Automation

I had written a lengthy explanation of how to create an Azure Automation runbook that we could call from a Flow, but then I found this post from Planet Technologies that explains so straightforwardly how to do this that I’d rather just link you to their blog. If you’ve never created an Azure Automation runbook, or you feel intimidated by the idea of PowerShell then I can assure you that you have no reason to worry. The steps in this blog are super easy to follow!

In my Azure Automation Runbook I’ve added a step using the Register-PnPHubSite cmdlet to register my Client (Account) site as a Hub site. By doing this I can then associate my Project and/or Work sites to this Hub using the Add-PnPHubSiteAssociation cmdlet, without having to implement a nested site structure.

The Flow

1. When a Record is Updated – for the trigger I’m using the “When a Record is Created, Updated or Deleted” trigger from the Common Data Service (Current Environment) Connector. In this instance I’m using the Update trigger condition, to check when the new “Create SP Site” field I created is set to “Yes”. Note below that I’ve set the Filtering attributes to this field, and set a Filter expression to only trigger the flow when the value of this field is True

2. List the Default SP sites to get the Base URL – next I am using a List Records step with a FetchXML query to retrieve the Default SharePoint site URL. When you set up your integration with SharePoint you would have specified a site, and this is listed in Dynamics 365 as a default site, so we can retrieve this easily. The FetchXML query I used is:

<fetch version="1.0" output-format="xml-platform" mapping="logical" distinct="false">
  <entity name="sharepointsite">
    <attribute name="name" />
    <attribute name="parentsite" />
    <attribute name="relativeurl" />
    <attribute name="absoluteurl" />
    <attribute name="validationstatus" />
    <attribute name="isdefault" />
    <order attribute="name" descending="false" />
    <filter type="and">
      <condition attribute="isdefault" operator="eq" value="1" />

3. Compose AbsoluteURL – As I’m only returning a single record in the List Records step above, I will use a Compose action to retrieve the Absolute URL field value, rather than having to use an Apply to Each step. To do this, we use a First expression to retrieve the first returned value, then we’ll interrogate the JSON to get the value we want. The expression I used is:


4. Initialise SiteURL – next we’ll create a string variable for the Site URL of the new SharePoint site we’re going to create. To do this, we’ll use the output of the compose step above, append it with the value “/sites/” (as the new site will be within the sites managed path in SharePoint) and then complete it with the autonumber value from the new Reference ID field we created.

Note: if you want to use the name of your record in D365 for the URL you might need to strip out any “dangerous” characters for the URL. In this case you could follow the great advice from AlanPS1 on his blog for stripping unwanted characters

5. Create Azure Automation Job – now that we have our Site URL for the new site we want to create we’re going to use the Create Job action from the Azure Automation connector. In this action you’ll specify the Subscription, Resource Group, Automation Account and Runbook Name we created above. This will then bring up the Input parameters you specified in your runbook. In my case I just need the Site URL and the Account Name for my site title. Note that I’ve set “Wait for Job” to Yes, this is important to ensure the Job is completed before the action is marked as complete in your flow.

6. Create a new SP Site record – after the Azure Automation has done it’s work and created the SharePoint site we’re going to create a SharePoint Site record in Dynamics 365 for it. With this SharePoint Site record we will set the Absolute URL to the SiteURL we created earlier.

7. Create a new Document Location for Account – the final step in the Flow is create a Document Location record in D365 for the Account so we can actually see the SharePoint site and the documents stored there when we look at the Account record in D365. For the purposes of this demo I’m just creating a SharePoint Document Location for the default Documents document library that is created when you create a SharePoint site, so I know that the Name is “Documents” and the Relative URL is “shared documents”. If you were using a Template that created multiple document libraries you could use the SharePoint “Get all Lists and Libraries” action to retrieve them all then create a new Document Location for each Library that is returned.

One thing to note here is that when I’m setting the Lookup fields I have to put them in the format “/PluralofEntityName(GUID)”. This is due to a known bug with the Common Data Service (Current Environment) connector – see this blog by Sara Lagerquist for more information on this issue.


In this blog post I wanted to demonstrate how easy it can be to create an advanced custom integration between Dynamics 365 and SharePoint, and how accessible it is even to non-developers like me now that we have such advanced functionality available to us in Power Automate flows and Azure Automation runbooks.

Implementing a custom integration allows you to implement proper security management, and makes the solution much more scalable for proper document management functionality. In the next blog I’ll show you how to create new SharePoint Document Libraries from within Dynamics 365.

If you think this is useful I’d love to get your thoughts on it, please drop a comment below or reach out to me on Twitter or LinkedIn!

Exchange Rate Conversion with Power Automate

Many of us will work in organisations that have a global presence and therefore may have a need to know the value of something in a local currency. There isn’t any out of the box way to convert a base currency to another currency in Power Automate, so I wanted to see how easy it would be to configure.

The Solution

For the purposes of this blog I’m going to configure a simple manually triggered flow that will take a number input for a value to convert, and the base currency from which we are converting, and it will return this value at it’s current exchange rate in 10 different currencies. There are two parts to this solution:

  1. Configure a Custom Connector to retrieve current exchange rates
  2. Configure the Flow to output the converted value

Configuring a Custom Connector

As there is no out-of-the-box connector to do exchange rate conversion, we need to create our own. If you’ve not created a custom connector before, it is really straightforward; you just need an API that you can access with the appropriate authorisation and then it is a really simple process. Joe Unwin, AKA Flow Joe, covered the steps to creating a custom connector on his blog here.

As I needed an API to access, a bit of google-fu led me to this free Exchange Rates API, which really handily allows you to request the rates against a specific currency by setting the base parameter in the requet:

So, how do we set up this Custom Connector? There are basically 2 main steps for this one, as there is no authentication required.

Step 1 – input the host URL in the General Information section of the Custom Connector:

Step 2 – as we saw above, the Open Exchange Rates API allows us to request the latest rates by specifying a base currency parameter. To add this to our custom connector we simply add a new Action in the Definition section, then under the request heading we click Import from Sample and paste the URL with the parameter into the flyout window and specify it as a GET Request. To show just how easy it is, watch the gif below:

Now that we have the custom connector created, we’re ready to create the Flow

COnverting Values with a Flow

This Flow is pretty straightforward so I’ll run through each step in turn:

1. Manually trigger a Flow – for the purposes of this demo I’m using a manual trigger, and I’m specifying two inputs: the value to convert and the base currency we’re converting from. Of course, if you were implementing this in your environment any trigger could be used, as long as you’re providing these input values

2. Initialize RateConversion Array variable – In this array I want to specify the Currency name, Currency Code (ISO 4217 code) and the Locale (BCP 47 Code). The Currency Code is used to identify the exchange rate we’ll be converting to in the returned results, while the Locale will be used as part of a formatNumber expression later in the Flow. The Array I’m using is:

    "Currency Name": "US Dollar",
    "Currency Code": "USD",
    "Locale": "en-US"
    "Currency Name": "British Pound Stirling",
    "Currency Code": "GBP",
    "Locale": "en-GB"
    "Currency Name": "Japanese Yen",
    "Currency Code": "JPY",
    "Locale": "ja-JP"
    "Currency Name": "Euro",
    "Currency Code": "EUR",
    "Locale": "fr-FR"
    "Currency Name": "Swedish Krona",
    "Currency Code": "SEK",
    "Locale": "sv-SE"
    "Currency Name": "Australian Dollar",
    "Currency Code": "AUD",
    "Locale": "en-AU"
    "Currency Name": "Canadian Dollar",
    "Currency Code": "CAD",
    "Locale": "en-CA"
    "Currency Name": "New Zealand Dollar",
    "Currency Code": "NZD",
    "Locale": "en-NZ"
    "Currency Name": "South African Rand",
    "Currency Code": "ZAR",
    "Locale": "en-ZA"
    "Currency Name": "Brazilian Real",
    "Currency Code": "BRL",
    "Locale": "pt-BR"

3. Parse RateConversion Array – in this step we’re using a Parse JSON action to enable us to use the elements from the array we create above later in the Flow. The Schema for this step is:

    "type": "array",
    "items": {
        "type": "object",
        "properties": {
            "Currency Name": {
                "type": "string"
            "Currency Code": {
                "type": "string"
            "Locale": {
                "type": "string"
        "required": [
            "Currency Name",
            "Currency Code",

4. Retrieve Latest Rates – in this step we’re going to use our Custom Connector that we created earlier to retrieve the latest rates. As you can see, because we specified a “base” parameter, it is asking us for that input in the action. We’re using the “Base Currency” input from our trigger in this field

The output from this step looks like:


5. Initialize ConvertedRates string variable – In this step we’re just creating an empty string variable that we’ll add to when we loop through the Array we created in step 2 to convert our values

6. For each requested Rate – We’re going to loop through each of the Rates in the Array we created in Step 2 to convert them

6A. Check that there is a value returned for the requested Rate – this condition is to allow for some error handling in the event that the rate we’ve requested in our Array isn’t returned in the results from our Custom Connector. The expression we’re using is

body('Retrieve_Latest_Rates')?['rates']?[items('For_Each_Requested_Rate')['Currency Code']]

6B. Append No Conversion Available to String – If the condition above returns a negative result then we’ll use an Append to String Variable action to append a note that there is No Conversion Available for the requested rate. This ensures consistency of results

6C – Append Converted Value to String – if the condition at step 6A returns a positive result then we want to multiply the requested value from our trigger by the exchange rate returned, then format the results in the local currency. The expression we use to do this is:

formatnumber(mul(triggerBody()['number'], float(body('Retrieve_Latest_Rates')?['rates']?[items('For_Each_Requested_Rate')['Currency Code']])), 'C', items('For_Each_Requested_Rate')['Locale'])

Lets break this down from inside out:

body('Retrieve_Latest_Rates')?['rates']?[items('For_Each_Requested_Rate')['Currency Code']])

For this part of the expression, we’re interrogating the JSON that is returned from our custom connector in Step 4 above to find an exchange rate. We’re using the Currency Code for the current item in our Array to find the related Exchange Rate. We then need to convert this string to a Float value using the Float expression so we can use it in our multiplication


Now that we’ve returned a Float value, we’re going to use a mul expression to multiply the number we input in our trigger by this value.

formatnumber([MULTIPLICATION_RESULT], 'C', items('For_Each_Requested_Rate')['Locale'])

The final step is to take the value that we’ve returned from our multiplication and format it in a manner that’s representative of the currency we’re displaying. The formatNumber expression allows us to specify that the number we’re working with is a Currency, and the Locale that we retrieve from the array specifies exactly what currency the value is.

7. Respond to a PowerApp or Flow – for this demo I’m assuming we’d want to return the string of converted values to a PowerApp or Flow, but of course you could use this output for any appropriate purpose

The output of the Flow for an input of 10000 GBP looks like:

US Dollar: $12,983.89
British Pound Stirling: £10,000.00
Japanese Yen: ¥1,427,943
Euro: 11 896,55 €
Swedish Krona: 124 872,11 kr
Australian Dollar: $19,242.67
Canadian Dollar: $17,211.93
New Zealand Dollar: $20,024.27
South African Rand: R191 850,86
Brazilian Real: R$ 56.254,02


This is a relatively simple Flow that shows how easy it is to configure a Custom Connector and use an open API in the context of a Flow. Converting values to other currencies could be useful in a number of scenarios, in particular if you are using multiple currencies in your Dynamics 365 environment where keeping the exchange rates up to date is vitally important.

See the Flow in action below:

Quick Tip: Resolving Error Retrieving Teams via Fetchxml with List Records Action in CDS (Current Environment) Connector

I ran into a rather odd error recently when I was trying to use a FetchXML query to retrieve a list of Teams using the List Records action with the Common Data Service (Current Environment) Connector in a Power Automate flow, so to save you any future headaches I’ve outlined the issue below and the solution to the problem.

The Problem

In this instance I needed to retrieve a list of Teams related to Accounts that matched a specific N:N relationship. As this involved an N:N relationship, there are mulitple levels of link-entities and therefore we cannot use the normal OData filter query parameters.

The FetchXML query I created, using Jonas Rapp’s FetchXML Builder, was:

  <entity name="team" >
    <attribute name="teamid" />
    <link-entity name="account" from="accountid" to="regardingobjectid" >
      <link-entity name="cn_cn_practice_account" from="accountid" to="accountid" >
        <link-entity name="cn_practice" from="cn_practiceid" to="cn_practiceid" >
            <condition attribute="cn_name" operator="eq" value="Investment" />

I tested this query and it returned a valid list of results:

I then took the FetchXML query and added it to my List Records action and ran my Flow, so imagine my surprise when it failed!

The important part of the error details that were returned is below:

[Key property 'ownerid' of type 'Microsoft.Dynamics.CRM.team' is null. Key properties cannot have null values.]

The only problem with this error is that there is no “ownerid” property on a Team…


I was banging my head off my table trying to solve this issue and I was stumped, so I reached out to the community to see if anyone else had any ideas; fortunately for me Rob Dawson stepped up to the table with a genius idea; he suggested faking the ownerid as an alias for the administratorid attribute. This would literally never have occurred to me, so I’m extremely grateful for his help!

I added the new FetchXML query, with the ownerid alias for the administratorid attribute and, hey presto, it worked!


This was actually a relatively simple workaround for what appears to be a bug in the List Records step, but hopefully you’ll find it useful too! I’m always grateful to be part of a community that can help me when I’m stuck, and scenarios like this show how important it is to have a great network of peers to help you see solutions that wouldn’t otherwise occur to you

Converting Time zones with Power Automate

If you’ve ever had to convert a timezone with Power Automate then you’re hopefully aware that there is a “Convert Time Zone” action that is designed to make this process easier for you. If you’ve had to convert one time to multiple different time zones all at once then it very quickly becomes a pain to create a Flow as it requires a new action for each conversion.

Did you know that Microsoft supports 250 different time zones? If you wanted to see your local time represented in all of these different time zones all at once then it would be really inefficient to create a Flow for this, so how could we do it?

Note: if you’d like to see how to achieve the same functionality using a Select action instead of an array then read Pieter Veenstra’s excellent post on the topic.

The Solution

If you’ve read any of my recent posts, you’ll be aware that I’m a big fan of arrays, and this is a perfect opportunity to use one. In the Flow below I’m going to create an array of time zones using the data from the table on the Windows Default Time Zones page I linked above, and then I’ll convert the current UTC time to each of those to generate an output.

1. Trigger – for the purposes of this demo I’m using a manual trigger, but you could of course use any trigger that gives you a time and date you want to work with.

2. Initialize ConvertedTimes string variable – we’ll create an empty string variable called ConvertedTimes that we will use to capture the outputs when we loop through the time zones later in the Flow

3. Intialize TimezonesArray array variable – next we’ll create an array variable that we will populate with our array of time zones for conversion. I’ve included an extract of the array below:

    "Timezone": "Afghanistan Standard Time",
    "UTC Offset": "(UTC+04:30)",
    "Description": "Kabul",
    "Country": "Afghanistan"
    "Timezone": "FLE Standard Time",
    "UTC Offset": "(UTC+02:00)",
    "Description": "Helsinki, Kyiv, Riga, Sofia, Tallinn, Vilnius",
    "Country": "Åland Islands"
    "Timezone": "Central Europe Standard Time",
    "UTC Offset": "(UTC+01:00)",
    "Description": "Belgrade, Bratislava, Budapest, Ljubljana, Prague",
    "Country": "Albania"
    "Timezone": "W. Central Africa Standard Time",
    "UTC Offset": "(UTC+01:00)",
    "Description": "West Central Africa",
    "Country": "Algeria"

The array contains information on the Time Zone including the name, it’s UTC offset, a description of some of the cities where it has effect, and the country of origin.

The full array is included in the Flow that you can download by clicking the link at the bottom of this post.

4. Parse JSON – next we run a Parse JSON step on the TimezonesArray variable so we can access the array item values for use in our Apply to Each loop. The schema that we use for the Parse JSON step is:

    "type": "array",
    "items": {
        "type": "object",
        "properties": {
            "Timezone": {
                "type": "string"
            "UTC Offset": {
                "type": "string"
            "Description": {
                "type": "string"
            "Country": {
                "type": "string"
        "required": [
            "UTC Offset",

5. Apply to Each – we use the Apply to Each condition to loop through each of the time zones from our Time Zones Array. Inside the loop we only have one Append to String Variable action, and we’re using this to convert the time to the Time Zone for the array item and capture the output.

To convert the time zone we use the convertTimeZone expression. As you can see from the reference, the convertTimeZone expression uses the following format:

convertTimeZone(‘<timestamp>’, ‘<sourceTimeZone>’, ‘<destinationTimeZone>’, ‘<format>’?)

I’ve constructed the expression as follows:

convertTimeZone(utcnow(),'UTC',items('Apply_to_each')['Timezone'], 'dd MMMM yyyy hh:mm tt')
  1. timestamp – I’m using the utcNow() expression to get the current date and time, but of course you could pass in any value from your trigger to here
  2. sourceTimeZone – I’ve hardcoded this to be ‘UTC’ in my Flow, but you could get the current User’s timezone in your own scenario to set it dynamically, or set it to any other timezone
  3. destinationTimeZone – for the destination Time Zone I’m using the Time Zone from the current array item, so it will be set dynamically for each time zone
  4. format – I’ve created a custom date format to show me the day, month as a full word, full year, and then the time as a 12 hour display with an AM/PM designation

For the rest of the String I’m also including the Country, UTC Offset and Description from the array for each item, and then including a line break, so the output will look something like:

30 January 2020 06:34 AM Barbados (UTC-04:00) Georgetown, La Paz, Manaus, San Juan

6. Compose – The final step is just a compose step to let me see the output of the Apply to Each loop. Once you have all the times converted you could use them in whatever way you please

Based on the inputs above, the output from the Flow will look something like:


In this flow I wanted to demonstrate how we could take an array of timezone inputs and convert a specified time to all of them in a quick and efficient manner; in my demo I’ve used an array of 250 time zones but the good thing with an approach like this is that it doesn’t matter how many values are in your array.

I realise it’s unlikely that you’d want to convert a time to 250 different time zone values (and I’m also aware that there is crossover between time zones, so there are multiple time zones sharing the same UTC offset), but I’ve worked in organisations that have worldwide operations so it’s always useful to be able to convert deadlines to local times in each office. Similarly, I know that there are lots of virtual events that have attendees from all over the world so it is handy to be able to see the local time for each attendee.

I think this flow could be expanded even further to make it more dynamic, but I hope it has provided some useful food for thought. If you’d like to download it and have a play with it yourself then please click here

Power Automate HTML Table Styling

If you’ve been using Power Automate flows to send notification emails to people in your organisation then there’s a fair chance you’ve run into a requirement to send a table of results in the email. This is a regular occurence in my organisation, and I’ve run into some limitations with the standard “Create HTML Table” action so in this blog I’ll explore them and demonstrate how I’ve managed to work around them.

The Situation

For the purposes of demonstrating this Flow I’m going to use the scenario where I’ve created a Quote for my customer and I want to email it to them using a Power Automate flow.

The Solution

There is an action in Power Automate called “Create HTML Table” and it will probably suffice for most usage scenarios. In the extract from my Flow below I’ve covered the key steps to create a table from a list of CDS records:

1. List CDS records – we use the List Records action to retrieve a list of records from CDS. Similarly, you could retrieve records from SharePoint (or any other data source).

2. Select – we use the Select action to specify matching key:value pairs from the returned records from the List CDS records step above. The easiest way to think about this is that the first column is the “Headers” for the table, whilst the dynamic values in the second column are the “rows” for the table

3. Create HTML Table – the final step to create the HTML table is to use the Create HTML Table action. As we’ve specified the Columns in the Select action above, we set the Columns to automatic.

Now that we have a HTML table generated, we can insert it into an email, or a Teams message, or create a PDF with it, or any other approriate action you can think of.

For the purposes of this demo I’m going to send it in an email, and I’ll include some of the details from the parent Quote record. The Send an Email action looks like this:

The output from this action looks like this:

As you can see, it works but the styling leaves a bit to be desired so let’s see if we can pretty it up a bit.

INjecting CSS Styling

One way we can do some styling is to add a CSS stylesheet to our HTML. We can do this by prefixing the output from the Create HTML Table step with some additional HTML using a Compose step. You can use internal CSS in HTML by using the <style> element, and we can do that in the compose step like this:

The HTML style element with the CSS that i used is below if you want to copy and paste it:

table {
  border: 1px solid #1C6EA4;
  background-color: #EEEEEE;
  width: 100%;
  text-align: left;
  border-collapse: collapse;
table td, table th {
  border: 1px solid #AAAAAA;
  padding: 3px 2px;
table tbody td {
  font-size: 13px;
table thead {
  background: #1C6EA4;
  border-bottom: 2px solid #444444;
table thead th {
  font-size: 15px;
  font-weight: bold;
  color: #FFFFFF;
  border-left: 2px solid #D0E4F5;
table thead th:first-child {
  border-left: none;

I used the table designer at http://divtable.com/table-styler/ to give me inspiration for the CSS, and the output with the styling added looks like this:

I think this looks much better, it’s more readable and a bit friendlier, but I still think it can be improved on. One of the things I experimented with was alternate colours for rows, but unfortunately Outlook seems to ignore this CSS styling. In order to get alternate rows, and to include the information at the bottom the email in the table, we’re going to have to get creative!

Advanced StylinG

The good news is that we can achieve some quite complicated styling using a mix of HTML and CSS. The bad news is that it’s slightly more complex and therefore we can’t use the simple “Create HTML Table” action. The output from my advanced styling looks like this:

I think this is a huge improvement, and looks really professional. The Flow to create this table is slightly longer, but I think it is pretty straightforward to follow. I’ve outlined it below and I’ll go through each step in turn

1. Trigger – in this Flow I’m using “when a record is updated”, but of course you can use any trigger

2. Initialize RowNumber Integer Variable – the first action is to create a new Integer variable called RowNumber. We’ll use this as part of our Apply to Each loop later in the Flow, but for the time being we’ll default it to 0

3. Initialize TableHTML String Variable – the next action is to create a new String variable called TableHTML. In this we’re going to set the Table Headers. We can also use some inline CSS style elements to set the table style and to set the table header style. We can also set the width of each column.

4. List Records – in this scenario I want to send a quote in an email to a client, so I’m going to get the related Quote Lines. In your own scenario you may wish to retrieve different records, or items from a SharePoint list

5. Apply to Each – next we’re going to loop through the items that are returned from the List step above. There are two actions inside the loop

5A. Increment RowNumber by 1 – in this action we’ll increase the RowNumber variable by 1 so we can identify which loop we’re on, and we’ll use this in the next step to identify if we’re in an even or odd row

5B. Append Product Row to TableHTML – in this step we’re using an Append to String Variable action to add more HTML table rows to the TableHTML. The HTML code we’re using is below:

        <tr style="background-color:@{if(equals(mod(variables('RowNumber'),2),0),'white','#e1e1e1')};">

There are some key elements to consider here:

  1. First, we’re setting the background colour of the row by using some inline CSS within the <tr> element. We have an IF condition in here as follows
    This works by using the MOD expression to check if the RowNumber variable is divisible by 2. The MOD expression returns the remainder from dividing two numbers. If this equals zero we know we’re on an even row, while if it returns any value other than zero we’re on an odd row. We can then use the IFTRUE and IFFALSE to set alternating row colours.
  2. We use the FormatNumber expression to set the currency values to the appropriate currency, e.g.
  3. We won’t have a discount applied to each line item, so we need to check if the returned value is null and, if so, set it to 0, i.e.

6. Append Footer to TableHTML – in the first two examples I demonstrated above the values from the parent record weren’t included in the table, which looked a little bit messy. In order to include them we can append some more HTML to the TableHTML variable. We also know that we want the rows containing subtotals and totals to be bold and to have a different colour background, so we can include some more inline CSS styling for these.

The table we have created above has 5 columns but the final elements in the footer only use two columns, so we can also use a HTML colspan element to make the row labels span 4 columns, thereby ensuring the row values are in column 5

If you want to copy the HTML above, then it is below:

<tr  style="text-align:right; background:#c5c5c5; padding:3px">
    <td colspan="4" ><strong>Detail Amount:</strong></td>
    <td><strong>  @{formatnumber(triggerOutputs()?['body/totallineitemamount'],'C','en-GB')}  </strong></td>
<tr style="text-align:right; background:#ffffff;padding:3px">
    <td colspan="4" >Less Discount %:</td>
    <td>@{formatnumber(div(triggerOutputs()?['body/discountpercentage'],100),'p','en-gb')} </td>
<tr style="text-align:right; background:#ffffff;padding:3px">
    <td colspan="4">Less Discount:</td>
    <td>@{if(equals(null,triggerOutputs()?['body/discountamount']),formatnumber(0,'c','en-GB'),formatnumber(triggerOutputs()?['body/discountamount'],'C','en-GB'))} </td>
<tr  style="text-align:right; background:#c5c5c5; padding:3px">
    <td colspan="4">
        <strong>Pre-Freight Amount:</strong></td>
    <td><strong>@{formatnumber(triggerOutputs()?['body/totalamountlessfreight'],'C','en-GB')}  </strong></td>
<tr style="text-align:right; background:#ffffff;padding:3px">
    <td colspan="4" > Freight Amount:</td>
    <td>@{if(equals(null,triggerOutputs()?['body/freightamount']),formatnumber(0,'C','en-GB'),formatnumber(triggerOutputs()?['body/freightamount'],'C','en-GB'))} </td>
<tr style="text-align:right; background:#ffffff;padding:3px">
    <td colspan="4"> Tax Amount:</td>
    <td>@{formatnumber(triggerOutputs()?['body/totaltax'],'C','en-GB')} </td>
<tr  style="text-align:right; background:#c5c5c5; padding:3px">
    <td colspan="4"><strong>Total Amount:</strong></td>
    <td><strong>@{formatnumber(triggerOutputs()?['body/totalamount'],'C','en-GB')}  </strong></td>

7. Send an Email – we can use the Send an Email action to send the HTML table we created in an email, or we can use the Teams – Post a Message [STEP 8 above] to post the HTML table in a Teams channel.


This post demonstrates 3 different ways to create and display HTML tables using Power Automate flows:

  1. Simple – using a combination of the Select and Create HTML Table actions
  2. Intermediate – using a combination of the Select and Create HTML Table actions, and adding internal CSS using a Compose action
  3. Advanced – using manually created HTML tables with inline CSS styling, colspan elements and Flow expressions to add formatting

I found this an interesting challenge, and I think it demonstrates how we can get creative with Power Automate to achieve advanced functionality. If you found it useful, or if you have any questions or comments then please let me know!

If you’d like to learn more about HTML tables then I’d highly recommend reading the W3 Schools tutorials.