Dataflow Automation using the Power Query Dataflows Connector in Power Automate
The Power Query Dataflows connector in Power Automate
Dataflows are a self-service, cloud-based, data preparation technology. Dataflows enable customers to ingest, transform, and load data into Microsoft Dataverse environments, Power BI workspaces, or your organization’s Azure Data Lake Storage account. Dataflows are authored by using Power Query, a unified data connectivity and preparation experience already featured in many Microsoft products, including Excel and Power BI. Customers can trigger dataflows to run either on demand or automatically on a schedule; data is always kept up to date.
In this blog we announced the Preview availability of the Power Query Dataflows connector in Power Automate along with templates and sample scenarios you can get started with right away. The new connector provides a way to react to dataflow events to take action on dataflows. Combined with the Power Platform and 400+ other connectors available in Power Automate, the scenarios for the Power Automate Dataflows connector are endless.
What is Power Automate? Power Automate is all about automation. With Power Automate, you can automate business processes, send automatic reminders for past due tasks, move business data between systems and even automate tasks on your local computer. Just think about time saved once you automate repetitive manual tasks.
The Power Query Dataflows connector in Power Automate has two parts; a trigger which fires “When a dataflow refresh completes” and an action to “Refresh a dataflow”.
Tutorial – Automatically update my Power BI report from a SharePoint file upload
In this tutorial, I will walk you through an example of how you can use the Power Query Dataflows connector to automate a workflow. For example, I have a Power BI dashboard that gives me important business insights. This Power BI dashboard is based on data that is coming from SharePoint.
Following dataflows best practices, I use dataflows to transform and prepare my data, so my ETL pipeline is separated from my report and I can reuse this data easily in other reports too. I then build a Power BI dataset on top of this data to include measures and the visuals I want to have in my dashboard.
It is very important to me that my dashboard is always showing the latest data present in my SharePoint. Since data is uploaded irregularly to my SharePoint, a scheduled refresh is not sufficient. I want my data to be refreshed every time a new file is uploaded or updated. This is where I can start using the new Power Query Dataflows connector in Power Automate.
To summarize, my scenario is – every time a new file gets uploaded or updated in SharePoint, trigger the refresh of my dataflow. When my dataflow refresh is successful, refresh my dataset. You can extend this scenario to any use case where you want to have up-to-date data in Power BI or Dataverse based on a change in the source data or on any other trigger that fits your requirement.
To build this scenario, follow these steps:
- Navigate to Power Automate.
- Create a new Automated cloud flow.
- Search for the SharePoint trigger “When a file gets created or modified” and add the location of your SharePoint folder.
- Search for the Power Query Dataflows connector action “Refresh a dataflow” and add the dataflow you want to refresh. Note: the workspace field refers to Power BI workspace and the environment field refers to the Power Apps environment.
- Search for the Power Query Dataflows connector trigger “When a dataflow refresh completes” and add the same dataflow that you did in the previous step.
- You can now use the dynamic content of the dataflow refresh to add a condition. Based on the refresh status (succeeded, failed or terminated) we set up Power Automate to conduct different actions.
With this flow set up, we can see the following happens:
- When a file is created or modified in my specified SharePoint folder,
- My dataflow will be refreshed.
- When the dataflow refresh completes,
- If it succeeds, my Power BI dataset will be refreshed.
- If it fails, I will be sent an email notification.
Learn more and access more tutorials on Dataflows automation
This tutorial demonstrates how to use the new Power Query Dataflows connector in Power Automate to build on top of dataflows and the Power Platform. In addition to this tutorial, we have many more in our documentation, also listed below:
- Send a notification when a dataflow refresh completes
- Open a ticket when a dataflow refresh fails
- Trigger dataflows and Power BI datasets sequentially
- Use Dataverse to build a dataflows monitoring report
- Use Excel to build a dataflows monitoring report
- Use a Power BI dataset to build a dataflows monitoring report
The Power Query team is interested in your feedback! Feel free to leave a comment below with your thoughts.