Logic App for Azure SQL DB to Azure File Storage Workflow

While dealing with integrations, developers usually need to work on bulk data transfers from one system to another and when it is for remote data stores then we need to think of solutions that can work with optimal database calls.

In this post, we will integrate Azure SQL Database with Azure File Storage. We will extract the bulk data from the database and save it to file storage in the comma-separated format (CSV) so that it can be used for further processing.

This sample interface uses Azure SQL Database, we created as part of BYOD setup that stores data from Microsoft Dynamics 365 Finance (D365F), like purchase orders. Once data is in Azure SQL, we will use Azure Logic Apps to export the bulk data on daily basis to an Azure File Storage in CSV format.

Also read: Azure Data Factory for Azure SQL DB to Azure File Storage Pipeline

First, let’s build a simplified version of the interface in this post, and then we can move to a more optimized version that uses SQL pagination.

Pagination is the process of dividing the transaction result set into discrete subsets.

Logic App

Workflow looks like this.

  1. Recurrence trigger is configured to run once in a day.
  2. File name is configured to be unique for that day.
  3. For a complex SQL query, you can create a View and call the records using Get rows action.
  4. A condition is setup to check if Get row returns any record.
  5. If records are returned, we create and upload the CSV file to Azure File Storage.

Initialize variable uses a concat() function to figure out the unique file name for the day and saves it into a variable called FileName.

Get rows (V2) is a SQL Server Action that will use SQL Server name, Database name, user and password to connect to the server.

RowCount is a compose action that checks the number of records returned by Get row.

Finally Create CSV table takes the Get Rows value to create the CSV table, and Create File (Azure File) takes the output from the Create CSV table to create & upload the file.

Once the workflow execution is complete, you will see the file created and uploaded to Azure File Storage.

Get row action has a limitation of returning 2048 records in one call. You can get over this by turning on the pagination in the action settings.


We will see a more robust example of pagination using the SQL Stored Procedure in our next post.

Also read: Azure Data Factory for Azure SQL DB to Azure File Storage Pipeline

If you liked the story and want to see more of this kind of content, please follow us on twitter @CursorRun or FB page @CursorRun

Default image
Rahul Sharma
Rahul is a technology enthusiast, solutions architect, trainer, and blogger, working on various Microsoft and open source solutions with more than 18 years of industry experience. He specially takes interest in designing enterprise applications, cloud integrations, IoT, and other architecture rich business solutions. Rahul is a guest author with us and shares his knowledge through technology blog posts.
Articles: 59


Enter your email address below to subscribe to our website. We will never flood your inbox with unwanted emails & you can always unsubscribe anytime.

Leave a Reply