top of page
Search

Setup OCR service integration using MS Flow for Dynamics 365 for Finance

Writer's picture: Sil de BoerSil de Boer

Updated: Sep 23, 2020


In August 2017 Microsoft introduced the Vendor invoice automation functionality. This functionality allows for native OCR integration for Dynamics F&O. Many people don’t know that you don’t need an ISV solution to perform this operation. In this post I will demonstrate how you can use MS Flow and Dynamics F&O to build an integration to your OCR service.


The below diagram represents the flow of data between the OCR service and Dynamics F&O.

The purple lines represents the integration between the OCR service and Dynamics F&O. When the OCR services has processed the invoice, the metadata and attachment files must be stored on a file location where MS flow can pick up the created files. Flow can also be trigger when a file is created when using Email, SharePoint or OneDrive as file repository. MS flows can read the file location and start an import job via data management. T


The OCR services will need to be fed the latest supplier data and configurations like dimensions, purchase order and or categories. MS flow can also handle this process.


In this post I will demonstrate how to configure data management and Flow to import and export data. The OCR service process steps are depended on the OCR service provider and will not be discussed. Ensure the service provider can store the invoice files where MS flow can pick up the files (I use OneDrive) and can export the metadata in an acceptable format and layout.

Configure import for invoices

The import of supplier invoices data and attachments has to be completed via a data package. In order to successfully import data packages, the following steps need to be completed:

  1. Configure data project for import

  2. Register Azure application

  3. Configure recurring data imports

  4. Setup MS flow for imports


Configure data project for import

Start by creating a data management export project and add the following data entities:

  • Vendor invoice header

  • Vendor invoice line

  • Vendor invoice document attachment

This process is easier if you create some pending supplier invoice with attachments and lines before you export the data project. Let the system create a sample file for you. If you wish to remove any column do so in the export job. This way the columns are also removed from the manifest. If the manifest contains different columns as the related data files the import will fail. Run the export for the data project and download when done.

Unzip the downloaded data package and update the excel files. Ensure the header reference is the same across all data files. The value can be re-used for other import jobs. If the import file contains more than one invoice, you need to ensure a unique header reference is used per invoice.


The vendor invoice document attachment entity fields require the following setup:

Create an import data project and add the entities by loading the zip file as package.


Start the import process to ensure the import project is working.


Register Azure application

The following blog explain how you can register an Azure Active Directory application in Dynamics F&O. This setup is required to configure recurring exports from data management. I also used the registration credentials as authentication method in MS Flow for both import and export tasks. Be sure to save the client key and secret key as you will need to them later.


Configure recurring data imports

Open the import project and click create recurring data job.

Provide a job name, update the application ID, set the processing recurrence, set the mentoring recurrence and update the supported data source type to ‘data package’. The processing recurrence will determine how often loaded packages will be imported. More on this later.


Configure MS flow for import

I have configured the following MS flow, to push OCR meta data and images to Dynamics F&O. The created zip file must be acceptable to Dynamics F&O, no transformations are performed in MS Flow nor in Dynamics F&O, both applications have limited capabilities when it comes to data transformation.

Using the following file structure


The flow starts when a new file is created in the Inbound folder

The file is moved to the Processing folder


The next step defines the configuration variables. Using a variable, you can set the value of it once, like parse a string or compile a message, and then reuse that variable whenever needed. Anything you may use more than once, set it in a variable. I used a variable of type Object which allows me to set more than one variable at once. These variables need to parsed in the next step.


Copy the value (body of previous task) into the sample payload to generate the Parse JSON schema.

Now you can use the variables in the next tasks. Configure the HTTP action to call the Dynamics API using the variables defined earlier. Each HTTP task needs the authentication details.

This task will upload the import file and create an inbound task for the data project with status ‘Queued’. When the ‘processing recurrence data job’ has started the message will be updated to ‘Processing’. The update can result in ‘Processed’, ‘ProcessedWithErros’ or ‘PostProcessingError’.

When the job tasks have been executed the status will be updated. The next Flow task will get the processing status and will proceed if the status is no longer in progress.

Generate the schema by copying the following sample. The sample is part of the returned message from Dynamics:


{

"DataJobStatus": {

"DataJobState": "Processed"

}

}


Configure the 'Do-until' condition as follows:

@or(

equals(body('Parse_JSON_2')?['DataJobStatus']?['DataJobState'], 'Processed'),

equals(body('Parse_JSON_2')?['DataJobStatus']?['DataJobState'], 'ProcessedWithErrors'),

equals(body('Parse_JSON_2')?['DataJobStatus']?['DataJobState'], 'PostProcessingError'),

equals(body('Parse_JSON_2')?['DataJobStatus']?['DataJobState'], 'PreProcessingError')

)


This condition checks if the data projects has progressed past the 'processing' state. Microsoft has listed all possible states on MS docs.


The next task is a condition that checks if the job has been processed successfully. If so, flow can move the file to success folder and if not; flow moves the files to the error folder.

Now start the flow by uploading a file in the 'Inbound' folder, if all is correct the import is successful


Now the import direction has been completed. Let’s have a look at the export jobs.


Configure export for configuration data

The export job uses a slightly different integration method however, in this example still relies on MS flow to call the API and push the files to the file repository (in this example OneDrive). Ensure the OCR service can access the file repository so it can update the configuration data.


I have split the export configuration in the following steps:

  1. Configure data export project

  2. Configure MS flow


Configure data export project

Navigate to the data management workspace and create a new export project and toggle the generate data package parameter. Include all the data entities you need. In this example I have added the Vendors V2, Financial dimension values and Procurement categories entities. Depending on your requirements you may want to include purchase order data as well. Remember to select a source format type that the OCR service can use.


Because the Azure application has already been registered all steps in Dynamics F&O are completed. This integration method doesn’t rely on the setup of recurring data projects.


Configure MS flow

This Flow is performing 4 different calls to Dynamics F&O, these are demonstrated in the diagram below. First the export of the data project is initiated, when Flow performs this tasks Dynamics will respond with the execution group ID. This execution group ID is the identification code from the data project. This ID will be used to check the status of the export job in the next steps. Flow will continue to retrieve the status of the export project until it has succeeded. Once the export has been completed, Flow can retrieve the download location (URL). Once received, Flow can download the data package from this location.


The Flow looks as follows:


The flow is trigger daily and starts of by initializing and parsing the variables just like the import integration.

The Export job is started via the HTTP POST call. I have configured the HTTP steps with the same authentication as the import MS Flow, via Azure active directory OAuth.


Dynamics passes a message back to Flow. This message is parsed in the next step. I created the schema by copying a example of the returned message. Example messages can be found on MS docs.


HTTP/1.1 200 OK

{"@odata.context":"https://<baseurl>/data/$metadata#Edm.String",

"value":{

"value":"<executionId>"}


The message contains the execution group ID of the executed data project, which can be used to request status updates.

Now that the execution group ID has been received, Flow can check the status of the data project. This this will be repeated until the data project has successfully exported the data.


Flow can retrieve the download location (URL) now that the export was successful.

The package is downloaded via a HTTP GET function. This request doesn't require any additional authentication, because the URL that the GetExportedPackageUrlAPI returns includes a shared access signatures token that grants access to download the file.

Flow can create and extract the package in a folder of your choosing.


Once the Flow is run the following files should be created.

That’s it for now..


0 comments

Comments


SIGN UP AND STAY UPDATED!
  • Grey LinkedIn Icon

© 2023 by Talking Business.  Proudly created with Wix.com

bottom of page