In this post, we’ll see how to upload data in CSV file to D365 instance using Azure Data Factory. We’ll need following Azure resources for this demo:
- Azure Data Factory
- Blob Storage
Let’s go through the below steps to see it in action:
- Login to Azure Portal
- Click on Create a resource –> Select Storage –> Select Storage Account
- Select Resource Group –> Give a valid name for the storage account –> Click Next:Advanced
- Click on Next: Tags
- Click on Next: Review + Create
- Once the Validation passed message is displayed, Click on Create.
- Browse through the storage account created –> Click on Blobs
- Click on +Container to create a container.
- Give a valid name to the blob container –> click on OK
- Browse through the blob container created –> Click on Upload to upload the CSV file for Contact
- Browse the CSV file to upload and click on Upload.
- Below is the content of the CSV file that we just uploaded to the blob container.
- Now, let’s create Azure Data Factory. Click on Create a resource –> Select Analytics –> Select Data Factory
- Give a valid name to the Data Factory –> Fill the mandatory fields –> Click Create
- Browse through the Azure Data Factory created –> Click on Author & monitor
- Click on Author Icon on the left –> Click on + –> Select Pipeline
- In the Pipeline search for activity copy
- Drag the Copy Data activity to the canvas
- Go to Source Tab –> Click +New to create a new source data set
- Select Azure Blob Storage –> Click Finish
- Give a proper name to the Azure Blob Storage
- Go to Connection Tab –> Click +New to create a linked service for the source data set
- Select the storage account we have already created above –> Click on Finish
- Click on Browse to to fill the file path to read the CSV file
- Browse to the CSV file uploaded already –> Click Finish
- Tick the checkbox to notify that first row contains column names
- Go to Schema tab –> Click on Import schema
- Select type of ContactId as GUID
- Then navigate to the pipeline. Go to Sink tab –> Click on +New to create destination data set
- Select Dynamics 365 –> Click Finish
- Give a proper name to the data set
- Go to Connection tab –> Click +New to create a linked service for the data set
- Fill the D365 instance URL, username, password –> Click Test Connection –> Click Finish
- Select Contact as the destination entity to which we’ll copy the data from the source
- Go to Schema Tab –> Click on Import Schema
- Keep the column names which we are going to map and delete rest of the columns from the schema
- Now, we have configured Source and Sink tabs of pipeline. Let’s navigate to Mapping tab of the pipeline –> Click Import schema
- Verify the mapping generated. If source columns are not mapped to proper destination columns then manually map them to the respective destination columns. Here, we have mapped “Middle Name” to “Job Title” and “Company Name” to “Website Url” field which is weird but only for demo purpose đŸ™‚
- Once mapping is done, click on Validate –> If there is no error found in the pipeline validation then click on Publish All.
- Once published, click on Trigger –> Trigger Now to trigger the pipeline to copy the data from CSV file/blob storage to D365 instance.
- Click Finish to run the pipeline.
- Click on Monitor Icon on the left to monitor Azure Data Factory. Click on pipeline runs tab and you can see that the pipeline run has been succeeded.
- Once we find that the pipeline run has been succeeded, ideally the contacts would have been created in the D365 instance. Let’s navigate to the D365 instance to verify that. Here we can see that the contacts have been created.
Hope it helps !!
Reblogged this on Arun Potti's MS CRM blog.
LikeLike
Thanks for the post. Just a question around creating relationship between records. If I want to upload a contact record with a parent account GUID. Is it possible through the connector? So far from what I’ve seen from the connector, it cannot update lookup fields.
LikeLike
Hi J,
I have also tried the same thing. It seems this feature is not available with ADF currently.
Thanks,
Ajit
LikeLike
Hi, do you know if this ADF Copy Data above works with D365 Finance & Operations specifically as the sink (destination)?
LikeLike
Hi JT, unfortunately, I have not explored FO using ADF.
Thanks,
Ajit
LikeLike
Oh, that is a pity. Never mind then. Thanks anyway.
LikeLike