Sunday, March 23, 2025

Export a Package from D365 Using the Rest API (ExportToPackage) and Upload It to Blob Storage Using Logic Apps

Export a Package from D365 Using the Rest API (ExportToPackage) and Upload It to Blob Storage Using Logic Apps


Follow the below steps:

1. Create an export project in DMF.

2. Use the ExportToPackage API URL listed below and pass the required parameters.
    URL:

{Resource}/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.ExportToPackage

   Input parameters:

  {
    "definitionGroupId":"<Data project name>",
    "packageName":"<Name to use for downloaded file.>",
    "executionId":"<Execution Id if it is a rerun>",
    "reExecute":<bool>,
    "legalEntityId":"<Legal entity Id>"
}

3. After triggering the ExportToPackage API, a record will be created in the job history, and the system will prepare a ZIP file.

4. The system will take some time to generate the package based on the data volume.

5. To handle this delay, use the until function and add a delay. Use the provided URL to check the status of the export job.

6. Assign the retrieved status to the Compose action.

   body('Get_the_execution_status')?['value']

7. The until loop will continue running until the status equals "Success."

8. After that, add a condition to check the status, and based on the result, proceed with downloading the file.

9. To download the file, trigger the API URL using the POST method and pass the executionId in the request body.

{Resource}/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetExportedPackageUrl


{"executionId":"<Execution Id>"}

10. Finally, upload the downloaded file to Blob Storage.


OutPut:


File uploaded to Blob:

 

  • We have passed the required parameters, and it is returning the execution ID.

  • We are passing the execution ID, and it is returning the status.

  • We are passing the execution ID, and it is returning the temporary blob URL.

  • We are assigning it to the blob, and it is being saved in the blob.






Keep Daxing!!

Sunday, March 16, 2025

Creating a ZIP File and push to D365 Using REST API in Logic Apps

Creating a ZIP File and push to D365 Using REST API in Logic Apps

           I will receive a CSV file, which I need to convert into a ZIP file. This ZIP file will then be pushed to D365.

    

        To create the ZIP file, I have developed an Azure Function. The input file for the function will be the CSV file, and within the code, I will fetch the manifest files from the blob storage.


Creating a ZIP File Using Azure Function

1. I have a CSV file that needs to be converted into a ZIP file, which will include two manifest XML files.

2. The CSV file is picked from the input.

3. The manifest XML files are stored in blob storage. Within the code, I connect to the blob storage and retrieve the XML files.

4. These three files (the CSV and two XML files) are added to the ZIP file, which is then returned by the function.

Click Here to create Azure Function:

Code:

using System;
using System.IO;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;
using Microsoft.WindowsAzure.Storage.Blob;
using System.IO.Compression;
using Microsoft.WindowsAzure.Storage.Auth;
using Microsoft.WindowsAzure.Storage;

namespace Test.test.CreateZip
{
    public static class CreateZipFile
    {
        [FunctionName("CreateZipFile")]
        public static async Task<IActionResult> Run(
            [HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] HttpRequest req,
            ILogger log)
        {
            try
            {
                log.LogInformation("CreateZipFile function triggered");

                if (req.Body != null)
                {
                    string fileName     = req.Query["FileName"];
                    string processName  = req.Query["ProcessName"];

                    CloudBlobClient         cloudBlobClient;
                    CloudBlobContainer      cloudBlobContainer;
                    CloudStorageAccount     cloudStorageAccount;
                    string                  blobFolder = "";

                    // Connecting to azure blob
                    cloudStorageAccount = CloudStorageAccount.Parse(Environment.GetEnvironmentVariable("BlobConnectionString"));
                    cloudBlobClient     = cloudStorageAccount.CreateCloudBlobClient();
                    cloudBlobContainer  = cloudBlobClient.GetContainerReference(Environment.GetEnvironmentVariable("BlobContainerName"));

                    switch (processName)
                    {
                        case "Process1":  blobFolder = Environment.GetEnvironmentVariable("Prcoess1Folder"); break;
                        case "Prcoess2":  blobFolder = Environment.GetEnvironmentVariable("Prcoess2Folder"); break;
                        case "Prcoess3":  blobFolder = Environment.GetEnvironmentVariable("Prcoess3Folder"); break;
                    }                   
                  
                    CloudBlobDirectory cloudBlobDirectory = cloudBlobContainer.GetDirectoryReference(blobFolder);

                    if (cloudBlobDirectory == null)
                    {
                        log.LogError("An error occurred : Blob is not connected. Check the Environment Variables");

                        return new BadRequestObjectResult("Blob is not connected. Check the Environment Variables");
                    }

                    CloudBlockBlob Manifestblob         = cloudBlobDirectory.GetBlockBlobReference("Manifest.xml");
                    CloudBlockBlob PackageHeaderblob    = cloudBlobDirectory.GetBlockBlobReference("PackageHeader.xml");

                    using (var zipStream = new MemoryStream())
                    {
			// Adding files to zip stream
                        using (var archive = new ZipArchive(zipStream, ZipArchiveMode.Create, true))
                        {
                            await AddToZipFile(archive, fileName, null, req.Body);
                            await AddToZipFile(archive, "Manifest.xml", Manifestblob);
                            await AddToZipFile(archive, "PackageHeader.xml", PackageHeaderblob);
                        }

                        zipStream.Position = 0;
						
			//Adding zip stream to response
                        req.HttpContext.Response.ContentType = "application/zip";
                        req.HttpContext.Response.Headers["Content-Disposition"] = $"attachment; filename={processName + ".zip"}";

                        await zipStream.CopyToAsync(req.HttpContext.Response.Body);
                    }

                    return new OkObjectResult(req.HttpContext.Response.Body);
                }
                else
                {
                    log.LogError("An error occurred : File is mandatory");

                    return new BadRequestObjectResult("File is mandatory");
                }
            }
            catch (Exception ex)
            {
                log.LogError(ex, "An error occurred");

                return new BadRequestObjectResult(ex.ToString());
            }
        }
		
        public static async Task<MemoryStream> downloadAsync(CloudBlockBlob blob)
        {
	    //Downoad the file from blob
            using (var stream = new MemoryStream())
            {
                await blob.DownloadToStreamAsync(stream);

                return stream;
            }
        }
		
        public static async Task AddToZipFile(ZipArchive archive, string fileName, CloudBlockBlob blob, Stream inputStream = null)
        {
            var zipFile = archive.CreateEntry(fileName, CompressionLevel.Optimal);

	    // Add the file to zip stream
            using (var entryStream = zipFile.Open())
            {
                if (blob != null)
                {
                    var result =  await downloadAsync(blob);

                    using (var fileToCompressStream = new MemoryStream(result.ToArray()))
                    {
                        fileToCompressStream.CopyTo(entryStream);
                    }
                }
                else
                { 
                    await inputStream.CopyToAsync(entryStream);
                }
            }
        }
    }
}


Process Workflow

1. Fetch the CSV file from blob storage and pass it to the Azure Function.

2. After the ZIP file is created, delete the original CSV file.

3. Generate an D365 authentication token.

4. Call the GetAzureWriteUrl API to retrieve a temporary blob URL.

URL: ('D365URL')/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetAzureWriteUrl

Headers: 

{
  "Authorization": "Bearer @{body('Generate_D365_Token')?['access_token']}"
}
Body:
{
  "uniqueFileName": "@{outputs('File_Name')?[triggerBody()?['ProcessName']]?['ImportProject']}_DMFPackage"
}

             Get the temporary blob url from previous step.

        json(body('Get_Azure_Write_URL')?['value'])

5. Upload the ZIP file to the temporary blob using the URL.   

    Headers: 

{
  "x-ms-blob-type": "BlockBlob"
}

6. Call the ImportFromPackage API to push the ZIP file to D365.

URL: 

('D365URL')}/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.ImportFromPackage

Headers: 

{
  "Authorization": "Bearer @{body('Generate_D365_Token')?['access_token']}"
}

body: 

{
  "packageUrl": "@{outputs('Fetch_blob_url')?['BlobUrl']}",
  "definitionGroupId": @{outputs('File_Name')?[triggerBody()?['ProcessName']]?['ImportProject']},
  "executionId": "",
  "execute": true,
  "overwrite": true,
  "legalEntityId": @{triggerBody()?['Company']}
}

7. Add a 1-minute delay after calling the until function.

    for until I have used the below expression.

or(equals(variables('D365 Message status'), 'Succeeded'), 
    equals(variables('D365 Message status'), 'PartiallySucceeded'),
    equals(variables('D365 Message status'), 'Failed'),
    equals(variables('D365 Message status'), 'Canceled'))

8. Use the ExecutionSummary URL to check the import status.

URL: 

('D365URL')}/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetExecutionSummaryStatus

body: 

{
    "executionId": "@{body('Push_file_to_D365')?['value']}"
}

9. Assign the status retrieved from the execution summary to the D365 message status variable.

10. If the status is not "Succeeded," send an email notification.

 

Response: 

      • CSV file is fetched from Blob.


      •  Function app return the zip file.

      • Token is generated.

      • File is pushed to D365.

      • 1st time the execution status is Executing



      • 2nd time the execution status is Success.





Keep Daxing!!

Wednesday, February 19, 2025

Exporting a File from D365 Using Deque and Uploading It to Blob via Logic Apps

Exporting a File from D365 Using Deque and Uploading It to Blob via Logic Apps


Follow the below steps:

1. Create an export project in the Data Management(DMF).

2. Under Manage Recurring Data Jobs, create a new record and add the application ID.

3. Set the processing recurrence and enable the corresponding boolean.

4. Copy the value from the ID field.

5. Create an HTTP request, paste the URL provided below, and use the GET method.

     {{resource}}/api/connector/dequeue/{activity ID}?company=USMF

6. Instead of generating a token, assign the values in the Authentication tab.

7. Retrieve the Download Location from the previous step and assign it to a new HTTP trigger to get the file.

       body('Get_the_package_Download_URL')?['DownloadLocation']

8. Use the Upload to Blob action and provide the container name, blob name, and content.



Output:


1. The Get Package Download URL step will return the JSON response, from which we need to capture the DownloadLocation.

2. The Download Package step will return an octet-stream.

3. Since the response is a ZIP file, it is not possible to read the data directly.

4. The ZIP file will be stored in Blob Storage.



5. Below is the CSV file exported from D365.



D365:

  • Once file is fetched from D365 status will change in manage messages.


Keep Daxing!!

Monday, February 3, 2025

Push Multiple Records in a Single Http Request Using Batch API in Logic Apps

 Pushing Multiple Records in a Single Request Using Batch API in Logic Apps:

        We can perform the multiple operations in single request by using Batch API. To know more Click Here

Please follow the below steps:

  • I have added the HTTP request.
  • I configured the following URL and headers.
    Content-Type: multipart/mixed; boundary=batch_CR


  • I used the batch number as batch_CR and the change set number as changeset_EE.
  • I am performing three operations:
    • Creating a record in the vendgroup table.
    • Updating a record in the custgroup table.
    • Deleting a record in the vendgroup table
      --batch_CR
      Content-Type: multipart/mixed; boundary=changeset_EE
      
      --changeset_EE
      Content-Type: application/http
      Content-Transfer-Encoding: binary
      Content-ID: 1
      
      POST VendorGroups?cross-company=true HTTP/1.1
      Content-Type: application/json; type=entry
      
      {
          "dataAreaId": "USMF",
          "VendorGroupId": "Test04",
          "Description": "Test 4",
      }
      
      --changeset_EE
      Content-Type: application/http
      Content-Transfer-Encoding: binary
      Content-ID: 2
      
      PATCH CustomerGroups(dataAreaId='USMF',CustomerGroupId= 'TestC01')?Cross-company=true HTTP/1.1
      Content-Type: application/json; type=entry { "Description": "Test C1111" } --changeset_EE Content-Type: application/http Content-Transfer-Encoding: binary Content-ID: 3 DELETE VendorGroups(dataAreaId='USMF',VendorGroupId= 'Test03')?Cross-company=true HTTP/1.1 Content-Type: application/json; type=entry --changeset_EE-- --batch_CR--
  • I used OAuth authentication, providing the Client ID, Secret, and Tenant ID.

Output:

 

If the Batch API request Succeeds:

  1. It returns a 201 status code for the create operation, along with the entire payload of the first request.


  2. It returns a 204 status code for the update and delete operations, as no content is returned after an update and delete.

 

If the Batch API request Fails:

  • The main status code is 200, but within the response body, it returns a 500 error.
  • Below is the output of the Batch API request, with error message highlighted in a green box.

Fetching the Error Message:

  • I added conditions to handle both success and failure cases.

  • If the Batch API request fails, I retrieve the error message using the following expression.
    • Fetch the error
      body('Batch_API')['$multipart'][0]['body']['$multipart'][0]['body']['$applicationHttp']['body']['error']
    • Fetch the error status code
      body('Batch_API')['$multipart'][0]['body']['$multipart'][0]['body']['$applicationHttp']['statusCode']

 

  • If the Batch API request succeeds, I extract the status code using the appropriate expression.
    • Fetch the body
      body('Batch_API')['$multipart'][0]['body']['$multipart'][0]['body']['$applicationHttp']['body']
    • Fetch the status code
      body('Batch_API')['$multipart'][0]['body']['$multipart'][0]['body']['$applicationHttp']['statusCode']



Keep Daxing!!