Monday, December 23, 2024

Batch API for D365FO

Using Batch API for D365FO 
        


  • You can send requests for multiple operations in a single HTTP request.
  • Batch requests can include up to 1000 individual requests but cannot contain other batch requests.

Without Change Sets :

  • If you send 10 requests in a batch and 2 fail, the remaining requests will not be stopped and will continue to be processed.

Using Change Sets :

  • In addition to individual requests, a batch request can include change sets:
  • When multiple operations are included in a change set, if any one operation fails, all completed operations are rolled back.


1. Batch URL for request.

{{resource}}/data/$batch

2. Include the following keys and values in the header:

    Content-Type : multipart/mixed; boundary="batch_22975cad-7f57-410d-be15-6363209367ea"


3. Batch number: Must be unique for the entire request. 

        If you use the same batch ID for multiple requests, there will be no issue. However, if you change it, make sure to cross-check it once.

batch_22975cad-7f57-410d-be15-6363209367ea

4. Content-ID: Must be unique for each step.

5. While starting the batch or change set use -- as prefix.

6. While ending use -- as suffix.

 

7. The 200 OK response indicates the status of the entire batch request. To find the response for each individual request, you need to review the responses within the batch response separately. 

Success Response:                                                                        Failed Response:

Success ResponseFailed Response

8. The below part is considered as one request.


Batch API for Insert

    This example will insert a vendor group and a customer group using the Batch API. 
Request : 

--batch_22975cad-7f57-410d-be15-6363209367ea
Content-Type: multipart/mixed; boundary="changeset_246e6bfe-89a4-4c77-b293-7a433f082e8a"

--changeset_246e6bfe-89a4-4c77-b293-7a433f082e8a
Content-Type: application/http
Content-Transfer-Encoding: binary
Content-ID: 1

POST VendorGroups?cross-company=true HTTP/1.1
Content-Type: application/json; type=entry

{
    "dataAreaId": "USMF",
"VendorGroupId": "Test01", "Description": "Test 1" } --changeset_246e6bfe-89a4-4c77-b293-7a433f082e8a Content-Type: application/http Content-Transfer-Encoding: binary
Content-ID: 2 POST VendorGroups?cross-company=true HTTP/1.1 Content-Type: application/json; type=entry { "dataAreaId": "USMF",
"VendorGroupId": "Test02", "Description": "Test 2" } --changeset_246e6bfe-89a4-4c77-b293-7a433f082e8a Content-Type: application/http Content-Transfer-Encoding: binary
Content-ID: 3 POST CustomerGroups?cross-company=true HTTP/1.1 Content-Type: application/json; type=entry { "dataAreaId": "USMF", "CustomerGroupId": "TestC01", "Description": "Test 1" } --changeset_246e6bfe-89a4-4c77-b293-7a433f082e8a-- --batch_22975cad-7f57-410d-be15-6363209367ea--


Request:
       1. Added Vend group record and customer group record.




Response:







Batch API for Update

--batch_22975cad-7f57-410d-be15-6363209367ea
Content-Type: multipart/mixed; boundary="changeset_246e6bfe-89a4-4c77-b293-7a433f082e8a"

--changeset_246e6bfe-89a4-4c77-b293-7a433f082e8a
Content-Type: application/http
Content-Transfer-Encoding: binary
Content-ID: 1

PATCH VendorGroups(dataAreaId='USMF',VendorGroupId= 'Test01')?Cross-company=true HTTP/1.1
Content-Type: application/json; type=entry { "Description": "Test 1111" } --changeset_246e6bfe-89a4-4c77-b293-7a433f082e8a Content-Type: application/http Content-Transfer-Encoding: binary Content-ID: 2 PATCH CustomerGroups(dataAreaId='USMF',CustomerGroupId= 'TestC01')?Cross-company=true HTTP/1.1
Content-Type: application/json; type=entry { "Description": "Test C1111" } --changeset_246e6bfe-89a4-4c77-b293-7a433f082e8a-- --batch_22975cad-7f57-410d-be15-6363209367ea--

Response:

Batch API for Delete

--batch_22975cad-7f57-410d-be15-6363209367ea
Content-Type: multipart/mixed; boundary="changeset_246e6bfe-89a4-4c77-b293-7a433f082e8a"

--changeset_246e6bfe-89a4-4c77-b293-7a433f082e8a
Content-Type: application/http
Content-Transfer-Encoding: binary
Content-ID: 1

DELETE VendorGroups(dataAreaId='USMF',VendorGroupId= 'Test01')?Cross-company=true HTTP/1.1
Content-Type: application/json; type=entry --changeset_246e6bfe-89a4-4c77-b293-7a433f082e8a Content-Type: application/http Content-Transfer-Encoding: binary Content-ID: 2 DELETE CustomerGroups(dataAreaId='USMF',CustomerGroupId= 'TestC01')?Cross-company=true HTTP/1.1
Content-Type: application/json; type=entry --changeset_246e6bfe-89a4-4c77-b293-7a433f082e8a-- --batch_22975cad-7f57-410d-be15-6363209367ea--
Response:

Batch API for Insert, update and Delete

--batch_22975cad-7f57-410d-be15-6363209367ea
Content-Type: multipart/mixed; boundary="changeset_246e6bfe-89a4-4c77-b293-7a433f082e8a"

--changeset_246e6bfe-89a4-4c77-b293-7a433f082e8a
Content-Type: application/http
Content-Transfer-Encoding: binary
Content-ID: 1

POST VendorGroups?cross-company=true HTTP/1.1
Content-Type: application/json; type=entry

{
    "dataAreaId": "USMF",
"VendorGroupId": "Test03", "Description": "Test 3" } --changeset_246e6bfe-89a4-4c77-b293-7a433f082e8a Content-Type: application/http Content-Transfer-Encoding: binary Content-ID: 2 PATCH CustomerGroups(dataAreaId='USMF',CustomerGroupId= 'TestC01')?Cross-company=true HTTP/1.1
Content-Type: application/json; type=entry { "Description": "Test C1111" } --changeset_246e6bfe-89a4-4c77-b293-7a433f082e8a Content-Type: application/http Content-Transfer-Encoding: binary Content-ID: 3 DELETE VendorGroups(dataAreaId='USMF',VendorGroupId= 'Test01')?Cross-company=true HTTP/1.1
Content-Type: application/json; type=entry --changeset_246e6bfe-89a4-4c77-b293-7a433f082e8a-- --batch_22975cad-7f57-410d-be15-6363209367ea--
Response:

     1. Create record response.
 
         
       2. Update and Delete responses.


MS Link.

Keep Daxing!!

Tuesday, December 17, 2024

Dequeue Process in D365FO

API for export (Dequeue) Process in D365FO. By using this we can export the file to other system.

1.Create the Export project and select the Data entity and file type.

2.Click on the three dots, select 'Manage,' and then 'Manage recurring data jobs.'

3.Click on 'New,' give the project a name, provide the application ID, and set 'Enabled' to true.

4.Click on 'Set processing recurrence' and select the batch time. Set 'Is recurring job enabled' to 'Yes.'

5.Copy the generated ID from the ID field, which will be used in the deque URL.

6.Replace the placeholder activity ID in the deque URL with the ID copied from the data project.

{{resource}}/api/connector/dequeue/{activity ID}?company=USMF

7.If you click 'Send,' you will receive a 200 OK message and a download location in the response JSON.

8.If a record is not available, it will return a 204 No Content status.

9.Once the batch job is processed, the message status will be in the 'Processed' state. After triggering the URL, the message status changes to 'Dequeued.'

 

Keep Daxing!!

Wednesday, December 11, 2024

Enqueue Process in D365FO.

API for Import (Enqueue) Process in D365FO.

1. Create the import project and add the data entity and its file type.
2. Click on the three dots, then select 'Manage recurring data jobs' under the 'Manage' tab.

 

3. Click on 'New,' provide a name, enter the application ID, and set 'Enabled' to true. Select the data source type as either 'File' or 'Data Package.'

4. Click on 'Set processing recurrence' and select the batch time. Set 'Is recurring job enabled' to 'Yes.'

 

5. Copy the generated ID from the ID field, which will be used in the enqueue URL.

6. Replace the placeholder activity ID in the enqueue URL with the ID copied from the data project. Also, specify the data entity name and company.

URL:

{{resource}}/api/connector/enqueue/<activity ID>?entity=<entity name>&company=USMF

7. Upload the Excel file under the binary node. And it will return the message id.

Using Post man:

 Using Logic app:

8. Click on 'Manage messages.' You will find the record in the 'Queued' state after triggering the URL.

9. The batch job will then run, process the records, and change the status accordingly.

10. If you want to check the status, you need to trigger the corresponding URL.

URL:

POST /data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetMessageStatus

Body:

{
    "messageId":"<string>"
}

Using Post man:

 Using Logic app:


Keep Daxing!!


Wednesday, December 4, 2024

D365 FO Export package using Rest API

D365 FO Export package.

1.Create the export project in Data Management and add the required data entities to the export project.

Start the process:

2.Trigger the below URL with the required parameters to export the data in package.

  • You can assign an execution ID, otherwise, the system will generate the ID.

URL : 

POST : {Resource}/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.ExportToPackage

Input parameters:

{
    "definitionGroupId":"<Data project name>",
    "packageName":"<Name to use for downloaded file.>",
    "executionId":"<Execution Id if it is a rerun>",
    "reExecute":<bool>,
    "legalEntityId":"<Legal entity Id>"
}

3.In response, the system will send a 200k response and return the execution ID in the response payload.

4.In job history, you will find the transaction.

Get or Download file:

5.Trigger getExportedPackageURL to retrieve the package, passing the execution ID as an input parameter.

URL : 

POST : {Resource}/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetExportedPackageUrl

Input parameters:

{"executionId":"<Execution Id>"}

6.The system will return the temporary blob URL in the value field. By using this URL, you can download the file.
Output:

Get the status:

7. In some scenarios, the data is huge and the system will take some time to process. In such cases, the above step will not work immediately. Before that, we need to check the status of that transaction.

8.To get the status of the transaction, trigger the below URL and pass the execution ID as an input parameter.

URL : 

POST : {Resource}/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetExecutionSummaryStatus

Input parameters:

{"executionId":"<executionId>"}




Keep Daxing!!


Tuesday, December 3, 2024

Import package in D365 using Rest API.

Import package to D365 using Rest API.


1. Before importing the package, we have to upload it into the blob. by triggering the below URL to get the Azure Write URL. You need to pass the UniqueFileName.

URL : 

POST : {Resource}/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetAzureWriteUrl

Input parameters : 

{
    "uniqueFileName":"<string>"
}


2.It will return a 200 OK response with the AzureWriteURL.

3.Copy the URL from the previous step's response and paste it into a new request. 
4.I have used Postman for this process. So In headers, add the required key and value; otherwise, it will throw an error.

Key   : x-ms-blob-type
Value : BlockBlob

5.While triggering this, use the PUT method. Under binary, upload the file and click send.

6.If the file is successfully added to the temporary blob URL, the system will return a 201 Created status.

7.Create an import project and add the required data entities.

8.After this, trigger the ImportFromPackage URL. The required parameters should be passed.

URL : 

POST: {Resource}/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.ImportFromPackage

Input parameters : 

{
    "packageUrl":"URL from 1st step",
    "definitionGroupId":"Import Data project name",
    "executionId":"<string>",
    "execute":<bool>,
    "overwrite":<bool>,
    "legalEntityId":"<string>"
}

9. The system will return a 200 OK response and an execution ID.

10. Go to job history and you will find a new transaction.

11. Once it is processed, you will find the data in the staging and main tables.


Keep Daxing!!