Tuesday, December 17, 2024

Dequeue Process in D365FO

API for export (Dequeue) Process in D365FO. By using this we can export the file to other system.

1.Create the Export project and select the Data entity and file type.

2.Click on the three dots, select 'Manage,' and then 'Manage recurring data jobs.'

3.Click on 'New,' give the project a name, provide the application ID, and set 'Enabled' to true.

4.Click on 'Set processing recurrence' and select the batch time. Set 'Is recurring job enabled' to 'Yes.'

5.Copy the generated ID from the ID field, which will be used in the deque URL.

6.Replace the placeholder activity ID in the deque URL with the ID copied from the data project.

{{resource}}/api/connector/dequeue/{activity ID}?company=USMF

7.If you click 'Send,' you will receive a 200 OK message and a download location in the response JSON.

8.If a record is not available, it will return a 204 No Content status.

9.Once the batch job is processed, the message status will be in the 'Processed' state. After triggering the URL, the message status changes to 'Dequeued.'

 

Keep Daxing!!

Wednesday, December 11, 2024

Enqueue Process in D365FO.

API for Import (Enqueue) Process in D365FO.

1. Create the import project and add the data entity and its file type.
2. Click on the three dots, then select 'Manage recurring data jobs' under the 'Manage' tab.

 

3. Click on 'New,' provide a name, enter the application ID, and set 'Enabled' to true. Select the data source type as either 'File' or 'Data Package.'

4. Click on 'Set processing recurrence' and select the batch time. Set 'Is recurring job enabled' to 'Yes.'

 

5. Copy the generated ID from the ID field, which will be used in the enqueue URL.

6. Replace the placeholder activity ID in the enqueue URL with the ID copied from the data project. Also, specify the data entity name and company.

URL:

{{resource}}/api/connector/enqueue/<activity ID>?entity=<entity name>&company=USMF

7. Upload the Excel file under the binary node. And it will return the message id.

Using Post man:

 Using Logic app:

8. Click on 'Manage messages.' You will find the record in the 'Queued' state after triggering the URL.

9. The batch job will then run, process the records, and change the status accordingly.

10. If you want to check the status, you need to trigger the corresponding URL.

URL:

POST /data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetMessageStatus

Body:

{
    "messageId":"<string>"
}

Using Post man:

 Using Logic app:


Keep Daxing!!


Wednesday, December 4, 2024

D365 FO Export package using Rest API

D365 FO Export package.

1.Create the export project in Data Management and add the required data entities to the export project.

Start the process:

2.Trigger the below URL with the required parameters to export the data in package.

  • You can assign an execution ID, otherwise, the system will generate the ID.

URL : 

POST : {Resource}/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.ExportToPackage

Input parameters:

{
    "definitionGroupId":"<Data project name>",
    "packageName":"<Name to use for downloaded file.>",
    "executionId":"<Execution Id if it is a rerun>",
    "reExecute":<bool>,
    "legalEntityId":"<Legal entity Id>"
}

3.In response, the system will send a 200k response and return the execution ID in the response payload.

4.In job history, you will find the transaction.

Get or Download file:

5.Trigger getExportedPackageURL to retrieve the package, passing the execution ID as an input parameter.

URL : 

POST : {Resource}/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetExportedPackageUrl

Input parameters:

{"executionId":"<Execution Id>"}

6.The system will return the temporary blob URL in the value field. By using this URL, you can download the file.
Output:

Get the status:

7. In some scenarios, the data is huge and the system will take some time to process. In such cases, the above step will not work immediately. Before that, we need to check the status of that transaction.

8.To get the status of the transaction, trigger the below URL and pass the execution ID as an input parameter.

URL : 

POST : {Resource}/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetExecutionSummaryStatus

Input parameters:

{"executionId":"<executionId>"}




Keep Daxing!!


Tuesday, December 3, 2024

Import package in D365 using Rest API.

Import package to D365 using Rest API.


1. Before importing the package, we have to upload it into the blob. by triggering the below URL to get the Azure Write URL. You need to pass the UniqueFileName.

URL : 

POST : {Resource}/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetAzureWriteUrl

Input parameters : 

{
    "uniqueFileName":"<string>"
}


2.It will return a 200 OK response with the AzureWriteURL.

3.Copy the URL from the previous step's response and paste it into a new request. 
4.I have used Postman for this process. So In headers, add the required key and value; otherwise, it will throw an error.

Key   : x-ms-blob-type
Value : BlockBlob

5.While triggering this, use the PUT method. Under binary, upload the file and click send.

6.If the file is successfully added to the temporary blob URL, the system will return a 201 Created status.

7.Create an import project and add the required data entities.

8.After this, trigger the ImportFromPackage URL. The required parameters should be passed.

URL : 

POST: {Resource}/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.ImportFromPackage

Input parameters : 

{
    "packageUrl":"URL from 1st step",
    "definitionGroupId":"Import Data project name",
    "executionId":"<string>",
    "execute":<bool>,
    "overwrite":<bool>,
    "legalEntityId":"<string>"
}

9. The system will return a 200 OK response and an execution ID.

10. Go to job history and you will find a new transaction.

11. Once it is processed, you will find the data in the staging and main tables.


Keep Daxing!!

Tuesday, February 20, 2024

How to retrieve the value from array or list in logic app

 

As per my requirement, I need to trigger third-party API. Upon a successful response, the API provides an array containing the Account ID. I have to retrieve this value without utilizing a for-each loop.

 I have successfully met this requirement using the following approach. 


Below is the JSON format:

"OrderLines": [
    {
	"line": "1",
	"orderedAccount": {
	    "Reference": {
		"AccountID": "ABC1234"
		}
	    }
    }
]

I have used the Below expression:

"body('HTTP')?['OrderLines']?[0]?['orderedAccount']?['Reference']?['AccountID']"


"OrderLines": [
    {
	"Reference": {
	    "AccountID": "ABC1234"
	   }
    },
    {
	"Reference": {
	    "AccountID": "ABC1235"
	    }
    }
],

I have used the Below expression:

"body('HTTP')?['OrderLines']?[0]?['Reference']?['AccountID']"

Output: ABC1234.


Keep Daxing!!














Friday, February 16, 2024

Get worker current company using x++

 Get worker current company using x++.




    HcmWorker               hcmWorker;
    HcmEmployment           hcmEmployment;     utcdatetime             now = DateTimeUtil::utcNow();     CompanyInfo             companyInfo; 

    select ValidTimeState(now) hcmEmployment         join hcmWorker
            where hcmWorker.RecId == hcmEmployment.Worker
            && hcmWorker.PersonnelNumber == ''
        join companyInfo where companyInfo.RecId == hcmEmployment.LegalEntity;


Keep Daxing!!