How to use Azure Logic apps with DMF API's in Dynamics 365 F&O

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
all once again we are now in session 28 uh from past one and a half years that's uh we have been doing and uh today we are more focused on Azure uh logic apps with Microsoft Dynamics 365 finance and operation uh today's session is led by uh Anita which is our lead speaker uh and the group introduction I will give you uh this group we started around one and a half years ago and uh with your love and support it has been growing and we have almost reached reached 1500 uh active users replying onto our groups and uh engaging yourself and ourselves making them uh that we we do exist and I see a lot of people participating giving us things and knowledge wherein we can interpret what can be brought in for your knowledge sharing and keeping you update so this volunteer group is basically led by Rachel who is now MVP recently and Umesh Pandit satta kejriwal as well uh we all are now based in uh Sydney uh like in Australia uh rachets are leading it from Melbourne uh Satya and Umesh leading it from uh Sydney we started back in India but uh the the the still we are looking forward for you to serve uh as in community um this is our LinkedIn handle Twitter handle as well as Facebook uh sorry uh YouTube you can actually scan the code and get this connected uh as soon as the recording is available for us to do we will be uh uploading it onto the YouTube and if you are following us or subscribed you will get the notification instantly we are more focused on Microsoft Dynamics as well as the surrounding the ecosystem like power bi power apps power automate power virtual agent in case you want to uh be any speaker do touch base with for me and register okay so till now we started in the 10th of July and uh in 2021 and that's our uh whole spectrum of what we have delivered till now all these sessions with number are available on our recorded platform YouTube so you are good to go today's session I will not delay more uh today's session is led by uh Anita hand over to you uh you can give us a brief introduction and keep your screen sharing that will be nice thank you very much and I welcome you all again and over to Anita thank you Mish hi all good morning good afternoon good evening everyone here I hope you are all doing good and I thank Umesh and rajit for giving me an opportunity to share my knowledge with a wider audience here and I also thank everybody who took the who took time in spite of that this is scheduled to attend this session thanks a lot so with no further delay let me share my screen and share about today's agenda let me turn off the camera ratchet maybe that might be interrupting one moment yeah yeah do you see my screen yes we do yeah so agenda for today is about the presenter that's me Anita ishwar and then the next since we are going to speak about logic apps DMF so there comes the integration part so I will be just giving you a touch base on what are the parameters and the patterns and the batch data apis especially for in fno so and then followed by a demo and if there are any question and answers we would have time for that as well so so I am into this Microsoft Dynamics AX world for 17 years so if I remember there it was then ax where we had the blue screen with the common credentials to enter our details so then it's hap it is excited to be a part of this Evolution Journey now it is being called as if I know so now I am purely a fno technical and plus I also have Hands-On on integration with the Azure logic apps Power Platform then ADF Azure data Lake and whatnot because every day we have some updates keeps coming on right so we need to keep ourselves updated right um so it's like that and now I am Senior digital architect with Sonata software Europe Limited I'm uh it's uh it's headquartered in Bangalore India so I think you might have known heard about the news where Sonata had a partnership in uh this Microsoft favorite which is a hot topic now so in case if you had not um just read that news please have take some time and have a look on how the Microsoft fabric had uh handsworth Sonata software to in the collaboration you can have a get a brief look on that and originally I am from India So currently deciding in UK and I also share my uh post in my blog Anita Santosh where it has mentioned here so it's not only fno technical whatever I learned be it Power Platform or logic apps or cql whatever it is I share my knowledge in this particular blog and it's been also syndicated in the Microsoft community so as soon as if you are an active member of Microsoft Community you can also get downloaded there and my hobbies are reading books and Association writing blogs and I prefer short travel and if you want to get in touch with me this LinkedIn you can have a check and you can just have a look and you can get in touch with me anytime so without further delay let us move on to our topic so the parameters for integration so as soon as you start the integration so it's not like a development where you need to spend some time on some key factors which would help you to decide the approach which you need to follow for your requirement because we have various types of requirements which will come for integration right so you need to focus on the parameters so this I have specified few key parameters here which would help you on designing the approach one is data volume so when it is integration obviously data would flow out and in of fnr right so you need to First find what is the data for volume for your integration be it it might it might be flowing from um minimum flow or the medium or the volume in S1 because based on this only you will be deciding what would be the integration pattern you will be deciding because for example some of the patterns will not support a volume in us our data so in that case this is one of the key factor where you need to get a handle on data volume and also make sure that you get some idea on the peak and non-peak hours volume also because at times um sorry Anita we lost your screen are you sharing yeah I'm sharing my screen do you see my screen I have lost it as well learning yeah just try to uh I think I'll let me make you presenter Maybe by mistake I uh made you attending can you try now yeah are you able to see now yes okay so now yeah uh coming back to the peak and non-peak arts data volume so in few cases where you have um uh month-end closing or the year end closing the data flow might be more in that case so even that case is also you need to check what is the volume during the pay cards and what will be the volume during the non-pgarts so this is number one and the next is frequency so frequency it can be either ad hoc or based on the recurrence either it can be 30 minutes once or it can be hardly basis or after business ads or bi-weekly so here you need to take care of the frequency as well it is not necessary that when the mode the frequency it will not when the frequency is also more the data volume will also be moved so it is not like hands in on but it is one of the key factor to decide and the next one is synchronous or asynchronous so synchronous is you need to check whether it's a it's a kind of real time here as soon as you do something in your Source system immediately it will be reflected in the and the destination so that is what you call a simple real time so and async is a kind of batch where it takes some delay to uh to reflect to send the data from the source to the destination so these are all the key parameters you would need to know do you need to identify when you are working on the integration and the next one so with the key with the pattern with the parameters with us now we will see how do we match the patterns with the parameters so one of the so I have listed few of the integration patterns yes one is odata the custom service so mostly here uh uh these are all this matches with the real time scenarios and if you see the volume it doesn't support the maximum volume where the volume is more so in case if uh if for maximum it would support 6000 records in a five minute window so for example if your integration is sending some 10 000 records then in this case this this is not the right approach for you to choose and though you have a chances of uh even if there are even if you send 6K 6K records there are chances of throttling as well you can Define throttling feature in fno where you have a setup to give a high priority for your particular service but then also there are chances that you will have some AOS performance issues so odata or the custom service you keep it for your minimum or the medium data load and the frequency is maximum 52 concurrent request it can handle see for example your system will be used by different users right from different from different ends so maximum it will be able to handle 52 concurrent requests otherwise if it crosses this limit then our chances of failure as well and as I said this is a real time scenario and the next is Excel integration so just we are we are already aware of this is already adding feature in fno and the volume will be maximum 10K and 10 000 records will be exported at a time and whereas the input you it will be imported in batches so in fno system administration you have a setup to Define how many records you need to handle in a batch so if you don't specify that by default it will be 100 in a batch it will be imported and the frequencies as usual ad hoc whenever the user users need or need to import or export the data then you can set this integration and it is a near real time so it takes some minimum delay to reflect the data in the N system very minimal and next is event driven um even driven as you know here the word indicates it is business event so whatever it is it's only an outbound from fno uh so whenever there is some event like insert or update or whatever it is confirmation or whatever it is then the event will be triggered from fno so here also we have a limitation on the volume where you have 64 kilobytes per message so if it crosses that then the the there are the fair chances of failure and it will not reach your end system so care has to be taken it should not cross this limit frequencies ad hoc because this is everything is purely event based so whenever there is any event defined in a system happens then this will be triggered um this is a near real time as I said and here comes our uh radar one the batch data API so here uh I will be explaining the about this in the next slide but this one is the best one best for the voluminous export or input um it not only for but even if you have a minimum volume this is suggested you can use it so and here you can set as recurrence so either you can you will be able to define the recurrence or it can be combined even with given to Dripping and then you can do it so this is a async where you will be able to set the batch and then based on the batch availability the pros records would be processed now over to the next slide and so our topic is based on the batch data apis we have two different types of batch data apis in FNL one is the recurrency Integrations API and the next one is DM of package APA not major difference in that um I have um but at the end it is just that it will be data will be split as batch when you proven fno is going to process or it is going to export or import everything is based on batch only but there are some differences where um between this this recurring integration and the package AP so recurring integration you will be defining the it will be scheduled in fno where you will be creating an export or import project and then you will be defining a batch for it so the schedule happens in fno and everything is based on the batch reference what you said here but the other key is APA it is outside the iPhone either by a logic apps or the Power Platform whatever it is this will be scheduled outside it because the outside system is going to call this APA and the supported um uh formats in recurrence integration is files and data packages I mean data packages consists of the data file manifest and the package header and it can also support files as well in this case and whereas in package API only data packages are supported so if you have any access LD transformation in in your requirement then in you need to go for recurrency API but in that case if here recurrence AP is not sub xslt is not supported but if the transformation can be handled in the external system then this can be used and this is mainly for the cloud system and whereas API this map package API which is applicable to both on-premises and cloud and the the commands used in recurrence Integrations are NQ DQ acknowledgment get message status this might look little bit high for you but as you go start implementing this this will be this will be a familiar keywords in your scenarios and in the data management apis it will be import from package export to package get Azure URL and these are all the keywords what you will be seeing in the demo so in our case I will be giving you a Hands-On demo on the data management framework package APA so next I will go to demo so I have a so here we are going to make a simple scenario where we will be importing a customer group record as a package to fno so how do we handle it via logic apps is our next task so I have created now my pattern will be my my this will be a recurrence so for example let us set it as three hours three hours once so I start with the recurrence it can be even even driven also it is based it's purely based on your requirement or it can be a HTTP trigger whatever it is so now to keep it simple I have kept it as um recurrence as three hours so since this one I have stored in a container so how do you go to a container you have something called a storage accounts here and I have created a storage account and in the container I have kept the file so this file it consists of a simple customer group where in our system so now this is ready now let us go with The Blob research here you have a zip storage block and because these are all the actions which you have related to this Azure storage blob so now the first one is I am going to list the data what are the files to be read from this blob so this will be the list blobs here so by default the correction is already available so if you want to make a new connection not a big thing change connection you can add a new one here this is for demo let me give and authentication type I am going to give as access key how do you do it is go to the storage account and you have something called as access Keys here okay and this is the this this is created automatically as soon as you create a storage account so practice and yeah so it's ready so if you see the folder you will be able to see the folder what I have created here it is customer group account so in the container this is the one so I'm going to choose this one okay and the next step is we are going to now we are going we have listed the we are going we have read the block okay which would list the files in this block and the next one is you need to get the file content on this block right so you already have a connector the action for this against this connector get Block content and if I click this the blog will be this from the previous step so if you see automatically the loop has been added the reason being the blob will have multiple files right so that is the reason the framework takes care of bringing the loop automatically here and so the so now we are ready to do the block content Here Comes how do we connect this to f1o type Finance here okay and let me search for Finance yes Fin and drops apps so here I am going to do a execute action so here I already have a connection let me tell you how do I create this connection let me create a new one for you I am going to connect with the service principle service principle is nothing but I think you would you would be familiar with the app registration where you will have the client ID secret and tenant details so with that I am going to create a connection with the financial operations I already have the client ID with me so we okay so here you will be getting the tenants and in the instances whatever you are available it is available for against the app registration you have done the next one is I am going to create a blob writable blob URL which generates the SAS token let me tell you why we are doing this give me a moment let it load here it will take some time few seconds to load being the first time okay so if you scroll down because everything goes around the data management framework so go to data management you have something called as get as you write URL so you are not go you are going to use this URL because directly you cannot pass the file content this particular zip file the the package to the import from pack import to package command so you need to convert that as a package URL and then you need to send it directly there and that is the reason you are using this get Azure you I name it is unique you can give any file name at the moment I will give for okay and the next one is here if you go here I will tell you something so this would generate this would generate a blob URL the output of this is a blob ID and the blob URL so this is in a Json format so our radar here is you need to extract this blob your url from the output of this Azure get Azure URL so how do we do do that so you are going to use the parts Json here you already have data and path Json so the output will be this value how I do is simply I go here copy this okay this is first time I am just showing you how this can be done okay remove this one you need to convert it to a Json format okay I usually do it here I have installed the plugin here which gives me a Json format so this becomes your schema yeah use this payload and this will be your schema over save it and the next step is so now this the output of this particular step would give you a blob URL so so in that blob URL you are going to upload this file so that will be passed as that URL will hold your package file ATP okay and put so which says that you are going to see if you from the output of the step you have a blob URL ready okay so this would hold the content the file content this one okay you can rename the step properly okay and no parameters needed over so now this URL is ready which is holding our file where which is a package the zip file which has manifest package header and the data the last step here is again the execute action okay and I am going to choose the same instance here and here you will be able to see the import from package if you scroll picture just import from package so as soon as you click here you will have you will be given various parameters okay and the package URL is the URL which is generated from the previous one so because now this URL holds your data holds your file so I will be giving this blob URL here definition group ID is the definition group is the export project name if you don't have the export project created in fno the framework takes care of automatically creating it for you but in our case I have created one fno uh the import project here uh let me show that to you give me a moment yeah this is the project I have created and uh it's a simple CSV file though we are importing as a package here I have only used a simple CSV data format so this name should be given there execute execution ID you can leave it blank because that would generate on the Fly you don't need to have it here execute yes because this should execute the target step as well because that goes via staging and Target so this particular parameter says that it has to execute the target State password over it if there are any existing files that are already there it will be overwritten so legal entity now at the moment I am going to use this usmf to info so give it as viewers okay so I will just rename it now I was our logic apps is ready it's now time to execute it you have on the Run trigger you have two ways to execute I usually go here and do a run trigger and this is executing okay there's an error let me just check okay I will do something here I will just copy a proper one maybe file content might not be proper here okay so execute action this is fine okay and okay let me just check one the reason why it has failed file content here okay it says okay do you need to give a suffix of dot zip with the file name I think it's not necessary okay yeah because it says it's uh it's when it is uploading the file content that is giving us the error there's something wrong here that's it okay so just to save our time okay I have already created one for the group here so they will be able to use it yeah so the similar one let me execute it for you that has the same steps what has been done here just on the count of time limitation what we have I am going to do it here if it is succeeded if you go to job history yeah it is ended so if you go here if you see test one and test would have been already created so um there is also another way where uh you can use um the mq mq which Imports the data to fno so maybe I can give you one quick um uh ratchet if you have some time can you can I add a one more step of EX telling how to use a NQ here is that fine we have some more time right yes yes yeah yeah fine yeah and this is this one is via this one is via the package APA so where we are importing the package via import from package framework so I will also show you the another another example of how do you use it via um the enqueue format so here if you see this is recurring and correct integration where you need to set up a batch in fno so this is already uh the command is already available how do we do with this I already have a a project here so how do I set up a red currants let me show that manage manage recruiting job in the authorization policy will be and you should be giving the application ID here okay I will get the application ID for you here let and here you need to give the data source source format as data package if you are importing a normal file this should be a file so in our case we are going to import the data package let us make it as data package here and here the recurrence will be one minute and you need to enable the recurrence here over so here this ID is the main one which we will be using it in our logic apps so let me copy this URL this is for NQ the earlier one didn't had any batch return setup so that is the reason I had told you in my earlier slide there it will be scheduled outside the fno so in this case the scheduling will be inside fno so now if I go here I will use the same one doesn't matter I will go to http the same one post and here I will be pasting the URL the URL will be the fno URL here and the activity ID is the activity ID which we have set copy this same page seven so this was the convention of the Open Bracket and the closed bracket and the entity name here is we are going to use the entity name as customer groups so that will be the entity name here and the body content is nothing but the file content which we have already read from the blog and here you need to add the parameter because here you are not using the execute action directly here you are going to use the HTTP so in that case you need to add the authentication explicitly here so click here and the authentication I am going to choose here is active or data and the Tenant as usual the same details whatever I have used it will be used here app ID is intid us and audiences the URL this is the secret okay so so we'll be doing for n Cube so explicitly you are defining a recurrence in fno that will be scheduled so now let us see okay I will just remove these steps so that it doesn't fail or you can do something I can configure to run only if this has failed so the rest of the steps will not be executed here so let us execute it it's running okay so now this has executed so not to worry so as the earlier Step at the issue it is still failing but this has executed and you would find a record in the manage recurring job and manages manage messages it will be in progress if you see this is the message ID if you see the output of the script you will see a message ID here the output of this one let me just minimize it the output of this one will be 3b1 which matches with this one so it is in progress and it is processed if you see test and test two if the record is already existing it will be overwritten better because the the framework takes care of it if you see this is the one so in this uh demo I had given you a walkthrough of how to use uh data management package API where this where to schedule your back currents which is scheduled inside the fno yes so so if you have any doubts yeah we can start Anita you have not uh updated this keyword parameters and uh perfect parameters right like how to check the data so we need to add this interface parameters and keyword parameters right no I am not using the keyboard here right key vault is also another option in this case I am using the app registration where the client ID tenant and the secret comes into picture there is also another option where you can use the keyboard parameters there you have various options to do this integration so that maybe now the now I have chosen an option of using the app registration that is audit or authentication we call it as audio experience first of all like Anita and team um appreciation because you've been doing a fantastic service are providing a knowledge sharing right it's it's been like I thoroughly follow this community and it's been really useful I must extend my uh thanks to you people are doing the wonderful job uh the one of the scenario that I could think of like Yeah you mentioned you will be using the package as an input for the logic app so where it is actually pigmented the location the source and it's trying to process it uh in few of the scenarios where we especially work with third party so where we try to get a file from the SFTP online so we would hardly receive only CSV files and uh for an import project like uh you know where we try to use package for the integration so how do we make uh like is there any way of doing it preparation of a package or no on apply with logic app like because every package has Manifesto and adder and uh files yeah but uh like in some cases like we only have CSV in the source but when we want to prepare a package by taking that CSV and apply to logic apps and then pass it on to the uh further execution so how do we do that like is there a way to do it created this can be you have two options here one is you can use it use the Azure functions okay get the logic apps that get the file here and using Azure functions create this package for you so that is one option and since you are using that is this will be using the the management AP package APA so the other case is in case if you have a SFTP connection with you you can cut I mean with the wire logic apps you will be able to connect to the SFP connector with whatever the credentials they have it right you can connect to it and I have also given a walkthrough of a NQ method correct so you can use that to import the file directly so in my demo I have given you a package but in that case you can import as file directly so there is another approach for you okay okay yeah and just to add uh just to add uh thanks mate first of all for appreciating uh we need this love and support to keep driving this group so keep joining us in all the sessions please uh so another option is there are some like if you are dealing with scenarios where um you don't have that expertise or in-house developers to write Azure functions for you there are some connectors available to create zip files so what you can do is you can Define your manifest files as a template somewhere and then you you can use these paid connectors to create a zip file for you so that's also another option that is basically um you're not investing in writing as your function whereas you can buy some subscription which gives you a lot of these flexibilities around creating packages because there can be scenario where you have multiple entities in one zip file so if you are having attachments for example then it can be quite cumbersome to write your own Azure function so that is also the third option so Azure function ADF pipelines third is paid connectors thank you oh thank you thank you rajit and uh Anita because this is one of the scenario where I had got a chance to work on my project and we were doing some Bank no bank statements Imports and we hardly have security wishes whether connecting with the banks around getting only CSV and everything we have to prepare manually and then important during that time we had a challenges like preparing the package and as rajit mentioned so the same issue we have fallen into like so we don't have any anybody you know who can create Azure functions and then I I think that's where I wanted to actually understand and I think uh got some it's up now I think thank you very much for sharing this yeah thanks remember one thing uh before doing into the customer environment you test it in your development environment internally that would be great yeah yeah yeah that that has been the taken care yeah yes yes all right uh anyone else want to unmute and ask the question I see a long question from lahiru the hero do you want to unmute and explain uh your scenario mate hello hi Anita so uh Anita I I have to I have to transfer data from d365 to another database this scenario I have to occur this in the 10 minute twice is that your custom entity or the out of box entity if the change tracking is enabled then we need to check whether the change tracking is working properly or not so if the change tracking is enabled and if you have set up a data export project for it that would take care of exporting only the incremental load okay there I think you have I think you would have you would be able to enable it for the primary table otherwise you can even be able to root for the data sources the entire data sources as well right so yes they have inside data sources and I active change for the entire entity yeah yeah but still it is unable unable to export the incremental load right yes I think I firstly I added 10 rows in the table main table primary table and then I do the logic cap logic then I got the 10 rows and after that I had the one row again uh then next time I run I got the 11 rows but I need to get only last only updated you don't need to go till logic apps so if you create an export project in fno just export it manually and see every time you don't need to disturb logic apps for it create a data management project here expose that entity in your data management project and make sure the change tracking is properly enabled whether it is a primary table or the query or the CT entity and whole entity yeah yes okay maybe is there anything to do uh without the enable in chain clicking no it is not possible you need to enable change tracking to bring your uh rather than anything to do more than enabling chain tracking um maybe you need to see what is wrong with your joints just check if your joints are proper there because that might also and where are you if you try modifying also are you getting the are you getting the Intel or getting exported yes or only when you add okay yeah then then you need to see then there is something wrong in the custom entity only then so please check the joints and then see how it is working and one more thing what I can do you can refer is in case my Approach is whenever I create a custom entity and whenever I enable such change checking and if it is not working I go back to the standard entity and see if something I have missed in my custom entity so I just compare it and see there I get some Clues query as well sorry you can also check the city query okay you have in the entity you have a method called for CT query so that also you can be checking okay uh you can have a modified date and time and you can have the filter on the DMF export project to include that okay right yeah I think that is a pattern I have seen where customers I filtering on modify date time and even like if it's not a very high volume scenario you can also use odata call with this filter expression on modify date time to flesh out your Delta another option is you can also use export to data Lake microservice which has the capability to feed Delta Data out so you install the microservice bring your data in data Lake and then from there you can query and move it forward so yeah there are all these options depending on your situation you can troubleshoot it yeah oops thanks thanks cool cool so let's take the next one Kunal Kunal you are number one in the list so maybe you can go next yeah sorry do you have a question Kunal yeah sorry I was on mute yeah okay so hello everyone and thank you so much information the sessions are pretty much engaging and they really give us a surface to at least start on something and then based upon the knowledge we receive from you people build something great so thank you so much for these sessions okay Anita I have a question that maybe I'm unable to formulate the question well because just so it's just a query that I raised in my mind I I saw saw in the session that you chose the data entity uh when when you were formulating the logic app design so let's say uh sorry not a data entity but in the company data area ID was using usmf as data area ID that where you want to push your files if you know so uh let's say I want to use a multiple data area IDs let's say our two containers in which one of the containers there all the files of the legal entity one and in another container there are all the files application too so in that case do I have to get two separate flows because I saw you choosing the data area ID manually or is it possible let's say I have such a CSV file or such a data package in which in in every row I have a data area ID defined over there and my logic app would would by itself take care that okay this row has a data area of this one so I need to push this record into this entity okay second row has the data area of data ideally too so I need to push this record into entity two so just because I saw you manually choosing it so how to proceed in such a scenario so do I have to get separate flows for different entities okay if your file is having the okay so if your file is having does it have that does it carry the information for the company ID or do we need to mention it in the command in that case to which company your data has to go right at least that information it should have in your file right based on that at least you should be able to Loop you'll be able to extract the data area ID from the file and then you will be able to import it okay so you mean to say that in some field I will specify that this is the data area ID where uh the the action need to be performed and then system the logic output itself identify that this is the area ID where the system will not identify logic apps will not identify at least it can it can segregate the data see for this file for this data area ID maybe this is this you can create we even we had a similar requirement like this so what we had we read the file we segregated and created a file each file for each entity for each legal entity and then we sent it as in the NQ so that is how we we did so maybe not sure we have a better approach or that but this is how we did we created one file for each legal entity before reading before that we'll be reading that original file to segregate the data okay so so do you mean that people using the same flow for the first time we were defining data area id1 and reiterating the file number one and for the second second time we were we were modifying the data area ID and uploading the file to right yes yes okay okay thank you that's that's solves the purpose yeah I I would also like to add My Views here um so are you calling standard data entities or you have a custom entity mate I'm I'm assuming the same like customer group as Anita picked up right so as Anita mentioned if it is a standard entity uh you cannot have cross company if it is not enabled out of the box for that entity otherwise the option is like third option is you create your own custom data entity where you keep the data area ID as one of the column of your data entity field and then you write logic in the data entity while it's processing to change company it will have some performance implications though because every time you switch company on each record then create record there and come back and switch company again so if it's a high volume scenario then the recommendation is do it outside Dynamics as like you can use any ADF pipeline or Azure function or anything where your team is skilled to split that incoming data into multiple files for legal entity and push it into Dynamics otherwise if I mean one option is to customize Dynamics to our custom data entity so really depends on on what Fields but yeah that's another option yeah okay thank you thank you so much yeah cool uh okay next one is Ali you want to go next mate yep yeah yeah thank you thank you organizers for this wonderful session uh the question here it is not a technical question so basically whenever we use the logic app what is the cost effective here based on the request or based on the volume of data so just I want to know this you know one thing you have what type of logic apps you are going to use at the moment I have used consumption okay so this will be built based on the number of steps you have created in your logic apps okay and you have a uh standard so that is base that is based on the subscription what you have selected so you have two types here so it depends on which one you are going to sell it okay did it answer your question yeah you have to to two types of design here one is the consumption workflow and the other is the normal workflow so if it's a normal workflow it will be built based on the subscription what you have and anyway you have a Azure price you have Azure price calculator as well so that would give you a heads up on what would be the approximate cost you will be getting when you execute your Azure components okay for it is a request or how it can be sorry it is for each request or it can be otherwise it is normal subscription it is yeah it you will have a premier subscription as well where you will have some some limitations you will have something which comes under your subscription technique it is called Premiere subscription I'm not sure about it so it is built based on the subscription what you have this is for the other type of workflow not for the consumption based okay okay thank you thank you I will check this one second yeah thank you almost you want to add something to regarding pricing I know you have good experience in that as you guys yeah so basically some tips to save some cost there yeah so basically cost is involved okay so and in case if you're doing uh for development environment you can use subscription based out of your Visual Studio which is comes under the Azure Dev test subscription and uh while you are using it for production or uat you can actually use until uat in fact you can use Dev test and when it comes to production you just try to use it pay as you go and uh in logic apps generally we have two type of uh plan which is available one plan is standard another plan is premium So based on what is needed and how much is your frequency and the data volume then you can choose based on that and even as Anita suggested that is it is somewhat consumption based as well so based on that if your consumption is too high go for premium if your consumption is minimal go with standard one ah okay okay thank you you're charged per your charge per action right um so if my logic app has 10 actions happening each action will incur some cost right is that correct understanding uh I think it is also uh based on increase and there is a component called increase in outgrade so as your data center as your data leaves your data center and goes into the Dynamics uh you need to see that as well so like it has whole lot of parameters in which we put in and then get the uh correct volume uh and prices what it is being charged for right yeah thank you thank you welcome so who is next I think SK would you like to go first now um um my question is how can we do a code promotion um because I am new to logic app so I'm currently Landing uh I have many questions so but I I put some some questions here so other other questions I can connect offline to Anita or someone uh how can we do a code promotion you said that you can use a release pipeline so um how do you define the variables for example you are a lap information so do you have any configurations uh settings uh we can Define it in logic apps yes so basically there are multiple options so you can either use Powershell scripts or you can also use Azure CLI command line interface to do the automation part so there are like I would say many videos available on YouTube where you can watch but basically you can define a configuration template which where you will Define all these variables and then you have another file which takes care of the logic app definition so when both of these things are combined you can actually automate your deployment so I mean it's also called as like infrastructure as a code so without going through the UI you can actually write everything in visual studio and with the help of your git and pipelines you can automate your end-to-end deployment process so there there is like logic app devops is a big Topic in itself so it's very hard to summarize it in few lines but yes there are many options there are so many options to automate and yes parameterizing and templatizing is not a problem yeah okay I will connect you uh in offline for get some more information so and my second question is and regarding performance um if you are using a large volume of data does logic apps is able to handle it or because if we are going by conception plan uh it consume more resources right so how can we uh you know uh you know efficiently we can develop this logic apps because I have a few requests where they have to handle you know likes of Records so I'll add my view and then maybe Anita Umesh you can add so my view is logic app is not for reading or parsing high volume of data if you have such scenario do that in data Factory or a pipeline that will be cheaper than doing it in logic app and more interest performance incentive so it's not about reading sorry uh it's not about reading so uh for example uh my external application so they can upload the uh their data into uh blob storage or something with the trigger we can do a weaker logic apps can read uh the file so does it impact performance like if the volume is too high so what are you doing by reading those file let's say of there is a file with 1 million record uploaded in a blob storage why are you reading it through logic app you can also read it through ADF if you have to do some Transformations on it so I'm just trying to understand your purpose of what are you doing with that data hello so uh we have um we have a data management uh recurrency API which is being set in Dynamic side so uh we want to push some uh the planned or the details uh to Dynamics from external applications so they have millions of Records uh planned order um so what what they do they cannot directly communicate to Dynamics so what they are doing is they are just uploading it to Windows Azure services like blob storage so from blob storage using trigger the Azure function is reading the file and pushing through I mean they are pushing via logic apps so does it impact any performance because I am I'm trying to understand because I am new to this logic app so I am trying to understand and learning uh I have yeah yeah look at your point here so maybe anything do you want to add anything on performance or maybe there is you can check with the possibility of with a large volume of data even you can do that as well but obviously um large volume with such such MB side I don't think so you will be able to do it via logic apps so maybe you need to explore the other options otherwise try splitting the file and then importing it logic app is to create a workflow logic app is not a big data analytics or data reading platform so I understand you got a big data and then you have to create a package and send it but try to see what is the best tool available in Azure ecosystem to do that job don't put complete load on logic apps yeah okay thank you thank you thank you thank you I guess the next one is uh amulu hamomo yeah uh if not then we can go with Naga hello yeah hello okay yeah go ahead go ahead uh now I just give him a minute yep sure yeah okay thank you very much uh thank you everyone and thank you especially for my speaker uh I want to ask if this scenario is applicable for um on-prem installation or Opera on-premise uh Offerman please thanks okay the stressed API is applicable for both on-premises and Cloud so if you go for the NQ and where I have set the recurrence it is only applicable for the cloud one but this one it is you can use it for both on premises and the cloud okay this is Sun requirements or some necessary connection between Azure cloud and on-prem connection so the connection reminds the same the commands usage of commands Remains the Same but to now I have connected to The Inferno system using the app registration so here in your case you need to find the way of connecting your entrepreneur system using the other way for example whether you have um the keyword or whatever approach you have then you will be able to connect it so the base is same you will be able to use this API to send the data using on-premises as well but how do you connect to the what credentials you have that you need to see yeah so basically with on-premise installation you need to have app registration as well okay okay I get that thank you thank you very much welcome another please go ahead thank you yeah yeah so first of all uh thank you for the session so my question is the in our scenario we have 10 different CSV files so can we convert all these into a package or to import import into RDMA converting to a package your package will will be able to handle only one data file see the data file is the same package has three files one is data one is package header and the Manifest but if you're able to calculate all your CSV files into one data file then that would work but you will not be able to send all these 10 files into one zip file that is not possible in this case so okay we need to create uh I mean 10 different uh packages then in that case so sorry why do you refine two different sorry you're good I just want to understand Naga so all these 10 files have data for same entity or different entities uh these are for different entities and is there any dependency on the sequence like in which sequence they need to be pushed uh no yes yes yeah this sequence should be yes yes sorry Anita I'll add My Views towards the end yeah yeah since it is different yeah sorry I didn't ask that question whether it is for a single entity or a different uh different entities so you will be defining a sequence in fno as well right where in the one data project where you will have the entities with the sequence defined right so yeah so in one uh project can we Define 10 different entities and push these 10 files to that yes you can entities yes it is possible yeah so one data project with multiple data entities in the lines you have to define the sequence and then what will happen is from Dynamics you can create a dummy export which will give you the Manifest file so the the two XML files which have the definition they Define in which sequence it goes then you can zip everything into one file and push it in that is a possibility okay okay great yeah thanks for that okay who is the next person I think we have Sandeep so yeah so thanks for the session so uh so I just have a query on uh on power automate so we are using uh Power automate uh for one of our solution and we are sending data from ethno uh to some CRM application through that power automate and we are using some custom entity so entity is working fine that is that is that is not a problem and initially we have gone through recurrence based flow in power out of it so it says that they can now customer is asking for a trigger bit so I am aware that that business event we can use as a trigger trigger so I'm just looking at I mean is there any option other than business event so that we can go it has a trigger based that's my question what's the trigger can you elaborate like what type of yeah is it like it's like uh when when a payment journal is posted a supply payment journal is posted and it hits the vendor transaction so we are doing the export and the data of that data uh so it's this requirement is like that so we built the entity and it is working I mean if we manually export the data from data management it is working fine and we can also enable change tracking that is I mean the Delta is a quoting that is fine uh so uh and power automates I mean recurrence based we can also uh use that so now if we want to switch it on a trigger basis so what is the best possible option what is the volume expected transactional volume yeah so it's not of a high volume so it will be like a per day basis yeah um roughly uh 100 on a day basis under transactions any reason why you are not preferring to use business event because it is an out of the box framework for these type of scenarios yeah so I mean we can go for business event um but the thing is that we have some customization on the on the vendor transaction also so I mean when defining the business event can we bring those I mean can we identify those fields I mean those custom fields that if the field has been changed so that this event will be triggered like that is it possible to use those custom fields in from the vendor transactions so you want the business event to trigger onoc situation or you want to include those fields in the payload which is going on in the in the in those those custom feeds yeah so condition condition based business event you want to trigger so trigger should be conditional right yeah yeah yeah correct yeah so is that possible in business event my I mean uh my understanding says you can either create your own custom business event I'm not sure if you can extend standard business even to add this condition you will have to check the extension point if it's available or not but you can definitely create your own custom business event which can take care of these conditions otherwise uh I mean you can always write X plus plus code to invoke your power automate or logic app that's another option oh but okay yeah yeah but yeah anything you want to add something to that no even I was about to say the custom business event only because we had a similar requirement but this extension is not possible so we created a customer business even and then anyway it is all even driven right so whenever there is any update on that particular field happens yeah then the business event will be triggered so I have created a custom business even for that okay okay so the business event is the best possible way to make that trigger that okay yes because you get a out of the box framework to manage your Eventing so business event has all these parameters to do a retry and it can be configured to an endpoint so you don't have to worry about this framework okay so yeah I think that that is one of the reason you should go with business event okay okay thanks a lot thank you we have Ramesh next nikhil Ramesh great session guys um that's it I I believe you know me uh based out of Melbourne as well welcome back welcome really appreciate you awake at it this time yeah yes this is the first session ever attending with regards to the the 365 UC India but yeah I mean great initiative um definitely uh appreciate the work that you guys are doing like I've got just a normal simple comparison question between the logic apps as well as power automate most of the things that we saw now like what logic apps could do is similarly achievable using power at automate so in what scenarios do you think you would be utilizing logic apps over power automate I mean I understand that logic apps is the Enterprise application and all that um but what I want to understand is like really the metrics like do you have any kind of comparison where you say okay if the number of records are greater than this then you actually go with logic apps compared to the power automate and so on yeah I think that's a very valid question uh it can have you have any different people different views but yeah let's see what views people have Anita Umesh you guys want to go first mainly for up um Power automate based on the Microsoft Office licenses right so this is closely coupled with the power automate and so whatever the features you can do with logic apps you can with more steps you can make it simple with power automate so if you are using the components which are available in the office Microsoft Office things for example with the CRM or whatever it is so that is the difference I feel these are all the Microsoft Office Products right so that is where the couple the close couple is with the power automate and this is azure Hoster and this is close to Azure so if I know so mostly will be preferring fno for the large scale Enterprise this one no so like in my scenario would would be like you know um a customer who is using um see a customer who is using data works as well as finance and operations and like you know in a scenario where you have Integrations where like you know you're bringing in um some of the Integrations are like you know um low in terms of um frequency as well as the volume but some of them are huge let's say like you know um in a retail scenario um let's say where they are not using the standard retail of finance and operations but they are using a customized boss and they are trying to send across sales orders um in this case like you want to make sure that the the sales orders are almost real time um in this case that's that's where my question is in terms of the volume like whether do you prefer like let's say the customer has uh office also office license also the customer Adventure also which would be your purchase in terms of the volume and the performance because obviously every like you know in retail scenario the sales orders are more like it's not a standard retail scenario let's say it's a convenience store or anything where you have high frequency of sales orders happening and you want to make sure that with the custom uh integration you're bringing those um you know sales orders from Custom POS into your um headquarters does that make sense I don't know if I was able to explain the question properly or not now I think these are real-time scenarios made which we Face day in and out yeah I think it really depends on many factors as Anita mentioned it it power automate is coupled with your Dynamics license so that is one definitely big factor if the use cases are not too complex go with power automated but if on the other hand if you're talking about a retail scenario then vs logic app is the more matured Enterprise level integration platform so yeah it's very hard to pick one over the other but both have their own pros and cons I'm not too sure from technology perspective what's the real difference under the hood um I think logic app gives you more automation capabilities I'm not sure about like because power automate is a part of Power Platform which means if you have to deploy power automates from Dev to test to QA you have to you know package them in the Power Platform solution so their devops is entirely different to the way you manage the devops for your Azure resources so yeah it really depends on a lot of things yeah yeah my major concern is just yeah my major concern is just the performance part of it like um like there have been like you know uh so obviously uh I work as an architect so there have been like you know sessions where people from different technology uh being part of the discussion they put a valid question in front of me asking like can you give us the metrics uh of like you know power automate as well as the logic app saying that okay these are the if these many number of records are there it's better to go with logic apps or power ultimate I I haven't been able to account any any answer till today but find such answers yeah I mean these all are SAS based platforms so you yeah I'm not sure if what limits Microsoft has published yeah I think it's better to check with our Fast Track Architects or someone at Microsoft because these are really like infrastructure related questions of these capabilities it's really hard to get hand on such level of detail I have one comment one comment basically so we face this in some similar scenario but the recommendation was to get we were using just a basic and a standard uh Power automate so they suggested microsources to go for their premium licenses and after doing after going to the premium licenses we were able to huddle on the performance issues which is where actually with the help of the infrastructure like it's under the hood of infrastructure so we were able to do that so recommended is check your plans which plans you are having E3 E5 which plan you're running in and uh then on top of it look for what type of uh Power automate licenses you are having then in that combination you will be able to do it okay okay cool thanks guys okay I think we covered up almost everyone uh vigneshwar is left so we'll this is the last question yes yeah thank you all uh my question is related to powerapps in Power Up character even I'm using a data entity the daily entity cross company also enabled but still I am getting the data from the particular entity which is mapped to the user so is there any way to get all the company records using power of data power of data connector have you tried using the cross company in your vignation Yeah by default it will fetch the data from the default company where the the business map the user is mapped right so even the cross companies enabled in the particular entity but still it is giving the data from the company which is mapped to the corresponding user is it a standard entity or the standard entity standard entity internet they are saying like we can use power automate write a whole data query to fetch all the company records this is not possible in data character like that maybe I have tried creating a custom entity where uh there is a Field property primary context where I have removed it will be by default will be Company ID so if you just remove it that would X do I export for all the companies for the particular entity so but since you are using standard entity then I don't think so this is possible that would export only for the company which you have created or the default one which it is otherwise we need to expose other options okay thank you thank you all I guess uh the announcement for the next session would be uh uh the upcoming session uh the nav communication is with the landed costing which will be again hosted by Satya sir which is actually Finance side for functional so uh our uh presenter would be saurabh and he's in uh he's based out of not in India so he's again outside in UK so he will be delivering that session thank you very much for this we will try to close this call and I know a lot of people are still joining but we'll try to restrict this to this timing and thank you very much thank you thank you thank you very much thank you
Info
Channel: AzureTalks
Views: 2,299
Rating: undefined out of 5
Keywords: Microsoft Azure, Microsoft Dynamics 365, Azuretalks
Id: 0rZdWYNMbtM
Channel Id: undefined
Length: 78min 9sec (4689 seconds)
Published: Sat Jun 17 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.