Create Dataverse Table Schemas Using Power Automate And Dataverse Web Api (CSV or Excel)

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hi there folks in today's demonstration I'm going to show you how to build a flow that will generate schemas for your tables contained within a solution so I'm going to take a look at the Coe which has got 48 tables believe it or not and with a very basic flow I'll show you how we can generate a CSV schema per table or in fact an Excel file with all of those tables like the one that you currently see on screen so if that's something that interests you please make sure you like And subscribe and without further Ado let's jump into demonstration so I'm going to kick things off from within power automate and I have a flow on screen here that accepts a solution ID as part of the trigger input and then it invokes an HTTP request in order to get all the tables that are part of a solution I'm going to go into all these actions in more detail later on in the video but as we work through the solution here you can see I've got a variable I'm using that actually to Output all the data into an Excel file and I do that extremely efficiently using an Office script but for those that don't have office scripts and are quite happy with a CSV per table you'll see that I have a 4 each loop it's looping through all the tables that are part of a solution getting the table definition from the web API and then selecting some of those fields that we can see here on the left hand side uh creating a CSV table and therefore creating a file I've got a scope within this apply to each which is the part marked for creating an Excel file so if you do have access to office scripts or maybe you've got access to graph API you could write all that data to excel in one go and have a gigantic Excel file with all of your tables and their columns and some of those properties which is extremely useful if you then want to go and filter on for instance the is required field or maybe the primary ID so as we work our way to the bottom the last few actions here are all about creating that Excel file and I have got a more detailed video that will show you how to create Excel files quickly and efficiently without using the Excel connector and I recommend you watch that if you're interested in generating Excel files efficiently now as I mentioned I'm going to generate all the schemas for all the tables in a solution and it just so happens that I have the center of excellence core components and we can see that we've actually got 48 tables in this solution so wouldn't it be very useful if we could see all those Key Properties within those tables in a single Source in this case my Excel file and the key to this flow is getting the solution ID and you can see that in the URL so if I was to copy this for later I can use this when I trigger the flow now the interesting thing about this is I'm currently in the make. power apps.com in the solutions if I jump across onto make. powermate.com in the same solution you'll note that you the URL does not contain the solution ID so if you're trying to hunt down that solution ID go and have a look at make. Power apps.com so jumping back onto this flow if I put it into test it's going to ask me for a solution guid and I can go and run that flow and we can see already that we have a 4 each Loop based on 48 individual tables and We're looping through all those tables and getting the definitions based on the various properties that I've selected using the select action I'm either creating a CSV file or as we get to the bottom of the flow I'll be generating a brand new Excel file and outputting all of that data into that single source of truth that single Excel file so in just under a minute the flow has now completed and if I jump across onto my SharePoint site we can see I have a new folder if I just to refresh I should also have an Excel file there we go so within the folder I have all of the csvs for all the different tables and if we have a look at for instance the flow table we can expand some of these columns cuz remember this is just a plain Tech CSV opening up in Excel we have all the different properties including the logical name schema name and attributes is it custom is it primary and the maximum length now of course you could expand this to go and get the values of lookups Etc this is a very basic schema but you can see it's brought together all of the properties for this particular table into a single CSV so jumping back over onto SharePoint if I open up the Excel file it has all all the tables from that solution and all of the properties in one Excel file now you could in theory break this up into individual sheets as well using an Office script but the great thing about having all on one sheet is I can now go and have a look at some of these filterings and I could select for instance all of the custom attributes and that now lets me look at all the tables with all of their custom attributes across that cue solution now I know myself I've been asked in the past for all of the all of the available fields that are being captured in the Coe and all of a sudden I've got that in a single Excel sheet imagine if you could do that across all of your Solutions using this very easy to build flow and so I'll upload a copy of this Excel file for you to download so you can have a look at all of the available columns within the Coe solution so if we jump into the flow start with a manual trigger we can see here that I have an input property that is a solution guid and that's what Ables me to capture the the guid from the end user when they run the flow now of course you could use a power apps trigger you could pass in the solution guid via your power apps interface exactly the same idea and then I jump straight into invoking an HTTP request now there's a couple of complexities with this HTTP request first of all you need to create a connection and this particular HTTP request is very useful because you can use it for graph API SharePoint I believe um but also for authenticating with the web API with your environment URL so the environment URL that I've just highlighted here I've got from the Power Platform admin Center and if I was to go and change the connection here and add a new I would actually Supply that as the base URL and also the entra resource URI and that would then allow me to authenticate as myself who has permission to access this data on the data verse environment and then query the various information about that solution including the table definitions back into that HTTP request if we have a look at the URL it's quite a complex URL but ultimately what it's doing it's querying the solution component summaries and I'm going to show you a very easy way of understanding how to construct these URLs using your web browser because actually you're doing this all the time without even realizing then there is a filter and that filter contains a solution ID hence why I capture the solution ID and pass it in dynamically into the filter and then you'll see that there is also a select and that select is reducing the amount of data that's retrieved by the API in order to keep things as efficient as possible so how did I construct this URL so jumping back across onto the solution I actually called that web API earlier without even realizing by using this browser so when I was browsing through the different components within my solution when I selected the tables filter I actually called that web API and in order to demonstrate that if I go and pop open the more tools and developer tools as long as we're in the network tab here if I select that tables filter we'll see all the traffic being constructed on the right hand side and one of them is the MS dine solution component summaries and that's exactly the same web API that I'm hitting from Power automate so you'll see that there is a call to a solution ID there's also a component type which is equal to one I think in this case and that will then return the tables of course you can change that to also filter on the different controls if you wanted to get back all the flows or all the data flows or all the pages you could equally construct a flow to filter through that data so what this does is it gives us a response and here is a sample response and that response gives us all of the different tables within that solution and that sets us up nicely for doing an apply to each Loop and looping through all the those tables and making a secondary API call to get the definition of that table back over onto flow what I have then is just a composed date time which I've used in the name of the folder that I've created I have an initialized variable which is just an empty array I'm using that for my Excel solution so if you don't want to create an Excel file and you're happy with all the csvs you don't need that that initialized variable and then in the apply to each what I'm doing is for each object with within the value that comes back from that initial HTTP request remember that is all of the tables in the solution and then going to invoke another HTTP request and in that I'm passing in the name of that table which will then allow me to return all of those attributes so jumping back over onto the solution we can see that property the MSD name that gets me the name of a table and we're doing that inside the apply to each we're passing that property in into the API call with data and again interestingly without you probably realizing if you go and click onto the table onto Advanced into tools there is an API link to the table definition and if you click on that you can see here we have the URL for the API call and it has The Logical name the admin AI Builder model and within that we have all the metadata for that table now within power automate that's exactly what I'm doing I'm hitting that same URL but I'm passing in the dynamic table name as part of that initial call that I made to get all the tables within the particular solution based on the guid after that I'm using a select and I'm just grabbing some of those properties now you could use parse Json in order to construct all the different Expressions I'm getting the logical name I'm getting the schema name the attribute type the is custom attribute is it a primary ID what is the maximum length and also the required level which enable Ables me to see is that a system required field and you'll be pleased to see that there is a document which I'll share in the description that gives you some of those properties so you can see the scheme name logical name display name and description things it might be useful as well as some of the other properties that are available for you to query via that API But ultimately if you look at that definition that's being brought back you can have a look at this data and determine what it is you're looking to retrieve and output in your schemas so with that in place I now have an array of all those different values I can easily create a CSV based on the body of that select and then I have to make things easy including the table name in this case the MSD name and the description in this case the MSD description and appended the CSV table onto the bottom and if we remind ourselves looking back at that CSV that's exactly why I have got the table name here and then the description because I wanted to have that as a header with the CSV before diving into all of those properties and it enables me to very quickly identify what table schema I'm looking at and then after that all I'm doing is just generating a CSV file I'm using the date time that I've generated above to construct a folder based on that solution guid and then I'm outputting a schema file at CSV based on the name of that table and of course using the output from the header if you just wanted a plain CSV you could use the output from the CSV table so for those that are interested in the Excel file I have just these three additional actions here in order to create the Excel file with all of the tables all in one sheet and it's slightly different setup you can see I'm using a select again using the body value but rather than creating um an object with key values like I did in the select above it's just a nested array so square bracket at the start and finish and then all of those different properties again from that schema definition but I've also included The Entity logical name as part of that data because then that allows me to determine which table or entity each of those columns have come from and then of course I can filter on that Excel table based on the entity name the next step is just go and create a copy of the array this is just a trick in order to allow me to append an array to an array and so by taking a copy of that variable it means I can then set the original variable based on a union and that Union is based on the select and the original array which is the copy now I explain this in a lot more detail in my video that shows you how to create Excel files efficiently and so if it interests you I highly recommend you go and watch that and see how I write 20,000 rows from a powerbi data set in about 15 seconds then as as we jump outside that apply to each all I want is one Excel file I have send an HTTP request to SharePoint and based on the path of my SharePoint site I hit the SharePoint rest API and generate a brand new file based on that solution goodd I get the ID of that file I compose a header which is basically a nested array you'll note that there's two opening brackets and two closing square brackets and that includes all of the column names I want within the Excel file and then finally I run that script that I demonstrate in the video that shows you how to import all that data into an Excel file Ultra fast and all I'm doing is creating a union of that header that we have in the compose and also the array that contains all of those table definitions as part of an nested array and that's basically what enables me to write all that data into a table that's already formatted in Excel and allows me to start filtering looking at just those custom attributes and start browsing through all of those different tables that we have within our solution in this case the Coe and look at all these flow properties that we have for the flow table if only it was that easy to get that information in front of our eyes and start making decisions about what we can build from the data that we've now captured in our Coe solution so I'm hoping from that demo you'll be able to build your own you've learned how to use the web API and call that from Power automate I've also used the web API in the past to edit Solutions and create new Flows In Solutions and so that might be of interest but as always if you haven't already please make sure you like And subscribe and I hope to see you again sometime soon thank you very much for watching cheers bye
Info
Channel: DamoBird365
Views: 1,235
Rating: undefined out of 5
Keywords:
Id: K3MlypTB2EI
Channel Id: undefined
Length: 15min 41sec (941 seconds)
Published: Fri Apr 05 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.