How to Import CSV Data to SharePoint List with Power Automate | Tutorial

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments

I didn't finish the whole video, I did skip around and to check out some key points and I wanted to share the way I managed to handle CSV data.

I had a use case of pulling down an export from a live data set. It was done about once a day. Using a document library from a Team, I had the end user add the shortcut of the folder to their drive. I set up a Flow to listen to that folder for .csv files, and it will parse them.

The key part, that probably kills the scope of your video, is that I had to use an Office Typescript script in Excel to invoke stringify. So I use the Excel action to run that script, which then I pass into a Parse JSON with a matching schema of the data needed, and it makes mapping the fields to SharePoint from there easy.

Problem we ran into was that suddenly the source of the data changed their CSV files from UTF-8 to UTF-16, and that is now binary and unable to be parsed. So now they have the extra step of opening the file in notepad, saving it as a UTF-8 (Thankfully it's super easy) and then dropping it in the shared folder.

This is, as far as I am aware the only way to manage dynamic data. I would be happy to share any parts of my process if it is at all helpful!

👍︎︎ 2 👤︎︎ u/UnheardWar 📅︎︎ Sep 04 2023 🗫︎ replies
Captions
hello everyone Reza here in this video I will show you how we can import data from a CSV file directly into a SharePoint list I will cover scenarios around various types of columns and SharePoint and also cover scenarios where we could potentially run into challenges and how to handle those exceptions so let's check it out in action foreign [Music] I have a CSV file data is stored in text format comma separated values I have data around my issue tracking system the information here includes the ID of the issue title description who the issue is assigned to status priority category and creation date so let's try and import this CSV file into a SharePoint list now one option we have in modern SharePoint is to directly create a list from a CSV file in my SharePoint site I'll click on new list and one of the new options here is from CSV I'll select this I can upload from my device I'll pick my file it launches the customize dialog wherein I can visualize my data it's smart enough to interpret that the first row in my CSV file were the headers and here I have the ability to check the column types and choose a new type if the current selection that it is making is incorrect so issue ID I would like to map this to a column of type number title is text in SharePoint we have a column called title for every list description I will make this multi lines of text assigned to these are email addresses of users in my Azure ad it does mapping this to a person or group column status I would like to map this to a choice column priority I'll pick choice category I'll pick choice and then creation date we will map it to a date and time column you also have the option here to decide if you do not want to import a specific call I'll click next I can give a name for my SharePoint list I'll call it issue tracking and click create and just like that it will create a new SharePoint list set all those columns based on my CSV file import customization that I mapped and include all the data the choice columns if I head over to column settings it does not include the choices it has imported the data though so let's add the choices I have three options here open in progress and closed and for each of these I can add some formatting open blue in progress called closed green I'll click save and you can see how the status column is representing that information I can do the same for priority click save the formatting is applied similar behavior for categories and these Choice columns that get generated if we go back to the settings of these columns they are set as allow multi select to begin with in my case there can only be one status so I'll turn this off and click save I'll do the same for priority but for category let's leave it allow multi-select for now the column for date and time if I do not need the time aspect I can remove it and this is a fully functional SharePoint list at this point if I click new I can add new rows to my issue tracking list if I select an existing item I can view it if I want to modify this I can edit save and we can see that the edit has been reflected in my SharePoint list this is a one-time import wherein it creates the list and imports the data but what if I have multiple CSV files that I receive on a periodic basis that I would like to import into that same SharePoint list so for such scenarios let's leverage power automate so I'll launch make dot power automate.com go to create my automation job that I'm creating here I would like to trigger it manually so I'll pick instant Cloud flow pick the manual trigger give my flow a name and click create my flow trigger is a manually triggered flow and I would like the user to give me an input and that input would be of type file I'll make this a required field click on new step and here I'll add a simple compose data operation action I will rename this to CSV data the input I will go to expression and write the expression base 64 to string here for the parameter to the base64 tostring expression I will head over to Dynamic content and pick upload CSV so I will select this and I will click ok at this point let's save the flow and let's test the flow manually I'll click test we can see how the flow is asking for a CSV file I'll click import I will select my CSV file and click run flow click done the flow will trigger read that data from that CSV file you can see that information comes in base64 format and that's the reason why I wrote that expression to convert it into text bear in mind the data here is stored in the form of string that is comma separated in my CSV file the data ends in column H which is called creation date now if I look at the output of this compose action when I get to Creation date there is a new line character that's being rendered here and then it goes to the next row and so and so forth I'll click edit flow now I need to split by a new line character for that I will add a new step once again I'll use compose I will rename this to new line and here simply click enter new step once again compose I'll rename this to array CSV the input will be the expression split to split I'll go to Dynamic content I would like to split CSV data comma split this with the new line character I'll click ok and that puts in the expression right here let's go ahead and test the flow tested with a previous trigger the flow gets triggered now if I look at array CSV you can see that the output is in add a format if I click show raw outputs each row in my CSV file will be available as a string in my array this is my header and then these are all my data rows now there could be scenarios that it picks up empty rows so I would want to remove those Plus if you have the header row like in my case I want to remove this row so I only have array of CSV data so I'll edit my flow array CSV I'll select this my split expression right before that I will use the expression function skip skip my split expression comma the first item in that array now close the function and click update Let's test this flow the flow has run successfully if I look at your raw outputs you can see that the array does not include my header row and for me to remove any empty rows I'll edit the flow add a new step use the filter array data operation action the array of data here is the outputs of at acsv and here my expression will be item which are the items within that array this is not equal to empty so this time if I test the flow filter array show raw outputs we can see it has this property body which has the array of all my data rows from my CSV file you want to only include data where the status is open in my case I want the data in column e column e is index 4. so here I can say edit in advanced mode copy this function paste it on Notepad to this I will add an and condition comma split my item which is my row with a comma this will give me an array from this array give me the data sitting at index position 4. and I want to check to see if its value equals over that completes my formula I can copy paste it in here I'll save my flow and now if I test my flow if I go back to filter array if you look at show raw outputs it only gives me those rows from my CSV file where the status is open in this case I'll go back to my initial simple expression which was just filtered out the data that is empty now that I have this array of data from my CSV file I want to load this data into my SharePoint list so I'll add a new step I will use apply to each to Loop through each row of my filter array action Dynamic content body and in here I will add an action create item from the SharePoint connector to create an item in my SharePoint site address I'll pick my SharePoint site list I'll pick my issue tracking SharePoint list and the moment I do that it will list out all the columns now the columns that it puts out here it expects the data in the format based upon the type of that column so let's begin with title title in my CSV file is in column B index is 1. so to get the title from expression item that's my row of data split this item by comma that will give me the array and from this array I want the data at position or index 1. I'll click ok that should give me the title issue ID the data is numeric in my CSV file column index 0. so for this one the expression will be split the item by comma and get the data at index 0. I'll click ok description this one is index 2. that's my expression I'll click ok let's handle Choice columns status and priority status index 4 priority index 5. we have to go enter custom value and here we can write our expression status data is in index 4. for priority I'll go to enter custom value expression put this one index is 5. let's focus on creation date my data is already in date format the column index is 7 so here all I have to do is write that expression and point to the index cell so at this point let's go ahead and test the flow this time I'll test it manually I'll click test I'll click continue import my file run the flow the flow triggers it's gone ahead and created the rose on my SharePoint list and if I refresh my list you can see it has imported all my data I'm testing it with just eight rows of data in this scenario but imagine the scenario where you have large files for eight rows it took about three seconds but I can speed that up even further for the apply to each Loop head over to settings concurrency control turn it on so we have the ability to run items in that Loop concurrently up to a maximum of 50. so I'll go ahead and do that and click done let's focus on the column of type person in my SharePoint list my CSV file is assigned to index 3. once again I can go to enter custom data so it needs this claims token and right at the end of the token it needs that email address and that I can simply get from my expression the index for sine 2 was 3. and for category it's a multi-select choice column right now in my CSV file category which is index 6. I have data in the form of a single choice one category is provided semicolon it can have a second category it can have a third category so and so forth remember this file is CSV comma separated values so we need to ensure that we do not have any commas in this file so prior to importing make sure that you replace all instances of comma with a different character let's say I replace a semicolon in my CSV file I'll pick performance it's my first option can add a new item my second option I'll pick content now here if I switch to input entire array this is the format that a multi-choice column expects so to support this format I need an array I will add an action for now after create item I will use the data operation action select from needs to be an array and this as well I need to split by a semicolon now I'll click ok and then map I will use that exact property that the category column expects which is value case sensitive and then the value I will go to expression and use item create item I will move it below select and where I had category remember to get here I have to click on this switch to input entire array here I will simply leverage the output from my select action now let's go ahead and test this import my data the floor runs this time if I look at my SharePoint list observe how it has the multi-choice column values also mapped from each row I will remove one column value so I'll remove ID from the first one title description assigned to status priority category creation date once again I'll test my flow this time when the flow runs if I go to apply to each we can see that we have errors the first row one of eight has failed and the error here says that the column was expecting a number but it got an empty string in my second scenario I did not receive an error meaning if a text value is empty no problem it will put empty in your SharePoint list same thing goes for description which is my third row of data my fourth row has failed this one is because assigned to the person type field is empty my fifth row was successful meaning if a choice column is empty no problem it will keep it empty same thing what happened for the priority Choice column being empty and the same thing would also work for the multi-select category column the last one has failed this row was related to date so let's fix these issues issue ID number type this one was failing if this split action returns an empty string use the function if if equals an empty string put null that's the expression null else you put the value that you get from that index I'll click update for the date column I will do exactly the same if equals empty in that case null else put the value and then for assigned to claims I will write the expression as follows if equals empty then put null else I'm using the concat function because I have to put that claims token concatenate that claims token with the data coming in from that column notice in my case my CSV file all the rows are ending with this slash r for my date column where I am splitting the data right before equals I can also use a replace expression to replace decode URI component percentage 0d and I will replace this with an empty string click update now if I test my flow the flow triggers the flow completes successfully and this time if I look at my data in my SharePoint list this flow triggers manually by me uploading a file so whenever a CSV file is uploaded here I want the flow to run so my trigger here will be when a file is created in a folder site address my library I need the content of the file so I'll use get file content using path site file path I'll pick the path and here my expression would be base64 to string Dynamic content file content from the body onwards I need to point to dollar content I'll click update I'll save my flow my flow is saved and listening to any new file that's uploaded in this document Library let's go ahead and upload both the files I should have flaws triggering for both those files both those flow runs have succeeded and if I look at my SharePoint list it has now been populated with data from both those files if you enjoyed this video then do like comment and subscribe to my YouTube channel and thank you so much for watching
Info
Channel: Reza Dorrani
Views: 20,129
Rating: undefined out of 5
Keywords: power automate, powerautomate, csv, sharepoint lists, csv import, flow, Import CSV Data to SharePoint List, import csv to sharepoint, power automate csv, csv to sharepoint, sharepoint power automate, power automate tutorial, tutorial, how to, reza dorrani, import data sharepoint list, flow import csv to sharepoint list, power automate import csv, power automate flow sharepoint list, microsoft lists, sharepoint, microsoft, sharepoint list csv, import, data, automation, flow json
Id: b3aQf17ptAs
Channel Id: undefined
Length: 25min 36sec (1536 seconds)
Published: Mon Sep 04 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.