IICS | Incremental load using Timestamp Passed by Parameter File in Informatica Cloud

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
Hello friends welcome back to my channel uh roshani TTL station please subscribe to my channel uh before we start and uh so that you can get all the latest updates from my channel this one erosion and detail station just subscribe to my channel and so I created one video but it turned out that there was no voice in that video so I'm just recording it again for you um I have to create the video again so we'll go through so what basically I had created the incremental load using the parameter file right so what we will what we will do in as part of this video we will create a parameter file and put default value like 1970 or something and then we will override this parameter file in next run so basically it would be like we will be reading the value whichever we have defined till now in our mapping and I had already created a mapping maybe I will just what I will do I will delete this mapping task and then I repeat it again so this mapping task I can delete it click on this and then here and then delete it okay mapping I had already created so what I will do uh maybe I will I'll just create and just delete the mapping itself then I'll create fresh file so that you can learn right okay save it so we'll get started so what are their sources and target for this uh for this exercise so for this we have our source cost dot CSV like you have in my other incremental videos right same file we are taking flat file and which we will be loading into Oracle table this is the table because same table as our previous example of incremental loading the same one we will be using Okay so let's get started so for that let's use the source and the source would be flat file so we can just give us our C underscore cost underscore FF flat file and this we are reading from Source directory so it's nothing but like this uh these the last update did the max update date which we are which we are having here which will be passing this in parameter and from parameter every time it will be creating new new new values so for that we have to create a param folder also parameter folder so let's create one param folder where you will be reading file okay so this will be we can call it as a param create it in your local Informatica secure agent where you have installed the where it is there and this one we can create one file this is the file we are going to read it so so this is prerequisite before you create mapping this parameter file you need to create it so we can just call it as a cost incremental uh and then we can call this extension can be param so let's keep it as a column and here I will be having that uh mapping parameter so if you would have seen my uh parameter video the how to use parameter in the mapping task there is one video you can go through it and you can see like how to create uh there are three three or four I have a video for parameterization in mapping and mapping task and workflow so you can watch that video here uh Informatica Cloud parameter in mapping task you can watch this video this one parameter in mapping and parameter in task flow these three videos you watch it so that you can you can see that what I'm what I'm doing so basically it's nothing but the parameter we are we are using and we can call it as a Max update date or you can just call it like let's call it as increase mental filter let's call it like that and this value we can give default this is the value our job is going to read okay so we need to give the first time it tells it we need to give default right and next time our job itself should override these values otherwise every time it will keep inserting all the rows right so this is the this is the column let's make it small maybe NK incremental underscore filter okay that is the column again so this this is the same uh parameter input parameter I need to create in the mapping so let's get started so this is our source and here I don't have any since it is flat file I am not having any sort or any filter here like like in a snowflake the first example which we had seen in incremental video we had done the filter in the query option itself but it is flat fire red so we have we have to do extract so let's suppose the let's create the map input parameter parameter panel go to and Max update that let's update it because now our field our input parameter name is different so let's change it to and the value can be the same one 1971 just put it straight can be string save it so this is the input parameter we have used it so that we need we need to use in the mapping meanwhile our source field here if you see create date and update date both are in string format so let's convert it first to expression and convert it to to date so we'll open the expression and let's create two columns and we can call it as the expression unders because in our Target this create date and update date are in the date format so we have to update it so it can be date time it will be date time and then you have to in expression you need to say today and my input values to date and then it will be created and the format is YY mmdd hs24 Mi SS okay and validate it and the same you can copy paste so that you don't have to write it again and then this will be expression underscore update date here it would be date and time say okay configure it and here you update it too updated here this column you can either choose it or you can just write it yourself save it so two columns we have created now we will be connecting to Target so let's choose the target and here the target is in the Target schema or record TGT and our Target table name is a cost incremental and it would be since it is the incremental load if we can have it as the insert or update so that if it is already existing it will just update the with the new value it come right now the data is G uh and its table is empty okay and this would be update else insert and our primary key is employee ID that's the unique column in our file okay and save it so this is your basic mapping right and still we have not mapped it that's why the error is coming and this is your basic mapping where you have not even used the parameter param parameter value and uh so we have to now use it right so now since it is flat file how you can use that filter filter criteria right so you have to use the uh filter based on the uh we have to use the filter here let's add a filter that that's only we can we can use that fair that filter for incremental right so let's see that filter and this filter would be advanced and your filter criteria would be where the update data expression update date because that's what you have converted it to is greater than whatever is your input parameter has has value it should be greater than that see automatically it's taking this I'm not doing anything like dollar dollar or I just clicked on this so input parameters are always like this when you are using inside the mapping okay and the format is same YY mm because that's the format we are giving in the parameter file m i s s whatever format your data is coming that format you need to choose my data is uh based on the Y by mm that's why I'm using this one which whichever format you are you are having that is the format you need to keep so validate it it's good so now your mapping is uh it's kind of okay and still we are getting invalid why it is invalid you can always choose this option validation option and it will tell you that you have not mapped The Columns that's why I'm giving error so you can go to field mapping and you can all the columns you can do smart map so here see create date and update date that's not correct right this is the create date and update data are coming from the source and Source we have the create date and update date as a string column so in my target it's defined as a timestamp or date column right so it will not work so that's why we need to unchoose this field mapping for create so choose these two option unmap and manually map whatever we have converted to today in the expression so I just mapped it so this is two dots to unmap and then drag and drop to the corresponding column so save it this is your basic mapping this is going to see uh this is your basic mapping and I will create now the mapping task first I will show you that this is basic mapping and then why we will be creating the parameter file in the same job why we will be doing that I will I will because this this mapping is not going to overwrite your param right so every time you are value will be 90 70 only so that's what I want to show you and then we will do the corresponding uh changes okay so let's check let's create the empty mapping task okay and I can call it like this and this is in my runtime environment in my local next and this incremental filter we don't want to read from default value we want to read from the parameter so our parameter is this one so read this incremental filter in your mapping wherever you have used and the value should be coming from dollar dollar incremental filter and that will only come when you define here the parameter uh path so my parameter path is copy this path copy this and paste it here and your parameter file name is this one so copy it in Unix it is very easy but here also we can do it come on okay I think I'll copy two times okay so this is the parameter it is going to read and the previous in previous screen we had we have we have to we are telling right read it from dollar dollar incrementation that's the value it is going to pass it through the parameters finish it now if you run the job your job is going to run fine but every time it is going to load 10 rows 10 rows only because if this parameter is not going to override so that's what we are going to see so let's run this this mapping task from here run the mapping task don't run the mapping okay so we are running the mapping task go to the start it's running refresh nine records got loaded so if you go to Oracle here in data refresh it see all the records Got Loaded everything got loaded right 10 and all these nine records now we are thinking it is incremental loading and now now the next time this nine record should not get loaded if I'm not doing any changes right will that happen now we don't think so because you have not overwritten the value it is going to be still 90 70. so it's still gonna read all the records if you still run it I am simply rerunning it it is still going to load the nine records again you click on this see all the nine records got it so how we can Rectify that how we can override this parameter value that is the one we need to create another flow so now you have this file here now we know another flow which we had seen in my video like how to get the first row and last row if some would have some somebody would have seen that video you would know that if you just pass your flow through the express through the uh aggregator then it gets you the last row that's the aggregator property that this one how to get the last last and first if you have seen this video then you would already know but I'm just telling you that if you pass the same flow to aggregator it is going to give you the last row so with that we will be able to create one flat file so for that we need to use after filter create one sorter because you want to sort okay let's not get sorted let's just create the aggregator then I'll show you why I'm using the sorter so let's create the aggregate so here aggregator and this will be not this clue this can go directly here and then we will be having another flow this target is fine it's loading after filter it's fine here uh field mapping is looking good let me see one step this is fine right this is we are loading fine but we need to write the parameter also right so we will be passing through the aggregator and we will be bringing one target here and this target this is the target we will be populating our new value for Max update date for the next run so you can call it as a TGT underscore param create you are basically creating the parameter file itself the value and here I am not using any Group by nothing you need to just pass through and now we will see what exactly it's having the value okay and in this field mapping in the Target it will be Target in you can create in any folder you can create in param folder or you can create in TGT files and then you can copy so I'm creating in TGT files as of now and select it and you can create on the uh runtime here and do I already have anything because I already created this video right since the sound did not come already I have I'm just creating it again for you guys so I'm just doing everything fresh so let's create one file called M underscore sorry cast underscore incremental underscore TGT dot txt file this is the file we are going to get the max of updated in this file and we will overwrite that into param file so let's just do like that and field mapping its run time so nothing is needed and now go and run the job what happens still it is not done but I just want to show you step by step so that understanding is better it will still load the nine records but what does the TGT file gets the incremental and how we can change it that is what I wanted to show so you see if I just pass through the aggregator it is automatically giving me the uh without sorted it is giving me it's not sorting it's just getting the 90 Rani it means it is giving me this one it is thinking that this is the last row but that's not true right that's not the last row I am not interested in the last row based on the employee ID or based on the my flat file whatever is my last row my last row may be running that's why the last one that's not why I want right I want the max value the max value can be in the first row or second row or third row right the max update dated I want so to get that you need to sort the data here you need to include One Transmission called sorter that's the reason we are doing because it's not giving it's just giving based on the whichever whichever flight file is whichever record in the flat file is coming and that is the record is giving and that's the max update that it will come if I choose the run it is going to give me this but what happen if 715 will be here then it will not work right then I am still going to get this value 712 we don't want that whichever is having the max date that should come that is what we want okay and now in the sorter you need to say sort by sorter you need to say sort by expression update date the one which you have date and time is good for sorting right string we don't want to rely on string date and integer are always good so now it is in ascending order so whichever all the records it will sort by the update date and whichever is Max it will coming in the last so basically now out of this whichever is having large whichever is having less this is my having the max this 713 so this path 100 the 713 that that is the max so let's save it and run it it will still going to give you all the columns then I will show you that how to remove the other columns and how to make it same as parameter file step by step I am showing you so it's giving you the max of the 713 713 2012. if we go here 7 13 20 tell is the max one this Diya that's the record you got now my parameter file looks like this incremental filter it is this is what I want right I don't want the whole row in the Target then how can I override this value into this value then we I'll be not able to do it so that's the reason you need to come back in your mapping and you need to say here in the target Target Field Target Target first of all not the oracle Advance you need to go to advance and you need to say no I don't want the field header name that's how you will get the value right and in the incoming fields we don't have to select all the fields we will just select the expression update date that's it so choose that still it is not completed but step by step just wanted to show it step by step okay we are done we are now just getting the expression updated and the header will also not come so all this column will not come header will also not come only this will come and in the Target schema in the Target change the formatting that I don't want the is say a double quotes also I don't want any course it we want to make it exactly like your parameter file so save it and incrementer and run it again it's still not running incremental okay because you are not able to override the parameter it's always pulling 90 70 97 this is the value we want to update after your job is done and that is the step we are learning now I ran the job again every time the now 10 records because one record is going into flat file that's why it is showing 10 minutes now close it and go to the Target directory and see how the value is coming right now see now it's good but the format is not correct because we have connected the other column we need to get into y by mmdd now see this is correct but I want what do we want we want value like this right we want exactly like this in our Target directory so how we will do it so if you want like this then you have to put in the expression right you need to con connect this hard coded and this value you are getting from the source so that is what you need to do right so we will be creating one expression after aggregator right you want to put that hard coded value right so I'm not giving the expression names and all properly but in your project you should be giving proper names okay and we have to create one expression right and we need to call it as a your you can call it as a param here the parameter filter or something and this one we need to give it as 100 or 200 whatever you want to keep so that all this whole value should be able to match it so make it maybe we can give 100 it should be fine and this expression you need to configure right you need to take it this hard coded value this dollar dollar and you need to concrete it as a string and you need to give like con connect it and the other value should be two character you need to convert now 2K whatever you have converted to date you need to convert this to two care or you can just use the source one but I am relying on the update one because that's the one I'm keep pulling from filter expression update date and YY we just want the same format as our parameter file that's why we are doing it okay and validating this is the format we want right so right now if you see the value is like that so if once I choose this expression this dollar dollar incremental filter will come and after that um one space needs to be no non-specially okay select it save it now if you run your your target direct Target will look exactly like your parameter file but with the max of your current run that is what it is going to do now we'll go and we will go here in digital directory and we'll refresh it oh I did not map it right that's why it is not working so go back and these are all troubleshooting so if field mapping we were we were giving the incoming record as a expression update date which is wrong right we need to change it now so not that column we need to give param filter save it param filter should go to the Target that's the one we have in expression we have calculated now let's go and run it we press it stay here now it looks like exactly like your parameter file but the new value is there now everything is done you have created the mapping it's loading nicely into Target schema right now just doing insert and updates so same value again and now you have achieved in your target file see exactly like but now you want to copy this into param folder right you want to override the value this should get that value the 713 what the target table is getting this should go to parameter right then only the next time when the job runs this value it will get it here and next time it will not pull out the record so now we have to create in our post session command or post processing command you need to copy this data set from your target schema the incremental and you need to overwrite your parameter file so for that in in Windows I have created one bad file which I will be giving the all the inside I will be giving this in comment section and then you can use it this is for the I will just go through what it is doing and uh Unix it will be very easy but fly here we are seeing so basically what we are saying we are going to TGT file directory and then we are finding out that file where the new values you are writing so we are going to TGT file directory and then you are opening this cast incremental tgt.txt here and then you are using all this dos command I'm not so much Pro in DOS command I just got it from Google and I have modified modified account accordingly to my requirement and what it is saying that if size is less than 1 then don't override if size is uh more than one then only override right so that's what it is doing it here and why do we use that size why it is not overriding all the time it's because if there is no change then it will be zero byte file right if you zero byte file you try to override the new parameter file data will go away and your job will fail right we'll see that scenario also so that's why we are doing so this uh this bad file I am going to give you and this will go into your post processing in the mapping task so now everything is all set we need to just override from TGT to param file so we'll edit the mapping class and if it is UNIX it will be easy you need to just write the shell script to check that if TGT files is more than zero then only copy paste to the param point overwrite the param file and here we are going to paste it here double quotes and then on the file which I have created this byte file in Unix it will be shell script okay you can put in anywhere I think you can put you can create one script folder and from there also you can place it right now I'm just having outside okay but it doesn't have to be outside you can create one more folder called Script normally in Project you will be having like that one so here we are saying post processing command after the file is created and before the next run just update the value into param that's what it is going to do post processing it is not pre-processing it is post processing after the job has executed after that finish so now your your mapping is all set and this one let's delete it so that you can learn so because now only your mapping is correct like initially when we will be developing mapping then you will create the mapping and then load it right so now we'll run the job and we'll see how it is doing and how it is updated now your parameter file will value will get updated this 1970 will get updated after my job is succeeded to uh 2023 that's 7 15 whatever your target it is writing that will copy the value here see it got copied but from that the bad script which you have created right it copied your data from your TGT directories from this file it copied the value to the param file so that now it will become incremental right now the nine records got loaded right now if I rerun this job it is not it now it will read from parameter file and it will get the 713 and the Zero Records will get load and you can see it here now now the thing will come why we were checking uh why we were checking 0 byte file and why we were not simply overwriting this is why we are not doing if let's say that someday in your Source there was no changes and then nothing got pool C your target file became zero if you would have not checked that 0.5 then don't do anything to parameter then it will it would have overwritten this zero byte file in the param file also a param file will also get overwritten if you don't check that right and if your problem file become 0 then your job will fail that there is no value invalid date you are passing in your mapping 0 null value you are passing and it is true also right if nothing is coming then you are incremental filter should not get updated right it should pull from the last Max date which was on the previous runs so that is why we are checking in the bad file uh this if it is zero then don't do anything only if it is more than zero then only copy it that is the reason we are doing it okay let's make some changes in the source and see that how it works so let's see make changes and let's make this as a bigger than 713 so let's make this as a 14 which is bigger than your flat 5 in your this one 713 it is bigger I am putting 714 Max updated and then is indoor let's mark it as a what can we say Delhi is there Delhi is not there right so let's put daily save it and let's rerun your job only one record should get updated and the parameter file will get updated to 7 14 it will get updated this will become 714 in parameter for the next one and right now only one record will get updated see one record got updated one record only got pulled from the uh from here it is flat file that's why it is showing nine but it which is Oracle and all it in on numerical you will be doing the Sorter and filter in the source qualifier itself right here itself if it is if it is a database then all this filter in sorter all this you will be doing here itself in the query option itself you will be putting the filter up Max up per incremental date whatever filter I put and sorter up update data sending also you can do it here itself if it is if it is or a database okay this is platform because it's flat files are more complex that's why I'm choosing the flat file for our video our record will be pretty easy you don't have to do all this expression so a filter sorter these three things will not be needed it will become very simple you can do it here itself in query option now let's go and check that daily is updated or not so daily is updated on the RAM and the 14th July has also come now let's open the see it is saying that it's it has been changed so 13 will become 14. so that's how your incremental uh for the flat file or the for the databases using the parameter file you can do it uh please don't forget to subscribe to my channel it motivates me to create new videos for you and I will put this bad file in the comment section you can get it from there okay for Unix it will be share the script you can just see lot of Google's script are there right just check the value is more than zero then only override the parameter file otherwise don't do anything okay see you in the next video guys
Info
Channel: ETLSTATION
Views: 3,094
Rating: undefined out of 5
Keywords: powercenter, informatica, iics, cloud, orcale, unix, sql, data, intration, migration, workflow, mapping, mapping task, taskflow, learning, tutorial, beginner, incremental
Id: f8SET-5MLG0
Channel Id: undefined
Length: 35min 50sec (2150 seconds)
Published: Thu Sep 07 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.