Azure IOT - End to End Project

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hi everyone welcome again so this is the third video of our Azure iot and to end project Series in the first video we have covered what exactly iot is in general in the second video we have tried to cover what exactly uh customer situation is right because before we wanted to jump to the solution we needed to understand the customer situation in lot of details so that we can look at the bigger picture in this video we will be covering the solutioning part we will be covering and implementing the end-to-end solution into your Azure iot uh stream analytics storage account database a lot of other things we will be taking a lot of services but what I would recommend that before you jump to this video If you haven't covered the last two videos I'll recommend it please cover these two videos into the Azure iot end-to-end Project Playlist right so worst is your uh the project overview the second is your customer situation right and now let's jump to the solution how we gonna be implementing the solution so what we are planning to do since we don't have physical devices with us we don't have Smart Meters with us we gonna be simulating this smart meter into a VM inside this VM we gonna be installing Visual Studio inside this Visual Studio we're gonna be simulating and coding a virtual or simulated smart meter which will be sending two uh information the first is your temperature the second is your voltage right and then we will be ingesting this particular information into your iitm so this left hand side portion we will be doing in a VM right uh but before we jump to how we will be simulating it how that VM looks like how things Works in on field so let's say you as a manufacturer or you as a power company installed a new device let's see device ID is 1000. into a customer premises how this device when it powered on will be communicating to the iot hub right because daily we gonna be installing thousands of devices we cannot do it manually we cannot feed the device we cannot feed the iot have address and all so for to make this thing a streamline we have a service called device provisioning service now what this device provisioning service is and how it works in production let's have a look so it's a basically eight step process the first step is manufacturer will create enrollment list into device provisioning service so basically in simple term let's say we have a new device with ID 1000 the manufacturer or the company will make sure that it will be added to the list the Second Step what it will do it will pre-feed the device with the device provisioning service address so let's say there is a API address something called http 1 0 10.80 dot whatever so what device manufacturer will do it will feed pre-feed the device with the same address so two steps done now once the device is turned on what it will do it will pass its device ID to the API of device provisioning service and device provisioning service will match the ID 1000 to its enrollment list and now if the ID matches it means the device has been authenticated so what in in more theoretical way DPS device provisioning service validates the identity of the device by validating the registration ID and the key against the enrollment list entry using a nonce challenge or it can use standards like x509 once this is validated what DPS will do DPS will register the device with our iot Hub so this is the fourth step so DPS device providing service will register the device with the iot Hub and now iot Hub will create a twin digital twin of this device into iodine right so basically now device has been registered in DPS DPA has has connected to the iot hub that okay we have this device and iot have has created a digital twin right so four steps done and the fifth step is now iot Hub written the device ID which device ID digital twin device ID to DPS again this is the fifth step right so five steps done sixth step is DPS Returns the iot Hub connection information to the device and device can now start sending data so DPS now DPS has iot have information iot has created the digital information and it says okay the inform the digital twin has been created now sixth step is this DPS will send the iot Hub information to the device and now device can send data directly to iot HUB device provisioning Services work done right so basically seven step device connect to the iot hub and device get the desired state from the digital twin or device twin from the iot app so let's make it simple again first manufacturer will make sure that it in the enrollment list new device ID is listed and it will make sure that in the device DPS API ID is there now when device is powered on it will send its ID to the DPS service DPS will validate it if the if it is validated it will connect to iot HUB send the device ID to the iot hub iot Hub will create a digital Twin State and send the information back to DPS service now DPS service will send this information of iot Hub to the device as a six step the device will connect directly to the iot have now and will start sending data so this is how device provisioning work so this is your device to Cloud communication right d2c communication now let's say we want to send some signal back to device Cloud to device how it will work right so basically it's again simple mechanism so this is your device this is your service which wants to communicate to the device so now to communicate with the device it cannot send signal directly to the device why because device has no uh feature to validate it so basically to communicate with Device it needs to send information to a zero iot Hub it needs to validate itself same thing device needs to do if both the pathways are validated then the information can be flowed from services or you know external providers to the device through this so basically in on a very simple term Azure iot Hub will act as a Communication channel from a device to cloud and to Cloud to device both there there are other details as well uh but this is the most simplest form right so this is how your communication work one more thing which we need to cover is most of the communication in fact all of the communications will be done using all these protocols either it will be HTTP post amqp mqtt so these three are the most important protocol uh when we talk about iot have communication either from Cloud to device or device to Cloud okay so the first thing what we need to do is to create a VM which will simulate uh the smart meter to do it we will do one thing we will go here and Azure uh portal will create a resource Group right so Resource Group is nothing but a container for this particular project right and let's say we will create a resource group called iot and iot project right um we will be creating this Resource Group let's say and in Europe North Europe and we will say review and create and we'll create it so once this Resource Group is created we will be going to this Resource Group and we will be creating a VM and this VM should be installed with your visual studio so we will directly search for visual studio we can do it in on local machine as well but I would prefer to go with this Visual Studio 2019. right uh and we can go for 2019 let's hit this one um and let's select this one let's go for create and once this is done we will be naming this machine at the demo machine I have already done it but I'm just showing it to you guys demo machine um let it be pre-populated let it be same we will go for let's say demo user right and we will put some password Here so please do remember this password and then we will create it so I have already done it um so I won't be creating it so once this is done connect to this machine and inside this machine what we will be doing we will be creating this project so don't worry if this doesn't make sense to you I will be uh you know putting this code into my GitHub and from there you can have a look into it so basically in this project so this is how your and result will look like so what we have done we have simulated nine Smart Meters these smart meters will be sending k sorry so we'll be sending uh temperature device ID and the timestamp and it's not scrolling but there is a virtual voltage as well here right so let's not jump to the code because this is not important but this is how basically what you need to do you need to import a project right or when you go here so just download it um and there will be a solution file double click it and it will open it so I don't want to focus on it but once you double click it this page will open and we gonna be covering what we need to do with this page right foreign so once this machine is created open your RDP right open your RDP connect to this machine with this IP address now once everything is done go to that machine that's I have already done it and I have already created the machine and everything the first thing what we need to do we need to um create or download the code of the simulator so I have downloaded this code I have modified this code and what I will be doing after this video I will be uh you know sharing this code using the GitHub to you guys so link will be in the in the description so once you download this code you need to double click this smart meters simulated solution file and it will open that file into your uh what we call into your visual studio right and this is how your smart meter look and feel will be so what we will be doing we are simulating nine Smart Meters right and these smart meters will be generating three things every second device ID temperature and timestamp and we will be needing three two things one is your DPS group enrollment key and DPS idea scope don't worry if you don't understand this right now don't worry again about the code and all the only purpose of this particular VM is to act as your smart meter simulated smart meter right if you have any doubt if you are not able to upload it not able to open it do contact me in the comment section but for me it worked like a charm so for you also it should work in the first core okay so this is your smart meter simulator looks like we won't be doing anything right now but this will be used in later fraction right so your this is your smart meta simulator in simple terms second thing what we will be doing we will be creating our Azure data breaks so let's say we will call here databricks and this data breaks will be used to analyze and process the data right so we're going to be creating here Azure databricks here and in this let's say we'll call it iot databricks let's see right and we gonna be doing it into Europe again North hero pricing tier we will be doing it into standard and we'll go for the networking uh deploy observe your database with secure cluster connectivity and deploy zero with their own virtual Network I think Let It Be default we enable infrastructure encryption we are testing it so we don't need it let's go for tag the tags are more for billing purpose how much this particular service costs it's always a good practice iot project so let's say iot project and the value is let's say iot simple and we will say review and create so if it is validated we can create it so it will be created in some time and once it is created we will be moving up to the next steps till the time your databricks is created let's create a few other services iot hub right so this is your IIT Hub and we will create iot hub great idea and what we will be doing we will be creating this I already have into your iot app project right and we will give it a name what's with the name um let's say we can give it a name as [Music] um any name let's say iot sorry test iot Hub simple right and not available so let's say 1 0 1. it's available and we again create it into North Europe and we will go for networking Public Access let it be there we'll go for management okay which pricing tier we gonna be doing it we will be doing it into S1 standard only right because we are only practicing it and now next thing is how many iot Hub unit we need to provision so let it be one we are not changing it um so yeah this is important so basically pricing and scale tier as one it will so these are the things what we need to know it will send device to Cloud messages that can be done messages how many messages can be sent per day um um 400k cost per month will be 21.08 Euros iot for defend the defender for iot is there everything is like you can read it now the next thing is um we can go for shared exact signature shared access policy it's good partitions should like water partitions the number of partition relates to the device to Cloud messages uh to the number of similar simultaneous reader of these messages so basically what it is saying how many uh parallel process can read data at the same time from iot when devices are sending data so let's say we want to go for four we want to go for add-on do we need do we need Defender for iot I think for testing let's not do it go for tags go for iot again and let's say iot only okay and review and grade so this will create your iot hub this uh second thing is what we need to do so it will take some time uh next thing is your okay so let's wait for it first and then once it is done so see it start with your test priority Hub it will create other services so now since we have our um uh this created um databricks created let's go to our databricks workspace and sign into the account and let me create the cluster first right so it let it gets signed in and we will create cluster as well so these things will take time um like just to set up so we'll go for compute here we will go for create cluster um I'm single user and so we either go for single user or isolation no isolation shade I think the default one is okay for us um what we need we need a Scala 2.12 latest version of spark how many nodes we need so I think uh uh this is fine minimum workers on Max worker I can put it five for now uh driver type same as worker text I think no need for the text here and let's go for create the cluster it will take some time uh once the cluster is created we will move to the next okay so when we see here our test iot Hub has been deployed uh successfully let's go to the resource so our cluster is in progress so let it be created so what we need to see here um so see this is created in Resource Group iot project the status is active testing pair is good uh one thing the most important thing is your scope ideal scope so let's see uh so here should be ID and scope refresh it iot Heaven DPS uh updating their TLS certificate I think that's a k refresh it um okay so the the the thing which I'm looking for will be in device provisioning service so I was looking for scope so it should be device provisioning service which we can create later since the iot has created here we'll go to your shared access policy in the shared access policy what we need to do we need to select this right and um we need to copy uh the connection string primary key field okay so we'll select this and we need to copy this part primary key right I'll just copy it and I'll copy to it to the notepad or in fact two or one note let's say so I'll create a new notebook I'll copy it sorry Ctrl J and I'll copy paste here right so this is this we can use later so in the iot hub uh what we did we copied the connection string primary key and we recorded it right and now what we are seeing here that this particular iotf owner can read write connect to the service connect to the device and everything so this is good so we have copied it um so primary key what we have here we have primary key we have secondary key and we have primary connection string so we need this part so I'll paste it here so we have your shared access key and everything okay so these are some information don't share it with anyone right now I'm sharing because I'm going to be deleting everything once this video is done right we will be canceling it so this is done so now what the next step we will be doing we will be looking for uh um device provisioning so this one I assure iot have device provisioning service fee when are we creating Azure iot Hub device Pro visiting service and I'll select the subscription selected the project name what name we can give let's say we can give here smart meter DPS right and again try to keep everything in the same reason so let's say North Europe and go for your networking everything is great management it will give us the S1 standard thousand operations will cost you 0.169 dollars that's okay I'll go for tag again not needed but ideal right review and create so it will create some device provisioning service once it is created it will take take some time now meanwhile let's go to your Azure sorry database cluster what we can see that your data brick cluster is up and running and uh we have your cluster computer not ready so yeah so now the cluster name is nilancing cluster because it's me I haven't changed the name runtime is 10.4 active memory 12 cores active DBU per hour everything is cool we will be using it in some time so if you guys want you guys can pause it and terminate it one thing which I need to see here is whether it will stop oh yeah so this thing um I will say edit do one thing um Auto scaling if you guys want that's okay but you know just save cost um make it 30 minutes even 15 minutes okay so confirm and restart so this is cool it will restart and change um thing so now device provisioning service is done now what we need to do here the most important thing is your idea scope this one we will copy it and we and why I'm doing it I'll tell you later but just do what I'm doing right now so this is important so you copied it and the second thing is um do one more thing here uh uh linked iot devices in this DPS service go to this particular linked iot hubs right and in this linked iot hub uh do one thing add a device right and add our iot app so this is our iotf since we have created in the same Resource Group and all so this is your iot Hub right um so till now we are good uh location everything is great and we will say save so this one it means that our iot Hub has been registered with your DPS service so now once uh the device iot device will communicate with DPS and if it's validated it will make sure that this iot have has been properly informed uh DPS can inform it right uh now next thing what we're going to be doing here is manage the enrollments right so now what we need to do we need to create uh enrollment uh groups or we can create individual enrollment so you remember when I was telling you that device manufacturer needs to create enrollment group or individual enrollment so we'll go for enrollment group creation right and in this group creation um we need to add a few things so let's say we will call it a smart smart meter device group right this is okay and so basically what is the attestation type so how your device provision services will authenticate so basically um select whether we're going to be using symmetric authentication key or x509 so let's say we are going for symmetric key right and uh what it says Auto generate is checked iot so let we are not putting anything here and how you want to assign devices to herbs let's say evenly distributed and select the iot Hub this group handle that I do it's already there we don't want to tag it and this is good and we will be enable the entry yes and save it so what we have done so far we okay I'll come to it later so this is done right so now uh when we go here to the overview and go to resource what we have done we have registered our iot help we have created the enrollment group and now the last thing what we need to do here is uh when we go here into the manage enrollment and this one and uh so we have your primary key here so now we need this primary key as well right so what we will do here we'll paste it here so this is your let's say DPS PK just to make sure that you remember it right so we have copied it as well now so let me recap what we had done so far in terms of what uh in terms of what have what I have started with from the device Pro visiting point of view right so let's forget this all part for two minutes and I will tell you what we have done in terms of this particular diagram right and then uh you yourself can see what else needs to be done right so we have created iot Hub right uh we have created a device provisioning service we have created the enrollment list we have registered this iot Hub with the device provisioning service right ah so this this part this side part is done we have kind of not hundred percent created our VM for the simulated device now what we need to do we need to make sure as a manufacturer of this device into our simulated environment that our device should have the information of this device provisioning service this is what we need to do right so what we have done so far in this section we have all the details we have the scope of our DPS we have our DPS primary key even we have the access key excessive for iot Hub so we have everything so now when I go here to my VM uh what is saying you have got another I don't want the update we need to do one thing we need to update here uh main from design here so once you click it you need to update your let me copy this this particular key DPS enrollment group key here into the text section and paste it here and this particular scope ID copy it and paste it here so what it means in very very simple terms as a device manufacturer we are pre-feeding our device with the iot Hub or the device Pro visioning service at simple right so now when we go to this diagram we have pre-feed this particular section with Device provisioning service address and all so now once we start the device device will send its ID to here sorry to hear an validation and everything will take care so kind of we have covered the whole infrastructure right so now ah this is great um so what else we need now we need to start the device now right so let's go here and let's say we want to start the simulator let's see if it works so it will start the program so see now what it will do it will bring up a window if this program runs okay it will compile it uh okay fantastic so it means our program has no error it is ready to run now what we need to do we need to register the device first right so if this Windows comes up for you guys it means your simulated environment is is ready to run right so this is okay um you have put your DPS enrollment key you have put your DPS ID scope and now what you need to do you need to click one of the device once you click it it will turn yellow and now once you ah so it means it is ready to be registered right ah so let's go here once and let's see from where we are really still here so we have clicked power on the device and let's see how the data will flow okay so so what we have done again why I am covering this so that we are on the same page so we have clicked the device it means we have turned the power on so yellow means yellow means we have turned the power on now once we click register so let's say we want to reset two device so we can go here click two so it means we need to register two new smart meters into customer promises and we will say register so if the registration is successful the color will change uh to Blue so it means this device has changed this color to Blue it means the registration successful what it means it means this device has been registered successfully to your iot so now let me go back to the provisioning so this device has sent the information uh to the enrollment list enrollment list validated and what we will be seeing that iot Hub will should have should have and I'm not sure should have the digital copy digital twin copy of this particular device right so let's see so now when we go here uh uh we go here we go to your iot project and let's go to your iot hub in iot Hub let's go to devices fantastic see we have created the device now device zero and one has been registered here the status is enabled authentication is through the shared account key what we have put there and we are good to go right now let's do one thing ah let's do one thing let's go to your DPS let's go to your manage enrollment let's go to the device group and let's go to your registration records wow fantastic it means device zero and device one came to your DPS service and registered at this particular date so what we have covered in theory has been successfully implemented into your Azure iot Hub DPS portal till now we are good right now let's go here and one more thing I'm emphasizing if you are not getting it that okay what exactly this is doing don't worry you know we will we can cover the code later uh so what else okay so till now we have covered uh the device ratio and all now let's see once so registration is done so means your uh VMS or smart meters in the VM uh are registered properly with your iot Hub right now let's see if we want to send data to iot HUB let's see now let's say when we say connect so connect means it will start emitting data okay so see the device zero device one are sending data every second okay and the color will change blue red or green based on the temperature and don't worry this is again fancy thing uh so so if the temperature of the device is less than 60 less than 68 it will be green and 60 to 70 blue more than 70 do is right okay so till now the temperature is okay okay right um or I have changed something so don't worry but yeah start emulating the data so now the next thing is um we need to ingest this data we need to ingest this data or we need to process this data right so what we will be doing we gonna be creating our stream Analytics a job right and let's create a stream analytics job and we gonna be creating it in subscription msdn and iot project and let's say we name it test stream analytics job right which North Europe again and we will be doing it on cloud we and see Edge means if you want to compute on the device itself we don't want to do it right uh we think streaming units should be one okay for now because we are just testing it now go to your storage uh do we need to store a secure account now let it be this testing environment um again iot IOD that means once we say review and create uh let's run the final validation um and once this is done we will be writing our stream analytics job to ingest data to your uh from a right here it will take some time uh once it is created we will be coming back again okay so your stream analytics workspace or job has been created any stream analytics job you will have four things inputs from where the data is coming outputs where you need to send this data and query how you need to get data from input and send it to Output any transformation you need to do functions means if you need to write your own user defined function to process this data and you need to use those functional query so let's go for input first let's create input stream from where we are getting the data from iot Hub that's cool and let's say we are giving it uh name as let's say temperature Temps what we need we need we are needing temperature right and uh okay that's cool uh consumer group let's say by default so we are getting data from Azure our iot Hub um shared extra signature is okay testing point is okay uh we need it in Json format UTF it is okay and we'll see okay save it so it means our input is created and now um what we need to do we need to create outputs right now where we need to send this data to so let's say we need to send this data to power bi okay and uh so in the power bi section um now we need to do one thing here uh so basically uh this particular user the user which is creating all these uh resources should be creating um the power UI workspace from the same user okay so uh see user is not licensed for power bi so what we need to do we need to create okay now make it simple so we'll go home right we'll go for Azure active directory and again see this thing is not done in this way in production we are we are making something um uh just to make things work for power bi we need to do this stuff in this way right we'll go for users and uh I have created existing usable let me create a new one so we'll say create a new user and we'll say iot new user okay and and let's say his name is iot user okay and everything is cool and let's say create it so this user once this user is created iot new user refresh it it should be appearing here uh yeah this is here now right so the user is this once I go here iot user okay now I need to copy this right and I will do one thing I will open a new incognito window and let's say I'll go for app Dot powerbi.com and I'll enter this I'll say submit and one's new user and let's again iot new user and copy paste this copy paste this here again and let me create the password so I'll see any password right so so I'll create my password so that I need to remember it now and refresh this page so once we have seen the fifth user so fifth user is here copy here and uh go back copy here submit and I'll see here my password sign in and will ask me to change the password and why I'm doing it and if I'm getting if you guys are getting confused let me explain it in whiteboard so basically what we have done so far we have created a resource Group we have created in this horoscope iot DPS blah blah and number of services right and now what we want to do from our stream analytics job we need to read data from this iot and we need to send data to your power bi right but the condition is that if this power bi user and this stream analytics or test iot Hub user is same then only the workspace from Power bi can be used in stream analytics as output right so in into our companies into production environment we do have the power bi user there right who uh who is into the same as active directory but for me since I'm using test account I don't have the same user for power bi and for the stream analytics for steam analytics I'm using my email ID my my personal email ID right and for power bi um this particular email ID is not registered with the power bi because power bi you need to use your company email address or something right so what I've done I have created a new Azure active directory user and I will be signing up for power bi using this user I will be assigning this user as owner to this Resource Group so that all the resources what we have created in this Resource Group this user can access and now I will login to my Azure portal through this user so that we can use this power bi workspace here seems complicated but it's very simple okay so what it's saying for second factor factor authentication I don't need it right now I'll say yes okay it will take some time and now foreign continue and I'll pay for Island I'll say one two three four five six seven eight nine right I understand and get started get started so now as your sorry power bi account will be creating in let's say a couple of minutes and then what I will be doing I will be creating a workspace and so meanwhile what is doing let me do one thing let me do one thing I'll go to my this one I'll go to the resource Group this iot project I'll go to access control right and I'll add a resource role assignment and I want to make owner of this particular Resource Group and whom do I need to make owner my Azure iot new user select review and assign review and assign so it means my role has been added there right so let's do one thing let's test one thing if I go to portal.asio.com and if I want to use this as your iot user here and okay so now it is asking me your phone download now I have already downloaded it so let's do one thing next so it is asking me to using my Azure authenticator app uh scan this code right so I'm scanning it so it's working school account you need to scan the code so I have authenticated it from my uh this one and I'll say next and I'm approving it on my mobile so it has been approved see and I'll go next I'll say done and it bring me to the Azure portal I don't want to go to here I'll go to Resource groups and cool see I'm seeing the Azure iot project here I will go here meanwhile let me try one more thing I'll go here I'll go here got it workspaces create a workspace it won't allow me it will ask me to buy a power bi Pro license I'll go for try free and I'll go for two three four five six seven eight nine and I'll say start my free time okay so it will give me free trial for 60 days that's cool and now what I'm doing here I'll go to workspace I'll create a workspace and I'll say test iot work space save okay so we have created a workspace so this is our test ID workspace now what we need to do we need to go to our Azure portal we need to go to your test job so now when we go here we have our inputs created already right so this is create let it get refreshed and so we have inputted create already now go to Output go for here go for power bi and now let's say we go for power bi see now this is what I was talking about so test iot workspace has been populated automatically why because this power bi user where is that this one as your iot new user and this as your portal user is using exactly same address ID now this is cool uh we have gone here and let's say we need to give the data set name anything we can give Let's see we can give go for average Temps and table name we can paste it here again the same thing this is great and let's try to authorize okay so now it's asking me to authorize uh so azure what I'll do here iot new user or if I don't want to go here I'll go for aaad let me try one thing let's go simply say save okay fail required refresh token Okay so user token Let It Be user token and I'll authorize it okay and I'll go for users I'm just copying the user ID here iot new user I'll copy it here I'll go here I'll go for authorize I'll paste that here next and put the new password what I have created yes authorization done save it wow that's great so now we are cool what we are doing here data is generated regularly every one second but we are not capturing the data anywhere right so now what we will be doing we will be reading the data from input we have created the output now we will be writing a query query here right so now see this is the cool stuff since there is only one input one output it wrote the query for us automatically and the cool part is that the the data coming automatically here see isn't it great we'd obviously we have done a lot of things but most of the coding has been done automatically for for us now since it's simple uh what we call it um SQL query now let's say we want to have the average temperature temp from where this stamp is coming if this dam is coming from here see so device ID temperature and time stamp device ID time stamp and temperature and one more thing which is not getting showed there is voltage we don't want to use the voltage now but and since we have given the three partitions and it is showing us which partition it is going into in on the iot Hub side forget about these things but let's go for temperature and uh as what we need as average right and we want ID as well and from this what we need to do we need to group by g r o u p I and let's say we want to use the tumbling window if you don't know the tumbling window I have already covered it in uh spark tumbling window sessions so look for it we want to Tumble every five minutes okay and ID now let's test this query fantastic okay so it means in fact let me make it one minute I don't want it everyone lets me test this query so every one minute it will uh push the result to us right now once we save this query so the setting has been saved now uh let's go here in the overview right and uh let's start the job so we have tested the query we have made the input output tested the query now we are starting the job so now once the job is started we will see some output let's see it will take maybe a minute or so so one thing which I missed so when we when we go back to the overview window we haven't pressed the refresh button so once we press the start here it will give us the start job blade and it will give us when we want to start the job whether now or we can go for any custom time right so we want to start it now so let's start so once it is done it will start jobs in let's say one minute okay so once this job will start we will go to our power bi and we will see whether our data set that is average temperature data set and average temperature table is created okay so till the time till the time our job is starting we will do one thing we'll go to a resource Group um we will go to the resource Group and let's create um one storage account right so we'll go here and let's say storage so um you know why we're doing it right so storage account and we will go for simple storage account here as your service and we will go for create and the school till now we are good here and let's say storage account test iot simple and where we will create everywhere we are using North Europe so we're gonna be using here uh North Europe yes here we're going to be using standard performance uh we want local redundancy um we don't want to increase the cost right um we will go for next Advance do we need rest API no we don't need it allow Public Access is okay everything is okay um this is cool and we'll go for access tier is okay anything else we'll go this as well uh so okay so in fact we do we need it let it be and networking everything is great data protection is okay encryption is okay and next text if you want to go for iot we can go for iot review so what we're doing we're gonna be storing uh we are sending results to power bi through hot stream and through the Cold Stream we will be using uh this one uh to store result right so this is this will be creating now let's do one thing till the time this is created let's go back to our test job and refresh it so see it is a running state and do we see anything coming yet so it will take some time uh still not coming here no ways so our storage account has been created so let's go to resource and let's go to your uh access keys and copy this key which one this one copy it for a time being paste it here we'll use it okay and now okay sorry copy it paste it why it's not very pasted just give me a second hmm so this is cool this might be some pasting error there um now let's do one thing uh let's go back back again and let's see if it is starting coming anyways let's do one thing let's go to our test workspace see this is what even though that's not getting refreshed don't worry don't worry right we are not seeing anything running here anything refreshed here might be there's some latency but when we go to our Azure power bi we can see that okay uh our azure data set has been averaged times data set has been created right so now let's do one thing ah now what we need to do using this particular data set we need to create some visualization now let's do it um so when we go here what we need to do we need to say create report right and let's say got it what we need to do we need to go for your bar chart let's maximize it a bit we don't want it to work cancel it and when we go here and this average temperature what we have we have average temperature and ID so sorry in the access field we have ID in the value field so see this is important in the access uh sorry this one x axis right ID we have here and the y-axis we should have average okay and this is great till now we are good so this is Filters we don't need any filters now and in the value field where is the value okay so this is good right so we have device 0 and device one here now let's do one thing let's see if this data is real or not so let's do one thing first let's save this let's say average time report simple and save it okay so this is good now let's do one thing let's go here register three more devices okay uh and once they are registered once they change their color we will connect them let's wait for one second and uh did we Press Register yes we did let them change color and then we will connect them and then we will see whether uh uh that dashboard is sending results from here and refreshing at the real time okay and meanwhile let's do one thing now let me refresh it again okay now that's great so we are seeing the resource utilization right it's 20 some data is coming we are seeing that okay 600 messages already came uh CPO utilization is 3.5 so it's it took some time but it eventually refreshed right I'm not sure what's not working let's do one thing let's refresh it start the simulator again so if it's not working like this because obviously it is taking resources generating a lot of data so stop that start the simulator again and let's this time we will register five devices okay and so this is a trick if you see that the color is not getting changed uh stop it and restart the simulate again and once all the colors have been changed then go for Connect connect okay so now this is good now let's try to validate it first thing first uh whether the uh it has been registered successfully into your testavity devices okay this is good so it means all the devices have been registered successfully and uh uh let's refresh this report as well okay so it's it it will it will take some time uh maybe maybe a minute or so but eventually this will get refreshed now let's do one more thing meanwhile we have our storage account created right we have our storage account created and uh now let's do one more thing let's go for stream analytics job create one more job in this job let's call this job Resource Group iot project only name should be let's say um cold uh Cold Storage anything North Europe cloud is okay streaming unit I don't need three I only need one review and create this thing is cool till here and create it it won't take much time and meanwhile let me go here and refresh the visuals oh fantastic so it means we haven't done anything um since this is your real time dashboard everything has been generated automatically and average temperature for each device has been shown here that's great till now what we have been thinking is is working right now let's do one thing go to the source the new Cold Storage job go for input guess can you guess which is streaming input it should be iot Hub right and what we call it um iot I already have right everything is great we don't need to turn in anything so it means it's good now what should be the output what should be the output here the output should be your blob storage or adrs chain two and let's call it um anything let's say blob blobs right msgn is okay storage account see what we have created it's already here since we are in the same uh Resource Group and all and uh K so this one create new container and create this is let's say smart metadata so what it will do it will create a new container and uh this is great line and let's say if we want to have a path pattern right so what we'll say in this smart meters data in this smart meters data create paths like this so it means for every date and time select different folders everything is looks okay here and let's see what's this minimum rows per batch we want to process 100 rows minimum and maximum time let's say enter three minutes okay so let's validate so we have your smart meters created path patterns created everything is gone let's save it and now let's go for query okay let's go for output once more let's do one thing the data is coming in not in this format it is okay so this is okay uh right and now so once this is started it will take another one or two minutes and so let's wait for it let's go for containers wow so see we haven't created any containers right um now let's go to this into this container we have a folder smart metadata in this we have the date we have month everything time so in the smart meteor data container we have the data for 10th sorry eighth of October 2022 for 20th hour so the structure of the folder has automatically created now let's see this data uh in fact let me download it scanning okay so let's open with Excel so what I'm trying to show you here that every three minute it will be sending us the record device zero and temperature and everything so this is done this is cool now what else do we need to do we need to do few more things here uh just give me a second here so what we have done so far let's go to our our project overview document again so we have covered this part we have covered this part now let's cover the databricks part right so now we have already created a database cluster in the beginning right so now let's go here uh so our documents are already created into our containers go to our iot project Resource Group go to this one and launch a workspace it will login cluster is already created I don't want it uh so let me see the compute very strange I have already created a cluster here Let me refresh it the cluster is already here okay anyways let me confirm it so basically uh you remember we have set the setting that okay cluster needs to be um uh stopped if it is inactive for 15 minutes and that's why it was inactive right so let's wait for maybe one or two minutes and it will be up and then we will be creating uh new notebook let's say we call it smart meter analysis right let's say the diagram is defaulting which is python we're going to be using this cluster once it is up and we will create a notebook fantastic uh what's that yeah we will be using this visualization as well um now in this notebook what we need to do we need to read the data from the storage account and we need to do some processing right um so first thing what we will be doing uh we need to create a bit widgets so let me show you how it works so once we copy it and paste it here and uh connecting okay so basically it will take a time it will take some time um it will take a minute or so once it is created we will come again okay so the cluster is up so what we are doing first thing is to create view sets two widgets one for account name and one for account key so what exactly visits are that once we run this shell cell you know there will be two text boxes created here uh with the name account name and account key and they are the placeholder they will be the placeholder for um for account key and account name okay so let's see now so once we have done it we have created your account key and account name here right and now into this account key and account name we need to uh you remember uh we have copied our account key here somewhere let me see where it is here so this is our account key you remember we have copied it right and we will copy this account key here and what's the account name account name is what is the account name uh so let's go quickly into your home iot projects and sales iot is it yeah so this is the story account so we'll put it here so it means so what we have done once we run the next cell next cells and we use somewhere this variable the variable will read the data from here right so now the next thing is how to read this data how to read this data is we need to create a new cell you know so let's create a new cell let's go to our uh uh one note and let's read the cells so to create the cell you will be using DB utils dot widgets dot text or dot dot drop down whatever you want to create the name of the variable and the name of the text box so this is how you need to create it how to read it variable DB utils Dot widgets.get and then the placeholder or the variable name of the widget right now let's run it so this is cool and now what we need to do we need to read data we need to mount our smart meter data and file here right so what we are doing Mount The Blob storage at Mount Smart Meters this assumes your container name is smart meters and you have a folder named Smart Meters within the container as specified right so uh when we go here we are going to containers so our container name is smart metadata and inside this smart metadata we have a folder named smart metadata so now what we are doing here we are putting smart meter data smart metadata Smart Meters data and smart meters data so now let's try to run it let's see so if it is runs ok then it means we have mounted our smart meter data container to the data breaks and the mount location is Mount smart metadata let's let's run it the next thing is we will be validating whether the mount is successful or not so the mount is successful let's run here and data answer okay so what's this smart meters data does not exist so might be the spelling mistake was there so we just copy pasted it right so it's running the command and let's see oh yeah fantastic so what we can see that the data uh the containers have been mounted the we can we can see uh the folder name and everything is cool now what we will be doing we will be creating a data frame what we are doing we are creating a data frame for all the files in the blob storage and uh it means it doesn't matter how deep the folder structure is we will be reading everything from the uh the leaf folder you can see so what we are doing here that smart metadata from from your databricks file storage Mount smart metadata star star means go three level deep and read all the CSV file folder CSV files where head is equal to and then print the types when I go here I just corrected it that instead of 3 I put four stars here one once we run here what we are seeing that it has inferred the schema for ID time stamp and everything so till now we are good now when we go here and uh once we go for let's say we want to see the data right so once we go here and we'll say DF dot show it will show us the 10 rows right so it means the data has been uh so we have mounted the disk uh our container and we are able to see the data now let's do one more thing save this data into a table into a DB database table called smart meters right and once we go here so it will create a table and then we'll go to the data so it's yeah this is created now once we go to the data we can see a smart meters table has been created into the default database so till now we are good but this is not I want to show you now we will be running SQL uh using so first of all since we are in Python notebook if you want to run SQL First Command should be your percentage SQL this is called Magic command and now this particular cell of this notebook will run as a SQL notebook see this is done to SQL here right so now what we are doing we are selecting the count and the average of each temperature by the device ID right so this is cool till now we are doing great next thing is we will query the data frame and we'll see how to do it using your data frames equal right so what we are doing here that if we want to run a query on the smart meter so we are using spark.sql and we are running the same query here right we are running the same query which we have run using SQL command so see the result should be same okay so ah since this is data frame we need to go for show as well here uh so this is cool and F is if you want to see the result summary Dot show and run it so it will show the results okay so this is good great right so we have done uh all the analysis now let's say we want to save this again into the uh table again so there will be new table created after this and the table name will be your device summary right so this is great now let's say if you want to run a SQL so once we press presence equals the comma the cell will be changed SQL and now we can run a plane query select star from device summary run SEC sell this and we'll sing the same command so it means we have copied the same command and the one more cool thing about um uh this one right so if we want to see the results into your visualization so what we can see we can go for let's say bar chart and we can see clearly ah how the results look like right so let's save it and we can clearly see results here so this is super handy to see the visualization right so this is how you connected your data breaks uh to the uh container do some processing uh do some analysis you know and now let's see we till what uh what we have covered so far so now we have covered this as well right now the last thing what we will be doing we will be we want to have the real-time alerts we want to have real time alerts so now for real-time alerts we will be doing multiple things so we know that we have our stream analytics job running which is consuming data from iot and we want to send results anomaly detection we need to do the anomaly detection it means animal interaction means what if temperature goes beyond certain point uh if this is anomaly it means if temperature is like in the range this 35 and if it suddenly goes to 60. it means this particular point is the anomaly right if we want to detect it and we need to send this anomaly to Gmail to my to my Gmail account so to do this what we will be doing we will be using the same thing and we will be using anomaly detection uh feature of steam Analytics so in stream analytics we have a feature of anomaly detection we will be using it and I'll show it to you but to do this um to send an alert to Gmail we will be using logic app here right we will be calling this logic app through a function right azir function inside this Azure function V will be okay before Azure function we will be sending this stream analytics jobs to event hub right this once the once this data comes to event Hub we will be calling the Azure function this Azure function will call the logic app this logicables and the Gmail right so seems complicated I know but don't worry we will be making very simple right so okay so now since we have done our data breaks computation what I would prefer you know we don't want to waste our money you know so just stop it uh next thing what we will be doing go to your resource group and do here event hub eventer right and I'll go for event Hub and let's say if I want to create a new event hub uh I'll be creating an mstn I'll be creating into your this portal what's the event Hub name let's say my iot Smart Meters smart meters okay Smart Meters dot service bus this is my name is base I want to create this in your North Europe and pricing tier should be standard and throughput unit is okay one Advanced do we need to go for version networking is okay text I think it's okay review and create and validation we have created event Hub so once we have created our event hub so once this is created we need to create a container inside it right um so let's wait for it okay so event Hub uh smart Mentor okay so spelling is wrong but yeah let it be uh event let's go here and let's create so that was event hub namespace inside this event Hub let's say uh we create a container ah the container we can create any name let's say anomaly what we need to name it let's see and a multi detector we can name it anything right let's go for next captures okay rewind create and then create it so now event of name is created event hub is created and now let's search here stream Analytics stream analytics job so we have two jobs which are running that's great and we will create a new job and we will call this uh let's say ehub right and where we do we running it everything is running on North Europe so we'll be running in North Europe and we want to have one we don't want to waste money and we'll create it so it will not take much time let's wait for it and go to resource in this input so basically what is the input iot have only let's say smart meter right and save it right and uh output what's the output output should be your ehub right output should be your event hub so since we have created in the same subscription it picked it up automatically let's call it E Hub and everything should be same that's great now go for query so what it is doing selecting star selecting everything from iot and sending data to your event app so data is coming that's great and now we will be using okay so first thing first why we are doing it we are doing it to predict the anomaly detection right so now there is let me I have created a query here I'll copy paste it and I'll explain it to you don't worry okay so basically not from here but from smart meter and that's great okay so what I have done so far let me explain it to you now what we are doing we are creating a city right a temperature table and inside this temporary table what I'm doing I'm selecting the system timestamp I'm casting the temperature to float we are selecting everything from your iot help right and then we are using the tumbling window here and tumbling window let's say we want to detect the anomaly every 30 seconds right so this is great and now one more uh uh City we are creating called anomaly detection step so what we're doing here now this is interesting so see this anomaly detection Spike and dip is a built-in feature of your stream Analytics so what it will do it will see the data it will see the data in this ranges and it will use Spike and div algorithm and if the animated is detected it will trigger or it will generate the output okay we will see and at the end what we are doing uh oh yeah so we are using from anomaly detection step we are selecting score like what's the score of the anomaly and then we are seeing whether it is anomaly or not every every 30 seconds right and then we are selecting ID time temperature everything so let's see let's test the query if it works then we are good but it's not working so let me see animaly where is equal to zero so let me test again okay so it means we didn't find any animal yet so I just put the zero here just to show you whether our results are coming or not so we will be generating the anomaly in some time but for for a time being let's put it one and save this query okay so it means uh in fact let me do one thing let me put 0 so that at least we can test the data right um so 0 means the animal is not there but let's see because we want to test further and we want to see the data once we do it in production we will put as one okay so so what we are seeing here that our anomaly detection step is working fine and once we save this query we are going to overview we are going to start this query okay refresh it first start it now and start now so what it will do it will send this particular stream analytics job will send will calculate the anomaly and send it to event Hub right so let's go home and okay and I'm going to my event hub so let's go for iot project and where is our event hub so we just created the event Hub right this one is a smart mentor okay and fantastic so data started coming now see and now so it means the data started flowing in into our uh event hub right so this is great so we need to capture these events how will we we will be doing it we will be doing this using your use using our one of the Azure function so this is the last section uh of this project and so basically as I uh in the beginning uh we were discussing that okay we will be sending an email if there is an issue or there is an anomaly right so basically what we have done so far we have our iot hub right this iot Hub is sending data to our stream analytics job and from stream analytics job what we want to do we want to find the anomaly and if anomaly occurs we need to send this data to event hub and from this event Hub we will be writing a function as your function and this function will send data to a logic app and this logic app will send an email to us so this is we are going to do so see this is just one way to alert uh to your mailbox there can be another easy ways but this is the most easy uh and cost efficient way which I know okay now the question arises why we cannot have stream analytics and to function so I think there is no functionality as such so that's why we need to go through this path okay so first thing what we need to do we need to create a event Hub right so let's go to Azure portal and what I have done I have created a event Hub already so that's called ehub test inside this uh sorry not this one so let's go here I have created a event app called smart mentor and inside this event hub event of name space we have created an event Hub called anomaly detector simple we haven't done anything okay so now the next thing what we want to do is to write a stream Analyst job right and inside that stream analytics job what we need to do sorry not this one inside that stream analytics job we need to do some anomaly detection right so what I have done I have already created a stream analytics job called ehub and once we see this job we already have a input from the smart meter that is your iot Hub in the output what I have done I have put a name as ehub and it is pointing to to the same uh anomaly detector event Hub let's see one most important thing which we need to take care and which should be uh we we should be careful is the output format should be Json and the format should be line separated simple right so now when we go here and we see the query so the query looks like something like this so let's minimize all this stuff this as well and once we see this query so what we are doing here we are using anomaly detection of stream Analytics so I haven't read in this code this code is readily available on Microsoft learn right so this particular functionality is called anomaly detection is Spike and depth what this does detects temporary anomalies in the time series event so our events are time series because when we see here it is generating time stamp every second and the temperature of the devices so it means this is our time series data so when there is any anomaly whether there is a spike or dip it will use the machine learning model using adaptive kernel density estimation algorithm I won't be uh deep diving into it but yeah I will share this link and you guys uh can can read it right so there are certain parameters which this uh uh um uh anomaly detection takes um confidence history size mode and different Partition by Clause when clause and different stuff it's very easy right so what I have done I have copied this and I have listed here same thing I just changed few few things according to my query for example I have changed it to system to timestamp and I have used temperature here right because we need to measure the animal in temperature I have changed the tumbling window to um 30 and I have selected you know a spike and dip here and at the end see just for testing purpose I'm putting it as 0 because when you see here this is pretty streamlined data there is no animal in it we will be generating the anomaly for for a time being we kept it 0 so that we can see data here so this should not be in production in production it will be one okay and I will do it at the end of the day so I'll just save it so it means the query is running and uh when we go here and not here sorry stream analytics jobs dhub and overview and we will start this now okay so let's stop and we will be starting it so it means it will pick up the data it will do see uh do the animal detection and it will send the data to our iot hub oh sorry event Hub right well started it will take maybe a minute or so and let it be done meanwhile what we need to do uh we need to create Azure function right so let's go here and let's say Azure or in fact right function function it so we have already created a function app right the consumption when we go here let's see if that can be seen so we have already created a function app and now uh we what we need to do we need to create a function inside this function f the one thing what we need to take here while creating the function app the please choose the plan as consumption because the other Enterprise level plan might be very expensive right now when we go for functions there's one function is created here um let's do one thing let me create another function for you guys let me disable this for a time being so this is normal function I have a zero function app inside this function what we want uh we want to read data from event Hub so when we write here event right and it will give us the result okay event give me a second go to create again actually it's not getting refreshed anyways let's see yeah so now it's working so he went here and select this go to create so it what it will do it will create as your function for you guys so I have already created one but it already it created a new function called event Hub Trigger 2 right now when we go to the code and test see this is already done for us right so we have already like uh Azure function has already written to us next thing what we need to do we just need to write our logic we just need to write a logic but before we go to the logic we need to make sure that this particular um event hub uh sorry as their function is reading data from the event Hub let's go to integration and let's see what uh this function is exactly doing so it is reading from event Hub uh running this function but let's see what exit from where it is reading data right so it is reading data from here what we need to do let's change the name to our enemy detector let's go here everything is good and save it it will take maybe a second or so so it's updated now go to test um it is different oh yeah so our uh what we are seeing here that your stream analytics job is running so that's first step now when we go to event Hub I want to see where the data is flowing into event Hub right so before the uh we we are going to the function app let's first validate whether the events are getting generated into your event herb right from from your stream Analytics okay so let's do one thing okay so still the events are not coming so these are my past events so the events are not coming right now it might take a minute or so so let's wait for it and oh yeah so we can see that okay it started generating again that the job is generated data and uh what we are seeing here that your uh in the in your event Hub trigger we are seeing the data flowing in from your uh event Hub uh right so so this is good so it means the data is coming in now and we are good to go so whenever there is any animal detected into your stream analytics job it will send that anomaly to your eventer from the event we are reading the data using your event of trigger one as your function so it means we are good here right and so when ah let's stop it let's stop the query and refresh it again just to see the data is coming in again and again right and uh so let's refresh the query once the job is stopped uh we will be pasting this anomaly detection query again because right now what I did just to make sure that the data is coming I just pasted a simple query uh just to have the connection testing and now we're gonna be changing this select star from into our ehub query with the uh anomaly detection query uh using your Spike and dip method okay so this job has been stopped we cannot change the query till the time job has not been stopped right so once the job is stopped now job is stopped and we will be copy pasting that query we're gonna be changing uh the input from your iot input to smart meter right so once it is done we will be testing it and uh if this query is giving result then we are we are good to go so it will take maybe a second so the result is not coming because there is no anomaly detector right now we might need to change the simulator to generate the anomaly so what we are doing here so to see the temperature is in the range of 37 to 40 so there is no anomaly um so we might need to do we can do two things either we can tweak the query or we can tweak the simulator so I will be showing you both okay so I'm starting I I have tweaked the query right now um just to what I have done I have changed the confidence in the trouble um for the query and uh it worked okay so now what we can see the data is coming into your event of trigger functional and yeah so when we see the data is coming in here now uh we will be creating a logic app and what this logic app will do our event function will send the data to this logic app so we will be creating this logic app into our msdn subscription and into your resource Group and we let's say we call this logic Apple send Gmail iot okay we will be creating this logic app into not Euro prison and uh because every other resources are not Europe right and we will be putting that logic app into consumption plan because consumption plan is is is cheap and is good for testing and we will directly say review and create so once this logic app is created we will go to your logic app designer and we will get the uh logic app URL okay so this is done let's go to the resource and uh yeah so let's go to your logic app designer so we will be using HTTP request once HTTP request is received because the the trigger from um Azure function is a HTTP request right and we will be saving this to generate the URL uh so it it won't show us the URL till the time we won't save it so let's go for the save option and uh so once we save it uh we will be seeing the URL so this is great so we will be using this URL into your azure function right now this Azure function is just pulling data from your event Hub and popping this onto the console now we need to change four or five things to make this thing work the connection between event Hub function and your and logic error so these are the we need to do few Imports we need to create a variable for the logic app URI right so we have imported your tasks threading tasks and we have imported HTTP and we have created a private variable called logic app URI and we gonna be updating this logic URI with our URL of the logic app right so this is the first initial step the second thing is um using that URI to send message to logical so let's copy paste one more line so we are creating a variable called response and in this variable we are using a logic app URI and we will be sending this message body to our logic app using that response variable okay so we are using post async method and the logic Ura address and message body will be signed in UTF 8 format in the application Json so we will be saving it and now we will be validating whether this code is working one thing what I did uh as I was discussing with you guys that I will do two things I will change the function as well as I have changed the uh simulator so I have changed the temperature to 100 and that's why the colors are red there right so this is working now so now we need to use this payload Into Your Logic app so we have copied this payload to generate this schema so we will be using this sample payload to generate schema method here right ah so once you paste it here and say done our schemas automatically generated so this is the schema for your anomaly message from event hub next thing what we will be doing we need to use Gmail method here action here to send Gmail to us right so we will be going since I have already done it um just to show you guys I'll go for send email I'll go for change connection why I'm using clean connection you might not see this option for the first time because I have already done it and I'll go for add new just to show you guys let's put a name there um let's say test one or test two right and and go for sign in first you go for sign in you need to allow this logic app uh access to your uh to your Gmail account so once this is done we need to add two more things one is your subject and body so this is done so means your connection is established right so now let's add subject and body and once this subject and body has been added let's do Dynamic content so in the dynamic content when you put ID so it means the the the scheme of what you have generated this schema uh can be used here so ID and let's say we need to use temperature so this is your your body right so and in the in the subject we will be seeing ID and time so these content are coming from the scheme of what we have generated using payload okay we will be saving it and we will see whether this logic app will be able to send data email to our Gmail account so let's wait for it oh yeah one more thing is where do we need to sign so let's say I'm using my Gmail account your Hadoop tutor uh gmail.com and save it you know so the C it has been saved successfully and now next thing is okay so first of all again validate whether data is still coming yes data is coming anomalies are still coming now go to your Gmail just to see your just refresh it yeah so what we are seeing here that once the anomaly has been raised and oh let's let's uh give some space so that it's more readable you know um because it is not readable ID and device ID and temperature and time are stuck together uh so there's no space in them so let's refresh it again and perfect so what we are seeing that okay there is anomaly since the device one is generating temperature of 101 degree celsius the ideal temperature is around 37 38 but the it has been heating up and this is anomaly so we got an email successful so we have successfully created a uh alert using your Spike and dip anomaly detector algorithm from your steam analytics we have populated it uh using your event hub from your Azure function we have signed an alert so everything is done right um so let's see okay so now see now uh the data is still coming now what we need to do we need to stop everything else our Gmail will be full so we need to stop our event Hub job we need to use we need to stop our simulator right and uh stop everything so this is the best practice once you are done uh to stop the cost to stop you know the space and storage uh just stop everything so these are few of the best practices what you guys need to follow and now let's revise what we have done so far in this project First Step what we have done we have created a simulator in a VM right using your C has a language in this simulator we are populating producing two things a temperature and voltage although we have only used temperature but we can use voltage also for any um any analysis then we have created your iot hub and uh one more thing what if grade is iot Hub Plus DSP DPS means your device provisioning service in device provisioning service we have created your group enrollment key and your scope and these two variables or these two information we have used into your Simulator for uh the connection between the simulator and the iot Hub so when you are registering a device in the simulator using register button it will use these two gek and scope to register itself with the iot Hub now what we have done we have created a stream analytics job and in any stream analytics job we need three things input query and output for almost all the jobs we have input as iot in this project because obviously uh the the data is flowing in from simulator to iot and for the first output job what we have done it's a hot output it means we need to have the real time information onto the dashboard so and we are populating this information on the power bi so since the Restriction with power bi is we to have the same user who is using stream analytics job and and power bi we have created a new user in Azure active directory and we have logged in using that user into your power bi workspace we have created a new workspace we have used the same user to log into the Azure portal and then we have created output and that the workspace which was created is automatically populated here and then we have created a table data set and then we have used a table and data set to to produce a report into the power bi the second thing what we have done is we have created a storage account and we have created a new stream analytics job use input as iot Hub and then we have populated this storage account ah uh you know with uh different folders like air uh month you know and then R and everything and then we have used this storage account uh with the data breaks to uh analyze this data third thing is complicated third so what we have done we have uh we need to do the alerting for the enemy detection so what we have done we have created an event populated that event with the stream analytics uh job in the Stream analytics job we have used uh a spike and dip algorithm right we have written a query and we have populated the event hub now the next thing what we have done we have created a zure function and that Azure function will be triggered whenever any event is coming into that event hub right and so what we have done once the events flowing in into the Azure function we can see we have seen and validated using certain methods that okay the on the console we have the method we have the messages coming in next thing what we have done we have created a logic app and we have integrated the logic app with your Azure function using four or five variables and then the we have validated Gmail is done so this is what we have done so far we have done your simulation we have done your iot power bi data breaks alerting a lot of things right I know this is very complicated project it took me two or three days to complete this project so many um you know debugging reputation I have done so don't worry if you guys are not able to complete it in in in a in a go keep practicing if you guys struggle somewhere you know let me know ah put the question you do comment email me message me on LinkedIn I'm more than happy to help you guys um again thanks uh for all the help and support please do share this video please subscribe uh to my channel and like the video thanks a lot
Info
Channel: Data Engineering For Everyone
Views: 15,084
Rating: undefined out of 5
Keywords: Azure IOT, Logic Apps, Event HUb, End to end project, Hands on, function App, Anomaly Detection, real time data processing, IOT simulator, Azure Stream Analytics, College Final Project, Big DAta
Id: IyPF05QKELo
Channel Id: undefined
Length: 129min 24sec (7764 seconds)
Published: Sun Oct 09 2022
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.