Microsoft Azure IoT Hub - Helium Hacks Happy Hour

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
[Music] this is another fun happy edition of helium hacks happy hour uh where i get to figure out how to do all of the things that travis does so seamlessly uh so you're at you're at the the peak of it right now we'll go ahead and let people sort of funnel in here uh if you know in traditional helium hacks happy hour tradition uh if anyone's new to the call go ahead and just introduce yourself real quick you know open open up your video say wave hello i guess oscar and i are new but we probably will have an introduction afterwards that's true yeah we'll we'll give you the a very special introduction um all right well i so i'm joey i you know long time community member i'm working as a technical program manager at dui and just filling in for travis today um you know we have microsoft with us to show off the brand new integration that is in the helium console yannick and oscar i i guess i'll go ahead and let you guys introduce yourselves and then uh we can probably jump right into some cool presentation i know you got something ready for us that's good janik would you like to go first yeah sure my name is janikop i'm a cloud solution architect for microsoft which basically means that i'm supporting our customers in implementing their like cloud workloads in the azure cloud and i have a very um special focus on iot and a very special interest in lorawan which also basically brought me to to helium great excellent and my name is oscar naim i am a principal program manager in the azure iot engineering organization um i have been partnering with yani for a while because one of my main focus is lpwa and connectivity in general and and my role is to help customers and partners take advantage of our azure iot platform and listen to their feedback so so that we can keep improving our product going forward so back to you yannick what's up cool um should we start right off or is there anyone in the in the community who would like to say hello and we did just add a couple more people while you're introducing uh if you're new to this call please do say hello uh we'll give you just a couple seconds to jump in there don't be shy like in the meantime i will will prepare my screen sharing so to yeah don't hesitate to just roll right into it a lot of times folks aren't on uh in a good setting to say hello so you should already see my azure portal but we also got a little presentation ready okay you should be able to see my uh my title slide and canon someone quickly confirmed to me looks great sweet okay so we had our our introduction again i'm here today with uh my colleague oscar and we will guide you basically through this presentation we got some um we got some slides we got some hands-on stuff and um yeah i mean if if you have any questions during the the uh the presentation or the call maybe drop them into the chat window or keep them until the end we are more than happy to answer them at the end of this presentation okay so um our topic today is um azure iot laura vaughn and obviously helium and we want to basically jump onto this this amazing integration that helium has done we want to show you some like an architecture that we have built on top of this integration we want to show you some devices and some pretty cool stuff that we have done and we also want to give you some insights into azure and i would like to start with some yeah let's call it facts and figures and just as a brief overview of the let's say global network that microsoft has built with their data centers it's kind of like like helium with the gateways but microsoft is operating data centers like more than 200 200 worldwide and more than 60 and they're combined in more than 60 regions which is pretty nice because i can actually decide where in the world i want to host my services so if i have like special requirements regarding latency or data protection i can basically host the data in my in my own country or a special region but like i don't want to kind of show off numbers here the main message i want to share is um that basically if you trust us with your with your use case um we will make make sure uh it scales um that the services are running reliable and um yeah and that your data stays secure and it's also mentioned on here that this is all powered by more than four million servers and service is actually a great topic because it leads to my next slide and and this is kind of a bit of a history lesson here because when we think about cloud basically everything started with like virtual machines uh virtual storage in the cloud as infrastructure service and this is something that which also a lot of people it's the first thing they they think about when when someone mentions cloud to them basically yet they're not a server that is that is just hosted somewhere in a data center but over the past years microsoft has invested a ton of money to actually build some platform as a service components and software assets as a service components on top of this existing infrastructure and the goal of this is to to make it as easy as possible for our like developers our customers to build their applications on top of these services so they are kind of standardized you can configure them and yeah you basically avoid writing a ton of code because this already has been done and by us and you can really focus on delivering the best business value to your customers and i got this super nice overview of all the services that are available in azure and yeah you can see that there's quite a lot of them we can could probably fill days if not weeks when we want to highlight all of these services that are available and today we're going to really focus on the analytics and iot part in the in the down right here so we want to show you how you can leverage these services these platform as a services and software as a service components in azure to very quickly implement your own iot data compute storage and visualization pipeline so it's not that you need to write like a ton of glue code all of these services are perfectly integrated with each other and it's actually it's actually great fun to build iot use cases and applications on top of this so when looking at these components basically the heart of everything is the azure iot hub it's kind of the the end point for all the iot devices and for example the helium console is just one potential like data delivery system that that is sending data to our azure iot hub you can connect a ton of other iot devices sometimes directly sometimes a gateway is necessary but we are really not restricted to one single data delivery platform and this makes it very versatile when it comes to like building your use case because you can combine sensor data from many different uh also connectivity technologies and sensor vendors and then like after the data has been ingested via the azure iot hub we got all these like services around that can actually consume compute or store the data and once the data has been ingested into azure um we got these core azure services but around this we also got like the the office suite from microsoft and like the dynamic suite which then can leverage this data that is stored in the azure cloud and like business people actually actually love to use these services because it's super easy for them to consume the data and and generate their own insights so it's kind of this self-service approach that that microsoft is driving here that that allows people um to to use the tools to generate their own insights and we will also highlight this during the presentation so um what we have done over the past few months i would say right oscar um we built this this end-to-end architecture with with a device we with a special tracker that that oscar will highlight um in in a few minutes and ingested this data in via helium into the helium console and then used the azure iot integration to seamlessly forward the data to our azure iot hub and then we use this azure function like this this serverless function to decode the data and you might ask yourself okay why why didn't you use the nice decoder that the helium console offers and the reason for this was that we had to do external api calls um to a special like cloud provided by semtech to do the um to the position um computing um we had to call this this external apis and that's not yet possible or maybe it won't ever be possible um from the the helium decoder functions that are that are available and then afterwards once we we got basically the real the real sensor data and the tracker location we will use or we used azure stream analytics to distribute the data into different sources and to do some analytics on top but we will all like highlight this in the later in this presentation so let's start basically at the beginning at the start of the the pipeline uh where the data ingestion actually happened and oscar you got the device and the gateway with you yeah let me let me share my screen uh here and sure okay so um he here here oops i guess why did it go over there let me move over here okay great so this is um the laura edge tracker uh by semtech and and it's the one that you are seeing on the screen here the lr 1110 and one thing that's very interesting about this tracker is that it doesn't have a gps it scans for wi-fi and other geolocation related signals in order to find clues in terms of what is the location where the tracker is and it pushes all this information to the cloud because all the computation of the location actually happens uh uh in the lorac cloud uh by semtech and and the idea with that is to make more cost effective that that computation and to save money not only in terms of hardware of the device per se because it doesn't have the the gps itself but but also um in terms of um energy saving because uh of course you know gps consumes a lot of uh battery so yes and and that's the reason why uh asean was mentioning we needed to make those api calls in order to get the corresponding lat load uh coordinates we are also using the dragino uh gateway and and the reason we are using that one is because it's very convenient um i just you know got a few of these a couple of weeks ago and and all you need to do is install uh the helium server on it and it is a very straightforward process to do that and and once you do that you just plug it into your router at home um and voila that that's all you need to do you have the tracker uh i mean you register the tracker with helium and then it just gets connected to the network uh via the dragino gateway so is there anything else that you want me to show you now yannick um no just maybe quickly to mention luckily it's not a a proper helium a minor so we're not not benefiting from um yeah now a nice agency that is generating but like to provide connectivity this was this was perfect yes yes exactly exactly yeah thanks i was going to jump in and clarify just in case so this would be considered a data-only hotspot yeah correct correct you just i don't remember yannick um if you let me know if you want me if you want me to show uh the console and the other aspects of um the deployment yeah sure please please i think that that's yeah because you know this is a very good tie to the to the whole story so that people can see that you know there is a real implementation here uh the integration with the helium the helium console was very easy here you can see the iot hub integration we already have one defined and you just need to provide a few configuration parameters and and that's all there is to it uh then uh you define your device here we have our laura edge uh tracker and and and we already have the integration defined and just need to connect the tracker with with that integration and that's all you need and you see that for the past uh you know several days we have had you know several messages going through uh here we have um the actual iot hub instance on the azure portal uh showing the the messages going through and uh we have a power bi uh dashboard that we created uh with an embedded azure maps and this this is um you know an example of how easy it is to to visualize lat long coordinates uh with power bi because all i did was uh create a tile uh based on azure maps and then import all the lat long values and that was it you know nothing else so yeah very simple uh back to you yannick awesome thanks now that now that oscar has basically showed us all the the management porn with the nice visualizations i want to give you some actually some insights how we have implemented this based on the components available in azure and the first component i would like to highlight is as oscar was already mentioning the azure iot up basically our our main ingestion point for for iot data and it allows us to basically connect any devices that are out in the field communicating via mqtt like very well-known iot protocol and we also have some fallbacks into https or amqps and basically if a device doesn't directly speak mqtt we can always use kind of um a protocol gateway as we call it to translate um the protocol into into mqtt there is there are also like blueprints or or um open source code available how to do this yeah and yeah you are not showing your screen by the way oh well good that you remind me thanks a lot sorry for this we like to see the azure percept behind you but yeah this is not the topic of today right maybe maybe once we have the the azure percep connected with helio then we will speak about this yeah i was actually showing this slide so uh you didn't miss that much yet so and really the nice thing about the azure iotip is also that it scales with the workload so i can start really small with a few devices and i also pay basically this little price for it but i can scale up to millions of devices and scale also the iot up and we have definitely customers in the field leveraging this with several millions of devices in the field communicating via the azure iot up and um next to the iot we're also providing a device provisioning service which is not like super relevant for for helium because this is already done by the helium console but if you have directly connected devices to um to the azure iot up and you basically just want to deploy them into the field with one single connection string to one central or global service you basically can leverage this this azure iot up device provisioning service which then basically provisions the device and assigns it to the to the right iot up that you would like to to use in this case good so let me give you um a quick overview how this component actually feels in the in the azure portal if we gonna use it and i want to show you some quite cool features as well if my presentation ends good okay so as oscar already has briefly shown uh the overview here and i can basically just brief very briefly or quickly see okay what's going on on this iot hub and like one of the major component is kind of this device management so i can see all the devices that are registered in here and this is another nice fact about the helium integration and that all the devices are like synchronized with the azure iot up so if i'm creating a new device in the helium console it will automatically appear um basically in my device registry of the of the azure iot up and then once the device is created it will also basically automatically get a so-called device twin which is basically an adjacent document which allows me to keep some static information about the device so i could actually leverage this to keep some some metadata about my device and have this as a single source of truth among my applications and and use this kind of a as a as a device database storing the the data about the device um just speaking about for example a vendor or or a manufacturer of an of a device and um i i think i briefly mentioned before that the iot op is also a bi-directional allows bi-directional communication and this is also implemented by the integration so i can choose a device in here and send a message back to the device via my iot of integration so i can open this up i can create a message body i hope it works so that it's translated automatically into hex well let's just quickly check so if i just send this out to the device we hopefully see it in the in the console as well so basically now um a downlink is scheduled on the helium console and yeah i mean i have done it now by the portal which is kind of just for the for the demo um but there's obviously a like a full fully fledged service api behind the iot up so i can easily like use this feature programmatically in my in my services so let me see if a downlink has been scheduled or maybe let me check in here real time packets yeah i think we have some trouble joining the device at the moment but i have seen it working promise and okay so this is basically the data ingestion part and um if you want to start basically using the data in uh in azure which was generated by your devices you got different options you have kind of the these built-in endpoints where you can fetch all the data that is landing from the devices but i also have this nice option to do message routing and the message routing allows you to basically check based on tags in the message and basically do a routing on top of this so i can define custom endpoints define a root on top and the azure iot up will basically dynamically analyze the tags and if a tag matches the corresponding route the message will be really redirected to this endpoint and this also includes a message in richmond so if i want to use um the device the device twin that i showed before i can leverage this in this enriched message tag so i can already actually join this this data created or added in the device twin via this enrich message tag message functionality good so let's see what's next in our in our pipeline so once we have ingested the data into the iot hub and it's ready for our services to be consumed we are now looking at the serverless azure functions and that's actually a super nice nice platform as a service component which allows me to run code without taking care of any like runtime environment or even operation system i really just have to deploy my code and define okay what kind of language i have i've written in my my code and then azure will take care of everything as of the whole the whole rest and again it's fully scalable so i can start small and scale up to a billion of devices and it's um it's a function-based programming so typically an azure function should really suit one single purpose and yeah as i mentioned before you don't have to do any infrastructure management this is all taken care of for you and we have some really nice pricing models available so basically the standard one is that you pay per run and i think the first 100 000 executions are for free so you can really like test a lot until you actually start paying for this service and again i wanted to show you quickly how um this feels in the portal and actually in the in a development environment so i have my visual studio code running and this is basically the um the function which does this decoding of the the data we get from our lora edge tracker so i'm not sure how familiar you are with with python but we are doing some decoding in here we are basically um [Music] calling this api and then receiving the the latitude and longitude based on the the the data that was sent by the tracker and this is all happening in here and it's actually super nice to develop these um these azure functions locally and because you got a like with visual studio code you've got a free development environment and you go you get the the azure function runtime environment as well so you can like directly test your function locally so i can just run it here it will start up it will it will show you you will even you are even able to debug the whole function and it's also configured to directly use the cloud output so at the moment my azure function that is that is running locally will connect to my iot up running in the cloud and this will be trying to fetch um the data the arrived messages from there so let's see if we i don't think we have any messages waiting because the device had been processed and also because i'm not moving and and you really need to move to get the tracker going so and and you know doing this kind of thing doesn't help very much maybe maybe you need to go for a quick uh for a quick run so that you can have some so you have about like 15 minutes until until your next part so we can generate some tests we should probably also mention that um this code uh will will be open source in github at some point near future absolutely absolutely together like the code will be open source to get and there will be like also the whole infrastructure and will also be ready open source as infrastructure as code so that it's super easy for you to deploy this into your azure subscription and have basically a nice foundation to start with so now basically i have developed my my function here locally and now i decide okay um i'm ready to deploy this to azure and the only thing i need to do is basically choose the right function in here or basically my azure subscription and then just click the like deploy to function app and then my code will be basically seamlessly deployed to azure into this into this kind of function environment and now if we switch to the to the azure cloud again one second i can show you how this looks like here we go so this is my my function app that is running in the cloud yeah as mentioned like we didn't receive any events lately unluckily but i have this full fully fledged function environment i can like monitor my function i can see basically the number of executions yeah basically we had some success we also had some errors because i was testing some stuff but it's basically all visible within this function runtime environment and as i said i basically developed this python function locally just deployed it to the cloud and it was running seamlessly and it keeps running until i basically say okay please stop now good and like this azure function takes care of the decoding of our of our lora message and now let's move further in the pipeline and that means we want to do some analytics on top and for this we got this azure stream analytics component and this basically does allow us to do some real-time analytics of on top of the arriving um json documents because basically all the sensor data that we are that we are receiving is is handled in json within within azure so we got different inputs i mean at the moment all our data is arriving in at the azure iot hub because it's it's kind of dynamic sensor data but i could also consume static data from just a storage account and i can also use reference data stored in a sql database or again like a blob storage like an azure storage account to enhance or enrich the data that was sent by by the sensors and then once i'm done with my analytics part um i can seriously integrate this data into for example a visualization tool like power bi or a database or even a data warehouse like there are many different um integrations available to basically seamlessly store or visualize visualize my data and that makes like streamalytics a super powerful tool to actually do this um do this real-time analytics based on these on this dynamic data and again let me jump into the the portal to show you how this looks and and feels like unluckily in here it's not that sophisticated but i got some other jobs running from other projects that i have been doing so this one this one is quite straightforward because i just um basically take everything that is arriving into stream analytics and just will write it into a blob storage at the moment so um there's there's not much going on here but i can show you some other um queries which are a bit more sophisticated for example here we got a job running and it's again it's it's a lora based project it's monitoring um environmental data and what's happening here is that i'm using some reference data that is stored in a sql database i do a join on a certain tag in the sensor data and then basically with this with this join i'm able to to reference on the data that is stored in the in the reference database and that can enrich the data that is arriving in my stream analytics and it's if you're familiar with sql it's really like really close to sql so if you have some some skills there already it's quite easy to apply them into azure stream analytics and the other one i wanted to show you is even a bit more sophisticated because it's kind of a rule engine which checks on thresholds so this is kind of a smart building use case and we want and we are monitoring like uh humidity and and temperature and again i'm doing this join of the reference data from the reference database and then i'm actually checking okay is there a certain threshold reach and basically every message is analyzed with this threshold and if a certain threshold is reached i will divide this data into another um like data sync or different output and with this basically i can very easily implement kind of um yeah alarming system or pipeline and like this is basically we are just scratching the top of the functionality of stream analytics because i can do like um also quite nice stuff with um aggregating data over a certain time frame so i can basically aggregate data over over five or ten minutes or i think it's up to seven days and do aggregations on top of this and then like do analytics um with the aggregations that have been built so there is really like it's it's a really powerful tool to do this this real-time analytics okay so and once we are done with this um the next part in our pipeline is basically um to generate the business value and basic for this um i would like to show or would like to present to you power bi which is kind of the go-to visualization tool when it comes to actually making use or show data um we have power bi oscar was briefly showing it before with the the map that was implemented there i'm just showing the latitude and longitude but i mean you are like power bi is a really fully fledged business intelligence suite which allows you to import different data sources and then build visualizations on top of this and again i mean in the in the use case or the architecture that have been presented here we were basically just visualizing latitude and longitude on a map but i also wanted to show you some other cases that i have implemented in the past which i'm actually allowed to show you and because they're kind of hackathon projects so one is again visualizing this this environmental data and again showing a nice map like some sensor data visualized and it's just super easy to create these visualizations based on top of the the data that was streamed into certain data sets or or a database and yeah this was the one case i wanted to show you and the other one was a smart building case so you can you can see here a floor plan implemented and we got a kind of a heat map based on the temperature and um yeah and then you can basically step into the rooms and check the temperature over time and humidity and so on and yeah i'm really like really no business intelligence expert and and still i managed to build these quite nice looking reports here so again i got this i got this desktop application but power bi is also available on mobile so i can basically also check these reports on mobile and it's also available in the in the browser so that it's really nice way to consume the data then and in fact the one i was showing was what was actually the report on the browser and and it's also worth mentioning that on the phone the ui is a responsive ui so it will adapt to the device so you don't need to do anything all the reports will be basically adapted for that particular uh screen size yeah absolutely thank you oscar and i mean now i'm basically this is mainly focused on iot data but next to the iot data i could also like put in maybe some for example weather data so i could i can really tap into a lot of different data sources all consolidated in in power bi all right okay and that's basically like the the pipeline we wanted to show you implemented with the different azure services and yeah i mean we got this code we had to write for the for the tracker decoding um but everything else is basically running um just by configuring um the service within azure so it was yeah super straightforward for us to create this architecture and as i mentioned before we will provide this architecture to you in our github repository um we will let you know probably also why uh hopefully um helium allows us to to publish this via discord so to to let you know once this is ready okay so now that that we had a look at this architecture i want to show you what's next on our roadmap and there's actually one um components we can't miss or one big um quite quite um central component it's also called rit central which is quite a highlight in our in our iot landscape because if you remember the slide i've showed before right we got all these these nice services which allow you to build your own basically iot architecture but yeah i mean to be honest even though they integrate quite nicely with each other you still need to have the skills and basically to put them together and i think our product group has realized this and basically came up with the conclusion that we need to make it easier for our customers to um to build their end-to-end use cases and they came up with a product called iot central and it's it's basically built on top of all of the services i've shown you before and abstracts them with a nice with a nice dashboard and allows you to basically build your applications based on this so-called application as a service platform and yeah we allow you to basically build your use cases on top of these of the services that are hidden below this this iot application platform and it's ready for your end-to-end use cases so you can bring your sensors you can connect them you can visualize the data and everything is nicely abstracted in this day in this dashboard and again it's it's enterprise great it's it's fully scalable and basically ready for uh for prime time um did i miss anything here oscar did you have any any additions because no well uh no well i i guess we should probably also say that um iot central is an application uh as a platform uh kind of service and and and and this is a concept that's going to keep evolving because on one hand you have all the different past services and then on the other hand you have everything managed uh from an application point of view in iot central and these two are separate concepts today you you you can expect them to kind of like become seamless as as time goes by and so it's going to become easier and easier for the user to to move from one to the other so yeah that's uh perfectly summarized actually and that that also brings us um basically um to the next slide here and you already can see we also offering these kind of blueprints within azure iot central kind of existing use cases that have been done that can be reused as a solution as a foundation for your own for your own iot use case that you you want to implement and again basically takes care of all your iot data and then nicely integrates with all the let's say data consuming applications that microsoft also provides on the business side like like power bi i have shown before and we also have this no code development environment called power apps and basically all the other azure services which which help you to compute and and present the data all right and this brings me to your final part because like azure iot central is also perfect foundation for azure digital twins and you give us your insights there right oscar yes yes let me let me share my screen [Music] um two there we go so hopefully you should be able to see my screen great so digital twins uh so so the first thing that i wanted to say is that the concept of digital twins is not a new concept and um i i believe the the first one to brought this in was at nasa and it was for the apollo missions because of course you know you can imagine the complexity from an engineering point of view you know putting a device in space and we see that today with with the rovers that are on mars i mean that thing is amazing even the helicopter and imagine managing that from from earth right so there are so many things that can go wrong and having the ability to model and simulate uh that device it's critical um today mostly you know we started with uh just doing connected assets so you had an asset say an elevator in a building and and you started to generate information and and push that to the cloud uh you know how many times the elevator goes up and down in what floor it is uh how much weight it is carrying you know the temperature etc and that's useful information uh from the elevator itself because then the elevator company can use it for uh predictive maintenance and all sort of other things so that's great but but but let's say you know you want to start enabling other kind of scenarios where um you know you have um several people on one floor and they are all calling the elevator to go to another floor um if you had more information about what's happening in the building now all of a sudden you can start controlling the flow of people within the building and making it more efficient but for that to happen you need to be able to have a connected environment and and of course when you look at the smart city uh what you have is really a system of systems and and and you need to be looking at connected ecosystems so um this is where you know digital twins uh comes into play because you need to be able to to model all these environments in a way that um that makes it possible to represent the scale and complexity we have identified several areas where we are investing from a digital twins point of view uh scenarios in manufacturing retail real estate uh citizen government healthcare agriculture education finance energy and you take for example manufacturing um you know we have multiple examples where you know you are monitoring a machine in a plant right and of course the typical use case is you do that for predictive maintenance but if if you are if you want to monitor the whole production line then not only you need to be able to monitor these individual components but but the whole line and and in many cases the output of one plant becomes the input of another plant so if you have a quality issue in plant one that might affect plant two so so you need to be able to take into account all these different uh uh interactions so if we look at the at this nice representation uh and let's say you know we have uh you know our laura edge tracker in a pallet and that pilot goes into a truck yes we can certainly monitor where that pallet is is at this moment in time uh because you know we can track its location but but let's say that um you know we want to enable a more complicated scenario and and we want to be able to look at the whole system right so let's say this is supply chain scenario and and and the tunnel that's uh right in front of that truck uh it's close and and it needs to basically take a detour that can affect the supply chain so so perhaps now we have to schedule different deliveries from another truck to that other location to the warehouse where that thing is going so that it can get on time and and the only way to adapt dynamically is if you have information about the whole system now building digital twins is is not easy typically because there are many um issues that you have to face when you're doing this so on one hand you know these models could be very complex um also you're dealing with silo data you know all over the place you also need multiple building blocks uh that you need to consider when you are building the solution and what you build has to be scalable because otherwise it's just a toy and then you need to sync that digital twin with real world uh updates so that's where our actual digital twins uh product comes into play and and and for these oops i want to show you a very quick video because it summarizes the vision of azure digital tweets so let me play that again imagine our world today people vehicles businesses buildings hospitals factories entire cities and the billions of devices that connect us to this world and to each other this is the physical world but what if we can replicate that physical world into its digital twin a digital twin that is bound to the physical world in real time that we can experience in mixed reality that we can collaborate with that we can run simulations on and find what is important to us that we can apply ai to learn predict and act to save time and money to reduce carbon to preserve natural resources to improve safety to bring us all closer this is already happening today across the world organizations are taking advantage of this new wave of innovation that trends to an interconnectedness that enables the metaverse everything modeled in the metaverse mirrors the status of its physical twin including the interactions and relations between everything else by applying the power of the cloud we eliminate boundaries of what this can do the potential is limitless we can now track and analyze data from connected environments to identify patterns trends and anomalies we can now simulate any possibility evaluate outcomes determine the impact of any change or condition we can now harness ai to perceive the physical world to improve and automate tasks to give superpowers to your frontline workers we can now give everyone in your organization the power to build apps and workflows to collaborate with each other in this virtual space to share and receive expertise at the right time with the right context in front of you to move through the physical world and get relevant information about its digital counterpart when and where you need using mixed reality the microsoft cloud brings these capabilities together with the limitless computing power of the cloud and intelligent devices at the edge and creates this framework for building immersive and impactful digital twin mixed reality and metaverse solutions [Music] and by the way a couple of things about the video uh any similarities with the the upcoming matrix movie it's all a coincidence by the way um but but also i wanted to say that um what you see in the video is not made up this is all based on real use cases and real deployments so so it's it's amazing because um you know we are beginning to see the realization of that vision and of course it's going to keep evolving because things are going you know that way that that's for sure so with um azure digital twins you can today pretty much model any environment and bring it to life with real world data you can query um the the digital twin graph which is the um instantiation of that model uh that that you have built you can query it and basically get answers to your business related questions you can connect any data to it from iot to you know business systems you can route data from the digital twin to any external point for integration with other services you can manage access and identity with role-based access control you can bulk import for you know massive deployments we have a tool called azure digital twins explorer that i'll show you briefly uh in a sec for visualizing um and interacting with the graph in 2d and we have a 3d version that's upcoming soon as well we have azure data explorer integration for data history and and this product it is enterprise ready there are multiple partners and and and customers that we have today actually using this product in in multiple ways so let me now switch to yes okay so so here let me actually move this here so that i can go through the tabs uh this is the the uh the portal dash portal again and um this is an instance an instantiation of the digital twins um i won't go over this because i i already talked about most of the details here but uh but basically um this is how you you you first create your digital twins and one one reason why we are able to model anything is because uh we use this language called digital twins definition language or dtdl which is based on json ld and it's a very powerful language that allows you to model you know anything from people to places processes and things then you bring that uh to life by creating a digital representation of that graph uh which is called a digital twin graph and and in that graph you basically overlay all the data that's happening in the real world and and you have to keep it up to date and in sync that way you can actually query uh that graph and get the answers that you're looking for and you can also um there are many extensibility features that allows you to add business logic data processing and integrate with other business systems but um now let me actually show you this is the the digital twins explorer if you go to github and you search for digital twin explorer you'll find the project and and i basically um you know deployed the solution on my machine is very straightforward and and of course this is a very simple graph but um i just wanted to to use something simple to make the case uh this is a building that has floors and rooms and if i click in one of the elements of the graph it'll show me the properties uh we are we have two properties here humidity and temperature and i and then i can actually query that graph and and and you know with sql like queries i can find for example all the rooms where the temperature is above 75 degrees or i can change it and look for the opposite all the rooms where the temperature is below 75 degrees so as you can see it's very easy to interact with the graph you know again sql-like kind of queries and a very powerful kind of visualization so with that um back to you yamik so that we can close absolutely absolutely and i think one one highlight feature also of the digital twins is that um information changes within the digital twin are automatically propagated through different levels right so that actually if for example a temperature changes in one room um this temperature change will actually propagate it to the next level which can then maybe be the floor and with this event trigger you can then start build your basically logic on top of this all right i guess we can close the presentation i will this time i won't forget to share my slides again one second good here we go okay i just need to quickly jump through here exactly and just a quick oh it's it's not really a call to action just if you want to get your your hands dirty and want to start with testing our services in the azure cloud it's basically free you can sign up use most like most of the services available even offer a free tier so that you don't ever have to pay we're not saying anything yannick by the way oh yeah perhaps you forgot to click share is that that happened to me the first time yeah yeah yeah you're right again i really wanted to work now it works now i messed up my whole nice call to call to action again okay this is just what i wanted to show yeah there you go um yeah i mean i don't think if you you probably can't click the link but there should be a link in here um just quickly google for the basically the azure free account and um yeah you can see that we got a lot of free services you get these 200 dollars azure free credit and we will also publish basically the architecture i have shown before onto github so that you can very easily deploy it into your subscription and yeah just like get started in the azure cloud and get your hands dirty um yeah i guess that's it from our site so let me quickly switch to the last slide so yeah if i mean feel free to to shoot any questions at us and you find our contact info in here feel free to connect if you if you want to discuss any nice iot cases or iot cases in general feel free to reach out to us we are more than happy to to support you here yes yes indeed we're very interested in in lorawan related cases microsoft recently joined the lord alliance and as you can see with the integration we have done with uh helium you know uh this is really a very strategic area for us so any interesting use cases that you have uh in this area by all means uh let us know we'll be happy to help yeah oscar this was a fantastic presentation uh if you have a couple minutes where we'll usually open things up to questions if anyone has what any right now sure i think everyone's in a little bit of a state of uh information overload there is there's so many things in there and you answered so many questions along the way uh i i quote that the matrix is non-fiction i like that right oh yeah i mean if you if you ever get the chance to play with this with this hololens maybe at the conference or something it's amazing it's really it's really uh next level and you have the feeling you're already in the future so the the possibilities there it's it's really it's limitless those are yeah those are certainly really cool and you know what's really exciting about all of this is you know we have especially on this call a lot of folks playing with sensors and sort of discovering all of the opportunities that are available there and i think sort of the next frontier for a lot of folks is figuring out what to do with that data and you know the presentation that you've put together i think offers a whole range of solutions with that in mind you know i think we're going to see a lot of people starting to hack on the power bi platform just to sort of start getting these dashboards together as an outlet of um oh travis just reminded me we can give one of these away so speak up if somebody has some cool iot use cases that'll fit right in with this uh this newfound knowledge um but yeah finding finding ways to get data straight out of helium console and into a database and then representing that through power bi i think we're going to see some people playing there there is a question that's a free trial give you full use um and and for the most part yes um you know it varies from service to service but um in many cases you have a free tier and in the cases where you don't have a free tier um the 200 that you get when you sign up for usher first that that goes a long way uh because all you need is is is to be careful and only select uh really the scale that you need and and for a few devices few messages you know it's just pretty much everything is free so that shouldn't be an issue can i ask a question yes please uh thanks for the presentation i actually have to leave briefly and because i was so intrigued by the non-gps tracker that you had shown at the beginning um i have a potential client that's interested in uh asset tracking um and where i live in southern british columbia there's quite a few areas here where they don't have uh like cellular data coverage right but there's there's actually more wind coverage at least from what i can tell when i go around with my mapper so i'm wondering could i use uh your platform to create just to start with a simple sort of white label asset tracking service for them uh and do and the second question is do i need to to be like a developer or can i do most of it sort of with no code and sort of pulling together the widgets and so on i think i think for the most part so so let's go you know uh one thing at a time i think laura one might be you know a very good choice uh at some point though you need to connect the gateway with the cloud so so so typically if you can find a location where you have the gateway and then use cellular to get to the cloud and then all the sensors connecting to that gateway and the gateway is in a good location so that it can it can see up to you know 10 miles or so you know we we have many use cases in remote locations that do exactly that and and and the lorawan network because it's on license spectrum it's pretty much like a private network so so you don't need to do much to to get that running and then um from a from building a solution point of view um most of the code that was required for this was for the translation of the payload from the tracker to lat long and and we did that uh and we'll make it open source so if you leverage that it's just the application layer probably what what that you you will need and if you use something like iot central then it will really be just configuration uh if at all that you will need to do actually this this reminds me of a a good point because one thing i missed to mention is that at the moment we have like with helium the iot up integration right so we can seamlessly send data from the helium console to iot up but obviously our goal is also to seamlessly integrate iot central and we will definitely working with the product team of helium and our developers to basically integrate also iot central seamlessly with helium which makes it even easier for you to build your application on top of this yeah and and it can it be white labeled or would they like my own solution yeah okay yes yes it can great thank you so much no worry thank you i have a question i missed uh quite a bit of the the presentation so if it was covered for me but i have a client that wants daily reports from tracking and currently we're using a satellite system but the i have access to the api but the only reports that are available from that service is a just an excel spreadsheet but the client wants a pretty report that has like maps and locations of all the different assets and i've been trying to find a service that i could generate just like a pdf report that's emailed daily that has i could break down assets by location because i have a fleeting area it's a global barge transport company so they want to know where all their barges are in the world and so uh i can't find anywhere where i can create these these custom reports that show a map by location and then also be able to break it down to where if there's a barge that's not in a fleeting area but in the middle of the say the gulf of mexico they don't get a map that's just blue water or groundwater whichever one we're down here in mississippi so something smart enough to realize that it's a regional map that they need or something a little bit more detailed yep i think this is exactly where power bi actually has its sweet spot i mean you could even like start with just integrating the data from the excel sheets into power bi just to visualize basically the excel sheet data and then in a second point you could think about like directly streaming in uh the data from this the set the satellite service in into power bi so this is definitely this use case i get from the uh the api so absolutely and the other thing is because of the integration with azure maps so when i showed my my power bi report it was actually an actual map style and so so basically um all the mapping parts will come from azure maps and and so that that would be very handy uh for you for for you know visualizing any geospatial uh kind of data right now there is also cosmos db and and the ability to represent geospatial data and store it like that so you know depending on the size and scale of your project then you know might be uh worth considering that as well but uh from a reporting tool point of view certainly power bi will be like a good choice to to get started with the integration with azure maps and and by the way the that integration is in preview so you need to in power bi desktop you need first to enable uh private preview features so that you can see uh the azure map style so and and and if you google it you know it's very straightforward but you know at microsoft for some reason we like it to make it a bit more complicated uh and you have to do the extra step to get there but uh is there like a pretty robust community of like a forum that people help each other out with i mean because some of the other ones that i use you could go on there but there might be one or two posts per day so you post like a question for help and it might be a week or two before somebody unless there's definitely like a worldwide community um available okay um because yeah it's it i mean on one side you all you also have stack overflow but we also have kind of a stack overflow like community um in our own um like forums okay so you will definitely find help there yeah but i think your use case is straightforward enough that um you should be able to have everything you need for it and they i mean it's two positions a day that we start out testing with one and then they want to grow and then be able to expand beyond just like large assets to smaller assets as well so oh cool very cool yeah all right i will uh i'll start by googling get my account on sure and i mean if you need any further information just feel free to to reach out yeah i wrote down your email addresses so perfect there you go and there is any phone number there as well good cool well i'll delay us for a little second just in case somebody else has another question but uh otherwise thank you both for jumping on and giving us such an in-depth overview of everything that you've been working on and bringing to the helium ecosystem i'm really excited about that we're not done yet thank you very much uh and the helium community uh for for this opportunity we really appreciate it thank you great yeah we look forward to seeing more oh cool we'll do take care guys thanks everybody take care bye bye
Info
Channel: Helium
Views: 2,855
Rating: undefined out of 5
Keywords: microsoft azure iot hub, helium azure, console integration azure, azure console, azure iot hub helium console, helium console microsoft, microsoft helium, console integration, helium console iot, helium console 2.0, console 2.0, lorawan, internet of things, integration console, helium console devices, helium console add devices, helium console import devices, Helium, helium hotspot, helium data credits, azure helium hacks, iot hub helium, helium microsoft, hnt microsoft
Id: pjPHVTFkbug
Channel Id: undefined
Length: 69min 45sec (4185 seconds)
Published: Thu Dec 16 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.