Azure Serverless Conf - Americas

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
[Music] hey everyone welcome to azure serverless conference welcome to azure serverless.com welcome to azure service conf my favorite part about serverless technology is that it's so easy to get started my favorite part of serverless technologies is the ability to be productive without having to worry about all the underlying bits and pieces my favorite part about serverless technology is that i can get started super quickly and i only pay for what i use just let me write my code build my integration spin up databases and azure you take care of the rest my favorite thing about serverless technologies is no infrastructure maintenance you don't have to think about compute or storage you just get started and start building your apps i am looking forward to connecting and engaging with our community and all of the viewers that join us today i'm looking forward to hearing from people all around the globe about what they're doing with serverless and seeing what i can learn and make my applications even better over the next 20 hours we will have some fantastic live sessions showcasing many of the surface technologies available in azure as well as a host of sessions you can watch on demand enjoy the event enjoy the events enjoy azure serverlessconf [Music] do [Music] hello hello hello welcome everyone to our very first azure serverless conference my name is nithyan arseman and i'm a senior cloud advocate on the developer relations team at microsoft and joining me from our amazing mvp and rd community is melody hi melody how are you doing today i'm great so excited to be here i know i'm joining from new york and melody you're joining from british columbia canada so take a minute tell us in the chat where you're dialing in from we'd love to see you from all over the world so let's first talk a little bit about what we have in store for you today azure serverless conference was actually organized as a stage for the azure developer communities for both the community and industry experts to share what they've been doing with serverless in their applications so we're going to talk about things like azure functions cosmos db azure sql even got a battle royale between logic apps and power automate over the next 20 hours you're going to see and hear from community members all around the globe they've got amazing services but first how are we going to start this off melody well the azure serverless conference is three hours i'm so excited three hours of live streaming as well as an entire track of on-demand sessions they're available now at aka.ms azure serverlessconf i'll pop that into the chat for you to make it a little easier to find each of the three live streams will be will have its own unique content delivered by members of our azure community such a great community that we have if you miss any of these live stream sessions no worries all the sessions will be available on demand shortly after each session concludes each of the live sessions will have live q a with each of the speakers if you'd like to ask questions we'll be sure to join the live stream on learn tv and submit questions in the chat so what do we have going on today nydia wow where do i even start i'm going to let you talk about the keynote but i think we have a number of speakers we're going to have talks on media processing workflows a battle royale as i mentioned between logic apps and power automate we're going to have an amazing session on how you can use serverless gaming large scale gaming we have people talking about cosmos db and azure sql usage and a lot more but we're kicking this off with an amazing keynote if i'm not mistaken right yeah thanks for that great overview i'm so excited there's so much great things going on today and i'm super excited about the first session of the day the keynote how the net community team uses serverless to automate all things with john galloway and james montenego let's take a watch hey everyone how's it going oh it's going wonderful where are you calling in from i was always curious yeah we're super excited to be here and to kick off all the things so we're ready when you're ready let's do this over to you guys thank you see if we can bring up our slides awesome well i'm james montemagno like uh melody said i'm john galloway and we're on the dot-net community team and john did come up with that absolutely astonishing title but john it just didn't fit on the slide i'm not gonna lie i thought yeah that's we do not so we're going to show you how our team automates all of the things so we can do more things and then automate more things so you might be wondering well who is the.net community team and i'm so glad that all of you asked it is john and i but we also have amazing team members from all around the globe uh that are out and our our mission actually from our okrs are to enable grow and nurture a diverse and inclusive.net community to where actually every single individual feels warm and welcome and have the ability to learn how to build amazing applications with net and of course you can build serverless stuff with net which is absolutely awesome so you're saying okay like what does actually a.net community team do that's a great mission statement but what do we do well we do a lot of things we actually help build and maintain not only the.net website where millions of people go every single month to learn about.net but also interactive pieces like net live tv that we'll talk about today we help produce awesome content from microsoft learn so anyone can learn for free how to build applications with net we help support the amazing user groups and community members around the globe we present at conferences just like this and also put on conferences like the upcoming.net conf dotnetconf.net which is happening in november so make sure to check that out we also do workshops and provide workshops for the community we provide architecture guidance we help with the mvp program and we help do the blogs and this is just like eight of the things that we do we do like so much awesome stuff and we love it because we love net and we love the community yeah like james is saying we just have a blast doing all kinds of stuff some of the things i've worked on recently are like the.net conf helping organize that uh contributing things to.net website just had some features launched last night we helped run live streaming shows so much more and you know while we do have an amazing group of six people we just can't do all that manually there's just too much to organize and handle and set up um and you know we want to solve that by using net code we don't want to write just a ton of extra.net code because then we'd have to write debug maintain figure out why didn't run on a leap year all that stuff so we really want to be strategic about what we do so we have built an army of a serverless harmony armypoweredby.net so uh you know a great example of this is the.net website it gets tens of millions of views in a month and it's built completely on.net technologies on the front end we've got asp.net razor pages and blazer blazers using webassembly we've got uh azure search uh behind the scenes we've got a ton of backend services and one of the newest community projects we've launched is dot net live tv you can think of it as just like live news for net developers it's got live streams and events things going on all the time and it's live chats with with us and with the uh programming and engineering teams yeah it has like a really interesting back story too it does yeah so this started way back when dot net core was first launching like 2014 2015 time frame and we started just doing these google hangouts calls it was scott hanselman damian edwards and myself and it was just random google hangouts and we'd chat it and and throw it out on twitter and see who showed up and after a while damien put together a website this is some of the first stuff that prototyped razer pages and it was a really cool website but there was still a lot of manual work if you didn't set things up just right if you didn't name them just right the show wouldn't work you know it was a catastrophe um and it took tons of manually updating uh and also it was very very specific to asp.net community stand-up it was one show yeah and before i was john's manager i was super jealous of everything that john was doing with the asp.net community stand up i worked on the xamarin team and i basically just wanted to copy everything that john was doing i was like this is amazing i actually set up a call with john and i said i want to steal all your code and do all the things and then we said hey you know what like why don't we just do this but for everything right because dot net truly is your platform to build anything not just web apps not just mobile applications but truly anything so we went off and we talked to all the pms and engineers and we said why don't we set up community stand-ups for all of the shows granted we didn't have the infrastructure any really know how to build this thing but like we could do it i mean i've never built a website but john has so we could totally build this right so funnily enough we got on twitch and we started live streaming for about six weeks just one one day a week and we built an entire donet community stand up completely open source web page it pulled from a single youtube playlist as the source of truth it had a countdown timer and it parsed all the show information into different categories and all this stuff we're feeling real good we have people in the studio and channel nine it was like wow this is awesome and then kovit hit and then we're like well we got stand-ups we got to figure out what we want to do and all of a sudden all of the team members from other teams said we love what you're doing but we want our own shows we want to do more shows we want to engage with the community more so we said let's just go crazy with this thing not just have a donut community stand up with a bunch of different shows let's build.net live tv that at this point has eight different shows and we're streaming 10 times a week on the dot net youtube twitch and learn tv so we said well how are we going to build this thing right i haven't built a real production application in a while so let's go into the requirements mode well what do we need this thing to do well we actually need to normalize the data uh into a single api youtube is one source of data where we schedule shows but we also schedule shows on twitch and on learn tv and other places and want to highlight other community members so let's sort of be able to update that schedule from not only youtube but also blob storage we want upcoming shows and completed shows we want a live shoot show view kind of so we know if something is streaming right now so we can display information we of course need to easily be able to scale this thing what if right now we have 10 shows but we have 20 shows next week everyone wants to start live streaming we want to be able to add those things easily and of course we want to get metrics so we want to see how our shows are doing how many comments how the engagement is going and of course generate those sweet sweet reports for all of our bosses yeah so the first part is to create a back end for this website so what what it does it's an azure function it's running with with.net and it's pinging against some back-end data sources it's got a youtube api and it's also got json files and youtube api rate limits us so by setting this up in a function we're able to kind of consolidate that and minimize our calls over to that and then that shares that information over to azure search as research is already being used by the.net website so this was a really easy integration to do there then when we want to surface this on the website we needed kind of a back-end api and fortunately that was really easy to set up with that back-end infrastructure so we've got a blazer web assembly front end and then it calls into a rest api powered by azure functions with net and then that calls into the azure search and gets us json blob of data what's really cool with that is the front end has to do almost no work it just gets this pre-digested you know perfectly set up information and just displays it so no matter no matter how much traffic the front end gets it really scales well and once we've got that information it lights up all this cool stuff on the front end so we've got one uh api endpoint that tells us when there's a live show and it'll automatically flash at the top of the website and then once you're there we want you to see our other shows and so this lights up again we've got this thing that says here's all our different shows you can see and then we want to make it easy to find our upcoming and recently completed shows and so that's all right there as well yeah it's really cool that you know we started with one function but it scales out that we create like 10 different functions all do different little pieces of functionality both publicly available and some admin functionality too and then on top of that right we have the website and then the important part is to get all the analytics and all that data so the analytics is actually really interesting because you know we're streaming to a bunch of different places which means we need to integrate into all of the different data sources so we actually use power automate here and there is a battle royale later now we use both power automate and logic apps which is fun based on the purpose but we have power automation that is scheduled basically every night and what it does is it pings a bunch of different services and normalizes the data so the first thing it does we have yet another azure function powered by csharpn.net that calls into the youtube api and what that will do is get all of the on-demand views and all the chats and all the likes and all the comments and it shoves that into a cosmos db database normalizing the data so updates deletes and reads and all the stuff we also have some excel spreadsheets for live you know information that's coming in on the shows so it's just like another piece of data source because everybody loves excel so we have some data there and it normalizes that data into the cosmos and then we also have internal data sources and we use cousteau queries in that power automation and then again we shove that into the cosmos db database and then generate a beautiful wonderful power bi dashboard that we can send to all of our bosses and this is actually what it looks like this is one of ours as you can see i'm a power bi expert as i shake my head no you can't see me thanks lamont for hiding my face i can't do it there you go shaking my head i'm not but you know to me i was able to take all that data generate it all here and see how our shows are doing over time and of course all this data is is is a public data right you could go to every single twitch every single youtube and and actually look up this data but if you're one of our 20 30 hosts you don't want to do that you want to go to one place that's here and while we do have this beautiful power bi dashboard you know you got to go load up power bi go do this thing so we we said let's automate and make it even easier to give direct feedback to all the show hosts so we created weekly reports this is cool because we're reusing all this data and all this intelligence and we have yet another power automation every week that runs pings that cosmos db database normalizes and figures out the last two weeks of show data and sends a team message and an email every single week to all of the live streamers that's really cool and this is what that looks like this is actually a screenshot from our team's channel where anyone can come in and see the data and of course go into the power bi for more information yeah and so this was one of our first ones and it worked really well for us and once we had kind of the code and the patterns down for this we started seeing opportunities all over the place so one that popped up recently is is this.net events and you know the the.net website now that we had live shows on it was starting to come a very like dynamic community place to hang out and so we thought it was really natural to make it easy to find events that were going on around the globe as well so uh you know first of all we we thought well we'll need to create our own api and it turned out there already were some data sources internal at microsoft where we were tracking hey who's going to this conference and which conferences are we sponsored but they were for all conferences not just.net so we needed to filter and normalize that data a bit and then we also want to be able to call out some featured events at the top of the screen things like net conf that are really important we want to make sure that we were able to feature those as well and then we didn't want to be doing all this every time you hit our website so we wanted to have a data cache that would keep all that information so here's how we went about doing that first of all we dusted off our code for our uh for our azure functions however this time i decided to use an azure function using net5 in isolated mode one of the reasons for that was that it made it really easy to integrate with sql server and so i'm able to use sql server with entity framework and connect directly across for our featured events we're using a json blob that's just over in a github repo and the reason for this is it makes it very easy for team members to submit a pull request and update it there and then this goes through and gets all that information and then puts it in a json block using that pattern again yeah we love this pattern if you can't tell so it's one thing you'll you'll see over and over again but i think this is really cool because this isolated mode john this thing is amazing oh it's crazy so some of the things that i just love doing with this isolated mode is i was able to use all my asp.net core skills so i could use asp.net core configuration i could use user secrets which are awesome so that uses a json file on my local disk but it's not in the source code directory and so i can't accidentally check it in the github hands up if you've ever done that and then also it can overlay that when i put that up in production and those user secrets are then overwritten by environment variables in azure and then hooking up any framework is super easy it's like the two lines of code that i've written a million times and i just do that again so here's the actual function this is uh the thing that runs on a schedule and goes and gets those upcoming events and i can you'll see the token replacement there update events timer and that's able to read directly from my local like settings on my computer and then the azure settings in production and then here's kind of the workhorse function that runs the you know it's the brains of the function here and we've got it separated into a separate function so it can be pinged via a rest call or run on a timer so you'll see up at the top i've got an ef query and it's got a link filter to make sure that we're only getting those.net events and the ones we're ready to put up on the website and then after we've got that super easy to just process it so we serialize it to adjacent blob and save it off to storage again using that pattern james and i've been talking about where we have that json cache so for instance if we're unable to hit the database it doesn't matter we can still respond with the information and then we also at the bottom quickly run over grab that json blob from the github repo and also store that information so then when it's time to retrieve data we have a very simple asp.net core is able to call into an http endpoint exposed by that.net 5 rest api and then it grabs the json data and just exposes it over one really cool thing that i was able to do as part of this is there's a new feature that they had just enabled for this which is open api generation so this will generate a swagger document that is great for documentation for sharing it with clients and with other developers which i did i sent them here's a link to our swagger and you'll see it's just a few attributes that i had to set up up at the top of the page and it automatically like i tell it this is the body type and it's able to generate the swagger from that and a really cool also thing that i get out of this is a swagger uh page so it actually generates this documentation like i said i was able to send it over to team and say this is the api we need to support and these swagger uh pages also when you click on a specific function it will show you all the different things and you can actually click on links to try it out and test it so the end result is this page and what's really magical about this is we don't have to do any work it's automatically maintained you'll see those two uh call out events up at the top of the page those are maintained by just putting some information in a json blob over in github and it all just works yeah i think this is really really cool because as you see in the back end we can use sort of that standard asp.net logic that we know and love but also like azure functions hides all of that authentication it hides all that configuration you need because you can put all of your connection strings your username your password any of that information into the environment variables right so none of that stuff is checked into source code or anything like that or hard coded it's all environment variables so we can easily pivot back and forth like we said earlier the dotnet website is ginormous and it's many many pieces of technologies so we wanted to make sure that if if we're having an event and marketing sends hundreds of thousands of people to live tv or to the events page the rest of the website stays up and it's ready and good to go so these are all deployed into different resource groups in the back end and they can all communicate with each other so if one part of the website's getting you know clobbered with information everything stays up and it's ready to go so i absolutely love this architecture on there really simplifies development all up especially when you're building at scale now we've talked a lot about like some of the the.net website pieces with donetlive tv and the events page but i also wanted to talk about our blogging system because this is actually a really interesting problem to solve which is like how blog articles get onto the blog now john i know you're saying like all right you probably just log into a cms you write some blog posts and you schedule like done right no no no because we're engineers it can't be that simple and that's not allowed so the problem is it's like tons of people are blogging all the time on the dot-net blog from the team and we wanted to create sort of an engineering system for the blog so i worked with emo landworth and jamie singleton from my team and we said well how do we make blogging easier for anybody well okay well we think about this like what does that mean we want easy authoring experience we don't have to have people log into a system of write a bunch of html or different different things we want them to write in a language they feel familiar with when it comes to blogging and usually you want sort of that the review process to be easy if ever you use the cms you can't like leave comments you can't do suggestions right we're engineers we want to do the things that we're doing in github or on devops or whatever we're coding at and this is a big deal with these.net blogs some of these get hundreds of review comments yeah so we really need a good review system and a lot of times these blogs are scheduled you know a month ahead of time so we want that long process anyone involved to do it and of course we want to make sure that your blogs are scheduled correctly on the calendar and of course gotta have sweet reports so this is the actual authoring experience that we use internally for the.net blog is we use github it is our one source for all things blogs and it's easy every single dotnet team member has access to the dotnet you know blog repo and they just write a markdown file like you just write markdown because that's how we code right we write markdown to document things that is how you blog and then you make a pull request and that's it you make a pull request and the review process starts there's a bunch of automation there with actually github actions runs a bunch of.net code and um and that actually validates all the blog entries and i'll show some of that code here in a second and then when the blog is validated and ready to go it actually hits a logic app and sends it information to start the scheduling process what's really cool is that emo actually wrote a net command line tool that can run anywhere to help streamline this process so anyone goes in they clone the repo and they just say new post which runs a dotnet service and then script and you just enter information what's the name of the post what do you want the slug to be when's the publication date and select your categories and it scaffolds it out for you then you just write the blog post and here's actually stephen tobe's amazing blog post on net six performance improvements and it's markdown right you just write markdown you do the thing and he created a pull request with it and you can see there's 119 conversation pieces on this single blog post because it's ginormous it's amazing if you haven't read it's absolutely astonishing um and and when that pr starts and when it's merged there's this validation flow that happens so we have a planner workflow um um that happens that calls the logic app and a validator and that validator is just a big.net script that gets run in in github actions it's all automated right which is really cool so there's about 24 different rules that get automatically validated like you know is are images accessible are they the right size um you know making sure that uh any language specific urls aren't in there so if you're in spain it goes to the correct url on documentation for you so it's really nice it's all automated email wrote that he's amazing and then once it's validated it kicks off this logic app and this logic app is in charge of putting things on our planner which is like a kanban board but it has calendar integration too and there's a lot of things that happen here when this logic app kicks off because it needs to figure out and connect the github pull request to a planner id it needs to make that connection and it needs to figure out well do i need to create a task on the planner do i need to update a task on the plan or do i need to delete a task on the planner do i need to merge it so the github action sends information about the pull request to the logic app so it has a web hook and it's listening there for it and what it does it parses all this information and stores information about that connection between the planner task and the github task in table storage and then it does a bunch of operations on the planner and this is what that looks like look at that thing it's beautiful i just get shivers and joy whenever i see this logic app that i wrote i'm not a logic app pro but look at this thing and it's so okay okay let me zoom in here let me show you what's actually happening here the first thing is that there's a bunch of different ifs and compilations to figure out where what it needs to do so one of the things that may need to happen is a pull request may have been closed and not merged and it goes down and it figures that out based on information passed in and if there's anything in table storage already and if so it deletes the tasks on the planner board and deletes the entry in table storage another possibility is that the thing has been merged and it's ready to go and it already exists so move that planner ticket over to another bucket and then assign a task on that task to individuals to schedule it inside of our cms other things is it could have just been an update maybe they changed the due date maybe the title that just updates it automatically else if it's brand new it will go through create the task add details and then update the table storage automatically so it creates this beautiful flow for what it's doing and this is what it looks like so here in the example of stephen's post we see that a new entry has been added at the top of this blog draft and process when it's merged or if it's just been updated it'll update these dates but once it's merged it moves it over to seo review and it'll attach it to other individuals that need to review it and then of course there's also this nice planner board so we can schedule out our blog posts automatically which is really really nice now once the blog is up and running and good to go we need reports and there's a whole bunch of really cool things that happen here whenever a blog post is um posted and that ticket is closed on the planner that kicks off a power automation hook that then updates our internal devops board for tracking we also sort of use the devops as sort of a database in a way because every night we have a power automation that runs off calls into our custo service to get information about blog views updates that ticket automatically on devops and then we have another one that every week kicks off another power automation calls into a query into the azure devops board and generates a teams message and an email and this is what that looks like again very similar to our live stream stuff and we see all the different blog posts they can click on and see their views but i've hid them here because that's internal information it's over there and these are just three examples of things that we've automated that you get to you know kind of see every single day on the.net website and enjoy on the blog but we automate all the things because we're ridiculous and we love it so much that we love this automation that i can't help myself i even automate our team stand up because i don't think it's fair if you just go alphabetical every single week so i have a power automate that calls into an azure function that randomly sorts a bunch of names and things and post information to our team's channel and to make sure that our stand-ups go super streamlined and no one talks about the weather every morning i generate a weather report fully automated not only with the weather the moon conditions the precipitation percentage and the air quality as well which all call into different back ends which is really really cool and you know once you've got this you can apply it to all kinds of things i we have these dotnet conf community events all over the world used to be in the past there's a lot of like manual work in order to review them and get them make sure they're in the right date ranges and all that and get them scheduled here we've updated this so you fill in a microsoft form and it automatically goes live so here you submit the form a function runs it does some validation to make sure it's in the right date range it checks for who to notify and all this stuff and then it adds it to an excel spreadsheet because as james mentioned we love excel puts it on a planner board for easy management by people that are helping with the events and sends out notification emails and then we need to geolocate to put them on the map and so the way we do this a function runs it hits the bing maps api for geo location and and stores that information this is important because we're rate limited on that bing maps api so we're not hitting it unnecessarily then for the website to display it super easy dotnet function exposes an http endpoint and returns json to an asp.net web page that's right i mean i literally love this stuff so much that once i got into it i said what else can i make serverless at the beginning the pandemic i got super into animal crossing on my nintendo switch and i needed to figure out how to share my turn of prices with all my friends so i created a completely serverless application called island tracker on the app store it's free um if you're if you're into animal into into animal crossing and what this does is every single api the authentication the back end the updates everything is 100 azure functions uses table storage on the back end to figure out all of the user accounts and information one hundred percent written in c sharp with.net including the mobile applications all written with xamarin forms so truly you can build absolutely anything and you can go off and build an awesome serverless army powered by these amazing services and net if you want to learn more go to dot.net for all the.net goodness including all of this amazing stuff that you saw here today and thanks for watching have a great azure serverless conference that was fabulous guys that that was so exciting it's a lot of fun yeah i was i really did automatically we had a couple of lot of engagement we had a couple of questions and i think we have time to take uh one so one of the questions was for real-time api calls are you using azure functions what about cold starts yes um and using premium plans maybe yeah it's a great question you know for the most part it's not it's not really a problem partly because we also cache our services on the front end so we allow enough time but usually these functions are so lightweight they're doing so little work because they're using this kind of pre-built json blob you know system that they just they can return that information very quickly so because the website or the consumer is caching that information and we've got long enough timeouts it hasn't been a problem um i think if we have time there's there was one other question which says do you have a documentation blog instructing you how to use service principle to access blog storage i think that's something you might be able to even answer on the chat that's a long answer yeah but um you know i will say that one thing that reminds me of is we do almost no code in order to handle authentication which we love doing right so we handle all that stuff using the you know like power automate and we just pre-authenticate with our services and then we don't have to write code to handle that yeah and that's the beautiful part too is to the point of using logic apps and power automate is i'm already signed in i'm already authenticated with azure or with power automate which means when i'm connecting to cosmos dbr i'm connecting to excel spreadsheet or i'm connecting to teams i'm already logged in i literally don't write any code to do that and it knows everything about me that's one of my favorite parts about using that type of automation when you're integrating with so many different systems like could i go and read an excel spreadsheet and parse it myself no i just let like i could but i'm just going to let power automate and and logic apps do it for me which i think is uh absolutely you all made me want to come and follow your twitch channel now because honestly i'm not even kidding this is the most amazing keynote talk ever i'm going to write this whole thing down i find so many reusable design patterns i hope you can stick around on the chat and answer a bunch of questions but uh i think melody we need to kind of move on and show the interstitial video now but thank you so much for an amazing keynote um should we do that yeah i think that would be great there's awesome we're going to talk azure functions let's get the quick review of what that means up until now the problem with building applications has been that before you could even start you had to choose a framework learn how to deploy that framework onto servers then manage and maintain those servers over time not exactly easy quick or efficient thankfully now there's azure functions the easy way to build the applications you need using simple serverless functions that scale automatically to meet demand no worrying about infrastructure or provisioning servers whether you're new to development or seasoned pro within a few minutes azure functions helps you create applications that accept http requests for a website processes product orders via queue messages react to data flowing from devices like an iot hub or just run simple scheduled tasks and since azure function scales automatically all you need to do is write your function logic as your functions handles the rest what's more you only pay for what you use with azure functions you'll write less code manage less infrastructure and have a lot less upfront costs see for yourself at functions.azure.com and try it free you'll see that making applications has never been easier with azure functions [Music] well that was really good um yeah i know between that keynote of automation and then now with the azure functions i really think we're ready to move on to the next segment of media processing workflow with other azure serverless with braden and jason are you guys ready yeah absolutely yep all right well we'll get started hi my name is brayden riggs uh i'm part of the developer relations team here at dolby io hi and nancy i uh lead that team yeah uh jason is uh being really modest he's he's been a mentor and has really helped me through this project from the beginning and he's kind of here to help answer questions while i uh while i present uh so before we get started if anybody does have some questions please post them in the chat as we go along with what your use case is and and when we get to the end or maybe a little bit during the presentation we can uh we can chat about them so before we dive into serverless media workflows let's quickly speak about what dolby io is so dolby io is a self-service platform that provides a number of apis that allow developers to build communications media and streaming solutions for things like podcasting and telehealth the apis come in two flavors are communications apis which focus on real-time streaming and communication technology so think like building your own live streaming platform uh and our media processing technology which offers solutions such as music mastering audio analysis or audio corrections and enhancement dolby io is really just uh talking about is really just taking the signal processing technology that dolby itself has been using in cinemas and music studios and packing it up in a way that allows developers to deliver it in their products so why are we dolby io here and why are we talking about serverless technology um as i mentioned the dolby io team have been developing a series of media processing apis some of which can be used for data generation tasks for things like speaker diarization uh number of talkers talkers quality scoring by talker and things like that i've been working on this project where i take these large collections of sports podcast recordings and i use these generation tools to create sets of data which can generate insight into the macro trends relating to podcast creation so we take these podcasts analyze them with our media tools and aggregate the results into data sets that we can use to explore these trends in podcast production now this project started locally but as i'm sure you can guess the storage and processing requirements quickly ballooned to a size that was just no longer appropriate for local development additionally our apis generate insight specifically from the signals and the audio and hence we don't create transcriptions of the spoken words for this project i was interested in also exploring the relationships between spoken words and sports podcasts and hence i want to create transcriptions of the audio data and as this project uh grew in scope i began looking at cloud environments where i could scale up this project to align with my goals and this is what got me started with azure serverless and developing media processing workflows so to get a little bit more specific what were my goals for this workflow and what did i need to do well first the workflow needed to have a filter of some sort to distinguish quality audio from poor audio and this is because in sports podcasts especially in the sports podcasting space these things can vary wildly in quality especially when shows have several guest speakers each calling in from a phone or using microphones of varying quality it doesn't make a ton of sense to transcribe low quality audio as the results will likely be inconsistent and noisy so the first requirement for my media workflow was to create a filter that can ensure we're putting transcription to good use media designated to be quality would be transcribed and added to our data set secondly we want to keep the low quality audio and clean it up using some noise reduction tools this is so we can take the designated low quality audio and manually explore some options to trim out the bad sections or review it for quality checks third and finally i want this whole place this whole process to take place on the cloud in a ditch effort to save my poor computer from having to store all this media and process the output transcriptions now of course i'm here today because we were able to accomplish this whole process leveraging the power of azure serverless technology in fact as we're going to show we didn't need to set up like a virtual machine we're able to create this tool that can do either one job a month or scale to handle thousands of jobs a day without us really having to change anything at all and at a really reasonable price so before i talk about what the serverless process looks like let's discuss a few of the individual tools that we use to help accomplish these tasks we outlined so to solve the first challenge we can leverage a dolby io api called diagnose our diagnose api is a tool that can perform lightweight analysis on media for a quick evaluation of the quality of the audio along with other metrics and metadata this allows us to route media through the dolby io servers to return a quality score for the audio which we can then use to filter the media into a transcription group and like a correction group so what does implementing a tool like this look like well in python using the dolby io diagnose rest api it's pretty easy we start by formatting a header with our dolby io media api key which uh which you get when you make an account and we also include the url that directs the eq the request to the appropriate dolby io server in this case our diagnose server next we format a body and for this stage we provide a url that points to the location of our media that we want to diagnose so in this case that url is our pre-signed url for our azure media storage finally we post this request if everything's formatted correctly you receive a job id and after waiting a few seconds to a few minutes depending on the size of the media we can use this job id to get returned results relating to specific audio problems uh such as speech sibilance uh and silent channels as well as a quality score which is pictured on screen um that we can use in our workflow to decide whether we want to proceed with a transcription now the audio score is on a scale of one to ten where one is poor audio recorded on a low quality mic and 10 is excellent audio quality likely professionally produced so in this example six is pretty average not good but but average however for spending the computational power to transcribe audio we want to be accurate and hence we don't want to transcribe anything with a quality uh less than seven so it's worth noting that there's nothing special about this threshold of seven uh it's just that in our testing that seemed to work best for transcribing sports podcasts if the audio quality scores below our set threshold we instead need a tool to clean up the audio so we can experiment with alternatives to transcription the dolby i o enhanced api is excellent for this task as it automatically reduces noise levels speech and corrects loudness among other things so much like the diagnose api the enhanced api is formatted in a similar way in this case we create a body and the body includes an input and an out location this is because enhance outputs an enhanced file as opposed to diagnose which just returns some data relating to media so in this case both the input and the output urls are pre-signed azure blob storage urls we also include a url that points to the dolby io media enhancement api we format this header in the same way as we did for diagnose and we post in this case we don't need to track a job id because when it is done we'll instead have an updated file on our container so what exactly is this enhance api doing well here i have an example spectrogram of one of fdr's early speeches recorded in 1941 a spectrogram is a way of visualizing audio as a series of signal strengths over time now if we take note of this particular segment here and let me zoom on in on it so it's a bit easier to see we can see that the enhanced audio that in the enhanced audio the spikes of signal are more distinct from the background these spikes the signal are actually spoken words so what the enhance api has done is is separated the signal from the noise and this is useful for further experimentation where we can look to clean up the audio and maybe trim out some of the bad parts or even explore alternatives to transcription altogether our third tool that we use is the azure cognitive services speech to text rest api once we've filtered our audio and are interested in getting a transcription we can use this api to generate our text data speech to text works by formatting a header with our speech-to-text api key and by specifying the region dependent server where we want to post this request to with the url we also format a body and in this body we specify some parameters such as the location of our file and the container we wish to output the file into then finally we post this request and let the text-to-speech magic happen so once the transcription is done the results are output into a json file in the specified container and in this case we get a few different models for transcription and we can pick one that that works for our results okay so those are our tools now how do we put them all together in a function so pictured on screen is a diagram of how all these pieces fit into a serverless function we have our diagnose tool which we use to distinguish quality audio from poor audio we have our enhanced tool which we use to clean up the poor audio for further research and we have our transcribe tool which we use to generate a transcription of the media that uh has a diagnosed score above that threshold of seven now using these tools in isolation is one thing however part of this project is to create an environment that can scale to handle hundreds of jobs all asynchronously let's talk a bit about the few tools that we use to accomplish this this serverless part uh and then we can come back and have another look at the diagram so for this project we decided we want to wrap it all up in an azure serverless function uh one part because we're already using azure speech detects tool and another part because the process to hello world or rather getting started was so quick and intuitive that i could just focus more on the complex parts of the project without stressing about building uh environments etc specifically uh we initialized this project in a python environment and and built the project using vs code very quickly i was able to create the environment i needed uh to instead accomplish uh this media workflow so some of the tools that became especially handy were using triggers to handle the workflow now there's two main triggers of specific interest uh in our in our use case we have our http trigger which our project uh which would be perfect if we developed our project to work as a web app and we have a blob trigger that could uh be used to function as like a cloud-based hot folder where every time i automatically place a file on there it starts this whole media workflow automatically so we then manage the job flow asynchronously using callbacks which could trigger the http trigger to move on to the next stage of the workflow whether that be transcribing the audio or enhancing the audio great so now we have all the tools we need to build the workflow uh we use triggers and callbacks to help direct the workflow and we use tools such as diagnose and speech to text to accomplish the goals of the workflow using serverless we can wrap all these tools together into a single function which can be triggered on a hot folder or in an http trigger this workflow can then be integrated into a web app if we want to make it a public tool so you can see on screen we have our app in the bottom left corner this app either triggers an http request or uploads a file to an azure hot folder this event then triggers the serverless function which begins the workflow by diagnosing the audio as the diagnose completes it triggers a callback which brings us back around to the media workflow function using the diagnose audio quality score with the workflow function uh it can either decide to enhance the low quality audio which again triggers a call back on completion and signals that the job is done or it will trigger the audio with the speech-to-text tool which will then provide a notification of job completion upon transcription all of this together constitutes the workflow which can run asynchronously depending on how many files we want to pass through the workflow so to recap what does the service function accomplish well for one it provides a viable filter to help ensure we're getting the most bang for buck for our transcriptions uh two it provides a path for our low quality audio to be cleaned and we can review this low quality audio later with our audio engineers to see if there's a solution and three the whole media workflow is entirely serverless and asynchronous meaning that we can run several jobs at once without having to worry about anything overlapping or interfering so i'd say mission accomplished but what did we learn from this project well the first thing we learned is that media data is sneaky big data in that the scale and size quickly sneaks up on you and you have to be prepared to develop an environment that can scale uh with the data two we learned that callbacks and web hooks are critical to asynchronous processing especially when you're working with rest apis by using web hooks we don't have to leave our functions tied up in jobs when that stage of the job is being handled elsewhere this also allows us to run multiple jobs asynchronously letting the azure server list adjust resources accordingly third triggers are your friend um i especially want to emphasize this point because picking the right trigger is critical to creating a tool that you want to use and that works for your circumstances great so with all that said i want to take this tool and share it with the serverless community so you can experiment and play with it and we have a link to our github here which hopefully we can drop in chat and we're also looking to add our project to the azure uh community serverless library uh but if you follow me on twitter or reach out via email i'm happy to let you know when the tool has been approved so you can play with it yourself uh and with that said uh thank you to everybody who listened to my presentation uh if you have any questions um i think me and jason still have a few minutes left uh on the claw otherwise please feel free to reach out to us on on twitter yeah absolutely that's fabulous that was a great session i particularly liked how you were able to put a couple of those um functions together and string them string them together so when you did that what did you find um were some of the best benefits of that and if you were to add something additional to that to improve it in the future what do you think that would be yeah well uh for the most part uh our process is really bare bones right now like we're just taking this file um we're you know filtering it into a transcription group and then into a repair group um and and i think the biggest benefit of doing that on serverless right is this idea of scaling up and especially with a project like this where i mean we are handling like anywhere from maybe just a couple of podcasts to thousands of podcasts it's just so nice to not really have to worry about hardware limitations or building these environments instead we can just run it and have it scale accordingly uh on azure side now if we want to build this out even more i definitely would love to take the workflow and make it a little bit more advanced implement implement things like transcoding tools so that we can handle multiple uh file formats and and also uh includes some sort of apparatus that can they can then take our transcribed files and maybe put it onto a sql database so that we can help like have a really nice aggregated uh set of all of our data from from the stuff that we're generating with our own apis in terms of like number of talkers to uh all the transcription stuff that we're able to get from using the cognitive services api that makes sense one of the questions from our audience is mark asks what kind of input output locations are available yeah yeah so in terms of input output locations for this project we specifically focused on using azure blob storage however dolby io does also handle something on their end where you can integrate it with our api you can upload the file locally uh and it can kind of handle uh the input and the output of the file and then return it to locally now enough json wants to add anything on that yeah i mean our media processing engine's pretty flexible in that regard um and usually uh by being close to processing the data you avoid any egress transferring of so working directly with um blob storage made a lot of sense in this project okay um pamonte promonto wants to know um when will logic's like logic sorry it just moved logix apps variables support getting variables from key vault like azure functions with a key vault uh uri um to be honest off the top of my head i'm not sure if i fully understand the question um we have variable supported gain variables from our yeah i was going to say that that's probably a question for the azure functions team because they're asking or are the logic apps team so we could put that in the chat and see i may not be specific to your use case on this union okay but i like the next question franco's asking how do we apply your solution to let's say the olympic games this i want to know because i think one of the amazing things i liked about the workflow is the ability to do localization or kind of just adapt that transcript and kind of have that be simultaneously triggers right like you can simultaneously get this in different olympic games united nations both same thing how would you do this what are you doing well uh i mean it's no coincidence that i was looking at sports broadcasts this past year in the olympic games we're on um well so assuming that you uh had uh either the recordings of the olympic games now that they're done or some way of collecting them live um our tool that we've posted uh pretty much just can handle it as like a hot folder so you drop it onto the azure into a particular azure container which you can set up yourself and it can then just automatically take it through the whole workflow you can set a threshold uh for you know either uh generating a transcription or not generating a transcription um yeah and and it works uh it works pretty seamlessly you can even have a continuous stream in there if you mess around with it a little bit yeah and i think that's one of the reasons contributing it to the community library lets people have access to it for customization right because it's really about orchestrating a bunch of these functions to perform some sort of outcome and with the pace of content being created um you know there's a high demand whether it's olympics or anything right uh and also being able to turn the content around very quickly um there's there's lots of use cases for this type of workflow um i don't know if there are any other questions but uh i i'll ask one if time parents which is do you can you do real time so the use case that immediately came to mind is being able to recognize um you know bad words things like that and kind of bleep them out in the five-second interval whatever but i i see this as a great way to support community things so that you're removing harassment you're removing that kind of stuff out of these streams so is it possible to use these apis in real time to do something like that fj so let's take that question yeah sure so uh the the demonstration here uh these are file-based media processing approach so it's not providing the real-time filter but some of our communications apis and things are more geared toward real-time use cases yeah so this specific example here is like media post processing in terms of like you already have the media then you can send it through um would love to have the opportunity to expand the functionality to something maybe a little bit more real-time because i agree i think that's a really interesting use case and that's one of the great things about making this community available too so that's something that as a community it can be expanded that way too so i think apparently you still i don't know if you can hang out and go to the chat and answer more questions there are a few questions about throttling tips and tricks to avoid problems with throttling or have you encountered bottlenecks with throttling a lot of interest in that if you want to kind of give a quick response but we'd love to have you hop on a chat and answer that too if that's possible um yeah absolutely uh if you don't mind dropping a link to where the live chat is i'd love to take a look because a lot of these questions um i i really would like the opportunity as well to follow up and maybe give a more detailed response you know so i'm not using up everybody's time here sounds good uh i think we should probably be kind of talking about our next session now right melody yeah i think it's time i'm super excited for this one because we've heard so much great stuff about serverless and functions and now we're gonna have davide mori come and give us the serverless full stack kickstart and i'm really excited to hear more about this davide hello hi everyone very nice to see you here um we'll let you take the stage and look forward to your talk yeah sure let's go okay cool uh let me share my screen and uh let's get started with the kickstart so i love it the keynote because uh it promised uh like any very easy um you know startup time so that's exactly what i want to show you how to create a full stack application using azure function and also static web apps um so let's let's get started height is and um so that's exactly the goal of the session uh create a web client well i will not be writing html i will be using already one available with html in view that will be our frontend and then for the back end we'll be using azure static web apps plus azure function since they are so good together we will be using.net core as a language but we can we could have used the node or pythor on any other language that we like and we are going to use azure sql uh specifically the serverless version to be completely serverless and that's what we are going to do uh basically take it to do mvc up front end that i like so much because it shows how you can create a front end with a you know a lot of different web client or web framework as you wish i will choose a view because that's what i'm passionate about and then that front end will actually work already by itself but then we want to save the data somewhere else other than the local storage that uh where is where the data is saved in the sample so for example we need to uh have a database in a backend providing some rest api behind the scene that will nicely receive the json send from the front end to the backend so that we can do whatever we need like store uh store data somewhere and and that's exactly what we are going to do right so use this uh simple and super nice application simple excel so uh also great for uh who are we starting uh to learn coding or to be in this field uh but very complete because it will give you a full end-to-end experience um and this is exactly the architecture of what we are going to do so the client ui will actually send some json to the back end and as a developer i would love to be able to send json also to my database and this is exactly what we'll be doing even with azure sql even if it's a relational database it has been extended and it has evolved over the last year to be able to manage json documents graph models just special data you know all the data you can you can imagine you can have so we'll be able to send json to azure sql there we'll decide how to actually store the data i will i have decided to store data as a regular table but i could have also stored it as json do some manipulation send back json to uh back in the backend api so it would be super easy to deserialize what we got into an object and then send it back to the client ui to visualize the result on the front end that's exactly what we are going to do now to start is super easy you have to download or install the azure function core tools and here's the link and the instruction to do that you have to download and install the static web up cli because that will make our life super easy and i love so much this cli that has been created by the static web teams because make a local development just a dream i will show you in a second and then um i can't just you know type all the html uh right right now because otherwise we'll be out of time very soon uh even because i'm not an hdmi guy i'm not super good at creating a front end i'm good more uh i'm better with the back end so i will be just downloading the view client uh uh and you know start from from from this one to evolve it so that it can connect to the backend can send data to database and do exactly what we just did so starting from these three simple things let's just summarize what are the next step that we are going to do in the next 20 minutes so create an azure function to implement the to do mvc api specs of course the front-end will be sending some json but how should we manage that json should we support a get a post how shaped is the js that we are going to receive and how the json should be shaped to be correctly manipulated by the front end so in these to do back-end website there are the specification uh that you just have to follow in order to make a back-end compatible with the front-end uh and then we will update the view client so that we can use the rest api created and then we will deploy uh the azure static uh web up to the um to the azure uh azure static group app engine and then we will change the api so that we can actually we will be storing the data into the database and then we will do everything with the csd pipeline because uh actually deploying the full working solution that i have on my machine to the azure static web apps can be done and actually it's really easy to do it using github action and that would be a dream you just push at some point you will just push your changes and it will be automatically uh compiled built into the azure web app for production and then we also want to add authentication support then finally we can have some good chocolate and enjoy what we have done so um let's uh let's do it but not before having explained why i decided to use azure sql database and not something else well as i said uh i i'm really starting from scratch here i don't know where this application will be in five years so i want to be able to be super flexible so i need to be flexible but i only need to be secure and performant and make sure that our data is consistent so i want to be flexible in the sense that i want to be able to support any kind of model i may want to use in future so json so the documents graph angiospecial uh maybe i need to have super high concurrency so azure sql provides in memory log free table which are tables that does not take any lock to implement the consistency of transactions so you can really have you know thousand and hundred thousand even million transaction per second on to stables we even may want to have a ledger table to make sure that all the audit trail of our data is proven and consistent we may want to use the column store to quickly aggregate data uh and you know provide uh reports for example dashboards we want to encrypt our data maybe we want to apply raw level security so all these features are within a single database and you can use pick and choose them and use them together as you wish so it makes super easy for me as developer to improve my application over time uh taking advantage of any of this feature and specifically i will use the serverless version of the azure sql because it provides all the feature i just mentioned but in addition it will also help me to automatically scale up and down when you know it detects that there are more workload uh it can automatically pause itself when i'm not using it so when i stop working on it especially for a development purposes i will not pay a cent for it and it's billet per second so exactly in a serverless fashion i only pay when i use it and for the second i use it which is great so let's let's do that in uh in if you want to follow me uh everything i'm showing here is already on the repository you can see on the screen there are three branches the first one is the very basic one the one i will show you right now and then the version 2 is with the added ability to send data to the database and the third one is also with authentication so you can only see the to-do item that you have created really completing the end-to-end story of uh you know professional fully working and solution so here we go let's let's switch to code and build something right now um so first of all i wanted to to show you how easy is to create uh uh the static web app so uh i just want to create a directory here so let's call it kickstart and then within the kickstarter let's uh let's create two folders um let's create the client folder where we will store the uh front end and then make us the api folder uh so that it can store the backend code let's see what we have perfect now what i want to do is quickly download the uh in the client api in the client folder the html that is provided by view um the view team that already implements basically the to-do list uh just using vue and local storage but this is exactly what i want to start from right so this is what impressed me about the static web in azure function it's done so i downloaded something that i want to use as a starting point all i have to do right now is swa start i don't have the api yet so let's just start the client uh fully work again super rich emulator started i can just go here and hugo i have my application work and of course hello again here so this is my application it's working very nicely the only limitation right now it's everything is stored in the local storage so local storage means and let me switch to the v1 which is basically exactly this application but the local storage means that let's go here local storage view page source so local storage means that uh this code here which will probably you recognize if you are uh familiar already with vue is basically saving the um the item i created uh into the storage uh the local storage of the browser right but then as you can see i have a function for when for example um hugo i add a new to do i remove a to-do or i edit one so what i really have to do is now change this code so instead of calling the function that user the local storage it will actually create a uh rest endpoint and how can i create a rest endpoint well that's super easy again now let's do something uh kind of uh super easy right now so let's go to the api what i have to do is just initialize the function environment here and then i want to create a function written in dotnet here we go so here everything is done now i want to create a function called to do or uh in and using the http trigger which means this function when deployed will answer it to some http call where i will write the code to actually handle the to-do that i receive save it into database in the future and so on and so forth that's it so again i can go to my folder now use swa and add api and this means that the enviro the local environment will actually start the front end even start the back end and serve them and make sure that they works well together taking care of complex otherwise complex uh configuration like a course and all these uh stuff that you usually have to take care of yourself now of course this is the uh nothing different than before because my front end doesn't know that the back end exists yet but we can check if the back end is actually working correctly and the back end is answering at this address localos api to do so if i send a get i should expect to receive something and that worked so the function is actually working now the function has been used has been created using some boilerplate code um so it doesn't really do much for example if we go to the api and and for example do cut to do which is being created for me when i did function new you can see it's just an example on how of how to end the get and post request and and then return something but this is exactly what i need to get started now i can remove all this code put the code in order that will send or store the to do somewhere and then i'm done now this is just the basic uh step that you need to get started and and you know to have some working code now we have to change the code to make sure that it does what we want and not what the boilerplate uh code already does for us so having said that let's go to the v1 branch of the repository and here you will see exactly what i did so the index.html the only difference between what you saw before and now is that now my index.html has been changed so that it can send data to a backend api instead of using the local storage and specifically i'm using the fetch method so that i can send the json that nicely the vue.js framework will give to me and send it to this api the api are nothing more than just uh uh the local lost slash api to do in this case and so my api will be uh done in this way here for now i still don't want to connect database i just want to create some back-end code that is able to handle the uh json that is sent from the front end i'm just using a list to simulate any memory database so i create a few to do item put it in the list and then i wrote the code to implement the get handler for the get method the post handler for the post method and so on and for example the get is super simple uh if you go out to do slash an id i will try to first of all i will check if you are specifying some id if not i will send back all the to do if you have specified an id i will look if there is any item with id you specified if yes i will return the item otherwise i will return are not not found super easy uh now that we know how the swa cli work we can just do so here i'm already in the repository i already cloned it it's already on the v1 so all i have to do is as i did before start my um my emulator here and then this will start the updated html that is able to send via the fetch method our request to the backend and the backend is now able to get the json that has been actually sent from the front end so everything here should work nicely um perfect i have hello world and word l that are exactly the sample item i created them and here i can for example create a new one new to do this will be stored in memory i can now mark it as done perfect so this is working now the only problem is that is saving everything in memory right and now we want to change that and connect it to the database but this is something i could already deploy and deploy means i can now since uh i just have uh 10 minutes left i can just run this arm template and the arp template is basically a way for you to tell azure what you want to be created so a description of the resource you want to be created and as you can see here i'm creating a static website and the static website needs a source repository url and my repository url will be exactly uh this repository here which is the one that you should fork and then use as your deployment starting point that contains the code that uh is actually running now locally right so if i move forward and deploy the art template using the azure deployed uh shell script i provided which is actually executing the deployment you will end up using a static web app fully working which is actually exactly this one now in order to just show you everything i want to move now to the v2 of this application so that you can see how easy is to connect to database because now the front end doesn't change as a doesn't have to change anymore because we know that the front end is sending and managing correctly the json so no matter what happen in the back end as long as we send back and basically manage the json in the same way we can do whatever we want in the back end and the front end doesn't have to change and in fact here the only thing i did is i converted the logic to get uh to manage the get and the position or the other method so that instead of saving into uh uh in memory list i'm sending the data as json so i'm getting the id wrapping it into a json object and sending it to a story procedure the story procedure here which is this one will actually execute something some sql code and that sql code will actually return a string which is json so i'm doing something on the database but the output of the database is json itself so it's super easy to pass it into json here i'm just returning it as json i could have deserialized it into a to-do object if i also wanted to apply the schema here in the application and how my database would look like super easy my table is incredibly easy so i just uh have to do with the id to do and completed but my story procedure is just taking json and returning json so from a developer perspective even the communication with the database is full json and this means that as developer my life is much easier because then i can simplify my code because i don't have to deal with read with rows and columns and table and other stuff i just have to have serialized to serialize and deserialize json and that's it so for example here what i can do is uh create a database and i will show you how even these can be automated in the process but uh just for the sake of uh the time we have available i already create a database now let me connect to it and i'm just using the query editor here let me clean it up to make sure that my demo will work as expected perfect and now uh i want uh um old uh let's yeah let's stop this um what i want to do right now is deploy database right so i have a database um i just want to deploy this sql there are several ways you can do uh that uh some other ways will be shown by anna in his session uh tomorrow here i want to show a a way i like a lot which is writing a script and executing the script in a smart way what smart means well for now i just only have one script so i'm using a tool called db app that will automatically get all the script in this folder and apply them in the correct sequence on my b alpha to the database so all i have to do is go to the database deploy folder and do.net run these will connect to database and execute the script and as a result of that i will have the database deployed nice so if we can go here we can go uh here right now and take a look at what's in the database right now there is the to-do table i created but also there is this uh dollar dbap journal which basically is something that is used by the db up library that i used to keep note of what script has been used already and what not so if i re-execute this script it is a deployment it will not execute the screen that has been executed before again which means that i can deploy my database incrementally which is great right now having my deployed my database in my dev environment what i can do here everything is ready my code is ready what i can do is just uh what we can close this let's move here clear this up and run again my solution again uh this will now be started emulator will spin up the server to serve the html front-end and also the function to serve the back-end and everything will work super nice in the sense that now i can go here and here we go we these values now let's see i've slides and demos here i can go to the portal actually it's here so i can do select from dbo dot to do's and you will see that we have slide and demos now let's go um to the same page and say hi there and let's say that i done the slides so let's go to database run the query again and you will see that everything has been saved in the database right perfect so this means that i coded my application correctly my front-end is correctly sending data to the my back-end and how my query would look like so for example this is exactly um another example of uh just using the api without the front-end here um so let's say i want to have all my to-do i can just request these and is exactly the json that has been sent back to the front end in this case if i want to create a new api all i want to do all i have to do is post to the endpoint and this will actually create an api so and this is exactly what has been sent also to the database and database is actually sending back back this the backend is now basically working as a translator between the front-end and the database to make sure that the json scientist correct can be validated and all this kind of additional stuff that for security want to keep now um this is super beautiful i would say now let's make it a little bit more realistic because as you can see right now anyone can actually come here and create a to-do and i can and i would be able to see that to do created by anyone but i want to be authenticated i want to make sure that my to-do are only visible to myself now the the really beautiful thing about the uh local emulator the static cli is that it also come with an integrated authentication system that allows you to simulate or2 so that you can actually simulate you connecting to a github resource authentication provider or twitter so you can really have a fully working experience even locally how we can do that it's uh extremely easy i can tell you and we just have to switch to the v3 where there is also the authentication code uh inside the uh the authentication mechanism inside the code and you can see here that i am just querying this dot out me which is something provided by static web app that i would allow to allow the code to understand if i'm logged in or not if i am not logged in because this doesn't return anything i just have to log in for example using github this is this works like magic let me show you so let's go here let's start again my simulator with now this additional code and you will be able to you will be able to see that everything works perfectly as expected so just okay perfect here we go let's go uh to this and refresh and now you can see that i can log in here and so let me try to login and here i can simulate the fact that i'm logging in with with the github and then i can log in and then i'm logged in and how do i know who is the user who connected to my application right now well again azure web apps uh make everything super easy in my code all i have to do is parse a special header that has been set up by static apps for me that contains the information that has been received by github or twitter so super easy i decode the uh the basically it is header that contains uh token information and then i can have um for example here um a user id somewhere here we go for example here i can extract information about the context which contains the user id perfect now this would work locally but uh now let's pretend that i already done everything i need to do i tested it locally i want to deploy it so to deploy as said i just have to run this arm template so let's do that let's stop this here and just run azure deploy.sh this will actually create a resource group and will actually create a resource group that already exists so it will be super fast and it will let's see if i can make some space here apparently not so it will also create a new database uh well actually that's not going to do it because that's something we have to do but we know that we need to have a to do database here created what it will be doing is creating a to-do app actually um i already prepared the to do up in case we don't have enough time to do everything so the only thing i need to do here is to change the name of the two apps let's call it live nice and let's let's deploy this again and uh perfect here you have to specify other information so the environment is something you have to create but the the shell script i prepared will help you basically you have to specify where you want to deploy the connection string to the database and some information so that you will see the static web app client or deployment will actually uh create a workflow in github for you and actually this is probably already done is usually quite fast and what will be happening is that if i go to my github repository soon here i will see a workflow with the name of the application so my application has been created and internally as a random name called delightful iceland so if i go here to the workflow i see delightful iceland perfect this is actually what gets kicked in every time we commit something and it goes and deploy the application build application deploy uh the application on azure for us now the only limitation of this uh workflow here is that of course it doesn't know that i'm using database and i want to also include the database creation in the process so uh to show you uh to show you how you can do that i also prepared a sample you can use and literally you just have to copy and paste from the action to the deploy and build so let me do that here we go you can go here so i said from the action to deploying build so it will be exactly here nice let me correct the indent is done start commit commit changes perfect from now on every time we commit all the database will be deployed and as you saw using db app all the incremental script will also be deployed every time i add a new script to my database so i can also evolve my database if you go to action let's stop this cancer run so we can just focus on the second one that contains also database deployment uh these in a couple of minutes will actually deploy the solution deploy database and we will be able to actually have a fully working end-to-end solution that implements even security on azure static uh static web up with azure function backing it up for what concerned uh api in azure sql for what concerned data management uh this usually takes a couple of seconds and this is also the end of my demo so in the meantime this deploys uh if there are any question i am more than happy to answer some of those so let me bring uh back my uh hi david i think you're probably not watching the learn tv chat but it's been going crazy there have been so many questions that have been answered i hope you get a chance to go back definitely yeah um i'll bring up one which i don't someone had just asked is this a specific pattern for static web apps and i think they're talking about your whole kind of end-to-end workflow um quickly so yeah the pattern would work with any technology you want to use a static web app make it super easy because otherwise you have to do everything on your own right so you have to figure out uh you know uh how to for example configure the front end to make sure it can work with the back end and usually cores uh which is something that usually tries to make sure that you don't get hacked uh is something you have to configure is not always super easy it is basically remove all the friction and give you just an amazing experience yeah that was a question from raj krishnan thank you so much for that um i also wanted to say i love love love the cli i mean you this is such an amazing session you made it so seamless but a quick question for you i think we also have vs code extensions for those of uh folks who prefer that with swa and yeah yeah there is a code extension for uh static web app i haven't used it just because i'm more a code guy so yes i like typing um but yeah definitely there is also a code code extension that will make everything even easier yes absolutely so i'm going to put a challenge to people out there david was using views so you should fork this and do this with other frameworks and write your articles on how you did this vs code that would be awesome thank you so much this was so amazing i'm going to actually go back and watch this and write it up because this is exactly what we need thank you thank you so much for uh davide and i think we have a quick minute where we're gonna go show you a video about um data exposed from anna hoffman who will be talking tomorrow let's go to this hi i'm anna hoffman from the azure data product group here to tell you about a show i host called data exposed we talk about all things data what's new deep dives how to's and we even give you a glimpse under the hood from the people who actually build the products we cover topics like azure sql security running high performance sql server workloads on azure virtual machines migrate all your database assets to azure sql data science was something old and something new azure sql managed instance developing apps without the sql database and more we post short episodes on thursdays and once a month on tuesdays we release a special mvp edition with the community may the fourth be with you catch us live on learn tv on wednesdays at 9 00 a.m pacific [Music] i'd love that data exposed it's one of my favorite things to watch yeah and i think anna's going to be doing a talk on this conference soon too so i'm looking forward to that yeah everybody should watch that one anna's such a great speaker but coming up next we have another great speaker um and i'm really excited for this session it's going to be um highly scalable game games with azure serverless with labrie uh loving it's gonna be awesome so let's watch hi labrita are you ready for this i am ready uh let me just share my screen you guys we're super we don't yet oh now we do oh yay we're so excited so yes i am so cool i am so glad to be here i've done a couple of conferences here before but this is my first time talking about gaming um so i'm excited to be here i'm excited to talk about how serverless can help gain game development so um so as everyone mentioned uh my name is labrina loving i work in developer advocacy for our gaming ecosystem organization here at microsoft and today we're going to talk all about building highly scalable games with azure serverless so um over the years games have gotten more data intensive more compute intensive and there are some amazing games out there that are really leveraging the power of clouds um but i wanted to have a discussion about how we can actually leverage the power of serverless so today you might actually be hosting your game on-prem or you might be hosting your game and the services that support the game um maybe you might already be hosting it in the cloud but you might be using um something like virtual machines so today we're going to talk about how you can leverage the benefits of serverless to host your game so for today for today's games there are a lot of online services that actually power um games games require a lot things like leaderboards matchmaking authentication player player messaging currency tournaments and analytics and and on top of that if you're doing multiplayer game you have to worry about hosting and as always you have to worry about the ever growing scale of um player data so there is a lot to think about a lot to a lot of services and although you uh may already have created services to um for some of these it's really great to have to leverage serverless as a way to actually take some of the the burden off your development so today hosting options for your game come in a lot a lot of different flavors so and and the flavors it's definitely a choose your own adventure and um the way you think about it this this uh from is really about how much you want to manage and how much you want azure to manage um and that can vary based on the requirements of your game um and based on uh just where you are in your cloud journey so we start all the way over at the left when we talk about infrastructure as a service that's definitely um where we at azure are managing the infrastructure hardware and networking and the facilities but you're still managing patching of servers you're still managing application code security config provisioning the list goes on uh then let's talk about a little bit um container platform as a service so this is where you're actually um um having your game and its services are running inside of a container and azure is maintaining the container orchestration plane uh still also the physical hardware software and facilities and then you manage the application code this data source integrations and you're managing those uh container clusters and for that we have um a service called azure kubernetes service or aks uh then there's um platform as a service uh which is uh where we will continue to do all those same things but you're managing um the scale of the web application and then there's functions as a service which is what uh we're primarily going to talk about in the session which is where at this point where we and we as an azure are managing everything except for your application code except for your game development code so you're so you uh can focus on uh building your game and uh we uh and we focus on everything else for you and then the the last thing in the corner which is last service in the corner which is really great which is integration platform as a service so this is uh we have a service called playfab and with this service it actually comes um all of a lot of the services that we talked about like leaderboards or matchmaking can actually be services that you can figure uh and then you could possibly combine um functions on the function as a service and integration platform as a service together to extend um playfab as well so why should you might be thinking if you're hosting if you're already building your game why should you use surplus why is serverless important for gaming so with serverless you actually can scale automatically we handle the scaling for you we manage high availability we manage the servers for you and you kind of focus on code it's very cost effective because in this case we um you pay only for the compute um time that you use so if you have a leaderboard that is running in an azure function um you pay for the time that that leaderboard services call and doing called and doing work and then and then you don't pay for um and then you're done and then so actually in this case you're paying for just the compute time for processing those particular events um and that's it so and then because of this you actually just can focus on your game not on provisioning infrastructure figuring how much infrastructure you need to buy figuring how to scale it how to replicate that infrastructure across the globe uh we uh manage all of that for you in azure so let's take a look at kind of the inventory of serverless features and options that you can choose um for powering your game so we talked all so a little bit about azure functions so azure functions allows you to run a compute and it allows you to run application game code in in the cloud and you can run various services it can be event based it can be triggered off of things like um messages that happen it can be triggered based off of an http endpoint um it could be time based trigger there are a lot of different event driven ways and it feeds really well into your event driven architecture api management is um our front end um for having security in front of your um environment so it handles things like um ssl offloading it can handle security uh other things like that then event hub and queues can be done used for um managing your messages uh that could scale that at scale so as you're um streaming events telemetry events for player data um you can use event hub and queues to manage that uh stream analytics is um our stream processing service um and azure cosmos db and azure sql are our database azure cosmos cb for nosql and azure receivable for relational databases and blob storage for unstructured data and then to kind of monitor and support all the events and all of this uh we have azure monitor and application insights so what i thought i would do is um actually take a look let's take a look at a reference architecture for actually doing something [Music] in your game for matchmaking so when when typically when you're doing matchmaking there is a lot of concerns that occur um figuring out um how the logic to match users across the globe figuring out the right latency figuring out how to minimize latency and then still handling things like what happens uh when the user might cancel or or leave the game so there's a lot of concerns that can happen um while uh while managing your game and so we can leverage the power of the cloud and server list to do all of these things so let's take a look a little bit um at the diagram that i have so let's imagine that we have a um a a game player that comes in uh through a device client um they will get added to queue they will use uh traffic manager and api management to route their requests to uh the closest region so if you have uh users coming in from europe or from japan or from the us they will get routed to the appropriate azure region that's uh closest to them um then uh that request gets placed on um an event hub uh then we have a azure function that actually will take that will take that um message and process it and figure out where this player created creates a ticket for the player um a message and um and fig and tries to figure out the appropriate uh game session for that user um then we have uh another function that will add the player to server um then we have this kind of a layer at the bottom where we're doing a match request and that actually um is an orchestration that happens and so during that match request we uh create um another azure function to determine um where the player should be matched um and then we put uh and then when we're done we figured out the right matching then um we can put a message on event hub and trigger that and then the player is ready to be to start playing so i'm going to show a little bit of the code behind the scenes for how to make this work okay um so here i've got um a set of azure functions uh that i've created um so i've got an azure function um called batch at our batch app player and this is triggered as you can see it's triggered from um my event hub and then what that does is it starts kicking off an orchestration and what that orchestration can do is um start connecting and figuring out um where uh the player should be um should be matched to so i've got some processing logic here that just takes the player information and deserializes it and then i start creating a hash key that i'll store in a redis cache to maintain the the session information for the player and so this code actually is uh pretty um uh pretty easy to get to get started and then once we have this code uh written and um set up then i can actually deploy that as an azure function um that azure function um would be of running um in the cloud and every time a new server every time a new player um gets uh gets added um requests requests a game server requests a match this code can run and so it can run at scale and basically after you write your matchmaking service uh you actually deploy it and you're done you don't have to worry about um posting you have to worry about other infrastructure concerns and scaling this is all done for you so um i've got just um just to kind of show um i've got a bunch of um i've got a quick little um game that i'm gonna just play real quick and start firing off um some events and then what i'll show happening is that when that um when we when that when i start my game play um we'll actually be able to see uh that in our server so let me bring so let me get that going it's taking a little bit of time to get my game started so um anyway so uh what i've done is i've provisioned what is called uh an event hub an event hub is what we use to um store all of our player data telemetry and messaging and so and so uh i can take a look at this event hub i've created a vid hub first thing you do is uh you create an event hub namespace um and then what i've done is went ahead and created an event hub instance and this event hub instance i've called it player data um i've actually started um capturing events so you can see um you could see over time like the the number of requests i started pulling in so i've got like over 3000 requests that i just started pumping through from playing my game over uh or simulating play in my game over time and so you can see these events have been captured here and uh that the code that i showed before my game server what it does is it actually connects to that those azure functions which then places the event on an event hub and then triggers the remaining code to actually do the matchmaking events okay so let's take a look so now that i have um so now that i have uh my player events and telemetry going to an event hub and i've got a lot of player data and matchmaking happening um one of the things that i want to do is start doing some analytics so the great thing um and but because of the way that uh game data is growing there's some challenges with doing gaming analytics so data is very critical and to gain to gain success and there's a ever growing amount of telemetry over time there's a lot of silo data that's aggregated so you might have player game data you might have um player profiles you might have um just the actual game play that's happening so you'll have a lot of data that grows over time and it drives a lot of important key decisions on how you grow your game how you continue to evolve your game how you monetize and keep players coming and you really want to be able to have that the game data to be provided in real time to power decisions so and then you want to be able to bring your existing tools to analyze the data so what i want to show as a second is to really show how you can use the power of serverless to actually stream data in real time be able to make decisions uh as well so let's talk a little bit about the tools that we'll use for streaming data in real time so the first thing we'll have we'll be streaming that event hub data that we started creating when a player from our player events um we'll use another service called azure stream analytics streaming and then we'll i'll show a little bit of how you can use different reporting tools of your choice um to actually report and visualize that data all right so let's go back to our azure portal and so what i've already done is i've provisioned um a stream analytics job and when you create a stream analytics um job there's a couple of things that you want to do so the first thing that you have to do is you create to create your inputs so i've created um already an input data stream but let's just see how i can create a additional one so when i create an input data stream i can actually create it from either um an event hub iot hub or blob storage but i'm going to create it from an event hub and so this gives me um uh the idea of my job is still running so i can't actually show but uh so this actually uh um will walk you through creating an event hub let's take a look at the one that i already have created um so i just went through and created um an event hub i created it from my um event hub namespace um and called player data here's my event hub and then it just creates the connection stream and then it creates a policy policies are used to um create um fine-grained control uh to your event hub and to your data um actually i'm going to go ahead and stop my job so we can actually see actually execute a few queries too so the next thing that i'm going to do is i'm actually going to create an output first and you can um add outputs as well um there's a lot of different ones you can create i created a power bi output uh power bi output um will actually allow me to connect and stream my data in real time um and connect it to and connect it to my job so actually at the end of the day just with a few clicks i can stream player data in and then actually connect it right to a power bi report um so here's um i can here's my power bi outputs i've created a data set called player data set and created a table here we'll quickly show see how that works just doing a quick time check here um and then um as you can see here there are various ways that you can output your data um you can output it back out to an azure function um you could store it and keep in cosmos db for other considerations you can put it in a data lake so that you can combine it with other data to do other further analysis but i've created mine out to power bi report um and then what i wanted to do is actually kind of show a little bit about this so i've actually i'm capturing things like player country and um the number of game counts and things like that and so what i've done is i've created a query and your query is what you actually do um to capture the data as it's streaming in from event hub and then what i've done is i've created a tumbling window to capture it every uh uh this is a tumbling window every five seconds so if i test my query oops oh it may not happen i need to stream some data um but uh this actually would show uh our data for uh in real time and then at the end i could now go to my power bi report and actually show uh my data in uh being captured in real time so let's see if i can get this kind of started again i'll just quickly show because i know it looks like we're running a little bit a little on time um so uh i'll just quickly kind of show the process of actually taking that and combining it with power bi data go here and just bring it i um report i can i'll see the because i created the um because i created um because i already created and connected uh my um player data i've now got a data set here that i can see through my power bi report and then i also um can see create a power bi dashboard um to capture the that player data so really just within a few minutes of simply clicking you can create um take player data ingest it um query it and then be able to output it to do like real time streaming um to your dashboard let's see let's run a little long time so um that's sort of our two uh those are sort of two instances where you can actually take um leveraged server lists for using um for improving and building your game so we talked about actually using serverless for actually running all of the apis and functional code for your game development and then we talked about using serverless for streaming your data in real time thanks very much that was great yeah thank you um we had a quick question for you if you have a minute um [Music] sunita was curious why you chose event hub over storage queues or service bus yeah so the re so the reason why i um chose event hub um over uh storage queues as store over storage using storage bus so the reason why you you think about the difference between a vent hub and queues are really um you have to think about what the purpose of your what you're doing so event hubs are um best used for events so saying like this happened like this player is requesting um a matchmaking uh event or this player um quit the game so those are actual events where i want to take further action versus um the queue where it's really about um just sort of reading uh data so that's actually why i chose the mid-hub versus storage but in in some cases you could it could work uh very well the same but um it also depends on the scale that you're looking for as well yeah that makes that makes a lot of sense we we have uh a lot of good sessions coming up um what's what's coming up nydia yeah and labrina we gotta talk after so i'm gonna that was awesome so uh thank you so much and we are off to the next session we're going to have throughout this conference you've been hearing about power automate and azure logic apps now we're going to hear about it from prashant boyar who is an mvp and also here to talk about how or kind of contrast logic apps with power automate take it away prashan thank you thank you thank you melody let me start sharing my screen all right uh let me know if you can see it all right so good morning uh good evening uh good afternoon depending on your time zone uh so in this particular talk we'll be focusing on key differences between the azure logic apps and microsoft power automate uh survey my name is prashanji boyer i was raised in india came to united states in 2007 for studies and university of maryland college park alumni co-authored a book called powershell for office 65 i also acted as a technical review for the book pro sharepoint 2013 administration i am from washington dc area and we organize a lot of community conferences pre-covet pandemic uh all the conferences used to be in person uh ever since the pandemic has started we have converted all our conferences to virtual only so if you are interested uh here is the link and you can find more information about registration uh and call for speaker and rest of the stuff i am recipient of antarctic service medal i got this award from us government for the work i did in antarctic continent back in 2008. i am recipient of microsoft mvp award since 2017 and my current category is ai and those of you who are not familiar with this award so i don't work for microsoft it's just the additional recognition that microsoft gives out to the folks who does technical contributions via various uh ways and currently i work as cloud solution architect uh at ais and i focus on intelligent business process automation where i use variety of the products and services in microsoft ecosystem like azure ai bots uh microsoft power platform microsoft 65 and sharepoint and i work at ais western virginia headquarters so this is going to be the agenda i will give start with uh a brief introduction about power automate a brief introduction about logic apps and the majority of the time we'll be spending on talking about the key differences between these two products i will also talk about how to extend uh microsoft flow or power automatic using logic apps key takeaways and q a so feel free to ask your questions in the chat window uh nithya and melody are here uh to monitor those and they will be letting me know as and when the question will arise so let's talk about the power automate it's a low code platform that spans across microsoft 65 uh dynamics 365 azure and standalone application and one of the early member of microsoft power platform was power bi and later on microsoft introduced power apps and power automatic and recently during the november 2019 microsoft introduced microsoft power virtualization which we can use to create uh intelligent chatbots without writing a single line of code now we use power automate uh to implement end-to-end intelligent automation and these are are the value pillars of our automate we can do automation at scale uh it's seamless a secure integration uh it builds on the serverless principle it helps us to accelerate productivity and we can implement intelligent automation using power automate and there are many characters available uh the list is keep on growing uh right now it's more than 300 and you can build your own custom connectors as well and with that connectors you can integrate with your data not only in the cloud but also in your on-premise legacy system as well and there is a list growing list of template libraries available so in case you would like to implement certain kind of automation uh you can quickly check out a template library and there you can find how to implement a particular set of automation let's say where you have some events happening in azure event hub and later on you would like to do some operation in other non microsoft specific system let's say google drive or dropbox or let's say aws then there are a lot of templates available that you can use now let's talk about what are logic apps uh logic apps uh is a cloud service that help us to schedule automate and orchestrate tasks business process and workflows uh it is a de facto workflow engine in microsoft azure uh it again just like power automate it's built on serverless principle and it uses containerized runtime in the back end uh depending on the usage it can scale up and down automatically to meet the demands and one of the key advantage of going with logic apps is you can focus on then implementing your business logic you don't have to worry about where my logic app will be hosted whether that environment is going to be secure or not how i make sure it has the latest patches those kind of thing you don't have to worry just straight up go to the logic apps portal azure portal create your logic app and start implementing your business logic just like power automate there are many connectors available uh and we can use connectors across the cloud as well as on premise and everything uh begins with a trigger and there are various triggers that are available which we'll talk about in few minutes and the triggers can be time-based figures uh it can be manual and it can be even press figure as well and now if you look at the description or the overview of power ultimate and logic apps they sound a little bit similar now let's see how these two products are connected with each other so if i had to which if you had to visualize how these two products are connected together uh you can use this uh diagram so azure is microsoft one of the microsoft public cloud offering uh and uh logic app is is the workflow engine in azure uh and back in 2016 when microsoft announced about power automate or it used to call microsoft flow that time it was built on the top of logic apps that means pretty much everything you can do in power automate we could do in a logic app but over the course of time power automate added its own feature as well so there is still overlap in between the functionality of these two products and services however now uh these super products and services also has this own unique feature sets that are only available let's say logic apps are only available in power automate the designer the lot of actions that trigger are still common both in logic apps and power automate and when it comes to power automate we can create business process flows desktop flows and additional actions that are now not available in logic apps however since power automate is built on the top of logic app whenever a cloud flow executes uh the runtime it uses in the back end is actually the logic app runtime as you can see here whenever you flow starts it uses the logic at runtime it makes the api call integrate with all the services that you're using in your flow and finally your flow get executed so if i show you this diagram here so i have here two screenshot uh the both the screenshot are implementing the same business logic we're basically starting uh automation whenever a new item is grating created in sharepoint uh then we are initializing bunch of variables and then we have some condition and if i hide few information from the ribbon and from the left hand navigation it's very hard for us to figure out which one is logic app and which one is power automate uh you can take a guess but let me remove this banners here so you can see here uh the one on uh the left is from uh logic apps and one on the right is from a power automate but if you focus on the middle section uh which where actual business logic is implemented it is pretty identical that means if you are familiar with microsoft flow or power automate then you can implement a similar business logic in logic apps as well only thing is you will be going to azure portal to implement that similarly if you are familiar with logic apps and how to build your business logic around logic apps then if you would like to use for automate you just have to go to a different url uh and power automate ui and start implementing your business logic so let's talk about the key differences between these two products so the first one is logic apps is part of azure group and microsoft power automate is part of the business amps group so dynamics group now you may be wondering why i'm talking about uh which product group these products are belongs to and there is a reason for it because depending on which product a particular uh product belongs to you'll see uh the corresponding other products from the same umbrella will have a really tighter integration with that for example logic apps will have really good integration with other azure devops and some other services of and an example of power automate power automate will have very tighter integration with power virtual agents power apps which are the part of business upsells when it comes to uh the offering so logic app is uh available to us as integrated platform as a service offering uh whereas power automate is available to us as software as a service offering uh the licensing model of logic app is very simple it is just like anything in azure it's all based on your consumption so we pay as per the usage uh whereas uh in power automate everything is subscription based uh and there are multiple ways you can get uh the license for power automate the target audience for logic apps is it professionals and it's built more for enterprise grade scenarios uh and power automate use subset of logic apps and the focus audience is uh citizen developers or people who are who would like to live in low code no code environment so with logic apps we can only create logic apps uh when it comes to power automate we can create uh cloud flows which are nothing but microsoft load business process flows or desktop loads uh this is important thing to consider whenever you're debating uh which one shall i use to implement my automation shall i go with logic apps or shall i go with power automate so the maximum run duration for a logic app is 90 days whereas the same duration for power automate is 30 days and this is true for cloud flows if you are going with business process flows uh then yes you get unlimited run duration however for typical scenarios if i had to compare apples to apples um then logic app is 90 days whereas power automate is 30 days and there is no way you can bypass this limitation uh now what does the run tuition means is if my instance of my power tournament starts today then it should finish uh its execution in the next 30 days otherwise it will get terminated automatically now what are some of the scenarios where you think your power automaton your logic app may run for more than few seconds or few minutes uh the scenarios where you're based on a user input for example if you are using any kind of approvals or in your logic app you are waiting on some kind of event to happen uh and for some reason that even took more than uh 30 days uh to happen in power automate of more than 90 days in case of logic app then the instances will get terminated automatically so keep that in mind while while designing uh your implementation and and and business logic uh when it comes to application lifecycle management since logic app is part of azure there is a really good story available uh when it comes to integration with azure dev source control uh and the management in azure resource management uh when it comes to the application life cycle management uh we can design and test in non-production environments uh and promote to the production environment when ready uh so the story with alm in power automatically like kind of bit lagging as compared to logic apps however uh it is getting improved uh day by day when it comes to powershell command let's uh we have a powershell command that's available for logic apps that we can use and they are part of azure adam.logic app model uh when it comes to power automate uh we have two models uh that we can use one is called microsoft.powerapps.administration.powershell and second is microsoft.powerapps.powershell modules and you can go to this link to download the models for it when it comes to trigger uh most of the triggers are identical so we have manual trigger available in logic as well as in power automate we have uh figures based on an event like created updated and deleted uh uh we can have logic apps and power outputs start on the schedule however power automate has one extra trigger which is not available in logic apps and that is a location based figure it is right now in preview however there are some really cool scenarios we can implement uh using the location based trigger in power automate now when it comes to setup and installation uh when it comes to logic apps uh you you just need to use a modern browser and also the code view is available uh and we can also use visual studio for the development when it comes to power automate we have to use modern browser uh if you have access to microsoft teams and with your current team's license you can get access to data was for teams then you can use teams clients with limited uh with some limitations to create your power automate as well uh when it comes to integration with sharepoint uh there is no direct integration of sharepoint and logic apps is there uh whereas you will see some built-in sharepoint flows already available uh uh and if i have to implement any automation where uh i had to do any kind of integration with sharepoint uh then i have to create uh the automation using the logic app right designer whereas if i do the same thing with power automate i can do that using the modern sharepoint ui or i can use the power automate designer or i can use microsoft teams client as long as i have uh database for teams available to uh to me now what does that mean like built-in flows if you are coming from sharepoint background and if you open a modern sharepoint site there in the ribbon you will see an option for automate and using that then you can start creating uh your automation directly into the sharepoint ui when it comes to integration with on-premise data let's say your data is a non-premise and for some reason you cannot move that to a public cloud uh there is a option for on-premise gateways available with both the products uh however there is a one uh carrier there uh so with logic apps you don't need additional license however if you would like to use on-premise gateway with power automate you need to subscribe for additional premium license when it comes to pricing and that's where the majority of the decision points come which one shall i use so when it comes to logic app the pricing is very simple everything is based on consumption like you pay as you go uh whereas when it comes to power automate it is subscription based however there is a free version of power automate available that anyone can sign up as long as they have a valid microsoft account uh now what does a valid microsoft account means account with outlook.com or how to hotmail.com if you have that you can sign up for power ultimate for free power automate is also bundled in your microsoft 65 subscription as part of your dynamic 65 subscription you can also uh purchase some additional plans for power automate for per user plan or per flow plan and also that includes uh attended rpa and one of the key thing uh decision point i have seen when it comes to using power ultimate especially if you are getting a power automate license why a microsoft 65 subscription is if you have to use any http action that is a premium action and that is not included in your regular uh subscription and if i had to use that in my implementation uh then i have to go with uh premium licenses now where i will be using the http action uh or trigger uh if you have to use http call uh to call microsoft graphic graph apis from power automate if i had to call any external apis or if i had to call any azure function from power automate or if i had to call any logic app from power automate in that time i had to use http action and depending on how you are getting access to power automate that may or may not be included in your subscription and let's say if you are getting power automate as part of your microsoft 65 subscription to use http action you have to go with one of the premium plans here now i see there's a question can i export a workflow from power automate and import in a logic app yes uh we can do that when it comes to uh exporting that uh so since power of domain is based on logic app there is an already an option available in the ui so if you go away go to the ui of power automate and if you click on this more button and if you click on export there is an option available where you can export that particular flow cloud flow as a logic app template which is which is nothing but a json file and then we can take that json file go to the azure portal and then we can start uh use that implemented use that json file and create a new logic app however whenever we do that uh we have to reconfigure the connections any connection that it may be using and we also reconfigure any action that is no longer available uh in logic app uh for example one of the demos a couple of demos i have i will show you like there is a specific action uh that is available in power automate that is no longer available in logic app and you have to do things a little bit a different way so when it comes to integrating these two so let's see if i have if i have a business logic and i would like to integrate both logic apps and power automate into that how i can do integration uh when it comes to logic apps uh i can call a flow using http action i can call a logic app from power automate using using http action i can also export a flow and deploy that as a logic app so that way i can reuse a lot of the business logic that i have i have already implemented uh when it comes to uh integration with power virtual agent power virtual legend is the latest member of microsoft power platform you can use it to create uh the no code chat bots uh and if you haven't played with this product i highly recommend you to just play with it sign up for the trial using this you can create a working chatbot within 30 minutes that you that can be used in your enterprise as well so when it comes to integration between logic apps and power automate if i have to call any logic app from power virtual agent i have to first call a power automatic and from that i have to call uh the power power virtual agent sorry yes and if i do the same thing uh the integration between power versus and power automate uh there is an out of the box action available where from power virtual legend i can call a power automate and the reason for that is because both the power virtual agent and power automator is part of the same umbrella of the product group so you'll see there's a seamless integration available here whereas here you have to do some additional steps eventually the same kind of integration will be available but whenever a new product or service get launched typically you'll see a really good integration between uh the umbrella products in the same product group with that let's go to a quick demo here what i'll be showing you is uh i have a logic app here uh which uh i'm using uh to create a new microsoft teams uh using uh the microsoft graph api so let me quickly click edit and here the implementation is very straightforward uh it starts whenever a new item gets added in the sharepoint and when it comes to the trigger uh you can have a various you have various options available i can have a logic app started on a specific event i can do that manually or i can also have that based on http request as well and then i'm initializing a bunch of variables here and then i'm having a condition and then i'm doing uh approval so the idea is whenever i need to create whenever a user needs to create a team he or she will submit a request and that request will go through an approval and here for the approval i'm using an email-based approval so if the request gets approved uh then only we are using uh the microsoft graph api and creating a new uh team uh uh using the graph apis here now similar implementation we have done using uh the microsoft flow so this is the microsoft flow and you can see here you just have to go to flow.microsoft.com to access it uh the implementation is fairly identical i i have this new flow starts whenever a new request gets submitted then we initialize bunch of variables that we require to call uh the graph apis from the power automate uh then we do a different kind of approval here so the approval action we are using here is specific to power automate and this approval action will give you option two options to approve uh we will get an email and if i go and expand this action items uh in uh the flow ui here also you can go and then take action on this particular approval request and if the request is approved uh we are doing the simple similar implementation where we are calling microsoft graph api uh and creating a new team for us and let's see if i have to uh export this particular power automate and if i had to deploy that as a logic app what will happen is since this particular action is no longer available in in logic app i had to delete this action and recon reconfigure this action using a similar action that that may or may not be available there so if i had to summarize the comparison again these are my personal thoughts when it comes to the comparison between these two products uh when it comes to the licensing uh logic apps or wins hands down because everything is simple you can quickly use azure calculate pricing calculator and find out how much uh charge it's going to be you it's it's it's going to charge you depending on your consumption whereas power automate everything is subscription based and some actions may be premium some may not be available in your regular license so you have to do uh some checking and some math for that when it comes to personal automation uh power automate or microsoft flow wins hands down the reason is you can get this service for free and uh and the free service also comes with a lot of cool features that you can use for personal automation now what does the personal automation means personal automation means i'm using some kind of automation for my own uh work for example if i'm going to a conference and uh instead of checking all the emails i can have a flow which is integrated with my mailbox and whenever a mail comes with an important important setting important notification or the mail comes from a manager i get a notification by a different mechanism so that i can only check email when something important is there when it comes to the power user friendly now who are the power users the people who don't want to write code who doesn't write code or who prefer more kind of loco no code scenarios for them uh power automate is much better the ui uh and the ease of use is much higher there as compared to the logic apps because i have seen a lot of power users struggle to navigate around azure portal and you have to use azure portal to create logic apps again this is my personal opinion your opinion may vary here when it comes to the developer friendly logic apps hangs down here because there is a really good integration available with other azure services uh when it comes to integration with third-party system uh both are kind of equal so overall both the services has their plus and minus and again i'm giving you an answer like a typical consultant depending on the situation you may use one over another uh but the point i would like to make here is if you are familiar with logic app don't shy away from trying power automate and same thing if you are familiar with power automate drone shy way trying a logic app because there is not much difference when it comes to implementing your actual business logic so if you would like to get your hands dirty how we can get access to power automate uh without uh paying for it for your personal development so there is a really good program available from microsoft called microsoft 65 developer program you can go to this link uh sign up for this program uh once you sign up you can create your personal microsoft 365 tenant for free it comes with 25 users with e5 licenses e5 is the highest level of license you can get with microsoft 65 and in that tenant you will have exchange sharepoint teams power automate and power apps and using the same tenant you can also sign up for free trial for power virtual agent as well again you can sign up for the free trial as well but to sign up for the free trial you need to use a credit card to use or to create logic app you need access to azure so how you can get access to azure especially for your personal training and development there are multiple ways if you have access to msdn that's the preferred way just go to your msgn portal and activate your monthly credit here you don't need to use credit cards this is excellent way to try out azure and play widgets paid services you can also sign up for the trial depending on your country uh the amount uh that trial amount you get may vary however you will need to use a credit card here uh there is another program from microsoft called azure for students where you don't need a credit card uh however you need a valid edu account from a participating school and you get a 100 credit depending on your country and you can go to this link to to sign up for that so hopefully the contents we covered today made you familiar with the key differences between power automate and logic app uh and if you don't have access to a personal uh developer tenant definitely check out microsoft 65 developer program with that we come to an end uh and these are my contact details uh feel free to email me or connect me on twitter or on linkedin with that thank you very much for coming and stay safe and stay healthy well done prashant thank you yeah i must say that has to be the most comprehensive like coverage of logic apps and per automate and everything that you need to know about them i really like the fact that you put those links for personal uh use because i think that anytime i look at one of these conferences i know a lot of people want to go play with it and to give them just enough that they can go and deploy and try stuff out is awesome so this was amazing um i know there was one question which i must say you answered in line uh i don't know if you planned this but it was a perfect segue the other comment that i loved is someone on chat simon said sure power automate and logic apps are interesting but i want to hear more about your antarctica service medal so you need to write a blog post on that sure we'll do that um i don't know if there are there are kind of there's a whole bunch of questions that were answered so you should definitely go check out the chat i had one quick question for you because we're talking about serverless and i've heard two really good technologies the out coming at it from the azure side and power platform do you have a resource for best practices for serverless integrations with logic apps and like i know power platforms has a practices and patterns kind of a movement but are there any kind of best practices based on different usage scenarios so what you're talking about is uh the center of excellence skate for power platform that's a good way to start with the power platform and if you just search on docs.microsoft.com you will find tons of information regarding the best practices not only for logic apps but other but other serverless technologies as well like azure functions or our cosmos db as well well thank you very much and huge shout out to the two conferences you're running we should definitely amplify them uh can't wait to see that but i think we are now moving on to the next um talk today yep then that was actually a good segue as he mentioned uh cosmos db because our next segment coming up is surfacing fda coved data using serviceless functions and cosmos db with charles chen it's like we planned that charles welcome hi hi melody thank you so much for having me uh today we will be talking about servicing covet data from the fda using serverless functions and cosmos db and we'll be looking at a full serverless data processing pipeline to ingest any sort of data in this case data from the fda all right so let's get started so my contact information is right here if you want to check out the app you can go to covetcureid.com to see the app running and kind of like davide showed earlier we will be going through a full stack of how to build this type of application so the session objective we're going to do a deep dive and we're going to build an end-to-end serverless data processing pipeline what we'll do is we'll extract the information from the fda encaps qrid database load that into azure storage use azure functions to perform the transformation and push that into a cosmos db instance and then serve it also from azure functions so what is the fda and cad secure id application this is a repository that was created by the fda to house electronic case report forms for novel use cases of existing therapeutics in many cases you have doctors all around the world fighting hard to treat diseases they're working with patients with diseases that are not commonly seen in the western world and there are sometimes they're using treatments that they have on hand or treatments that are available to try to treat these diseases and the cure idea application is a community driven application where physicians all around the world can contribute electronic case report forms now during the beginning of covid when the first outbreak started to happen around the world this was one of the first places where physicians all around the world came to report their treatment regimens and report the results that they were observing now again again this doesn't have just cova data but it has data of all sorts of disease areas covered cure id is by take on the application where we extract the information and make it just a little bit more presentable so that given a age and a gender of a patient we can find all the different treatment regimens and drugs that physicians around the world use to treat the patients and find find efficacious treatments uh so that patients can doctors and patients can potentially find ways to to treat covid okay so let's take a quick tour this is the fda ncats cure id application so you can see it's organized by disease area and when you click into a particular disease area you can see the different types of drugs that the doctors around the world physicians have used to treat their patients and as you click through it will list the different types of therapies that were that have been reported but you can see it's very difficult to find exactly the information you're looking for and the application is not responsive so what i did is i built codedcurit.com which takes the same application from the fda and makes the information a bit more accessible so given the age and gender of a patient we can find all of the cases that are in the cure id application and view it by the different therapies that physicians around the world have tried and if we click through we can see the different types of regimens that were given to the patient and quickly filter by whether they improved or deteriorated and we can click through each of these cases and find details and additional notes all right this application is built as a full stack full stack serverless application on azure so at the top most layer is azure storage and static websites where html javascript css and image files are hosted azure storage with static websites is a excellent choice for building web applications because it is super scalable serving your static assets directly from your azure storage endpoint not only that it's incredibly cheap so we're talking about fractions of a penny to run applications on top of azure storage static websites and it also integrates easily with azure cdn so that you have a true full front end solution for building modern web applications at the application level we have azure functions so azure functions we are using it for both the rest api and also to perform data processing and etl azure functions for me is a swiss army knife and one of the my favorite technologies i've been working with dot net for 20 years since the very first beta that came out and i i cannot be more excited about building solutions in azure functions and of course azure cosmos db this is one of my favorite databases to work with uh at the core of it it is a super scalable document oriented database but you can query it with a sql-based api there is a based api and there is also a graph-based graph based api uh using gremlin so it is a multi-modal database that gives you a lot of flexibility in how you can work with that data and it's also super scalable so why azure serverless some of you folks may be coming into the serverless world for the first time some of you are already experienced i want to give you eight reasons why i love azure serverless number one is a transition from devops to no ops right so the beauty of azure serverless is you're no longer thinking about building containers or building virtual machines as a one-man team it's incredibly powerful because i don't have to think about devops i don't have to think about operational considerations or deployment number two is focus on building code so we saw it right at the introduction video this is one of the biggest benefits of azure serverless is that you can really just focus on building the code and not worry so much about the deployment about the infrastructure about building all of the surrounding aspects of your application it's event driven by design and i think this is one of the trickiest parts for most people to grasp when they first touch azure serverless is that it is inherently event driven and we'll see how this works when we walk through the code and of course ease of connectivity so it's incredibly easy to build data processing pipelines using azure serverless because of the triggers that we can bind to our code it's really a perfect solution for modern single page applications so we'll take a look at how we can use static web apps to host the front end we'll see how we can use azure functions rest based apis to build our api for our front end and of course we have a fully serverless database in cosmos db where we don't even have to think about a schema we can push our json objects directly into our database hands-free scalability right so we can scale this application hands-free we don't have to think about how many containers do i need how do i how do i configure my kubernetes scaling to handle my application and scale it up and down with azure serverless the scaling is on demand and consumption-based pricing so as i've built these applications i've paid fractions of a penny to host my applications and it's particularly interesting with azure cosmos db because it takes this entire model and turns it on its head azure cosmos db is priced by it has multiple pricing models but the most interesting one is it's priced by what are called request units per second that means you pay seven dollars a month for 400 rus per second now whatever you build if it can fit inside of that envelope of 400 rus per second you pay a fixed price of six dollars per month and of course what's really great about azure serverless for anyone that's coming from a.net background or a microsoft stack background is there are just really familiar developer ergonomics and we'll take a look at how this manifests when we walk through the code so let's talk a little bit more about the connectivity right one of my favorite parts about working with azure functions is the multitude of bindings that you get out of the box to connect different parts different types of applications and different systems one of my favorite ones is the signalr binding this binding allows you to build real-time applications without a without a ton of programming whatsoever in this application in this demo we'll take a look at how we can use blob storage cosmos db bindings http bindings and queue storage bindings to build a data processing pipeline that's going to take this data from the fda and push it into our cosmos db instance let's take a look at our data processing pipeline so in this application in the cover qrid application i've built an event driven serverless data processing processing pipeline that's going to take the case json files case case data json files from the fda website and we can move it into blob storage and once it's in blob storage this will this will trigger a function that will kick off and perform some etl some light etl and do a distribution of those case of that case data to two separate cues we'll have one outbound queue for the different regiments and we'll have a second outbound queue for each of the drugs that are part of those regiments so that we can view the data by drug and also by regiment once it's in the queues we can use a q trigger to trigger the different functions to perform additional transformations or additional processing and push those right into cosmos db now as we look through this code you know what you really want to take take note of is that we don't have to write a lot of the underlying connectivity to get these pieces working together okay and if you're interested in working with the full database you can email curesupport nih.gov.gov to get the full database to work with it yourself okay let's dive into code so number one is how we fetch the data there's a simple script and if you'd like to if you'd like to see there's a full the code is fully hosted in a github repo so you can have full access to how all of this is how all of this is set up so but first is fetching the data we have a simple javascript file that we can run from the command line to fetch the data from the fda qrid website once we fetch the data we use a set of azure cli commands to load that data this azure cli commands will also create the assets in azure you could use technologies like arm templates you can use terraform or other other other technologies to deploy these assets but for me my personal preference is to use the azure cli and this simple script will will configure all of the artifacts whether it's local in your local development environment or in azure now once we have this configured we can start to write our data pipeline this data processing pipeline you can see what's really interesting about it is that it's just a regular class there's nothing special about it i don't have to inherit from some class i can just write a regular class and once i've written this regular class i can decorate it with this function attribute this function attribute indicates to the to the uh to the runtime that this is a function entry point and we have these attributes here that indicate that this is an inbound inbound trigger from a blob storage endpoint and an outbound trigger to two separate cues right now when i when i start to when i start to receive information when the first file gets robbed dropped into blob storage this will kick off this function and you can see i can start processing my data right away and there's no connection that i have to make to the queue i simply output i simply drop my data into the output collector and that automatically pushes it into the queue and once it's in the queue i can pick it up with a separate function here to process each of the entries each of the regiment entries using a q trigger listening on that queue and once i have that information i can do any additional transformation or processing that i need to do on my data before dropping it right into cosmos db and you can see there's no connection code necessary all of the connectivity between these different elements in in the azure serverless ecosystem is handled by these trigger bindings so this is an incredible incredibly powerful streamlined way to build data processing pipelines without having to mess with the complexity of connecting a multitude of endpoints in your code and really cleans up your code so that you can focus on working with your data now in this case we're just doing a very simple etl and for that azure functions is a perfect tool it's lightweight it's incredibly easy to work with and you have tons of options for pipelining that data in and pushing that data out for any sort of heavy lifting you'll want to consider azure data factory if you're doing large data streams or if you're doing heavy duty transformations or you want code for no code transformations let's take a look at how we're doing this transformation in azure functions in this case in this case when we receive the case file we just have a simple function here that takes the information in that json and starts to flatten that information down so we're just transfer doing a very simple json transformation to take a very complex json document and making it a very flat json document azure functions is the perfect tool for this because we don't have to write we don't have to learn a multitude of tools we don't have to configure a large system to be able to perform these types of very simple data transformations and we can also run these data transformations locally now let's talk about querying this data once we have it in the database how do we get it out of the database right for this we can use the repository pattern repository pattern is a perfect fit for the azure cosmos tv because it is a document oriented database let's take a look at what this looks like when we have the repository pattern in this case this is my drug repository class and the drug repository class simply maps to a entity type that i have in my cosmos db and you can see one of the great things about working with cosmos db is that even though it is a document oriented database i can work with this database using familiar technologies familiar paradigms like sql so i can query the cos i can query my cosmos db using the sql api i can use the mongodb api or depending on my workload i could also use a graph database api uh called gremlin in this case i have a very simple query i can parameterize it and i execute this query and get the results back now if i'm working locally i can also run the emulator so cosmos db comes with a local emulator you can also run the emulator in a in a docker container if you're working on mac and this emulator allows you you basically get a fully functioning azure cosmos db on your local machine where you can build anything you want without paying a single cent now you can see we can write very simple sql like queries and in this case we're going to find and count all of the different cases by country and if we execute the query we can see all of the different cases that were reported from all of all of different countries around the world what about the front end so azure cosmos db is a azure functions is a not just a etl or data processing pipeline but it can be the foundation of building your rest apis and it's a great solution for building rest apis for a number of reasons the first reason of course is that you don't have to think about scaling your apis right so even as your number of users increases the azure functions runtime itself will handle the scaling of the api of the runtime instances for you right number two is if you're like me and you build a lot of enterprise applications azure functions is a great solution if you think about your enterprise applications they tend to have peaks and trouts during the day right so you'll have very high usage when people come online very early in the morning you'll have potentially very high usage in the afternoon as people come back from lunch then at night your application usage completely drops off and your servers if you're using virtual machines are just sitting there idle azure functions is a perfect solution for building these types of enterprise applications because it automatically scales up and down with the usage of your system throughout the day and if no one's using the application at night it's just going to sit there and you won't be charged in this case you can see here we have two http endpoints that are receiving requests from the front-end stat static static web app and you can see what's really nice about it is that if you're already familiar with netweb apis it's really familiar to write the same applications in azure functions in this case we'll have two apis that each map to one of the repositories and retrieves the information from cosmos db let's take a look at these so in the data provider once again you can see my azure function is just a normal class and you can write these classes either as static classes as you'll see in most of the online examples but you can also write these classes as instance classes and if you write them as instance classes one of the benefits you get is that you can take advantage of dependency injection so right at the beginning at the keynote john talked about being able to use an in-process and out-of-process model with the i believe with the out-of-process model you can use the full you can use the full uh middleware of asp.net however if you're using in process you have to fall back on the dependency injection capabilities of the em process model and in this case in our app startup we can initialize our cosmos gateway and our repositories and put it into our into our services container and access those as injected instances in our azure functions now you can see the azure function is super simple all we have to do is decorate our function with a function name attribute and an http trigger that says this function will react to an http event from a client and we can use get we can use post semantics we can declare the route and within that route we can parameterize it and what's really great about azure functions is that if you're using net or the java api you get these strongly typed parameters by default so you don't have to guess you don't have to process and pull this information out of the forms or out of the query string you can just work with strongly type data and my personal preference is to keep the functions simple and move most of the data processing logic into domain models so in this case you can see we have a very simple entry point that's reacting to an http event and we simply execute the query and return the result for anyone that's familiar with netweb apis this should look instantly familiar right there's nothing magical here and that's what makes azure functions a great platform to build the next generation of web apis one of my favorite parts about working with azure functions is that you can build and test locally there's no cost or commitment you don't need to sign up for anything you don't have to provide a credit card you can build these types of applications completely locally using completely free tooling whether you're on windows you're on a mac you're on linux you can run many of these components in docker and you can build using the language of your choice so c sharp obviously my favorite but if you're if you like f sharp javascript java powershell even python or typescript you can use the language that you're comfortable with for building.net server for building azure serverless applications okay and i also want to show you how we can take advantage of some of these developer ergonomics that we're all used to already so for example within this workspace within my visual studio code workspace i can debug my application and use an f5 development experience if if i want to use it that way right so just by clicking my start button it will build the application attach the debugger and i'm ready to go so you can see here it's it's cleaning my application it's built it and it started the functions runtime and now if i use my local instance it'll step right into my debugger and i have full access to look through my local variables and my call stack so azure functions for me is a great development experience because you can work with the tools that you're already familiar with work with the developer experience that you already have you know decades of experience with and the ergonomics just feel very familiar now if you'd like to you can also run this from the command line so for example in this case simple as running func start and this can this will also start my functions runtime and now i'm ready to go once again with my functions running on the left-hand side you can see here there's also this azure azure extension that you can install with visual studio code that gives you full access to see the functions that have been conf that you're working on and also access to your local storage so that you can explore the different files that you're working with now earlier we talked about working with azure functions and why azure serverless is such a great solution of course one of these is devops versus no ops and my favorite part about this is that working with azure functions and in azure serverless i have a really streamlined path to deploying my code right from github right so from visual studio code from visual studio code i can build my github actions i can specify my github actions and from my github actions i can respond to push and pull requests and automatically deploy the code into azure now i've separated my deployment into server and api into api and web front end so this means that as i'm building the application i can separate my deployments so i can have front-end deployment separate from my back-end deployment and if i make a change to my application all i need to do is commit and push and this will trigger the build on the server side on github and push into my azure storage so let's do a quick example make a small change all set and now as soon as i push this into github this will trigger an action to start building and deploying the application and because i've separated my actions to from the server side and the front end i can deploy the two separately i can deploy just the front end or just the back end based on which side of the application has chain now what's really great about this integration is of course if you're using azure static apps web apps versus azure static websites right azure static web apps pulls all of this together for you automatically with azure static websites you have to do a little bit of the work yourself however i've taken the initiative and written down how you can set up this type of application yourself from the very beginning how to build the function app itself to the different commands you need to use to pull all of this together and one of the key parts of this is during this build process we can inject secrets into our application that include the api endpoint that we are hitting from the front end from the static website our azure credentials and our azure function published profile we can take all of our secrets and move them out of the code and inject them at at build time so this code is made open source and you can download it and you can start building with it locally again one of the best parts about working with azure serverless is that you can build in the language that you like to work with that you're comfortable building with you can build it in any environment that you are comfortable with that you have access to and there's no cost for commitment to get started and learn these technologies and of course the best part is that if you're already a experienced.net developer it's incredibly easy to make that shift from your asp.net web apis that you're building today to azure serverless apis that can scale that have all the benefits of building azure serverless solutions all right so if you're interested you have full access to the application at cobitcurity.com and of course the github repository any questions that was excellent that's pretty exciting thank you i think most of the questions were actually answered either by you in your session or in the chat already i didn't see any new ones coming in just yet uh i really enjoyed how you were saying how easy it is to to integrate and of course um how how it is how cosmos db is is easy to use and and to integrate with cosmos db that's and i think that's one of my favorites i think the best part of it is that if you're you already have very strong back-end skills in sql you can take those skills and it translates very easily using the sql api right if you're if you don't want to be adventurous and start using the gremlin api uh and writing graph queries right um you can take the skills you're already familiar with and and translate those into the azure platform yeah whether that's front end or backend does yours whether it's front end or back end exactly exactly yeah i really like that yeah i i really liked it also because it's a completely open source project so i'm going to encourage people i know hacked over fest is coming up but i think this is one of those projects where you can learn from how you built the system use it on open source data and take that pattern and apply it to like a different problem or maybe just contribute back to the same one that's definitely definitely one of my one of my drivers number one was during the spring of 2021 to find some way i could contribute to how we were handling the covert situation but of course also contribute this application as a pattern to the community so that folks that are building data processing pipelines folks that are already familiar with asp.net and want to build the next generation of applications on azure serverless in cloud they have a fully functioning application that they can download and start building with so i know this is probably not relevant but is there any kind of issues you have to deal with because this could be sensitive health care data in this particular case the information is fully available from the fda and from ncads so you can email you can email uh cure id secure support at nih.gov to get access to the full database and they will email it to you you can also access the information through the website i love this case study super awesome and i can't wait to see where it goes next thank you yeah that definitely makes it an easy case study to use because typically when people want to do something like this it makes it difficult having access to data to get them started so this will make it easy for everyone yeah and i can't believe oh my goodness we're almost to the end we're at the last talk of the event are you excited yes but i think we have uh yeah one interstitial video one video to go yep so let's talk about it we just heard about cosmos db so let's roll tape on what cosmos db is azure cosmos db helps you get more value for your money by making it easy to manage the components you pay for database operations and storage the cost to perform database operations including memory cpu and iops is normalized and expressed as a request unit more request units are charged for more demanding activities for database operations you can select one of two models provision throughput for serverless consumption provisioned throughput is the capacity you allocate for database operations measured in the number of request units per second and billed hourly it works best for workloads that always have some traffic and require high performance slas if the traffic is predictable you can use standard provision throughput to manually set and adjust capacity as needed if the traffic is unpredictable you can use auto scale provision throughput to instantaneously and automatically adjust capacity between 10 and 100 percent of your set limit auto scale becomes more cost effective than standard when traffic is unpredictable and not close to maximum capacity most of the time provision throughput may not suit workloads with only occasional database operations and lower performance requirements these applications can benefit from the serverless model while it has a higher unit cost it's consumption-based and only charges for the request units used for database operation with consumed storage fees are charged for the total gigabytes used per month for both transactional and analytical storage you also pay for storage i o and analytical storage get the most value from your workloads by understanding the components you're built for in azure cosmos db [Music] wow that was cool i really love that set that uh segment what so uh nydia what's been your favorite session so far you know i have to say i was blown away by the keynote uh i mean i so maybe i'm just kind of a little biased but i almost thought the every one of these talks was spectacular for a different reason i was blown away by the keynote because i could get actionable information from it right it was like we could kind of get a c being built and i really think the use case that they had which is how do you build an open source community set of tools and kind of use these workflows and technologies i was looking at it going this is just very relevant to all the things we're going to try to do because we're now in virtual event space um the second one that uh i personally because i work in the mobile web space i was like i really like davide's jumpstart because i think that that's a great pattern labrina's game i need to go talk to her because i kind of am very curious about how we can use analytics game analytics to do different things because i think that you know although i think about it is like like large-scale gaming but i i think that behavioral analytics is something that's an interesting area for us to look at and then um just because this was the last conversation that we had the covert data i i kind of like we just we're doing something with data science and i find this these kinds of examples are things that i think people can take and replicate like fork the project learn from it play with the data kind of build your own dashboards if you will so those four stuck out but all of them i mean like we had a great talk from the folks from dolby as well i was just like tweeting all the time how about you you know it was really hard to pick one in particular but there was this theme that went through all of them that they were all really practical things that we could you know leverage right now in our workplace things that we could build on and move forward with and you know leverage you know pull things out of github like little tips and tricks that we could use in our our work day right now and move forward with or be able to build upon or even if it was something that you hadn't worked with before you could learn from it you could grab something off of github you know the the davide one with that whole end to end you know i was listening to that and i was like this reminds me of years ago when we used to watch cooking shows and he would just pull something out of the oven and you had something completely baked it was like i was like oh there's my cake it's already baked you know i know and shout out to prashant because i think there's low code options as well but that sets up a big challenge for carlos because i think he's our last speaker um in the session hi carlos how are you hi nidia hi melody great to be here what a cool presentation so i think you're the last speaker and you're gonna talk to us about working with azure sql dbs and serverless so take it away thank you very much um pleasure to be here and what are cool performances of all but uh let's start so uh my name is carlos lopez i'm a member of the community for the data platform i'm microsoft mbp so uh today we will work up a talk about the working with azure sql in serverless mode but uh taking the considerations uh the agenda is very small just to talk about the azure sql overview the provisioning and the considerations that we can take when we are creating on serverless because uh it's a great product it's fit for for our uh serverless game however we have to be aware of the considerations right to when so that video of cosmos is just great explaining that part also as well so checking the overview on the microsoft azure sql um we can see that the requisite for for for using it uh you have to have the azure account right the subscription uh you can use it you can manage through portal azure portal that is uh today we will work on azure cli exclusi most of the time because i like that mode to see the commands and not to be changing switching through through the portal uh however we'll see it but not that much in the in the portal uh using the visual studio code and the the great uh resources that are there you can user use the azure sql server extension and you can see it there and and you can use it as a azure data studio as well so that's the way to to manage those and basically the the cloud computing patterns that you can that you can see in in that is that you can that you can have a periods of inactivity and that's great for for uh azure serverless as your sql serverless that is when you can see that that compute is growing fast and unpredictable bursting then then you you can change to to to another kind of uh um mode that you that you can select so you this is the patterns that you can that you can see in in the cloud you can see on and off you can see growing fast unpredictable bursting and predictable bursting so in that matter that's what we have to be aware of when to to choose the the selected mode that we need so it is a good consideration and concern that that we need to be so in terms of creating an azure sql we have um three ways to uh use it in a declarative mode you you have to think in the declarative model to to use it so one is the yaml file using the yaml the other is uh seeing the responses through json when you're configuring through the portal you can see it that json and the responses are through json and the other is using the the azure cli commands so whenever you are using through uh these interfaces uh interfacing with the the the model you know that um the ways to do to use it for example when you're using yaml you you are pretend you're preparing yourself to use like uh maybe for a devops you're you're preparing a pipeline for example to create the the all the commands to to define your your creation so that's a good definition to know when when to use it so you're creating your yaml file so um when you're creating the provisioning model will you be using this kind of commands we will create the the sql server will create see the the response for the list of sql servers we can see the the the tvs and the usage for it and finally we will create a database for it so it is that simple you just need to create the data the sql server itself and then you have to create your sql db for the the the sql server that you have just created that's a that's it about it and um finally you can delete those when you're not using it since you only this is designed just to to use to be used when you are when you want it you can delete it whenever you want just as easy so that's a way of doing it this is a json response for for when you're creating the your sql server so that's uh another perspective to to see it so when you're creating your sql database you have uh four models you have you have serverless you have proficient you have hyperscale and manage instance that's the way that you you can work in your azure sql environment today we will work about the serverless instead of managed instance that with that you are thinking in the approach of having your sql server agent and all the sql server system databases as you know and love however in serverless is different you just have your database per se and your master database you don't have that engine you don't have the sql server agent however but you can you can automate you can use your notebooks you can create a notebook and automate from there you can execute from there and uh also uh you can use an elastic job to create it that way so that's the good things about using the azure serverless mode so as we said uh the way to to create to work with the azure serverless is to first you have to define your resource group then you have to create the network configuration and firewall rules you have to define a compute here that you will use which is the provisioning or serverless depending on your resourcing that you need and you have to define a sql elastic pool if you need or not that depends on we will not use the elastic pool right now because we will work on the serverless mode and with a simple uh resource use um also you have to define the storage where you will create those and uh the compute family that you will work on sorry the gen 4 is not longer available it's starting with gen 5 and upper so sorry about that however it's a [Music] good point to say that you can use an elastic job agent here also as a feature however um to automate which is cool uh feature it's important to say and it is very easy to use you know you just have to separate you just have to use the general purpose uh license you just have to select the the provision or serverless mode depending on what your uh intentions and to select the family or hardware family that you need and that's all that's all to start so basic characteristics is that you can auto pause you can select your service objective you can auto scale and you can define your service pool for the database if you want to create so this is this is an important part when you are uh on sql db scenarios for serverless you have to identify if your server database is going to have an intermittent use you will have inactivity periods to save costs so that's very important important to to know because then you will be a good fit for serverless this database is designed to scale up during time and it is perfect when you are creating a new application database whose resource resources are not predictable to estimate because you know you don't know yet it is a new unfamiliar database that you are building through uh your your uh development so in that case uh serverless is a great way to save costs to know what you're expecting to getting to know what you're expecting it for for for scale up so that's good for permission and it's a database with a predict long-term use you know this database and so you have to have predefined resources and so it is a considerable considerably business important database so in that case you can use a hyperscale or business critical services depending on the service tier that you want to scale up think of if you are on serverless and you are going to you have this database matured in a level that you want to convert in a production productive level so that's your case and you cannot tell tolerate delays on autoresume or out of pot actions because there's already clients or connections there so you're you're not longer thinking on developers you're thinking on users so in that case you cannot tolerate delays in that case you need to scale up cannot tolerate constant memory claims and which are the the perfect fits for the serverless because these are the consumption that allows us to save cost there so um in that that case provision is is perfect for you and the different db's within the resource group sharing an elastic pool can be a match for you so um in that considerations uh you only pay what you you consume so basically you are working on a general purpose license you have a max memory allocation for three gigabytes or per core uh thinking on azure serverless of course and you have 40 v cores so max serverless and you may experiment a delay when you are on autopass so basically is for saving cost of course but you have to think on that uh the way that it is the architecture distinct thought for for uh save cost so basically um in terms of performance it seems it is important to know um features that can help us so um this is a good one this is a azure sql insights which is specially designed for azure to monitor your database how it works is that it uses your dmvs and collects all and transported information to a linux vm uh to start where it stores your dmv data metadata uh and collects all these time series db in a team time series db so basically what you're paying there for this feature is is not the exactly the use of the feature it's just that you need to have to have this linux vm where you are hosting this this feature so but it's cool because you can transport all these dmvs that you can you know that you will only have when you're when your sql server agent is running um your time series db will collect those uh using this uh kind of feature having this azure monitor agent and having all these workloads uh managed by this telegraph agent that will like move this through the the uh through your to your monitor so that's cool so basically um with that azure sql db comes with a great responsibility to use so um let's go to the demo to see and create a database really fast and see how how it responds um so um let's go here to the visual studio code and see the commands that we were talking about the first one will query out with with the azure group list and you can query the location you can query through the the json uh describing your resources you can see the naming here so i have already created a resource group for the server so here is the the details for it and tells you that the research that you use so uh in terms of creation a resource group is that easy just you have to uh dedicate the location where you want it and the resource group name so that's easy as that and create it's just as easy as that for uh the next command you can is almost easy to delete as as it created so but it has to be uh told that we have to that we want to delete okay so uh for in terms of sql server creation we need to have uh already have a resource group and then we will create a sql server since it is a little bit slow i will create a new bash and with that i will execute the creation of four sql servers let's see how it goes it's gonna flash see if it takes a little bit of time on resource group list oh all right because i i delete those okay sorry about that i didn't see the delete okay let's no worries it responds a little bit slower but surely i know it's mine let's change it to one just as easy of that as that you're creating four sql servers and then it responded real fast the first one and when it's cooking we can see another kind of usage of that we can cl another operation that we can do is copy the database so since it is still running the creation we'll create another bash and we will copy one of the already that we already have in this case we'll copy a database for the resource group the name of the server database that we have and the server that we have in the database that we are trying to to move and the destination to to move it through let's see just as simple as that oh sorry ah sir no was not done with james um i wonder if i already killed that so um sorry sorry for that i'm still completing the sql server creation so uh we'll move to another example still not seeing why it's not finding i think that it is because the name is different that is why you don't have to change your code in the middle of the test okay let's try to create a sql sql database now now that we have tested the creation of of a sql server just as easy as that we just select the family that we want the minimum capacity the compute model which is serverless we'll set the auto pause to delay to 60 minutes service objective to be basic uh storage redundancy to be local and the collection for the database will set to cp1 cias so with that we'll execute and with that we'll create a sql database for the server we have more examples as you see think of this example is just to figure out if you want to like create tons of database and automate so in that case you need a way to do it so see azure cli is just good for that creating databases just as simple as that and this is the example so let's move to now that we have seen how to provision using the azure cli let's move to see the azure sql to move to the connection to the database so with that you can we can use the sql server connections declare the configure did the the resource that we already had and set the and set the test so right now i'll move to my them over here and as well as as we do it on sql server on-prem we can see the dmvs we can see the this is just an example that we can see the same emps that that we can see on a sql server i'm sorry other view so this test is just to show you that you can you can query to to see the dmvs there you can see the the results that the memory clerks use the objects the object locks you can see the virtual use and the memories as well as we always do we can see the resources using the same dmvs that we always have used in the past resource staff sorry i forgot to see the average maximums the logical rights memory utilization the weights of the database we can see the performance the max job operations that already comes configured in the sql server right so what we're seeing here is the total wake times we can see the parameters for the database that are already configured you can see the plans if if we are dbas we are always thinking of these kind of features how to look at things and uh the other way is how to create databases um in sql t sql mode you can do it so in that for that i i created a script just to tell that we can execute the this command as t sql as always and we'll create just as that will create a database for itself in azure sql mode you can see the slos change the service objective if you if you're trying to move switch up and scale up you're trying to upgrade your database so basically that's what you can do you can alter it through command server change it to service objective i have put it a little bit of code here to show the if you are on standard you can move to premium and select which one you want to select so maybe you can put here another value like s1 or s2 with that oh sorry um on your request okay let me change it to zero like it lost the connection i think see the elastic pull here the database itself you can see the general purpose edition we can see the surface objective and the slo that is already working on here um so basically this is the kind of operations that you can do when you alter your database through your tc google command connecting to a azure sql database and changing switching to another so that's another way to do it if you want to if you are connecting through the engine database so let's take a look around how it's looking in the sql server databases that we created here so in the let's refresh here see if it created new ones already takes a little time yeah it already created a 99 we can see the databases that we as well as we can see the sql let's see the resource groups cr01 so basically it had created four sequel servers and two databases and in all that we have done basically with uh simple commands in uh azure cli and tc so uh this is cool to to another way to handle uh sql server instead of a management studio uh which is the traditional way to manage uh this is um uh more centralized for me for to to to handle so it is good um i don't know how how are we going with time we already yeah i think we're going to kind of be rapping pretty shortly but i do remember you told me you taped this talk the the talk before so i'm hoping we can share the link with people and this has been super interesting um there are some questions would you be able to go on chat and answer them carlos sure of course i'll be on on the chat welcome thank you so much i think we will be dropping very soon so melody before we kind of wrap i think huge thanks to all our speakers and to our audiences from all over uh but i guess we have a few calls to action absolutely um we want to make sure that you check out all those great sessions they'll be on demand and you can check them out at the home page or aka dot ms slash azure serverless conf and that'll be put in the in the um chat for you to click on or you can learn more about the skills challenge and compete to get your name on the leaderboard visit the home page or aka dot ms slash azure serverlessconf slash challenge also you can also go back to our homepage and catch our next stream for the asia time zone this evening or get up early tomorrow morning and watch the european stream or you can watch the live stream on demand that's right 20 hours of this this is going to be awesome so don't forget to go bookmark those sites and we hope you got a lot out of this share your comments with us on azure serverlessconf and thank you so much thanks for joining us hope to see you again all next year
Info
Channel: Azure Cosmos DB
Views: 290
Rating: undefined out of 5
Keywords:
Id: x67DysG4F94
Channel Id: undefined
Length: 214min 36sec (12876 seconds)
Published: Wed Sep 29 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.