Day 5 - Sept 17 Part 2 : Stratis presents Cloud Summit 2021

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
[Music] [Applause] [Music] [Music] [Applause] [Music] so [Music] so [Music] so [Music] [Music] [Music] hey [Applause] so [Music] [Applause] [Music] [Music] [Music] [Music] hi i'm anna hoffman hey friends i'm nicolas hi i am mr hi i'm tanya jenka hello and i'm excited to be the speaker at azure summit 2021. it's a fantastic event that will be 11 days live streaming where more than 100 speakers from all over the world i'm excited to speak attention summit about power bi and synopsy analytics and i'm gonna talk about security and also a microsoft mvp for azure i'm speaking at cloud summit about automated release of dotnet application and the best part is that this is a free event can't join me live at learn tv come join me live on learn tv on september 14th come join me on learn tv on the 14th of september this year a bunch of other microsoft and community speakers so if you want to learn how to secure azure come to my talk join us it will be a lot of fun see you there see you there see you there see you there [Music] [Music] [Music] [Music] hi everyone welcome back to cloud summit 21 i'm your host stephen simon and we are back with another part of cloud summit what an amazing first four and a half days we have had looks like we have already covered the entire ecosystem of azure but there's a lot more to come and i see there around i can see the numbers there are 700 people watching us live from different platforms uh we are streaming it on c-sharp corner we are streaming it on c-sharp live we are streaming it on linkedin we are streaming it on learn tv microsoft developers youtube channel and a bunch more destinations and if you are someone who is joining us for the very first time i've been looking in the comments you all are very kind to each other and that's all we want you to continue uh to follow code of conduct and also three contests that's always going on that whenever you ask a question to speaker or share your thought in the comments please use cloud summit and we'll make sure that we pick uh some lucky winners towards the end this kind of lucky draw and please feel free to go ahead and take a screen grab take a selfie be creative tag us on social media at cloud summit live both on linkedin and twitter using the hashtag cloud summit and we'll make sure that we pick winners from each social media destination towards the end of the day having said that i'm really really excited to go ahead and welcome the keynote speaker of the day today uh she is anna hoffman she's gonna talk about data and applied scientist and program manager at microsoft i had the opportunity to host a couple of times in the past a couple of times at least three times in the past but i'm always very excited to host her and i know i've already taken five minutes in a session so let's go ahead and quickly bring anna to the cloud summit 2021 [Music] hi anna welcome to cloud summit 2021 hi thanks for having me i'm so excited to be here in that that intro video i feel like every time you add some new new gems of photos yeah yeah yeah but you have been enjoying summer so you're gonna update it when you host for the next time uh thank you so much anna for accepting the invitation i really appreciate it we have had some amazing session around azure sql both from the first day second third fourth and i'm really looking forward for your session for your keynote on azure sequel i'm gonna quickly share your screen and add it to the stream everybody can see it and the next 25 minutes is all yours and we're gonna do chit chat towards the end of your session awesome thanks so much simon and thank you all for joining me today it seems like there are a lot of people joining in from wherever you're joining in from wherever you're streaming from really appreciate it if you want to add any comments throughout i can see them so i would love to see any engagement you might have um just a tiny bit about me my name is anna hoffman and i work on the microsoft team focused on azure data and azure sql and sql server we're going to be talking about sql from edge to cloud and how some of our customers are really kind of modernizing their existing infrastructure as well as building new exciting applications and of course we we're all familiar that in this day and age the ultimate tool is data and over the past 12 months we've all seen well really more than 12 months we've seen a ton of organizations undergo digital transformation at record-breaking speeds and with our global situation being virtual and being ready to adapt to anything is very important so for the course of this session i want to take a look back in time talk about sql server how far we've come and where we're going next so let's go back in time back to a day where i don't even think i was alive uh for the first version of sql server where we started on desktops now over time we cut our teeth as the number one department server database and then we gained a reputation with enterprises because of our ease of use and price performance in 2008 we went boldly to the cloud as the first platform as a service database and as the years have gone on we've also embraced new operating systems and added new capabilities for example for built-in machine learning for iot scenarios and more really becoming a data platform as opposed to just a database and all the while we've been using the same sql proven engine we've been working on it we've been enhancing it and we've been bringing you all these different flavors of sql and we've come a long way today the 98 of the top fortune 100 companies are using sql server today and while many of these organizations are still running sql server at kind of the core of their business they've also started to adopt new application scenarios and new types of databases to which azure data services have been really great for building on that innovation and what we continue to do as part of our sql strategy really at the heart of our strategy is sql is that sql engine that proven sql enterprise engine and we're bringing sql from edge to cloud including your t-sql skills your favorite tools and everything you know and love about sql we want to equip you to reimagine the art of what's possible so for the rest of this talk i want to focus on some of the latest enhancements we've made to these five sql based offerings and more importantly i want to show you how these enhancements are helping companies on their journey of digital transformation at the end of course i want to share a little bit of a sneak peek of what we're planning to do next so i hope you will stick around so let's start with azure sql edge which is our newest addition to the family of sql based products it's going to offer the full power of sql server running at the edge along with built-in capabilities for streaming time series and artificial intelligence now i want to look at an example by a customer we have called 3m now 3m is a brand uh that is a very familiar household name because it's unusual to find a work desk that doesn't have scotch tape or post-it notes on it 3m produces over 60 000 products in various categories including the n95 which has been really instrumental in our fight against kovit now our team partnered with one of the us manufacturing plants and the 3m corporate research lab to improve predictive anomalies so this plant's network connectivity was very limited therefore it was difficult to gather data and then transfer the data and then make predictions and send it back to those devices the team decided to move to azure sql edge for all the processing and artificial intelligence so imagine the full power of sql server with ai and team time series capabilities because we integrated it with azure stream analytics and azure machine learning services running on these tiny little edge devices all across the plant now this really helped them with network connectivity because azure sql edge allows you to run in a connected semi-connected or fully disconnected edge disconnected mode and you can deploy these models to the edge so you can actually make decisions on the fly without being connected so this is a very powerful example and a way that 3m was able to use this predictive maintenance to help fix problems before they occur all right moving on from the edge to your on-premises networks or your on-premises data estates and i'm sure many folks are familiar with sql server and how we continue to invest in this platform but in sql server 2019 we enabled things like data virtualization with poly base we enabled new intelligent performance capabilities uh memory optimized mtv and much much more we also removed the chasm that exists today between operational stores and analytics platform but to talk about a customer i wanted to focus on ital una banko which is one of the largest banks in brazil and the 10th largest bank in the world operating in the finance and insurance industry all right so their concern was that they would their infrastructure would not be able to keep up with their growth and this would affect their slas and in turn their business itself a good example is when they had to perform a rollback operation this would take a long time on average this would take four to five hours and and recovery could take as long as 40 hours so they decided to be an early adopter of sql server 2019 to take in full advantage of some of the new intelligent query processing and accelerated database recovery benefits in 2019. with accelerated database recovery they were able to drop their rollback time from four to five hours to a matter of seconds we've said things like rollbacks faster than you can react additionally intelligent query processing was able to make most of their queries run about 90 faster and the cool part about that is this was with zero code changes so really powerful stuff coming out of the sql server engineering team especially with regards to performance moving right along in our journey from edge to cloud we move into the hybrid space and azure arc is really at the cutting edge of this technology innovation what we've done here is we've brought the best of sql kubernetes and azure to run fully managed cloud data services in customer data centers or other clouds on top of their choice of hardware and os for a customer example i want to talk about kpmg japan in tokyo there's a team called ignition and they're working on building out a next-gen cloud-native application platform to serve their clients called cloud next now while they want to harness harness all the latest cloud innovation kpmg's clients often require running their applications and storing their data on-prem or in multiple public clouds to comply with government regulations or company policies now azure arc is going to extend azure's management capabilities and data platform services to any infrastructure on-prem or any public cloud this enabled kpmg to basically build applications once and run them anywhere and manage them centrally from azure the built-in automated management services like self-service provision elastic scalability backup restore and automatic updates of their azure sql in a azure arc enabled sql managed instance allowed them to save a lot of time and also keep their customers running on the most secure up-to-date database platform all right so we've kind of been through a whirlwind of things in just a few minutes but we've talked about the edge on-prem hybrid the next thing i want to talk about is moving to the cloud now azure sql which we'll talk about what that means but azure sql is the largest platform as a service in azure azure sql is not just hosting sql server in the cloud but we have built it from the ground up and i want to talk about a few examples of that specifically hyperscale but let's talk a little bit about what azure sql actually entails so azure sql is really a brand that represents our three main deployment options of running sql in azure now the first option is known as infrastructure as a service or is now you saw sql server 2019 and you're probably familiar with a lot of other sql server versions um hopefully you're running on the latest but uh with sql server and azure virtual machine it's essentially just sql server so you pick the version of sql server you pick the os you want to run you pick the version of the os you want to run on you have access to everything the only difference is that we are now managing the infrastructure and hardware underneath that virtual machine so this is going to give you the full flexibility and the full capabilities of sql server running in azure we see a lot of customers using this to migrate very quickly to the cloud for example allscripts was able to migrate about 600 on-prem vms in a matter of weeks and move to sql server azure virtual machine we're also working hard to give you as many benefits that are platform as a service like as possible with sql server and azure virtual machine we recently added ways to make it easier to configure your virtual machine to be and your sql server to be the best for the type of workload you specify and we've added new capabilities around creating availability groups and failure cluster instances very easily in azure so that's is now let's say you never want to upgrade sql server again well for that we have our platform as a service offerings while these offerings are going to do things like abstract away the os for you by doing that we're able to provide you a versionless version of azure sql and this first one is azure sql managed incidents and we're seeing a lot of customers move here as they migrate to the cloud and want to get more of those platform as a service benefits with things like azure sql manage instance and other pass services you also get an sla or a service level agreement so a guaranteed amount of up team uptime from us or you get credit back now azure sql manager instance is great if you need the full instance uh instant scope capabilities for example if you need machine learning services service broker database mail sql server agent all these things that you're still using that's fine azure sql managed instance is going to support them now let's say you don't need some of those instant scope features or you're building a new application and you want to take advantage of the latest innovation in the cloud well that's where azure sql database comes into the play this is an amazing service this is the first service we came to the cloud with back in 2008 on what we called then windows azure and sql services but azure sql database is just that we're going to give you just a sql database and we're going to allow you to take advantage of some of the latest innovations that we're working on one example of that is serverless you've probably been hearing about serverless apis and serverless applications but we've developed a way to give you a serverless auto scaling database so you can essentially set the minimum and the maximum of what you want us to scale between and we'll do that for you on a per second basis only charging you for what you actually use when you actually need it now when you're not using the database if you want you can set up an auto pause delay and during the pause time we're actually going to completely separate the compute from the storage and only bill you from the store for you for the storage which is going to result in huge cost savings so this is a great example of how we're innovating how we're thinking about re-architecting sql server to be cloud native the other example i wanted to talk about is hyperscale and instead of me just talking about it i wanted to tell you about a customer we have called high res now high-res studios is a leading video game developer they have more than 70 million people playing worldwide playing on pc mobile and console some of their popular games include global agenda smite paladins and realm royale and rogue company rogue company something called roko is their latest release and they use azure sql database hyperscale for it now this is that latest thing i wanted to chat with you about reason they wanted to use hyperscale is when a gaming company launches a new game they don't know how many people are going to play they could start with five players and then within days get to like five million players this is not out of the ordinary now high res was looking for a database that was going to give them the full t sequel surface area with extreme low read write latency and most importantly unlimited scale they took a big bet on sql hyperscale which is our flagship cloud re-architected relational database so we took sql server apart and put it back together to be a cloud-native solution that can scale vertically and horizontally to great limits and hardware is no longer a limit as a result high-res was able to scale the same database from a few gigabytes to hundreds of terabytes in minutes now if you're thinking about migrating to azure we have lots of tools to help you along that journey not going to spend a lot of time here but i just wanted to highlight that we have custom guides at aka.mslash datamigration.com or sorry not dot com aka dot ms slash data migration and you can go there to essentially select a source and a target for example oracle to azure sql manage instance and we will give you step-by-step guides on how to do that we are really trying to make this as easy as possible for you now one customer i wanted to highlight here is a customer that took a phased approach to migrations so h r block is a great example of this uh amid some changes in the tax industry h r block realized they needed to rethink how they use technology to serve their clients with a seamless experience so they migrated to sql server 2017 running on virtual machines now once they were up in azure they then were able to evaluate the platform as a service capabilities and they move made the move from virtual machine to azure sql manage instance we're seeing more and more customers move into is initially and then move to pass as they feel more comfortable and they're ready to get the full power of the cloud all right so the final thing i wanted to talk about the final service i want to talk about before we get into the what's new and what's coming is azure synapse now azure synapse is the first and only analytic system to have run all tpc-h queries at petabyte scale this was made possible because azure synapse uses the same query processing engine that ships in all flavors of sql of course you'll have to go check out a synapse session to get the deep dive on what exactly is going on here but essentially we are evolving our analytics offering and bringing the next wave of innovation we've taken our industry-leading data warehouse to a whole new level of performance and capabilities you can see how we're bringing together enterprise data warehousing and big data analytics and integrating deeply with things like power bi and azure machine learning services recently we even announced a bridge between your data warehouse and your operational databases with azure synapse link for cosmos db so a lot of exciting stuff going on here but really the main point and it's illustrated with this customer called grab taxi is that we're able to help customers reduce the time to insights so we're helping you get intelligence over all of your data and that's something that grab taxi was allowed to get as well as added benefits because of the investments that we're always working on across the sql platform related to security all right so we've talked about a lot of things um before we wrap up i wanted to give you a little look at the future now there are five areas where we are investing as an overall team the first is unifying sql across cloud and on-prem and over the next few years we intend to totally blur the lines between cloud and on-prem azure arc is kind of a great step that we're delivering on that promise and part of that vision but we want to go much further today in azure sql we run millions of customer databases processing over 10 trillion databases transactions per day but there's much more work that we want to do to meet the needs of modern developers we're working on building out with modern frameworks we recently announced some support for django we also want to enhance our json and graph capabilities and much much more related to things like iot scaling streaming and more now this is the most exciting one to me we talked about hyperscale which is giving you unlimited scale we talked about serverless that is kind of giving you that auto scaling auto pause delay pay for what you need and we didn't talk a ton about elastic pools but this is a way of resource optimizing and cost sharing between mini databases now we're working to bring the best of all these services into one single product offering so that's something really exciting that we are working on uh fourth we are always going to be investing in security we've always been a leader here just to mention a few things in july we announced the general availability of always encrypted with secure enclaves and in may we announced the public preview for azure sql ledger where we're bringing the power of blockchain into your existing databases without all the complexity and with no extra cost these are just a few examples of how we're investing in security and you'll see more as we build integrations with azure purview our latest data governance solution finally i want to say that sql server is very dear to us we're never going to slow down when it comes to innovating this sql server engine i wanted to share one example of how we are leading innovation and it's a project where we are rewriting parts of the sql engine to remove the static variables in sql code with machine learning models that can help adapt your run adapt to your workloads at runtime this is going to be transformational there's no data platform that offers this level of intelligence and this is just one example of how we're truly truly innovating here now i have a brief video i wanted to share maybe some of you all have seen it and then i will give a few closing remarks so hopefully the sound works let's try it imagine a tiny particle drifting at the edge of the milky way a single element that could solve the next big challenge imagine you had the power to reach across the stars and grab it our world is immersed in a galaxy of data data that connects us cures us enables us to do more data that helps us innovate wherever imagination takes us in 1969 apollo 11's guidance computer was limited to 72 kilobytes as it traveled to the moon now we have exponentially more power in the palm of our hands this precious resource demands a trusted partner to harness and make every bit of it intuitively accessible at the speed of our ideas the speed of sql server in azure sql the latest evolution of data management's highly secure supercharged platform enabling you to break performance barriers and boost agility azure sql databases hyperscale capabilities allow you to auto scale dynamically advanced data security keeps hazards at bay with a single click while built-in cloud intelligence eliminates worry machine learning services enhance the value of your data by importing adjacent code to create models faster than ever and with cross-platform solutions like azure arc sql server and azure sql products can be leveraged virtually everywhere our team is motivated to consistently improve this new breed of database without disrupting the application of our existing technology as many of the world's critical systems and most influential corporations deploy it and agencies on our most important front lines depend on it businesses like balzano that leverage sql server big data clusters to help radiologists interpret medical images or zeiss that use azure sql edge to constantly improve the manufacturing of optical lenses and organizations that help us dream bigger like the sloan digital sky survey that built its ever expanding database of 200 million galaxies and 260 million stars on sql server imagine what your data could dream up imagine unparalleled access to that next great innovation together we can reach it at the speed of sql server in azure sql yeah i really like that uh video and i think it kind of illustrates how sql started as a relational engine and over the years we have transformed it into a full-blown data platform as you can see many of these innovations are not restricted to sql server we are taking them beyond sql server and we are really looking to you to help us build the next version of all of these great things with that being said i wanted to say that even with the best technical innovation none of our products can be successful without a strong community our philosophy has always been very community centric and so i want to spend a moment to say thank you to everyone joining to everyone watching to everyone contributing giving feedback and please keep it coming we always love hearing from you and we're always looking for new ways to support you thank you thank you so much anna that was absolutely amazing you setting up the vision of your sequel not actually equal sequel in general talking about how you all started back in 1980s to what lies next definitely it tells that you guys and girls are definitely not going to stop working on sql engine uh thank you so much anna we we couldn't take the q a after the keynotes because that's not how it works because i see many comments coming in people are really excited for the sequel and yeah any final thing you want to plug in before we move to the next session thank you so much simon for everything that you're doing you know this is a great example of how how important the community is for the success of azure overall so thank you so much for everything you do and i hope everyone enjoys the rest of the conference i want to thank you anna for accepting invitation i'm always as i said many times i'm always very excited to have you so is this community i would love to have you back and i no way to reach you out have a great day and thank you so much and see you soon tara bye all right with that keynote we now move to our next session that's that's will be by mustafa thurman is going to talk about azure devops love started mustafa is a microsoft mvp he's also a c-sharp on nvp and a solution architect so without any further ado let's invite mustafa to cloud summit 2021 [Music] [Music] [Applause] [Music] hi mustapha welcome to cloud summation 21. hello hello great to be here i'm really excited mustafa uh to be honest devops is something that people are really looking forward to learning this session as we are looking at the registrations forms so uh at this moment the many many people are watching us and i'm really looking forward for your session uh please feel free and to go ahead and share screen and then you can go ahead and get started sure and everyone who is watching uh i see many people are joining from linkedin please feel free to go ahead and ask a question in the comments and after the session ends we would love to take your questions and musa will be there to answer it most of i've added your screen to the stream everybody can see it and next 25 minutes is all yours thank you all right let's kick this off so uh next session is going to be azure devopslavs.net and that was a really nice uh introduction before me uh but i need to speak with simon about some photos and where he actually found them uh for that however uh just a few words i work as a solution architect i'm part of the claudion which is recently acquired by devo team here is also my twitter handle if i can help you in any way feel free to reach out so 25 minutes is not a lot of time so okay let's kick this off uh why are we here so dot net is a very popular platform uh it's uh something an amazing framework that helps us develop software for quite a few years now uh i'm i was amazed by the information that more than 38 million uh websites are in the world are running on asp.net uh popularity of this even even grow uh started growing even more uh when dot net core was uh introduced with a lot of open source components and stuff so it was a very natural way for us to move our asp.net applications no matter if it's classic or core to azure because basically azureloves.net they come from same company microsoft uh created them with a lot of love and these are things that are going well together asp.net and azure are a natural mix but how do we actually get to from from our development environment to azure how do we put our code out there and how do we make it available to our customers well the answer is very simple we're going to do it with azure devops another tool from microsoft family that is really tightly uh put together with all the other things in the ecosystem so it makes a natural thing it makes a natural environment for everything in that uh ecosystem let's basically look at the demo and we are going to basically spend the rest of our uh session in here so what i'm going to do in azure devops i have a devops organization so i'm going to create a new project so let's call it cloud summit and this should be done relatively quickly and of course when you're doing a demo it's taking a bit longer okay here we are so new organization new project was created inside my organization and if i go to the repos i can see that it's empty i will right now initialize my main branch just by adding a readme file this is a default placeholder just for me to know that this repo exists and when i actually connect to it that i know that something is in there so as a next step let's go with the starting up of visual studio and i'm gonna clone a repository i'm gonna connect to my azure develops once the list shows i'm gonna select the organization come on okay here we are i'm gonna select organization i'm gonna select the repository i just created and i'm gonna connect to it so here we are next thing i want to do it will be clone the repository so i will create the clone of my repository and if i go to solution i can see the readme file that was generated when i initialized the the repository itself so we know we are connected there now this doesn't mean much let's add some code to it so i will simply add a new project i will select asp.net core web app we will select that one uh let's call the project name cloud cloud summit we want to place it in the same repository and so it's in the folder so we are creating cloud summit let's kick it off we will select.net core 3.1 project is created and here we are we have a bit of code right now uh this is nothing major this is just an empty asp.net core project but it will be enough of a code to showcase what we are trying to to uh demo here so this what's that can you zoom in a little bit okay let's see is it better now better okay perfect so next thing that we're gonna do is we're gonna commit this to our repo so let's go to the kit changes i will exclude the files that were generated by my visual studio so these uh folders that you see with dot vs those are generated locally and you probably don't want them in a repository but the rest of the files i want to commit so i will create initial commit to create a stage commit and then i will commit it i will commit it to my repository and push the changes initiating push and successfully pushed so let's go back to uh let's go back to my repository so if i refresh right here here we are my code showed up everything isn't here so code is there i'm ready to collaborate with my teammates everyone in the team can can can work on this this right now so let's move on and start creating additional components for this so we want a fully functional ci cd for this in order to once the code is in there we want to build it and then push it to to azure so let's start by creating a pipeline and we can create two versions of pipeline we can do either yaml files or we can use a classic editor right now i'm going to use the classic editor as a first instance and then we'll move to the second one a bit later so i'll use a classic editor it offers me to select where my code is so naturally it recognize that it's in uh azure repos and it recognized where my code is but i have other options like git github enterprise server subversion and so on i will select the one and then i have an option to select between the templates i will select asp.net core as my template and it will basically generate everything required for me i will do two additional changes on this one first one is because in this state it will not run why because by default it's using a older version of uh windows agent that doesn't support net core 3.1 like net core 2.1 and all the versions could run on this but this one wouldn't so i'm going to change that to windows server 2022 so basically to ensure that i'm running the latest version that is supported the second thing i'm gonna do is i'm gonna disable test assemblies because i don't have uh that right now in my repo of course you definitely want to have tests you want during your build to run tests but i don't have any in in the repo right now so i will just keep them to not to waste time and i can do save and queue and this will trigger my first build this will create the build on my build agent and will as it execute all the all the steps that were uh defined in in in the build so what it's going to do is basically once the the agent starts and it should be any time now hopefully yeah here we are uh it will initialize job it will collect the code from my repository then do uh set the proper version nougat nougat restore it will create the build and then uh basically publish symbol pad and publish artifact this last step is very important in our process because access is after successful build we want a package that we can actually deploy to our servers while we're waiting for this to end we are gonna quickly create uh uh a web app where we actually want to deploy this so i'm gonna do a template deployment and i have kind of prepared templates so i'm gonna just use them so i have a template file and parameter file which are basically arm templates are basically a json file that contains information about my infrastructure everything's in place i'm just going to change it to west europe so it's closer to me and it responds more fast it responds faster to me you do create and it should be on its way perfect okay uh where are we with the build okay build completed we did publish symbol pad we are publishing an artifact and cleanup and we're done so as a result we have an artifact that's very important part because right now if i go to artifact i can see that i have a web app zip that basically contains the files that i want to deploy so this is the part that's really important for us as a next step we are gonna set up a release so we did a continuous integration we we deploy we build the the code we created the package for deployment but we need additional steps to actually put it somewhere to actually be available to to people so let's do a new pipeline and again here we're going to be offered a bunch of templates and i'm going to do a wrap service deployment so we're good there i'm gonna select an artifact artifact is basically you need to select a build in your uh uh project so i have a single one cannot be a mistake there and i have several diff options when uh which version i want to run obviously i want to do latest because i want my code to be deployed every time i change it so we're good there as a next step i need to set up my area subscription so let's connect to my subscription i'm gonna select this one and i need to authorize it so i'm gonna use my azure account i'm gonna log in and obviously i need to approve uh mfa notification so i approve the mfa check we're good to go and this is now creating a service to my azure subscription so i can actually connect from my azure devops organization and and deploy the code in the meantime while we're waiting for this one to finish let's see what's going on here so this one was finished and if i go to resource group there is a cloud summit research group where i have a simple app service running so let's check it out if i go to this web app this is a basically an empty website it's just single html uh to tell you that your web app is up and running and you can actually access it it's waiting for your code uh so so everything is is ready okay uh subscription was authorized as the next step i need to select app service i'm gonna use that one that i created previously and i'm gonna say save and once i say c i'm gonna say also create a release and here we are release started hopefully we will not read much for it and it will start immediately and while this is happening let's move on back to the web app and see a couple of things so we mentioned previously uh the cloud offers a lot of different things and and it's a perfect platform but when it comes to app service and that is probably one of my favorite services in azure uh there are some amazing things uh one of the favorite things is definitely uh scaling out so what scaling out does is basically creating a rules that allow you to create additional instances of your application to handle in increased workloads this actually allows you to save on cost because you can run a minimum instance of the application when you don't have traffic and as number of users increase it can grow so if i create scale rules i can say that based on the metric i can add a rule that says for example if my cpu percentage is over 70 automatically increase the count of instances to one and if i add the rule it will automatically create uh additional instance for for my uh application however you have to be careful with this one because if uh you scale infinitely uh you have to scale down eventually and that will never happen if you don't have scale out rules so it's also remember that once the cpu drops you should decommission some of the instances okay this was done uh probably let's see yeah application was deployed so what the release process does it just gets the artifact that we created during the build and then deploys it to error app service so let's check it out if i refresh here i should get my app and here we are it's really simple cloud summit welcome nothing special in there but here it is it's it's deployed now this allows me to trigger manually the the build and push it to azure whenever i want but what if i want every change that is created to actually be put in azure well i can definitely automate that there are two things i need to do here first one would be to go to my build to my uh ci pipeline and under the triggers i want to enable continuous integration and right now i'm just gonna save it i'm gonna do a similar thing with the release and i'm gonna edit the release and say that i want similar thing i want to enable trigger so whenever new artifact is available it should start the process automatically let's quickly go back to our visual studio and i'm going to open pages and just edit index and change welcome to cloud summit we're gonna save that one and we're gonna commit the change so home page change i'm gonna commit that one i'm gonna push it and successfully pushed so if i go back to my uh repository and check out what happened is basically in the pages and check out the index we can change that the change was made i can see the change i just did in my visual studio it was pushed successfully now automatically once i push this this should create a new uh build so yeah as you can see the build is running you can see that there is build in process so already did couple of things it already uh got the code did nougat uh uh set the nougat version uh doing a new restore building a solution which should be completed relatively fast okay everything done pushing the artifact and it's completed we can see that there is a new artifact created right now we have a new web app zip and this should automatically start the new release process as well so let's check it out yeah here it is uh release was just just created so new release is in the on the way it was already queued and it's actually already running so we have a new build process new build process automatically triggered and access after successful build we have automatical release so the artifact was downloaded it's deployed to your app service right now it should copy the files if i remember correctly it's going to be 59 files uploaded and it should be done relatively quickly in time for us to do a couple of more things and it's always slower when you're waiting for it okay here we are so let's check it out let's see if it's already pushed here we are change is there welcome to cloud summit is available okay so what we manage mentioned at the beginning we can do two types of pipelines we can do a a classic one that we showed and we can do a yamo we probably want to do yamo because we want to use pipelines as a code as well we want our pipelines to be treated same with the code because there are multiple benefits first change tracking uh it's easier to handle you can revert to your version of the pipeline at the point where you want to run your code and so on so it's much easier what i'm gonna do is disable this trigger because i don't want this one to be triggered automatically anymore and i'm gonna do another pipeline so i'm going to do a new pipeline but i'm this time i'm going to do a yammer one so i'm going to do azure repos i'm going to do cloud summit as a repository and it automatically recognize what kind of code i have so it's recognized it's some kind of asp.net i'm going to select the core one and it's going to generate some kind of pipeline for me it already creates steps for me as it is if i go to save and run it will commit the code so what this has here is it will either commit it directly to the main branch or create a new branch i don't have a time for creating branches and uh running pull requests so i will just commit directly to the main branch and it's creating a pipeline and if i go to my repos now you can see a new file there are azure pipelines here right now we have an azure pipelines uh that is basically describing how my pipeline works and any kind of change and everything that's happening we can track in same way as we are tracking the code and if we want to return to an instance of a code at some point we can easily uh revert to same version of the pipeline or the pipeline that we were using at the time when that code was actually uh uh in in in use as well okay so let's see what's going on this pipeline is running so it did couple of steps and stuff so nuget installer command is going to do build i should probably remove the test because i don't have them but you may notice that there are a couple of steps missing here as well uh we are missing uh uh a published part we will not get an artifact here it will not ge automatically generated for us so we will need to adjust this pipeline a bit and add them we can do that in two different ways so i will go to my pipeline so this one is done so i will go to the same pipeline and i say edit most of all maybe you can wrap up right yeah yeah just so i will remove this task and i i can either use assistance so if i go to the publish it will give me an assistance that can i can fill in or the other option is if i go to my classic pipeline i can simply oh select the wrong one when you are under hurry uh if i go to my classic pipeline i can get the information if i go to the task that i'm missing i can select yaml and get the code that i'm missing here so i will just paste it here and do the same thing for the publish why i'm showing this a lot of people when starting out with yaml pipelines are not quite comfortable uh with writing that manually so this is kind of a way to actually uh uh help you out and we simply push save our new pipeline is updated and and as a result of this pipeline we are going to get the same artifact that we have previously what we need to do is adjust our release select the new new uh uh uh build uh pipeline to to to get the artifact and we're good to go to to release again okay as i said not a lot of time thank you for your attention and i really hope you enjoyed this session if there are any questions feel free to reach i'm i'm i will try to to answer uh anything an absolutely amazing session i would say a roller coaster session trying to get as much as possible in 25 minutes i'm so sorry about it this time constraints are always very challenging that's a couple of questions uh i know our next speaker is very kind so she won't mind if we start a couple of minutes late uh keshav is asking us so we need to trigger in ci pipeline to automatically trigger the changes done to be reflected in azure minority uh okay so if you want code to go automatically we need to set up a trigger so every time we do a certain action it will trigger the the uh like trigger it and and and uh do everything the chain of the actions that we define yeah that's that's all right final question by comachi saying can we say artifact basically contain the actual core code which contains the functionality uh yeah that's that's basically it that we are we're verifying the build and then collecting the files that we need to actually deploy that's that's completely true white people say great session very informative people are enjoying your session excellent uh so that was a great mustafa absolutely amazing thank you so much uh union your team is growing uh we see that uh we had an absolutely amazing keynote by magnus the opening keynote and your session was absolutely great thank you so much any final thing you want to plug in before we move to the next session yeah thank you for your time it was it was pleasure to have a chance to join you here thank you mustafa have a nice day and see you soon bye bye alright with that we move to our next session but before we move to the next session um i gotta play a video because we made a video and the next speaker didn't see it which is so sad uh i will play a minute video and then we'll be back fast hi i'm anna hoffman hey friends i'm nicola hi i am mr roman hi i'm tanya jenker hello and i'm excited to be the speaker at azure summit 2021 it's a fantastic event that will be 11 days live streaming where more than 100 speakers from all over the world i'm excited to speak attention summit about power bi and synopsis analytics and i'm gonna talk about security and also a microsoft mdp for azure i'm speaking at cloud summit about automated release of dotnet applications and the best part is that this is a free event come join me live on tv come join me live on learn tv on september 14th come join me on learntv on the 14th of september this year a bunch of other microsoft and community speakers so if you want to learn how to secure azure come to my channel join us it will be a lot of fun see you there see you there see you there see you there [Music] [Music] so [Music] so [Music] all right we now move to our next session that's why tanya tana is going to talk about cloud native security i'm really excited to host that because i'm hosting her for the very first time she's a community rock star and i follow her on linkedin twitter everywhere so let's welcome tanya to class tommy21 [Music] hi tanya welcome to cloud summit 2021 thank you so much for having me oh my gosh what a wonderful intro video i i don't think i've ever had such a great intro before thank you so much yeah i really appreciate daria for you accepting the invitation i know you're not well right so thank thank you so much for that i'm not going to take much of your time 25 minutes is all yours i'm gonna share your screen we're gonna do chit chat towards the end of the session everybody can see it and next 25 minutes is all yours thank you so much okay so we only have 25 minutes and i'm going to go through what cloud native security is so although this talk usually has a whole bunch of demos of azure we at 25 minutes i mean i i did see part of i saw most of mustafa's talk and i have to say i think he pulled 400 rabbits out of a hat doing that so fast in azure i was super impressed with him wow um so cloud native security so let me so i like to tell you what i'm going to tell you then i tell you what i told you and then i i tell you the thing and so the idea is is that you you get it so let's talk about what we're going to talk about so we're going to talk about cloud we're going to talk about what cloud native means and then we're going to talk about how we secure a cloud native and basically how we can do security in more innovative and exciting in exciting ways that we couldn't do it before before clouds like azure existed and so i want to warn you a bit because i'm going to ask you a bunch of questions and i'm hoping that you're going to put some things into the chat to answer to share and i can't really interact with you as much as i'd like to as much as i could in person because then i can stare at you and then i could also bribe you with maple candies to get to get answers out of you but i'd really love it if you could share your answers in the chat as we go through this so i'm gonna tell you a bunch of the answers because well i mean like when i teach this live especially if it's at a meet up i'm like i've got all night folks i will just sit here until you start sharing but i don't have all night this time so let's go so this is me i feel like simon's video did like a way better job of explaining that i am a big nerd on the internet and we do these slides as speakers we do these slides so that you think that we are qualified to give our own talk but i'm gonna just assume you're like i'm on learn tv i'm gonna learn stuff let's go with tanya so let's learn some things so what is cloud right and so i have um so let's do the the first like the really dry very formal definition of cloud so this is the wikipedia definition so cloud computing is the on so let's go through this i'm going to read it and then let's dissect it so cloud computing's the on-demand availability of computer systems resources especially data storage and computing power without the direct active management of the user so let's let's tear this apart kind of suss it out because when someone reads me something like this i'm like but that's too many abstract very complex concepts let's dissect it a bit so it's on demand so this is so magical i used to have to when i was a dev i would make a ticket and it would go to the operations team and then they would think about it and then i'd usually have to get approval to get a server and this and that and sometimes i'd have to go over and be like hey buddy like could you make my server like sometime this week please i really need this stuff and i so i i i'm not proud of it but i would bribe them with cookies like a lot there used to be this dva and if i wanted data things happening and i wasn't supposed to access the database myself even though you know programmers can usually get around that i was like colin i have oreos i will give you four if you will do this and i have fried so many people with cookies over my career and a lot of homemade ones but oreos worked particularly well on call in the dba but the point is is that you could just ask i like i kid you not like with azure in all the clouds you could just you just put in the thing you say i'm gonna pay and then they just build you infrastructure they give you iot they give you like whatever it is that you've asked for and it takes like a couple minutes it's even like short enough time that like i can't necessarily go get a coffee and come back it's like too fast and that's amazing and so the other part that's really exciting is with without direct active management by the user and so i used to have to care for my servers just like you would care for a pet i mean like you wouldn't give your servers food that's not gonna go well and if you pet them they don't usually make more noise but definitely you'd have to care for them on a regular basis you have to patch them you have to make sure your configurations are okay you have to just like maintain this thing and make sure it's okay and on top of that i'm patching my app and and fixing bugs and making little changes that my clients would ask for and so you don't have to manage your infrastructure anymore most the time because they just do this for you they just create infrastructure for you they will care for a lot of it it's pretty exciting and so what do we mean by all of this like we mean that you have this shared responsibility which we're going to talk about a bit later but it means that basically microsoft or whoever your provider is is doing a bunch of the heavy lifting for you and it's pretty exciting to be able to just like count on other people to help you be awesome and so that's basically what we mean by cloud is that they will provide all these things to you reliably and they will share some of the responsibility with you for maintaining and securing it so we haven't even got to security yet but don't worry we're on our way so what is cloud native security um so my definition is that it's stuff that's specifically made to run in the cloud specifically made for that and like okay so let's give you a more advanced one so this is one that i put together so i used to work at microsoft i actually used to work with frank and like a bunch of the other people that are part of the summit and so it's kind of nice to see familiar faces so i put this together from so many different amazing incredibly brilliant human beings that i had the opportunity to work with especially on the microsoft ignite tour um so then i actually got to spend like quality time with a lot of people and so i came up with this definition from all these brilliant human beings so it's applications and services that automate and integrate the concepts of continuous integration continuous delivery continuous deployment devops microservices and microservices architecture so that means apis serverless and containers and basically like this is stuff you could not do on-prem and you can do things like azure stacks you kind of like bring out your home with you and that's great but still you're doing you're doing cloud do you know what i mean you're doing clap you're doing the new stuff and so the idea of cloud native is not taking all your old legacy things and lifting and shifting it into the cloud that's cool that is good you have way better reliability and resilience from then on right but that's not cloud native you're not taking advantage of all the awesome stuff and that's okay that's okay if you're doing that but what i'm going to talk about in this talk is specifically cloud native and like the new ways that you can do things and i do not mean to say that people supporting legacy apps are not fighting the good fight because if anything you have to fight a lot harder because you don't get to do all the fancy new tricks and you're taking care of like an elderly application that's like 16 years old you are still fighting the good fight but i want to talk about the fun new stuff okay so what is the difference between cloud and a traditional data center so we're just gonna talk about general differences and then we're gonna go into security differences and uh and then i'm gonna make fun of myself probably a bit and then i'm going to make fun of cloud providers a bit because teasing and making fun of things is sort of one of my hobbies and then we are going to wrap up but when you think of a traditional data center what do you think of so this is where i would like you to be interactive i would like you to share some of your thoughts about what you think when you think of a traditional data center in the chat seriously simon does not need to read them out to me i'm going to read them all later though you better believe it and so what i my first thought is me as a penetration a junior penetration tester having to go physically into data centers and being so cold i remember my boss sent me once and it was july it was really hot where i live so i was wearing this pretty dress and i go into a data center and i ended up i was only supposed to be there half an hour and then it turned into i was there six hours in a dress not wearing gloves not wearing a hat not even having a sweater freezing um and so that i think of loud cold data centers and so some of the other things that happened with the traditional data center were there was some manual patching there was some manual management of patching there's a lot of like you can use like system center and other things to deploy patches but if things are missing yeah it didn't always tell you like imperfect um it's on-premises which means it's your responsibility i worked somewhere once and we had a flood and part of our data center was in our building and our data center had a flood yeah that sucked um i i was one of those people that had my giant desktop on my desk because i hated crawling around underneath my desk mostly because all my male colleagues would look at my butt and i was just like this is not cool so i just one day redid and so my computer was okay but a lot of people's computers were not okay because they were on the floor underneath their desk and so when your data center is on-premises it is your responsibility like physical responsibility i also think of bare metal um so rather than now like i don't have a specific computer but i used to have computers and they all had names there was scully mulder and pamela and and bessie bessie was one of my favorites and like i'd have physical machines bare metal that i'd run things on i had to run my own network so i only ran my own network a handful of times most of the times i was lucky and i could be on an internet and someone else ran it but like running your own network is work that's effort everything yourself manual stringing cables oh my gosh those pictures when people put pictures on the internet of inside a data center and there's like a million cables and there's this awful mess makes my heart break a little bit makes me stressed like cortisol comes out of me um you run your own air conditioning your own power your own backups your own everything and so a thing that a lot of people don't know about power is power is not created in this perfect amount of 120 or 220 volts no no no power comes and surges and it it's not this perfect thing and so you have to regulate power if you're going to go to like this big huge data center and like this is stuff you have to worry about a lot you have to worry about failover well i've had so many things where like a data center just goes down and there's like no failover and i thought we paid for that um you need an operations team and they have to be on call and that is expensive and you have to train them you have to not only recruit them but then you have to retain that staff which means you have to be nice to them and give them training and hopefully give them like career opportunities so they want to keep working for you you need to have an off-site business continuity plan disaster recovery or you should have one i've definitely worked at places where our disaster recovery site was at the same place and i was like then there what you're saying is there will be no recovery if we have disaster you'll see over provisioning this is a huge thing where you buy a lot extra just in case and that as a person that owns us a business like a startup oh my gosh there are so many things that we have accidentally bought too much of because we weren't sure or you didn't get enough and then things crash and so for the cloud this is a thing you don't have to worry about necessarily anymore um and apps are deployed to a specific server so i'm calling this photo op number one if you want to take a screenshot of this list now sort of your time i don't personally love taking notes while i'm watching a talk i love just watching and trying to listen and consume as much information as i can so i took notes for you but apps were deployed to specific servers like bessie had this app pamela had this other app and i remember this one guy he's like blah blah pam i'm like please address her by her full name anyway so what do you think of in cloud so i would like you to put this into the chat what you think of when you think of cloud because i so i had like you know pretty wonderful introduction to cloud because like you know i saw a little aws i saw a little this a little that but then i got to work at microsoft and i got to have as much azure as i wanted and i was like this is ridiculous i have licenses to all the awesome stuff and so um my first foray into cloud was seeing the security center dashboard and how i could see for the first time i could see everything clearly and i don't know how to explain but that visibility and the fact that i could drill down to a specific box and fix it i was just like it was um is pretty magical so that's what i think of when i first think of cloud and i'd love to hear what you think of when you think of cloud in the chat so off-premises not your responsibility so it is not in unless you are a person that works at microsoft and you happen to work in an azure data center it's going to be off-premises for you right like all the cloud customers it's it's off-premises and that's nice it sounds weird but like giving up that responsibility and knowing that it's safely in someone else's hand is awesome means auto scaling so no needing to buy things in advance i know this is going to sound so silly but like i wrote a book and people buy my book and people ask for signed books so i have to ship books all the time i have boxes of boxes and boxes of books in my house and a zillion of those like little bubble wrap envelope things and then um someone ordered the wrong size so then i have like 50 of that one and so i have like so much stuff and i have to over provision i have to buy too many and slowly like let them all go one at a time or 10 or 20 or 50 at a time like people buy my book and so auto scaling of technology yes because i used to have to do this with all of my servers too usually the cloud is internet available in my entire career in tech since 1997 once the internet went down but the intranet was still up so we could still work every other time our internet was also down so your internal network and then you know once they realize like it's down down it's not coming back in a couple minutes so they've sent us all to starbucks or timmy's we've all had we've all recaffinated and come back it's still not up there like just go home have the rest of the day off we don't want all of you software developers running around causing trouble um you can do infrastructure as a service so you so i am i was on azure fun bites yesterday uh with my friend jay and basically um like the night before i just did a whole bunch of infrastructures code and built out a bunch of insecure demos then i attacked them because i'm a jerk like that and then i could like demo to him like look here we found this so we did that and i just asked for some infrastructure as a service and then addre just made it for me and that's awesome apps can be on a whole bunch of different servers maintenance is not your problem like i said centralized beautiful like everything you can see and this one's really important geographically distributed um i i have worked out places where like so i live in canada and you know we're physically geographically bigger than the united states but compared to the united states our country's empty we have like for like every 10 12 americans there's one of us and so we like geographically just we don't geographically distribute a lot of things we are all like where's the warmest part of canada i want to move there and so um yeah geographically distributed data centers that the cloud allows is magical because we're mostly in just a couple of cities and i don't mean to talk badly of us it's just that that's where we do a lot of our technology stuff we'll gather in one place and generate the heat so we can stay warm and that's a thing cloud data centers are lights out which means they're run by robots and i have to say i think robots are very cool but also it means that humans don't need to go in and things are just automated and that's kind of amazing and so here is like a little graph that i took off the internet from quora.com and if you want to take a screenshot of this you totally could but it it only says some of the things we talked about and really what i want to talk about is security right that's what i really want to talk about and so what do you think of when you think of traditional data center security because i immediately think of security guards wandering around i think of them having like a giant belt with put it put your thoughts in the chat i think of them having a radio so they can call if they see someone that should not be there and i have to say like being a pen tester i've gotten into a lot of places where maybe i shouldn't have gotten in and being you know a pretty lady really helps like hi uh i'm supposed to be in this data center yeah i supposed to get something from ken but what else do you think of because i think of zoning so we used to have implied trust in data centers so we would have you know the data zone and we would have all the databases in there and then we'd have a firewall around it and then we would have you know public access zone so things where it connects to the internet we'd have a demilitarized zone for our very serious stuff we would just have like a bunch of firewalls and everything inside each firewall would trust each other and everything else you'd have to open up a hole one of my old bosses used to refer to our firewall as the swiss cheese and i tried to explain to him that that was embarrassing he should stop saying that but then i went to switzerland and they explained that what we in canada call swiss cheese is like one type of cheese and they're like that's not the best cheese at all we have hundreds of cheeses that are better and then they gave me a bunch of cheese when i visited and i have to say that's probably the best speaker gift ever because i really like cheese um manual patching manual patch management manually having to scan a bunch of areas to see which patches are missing which configurations are wrong what's questionable physical security like i said physical security guards um generally using third-party software so i have made a lot of money in my career from using that nexus a nessus and expose just like scanning infrastructure they're like do you wanna you know we'd like to hire you for a pen test on our infrastructure i'm like no you don't you just want me to do a va scan i'll charge you way less and i will tell you the things you need to fix and then you fix them life will be good you don't want first of all you shouldn't hire me to do a physical like i am i'm a web app patent tester or i was not an infra you don't want that i'm not you maintain not only a trained operations team but a security operations team as well money extra money you have many third third-party monitoring and security tools so this is your third one this is your third one and then cloud native applications and services that automate and integrate the concepts of devops ci cd microservice architecture api serverless containers it is not copying everything your legacy applications lifting and shifting them into the cloud that's okay if you want to do it like it is totally okay if you want to do that absolutely but you're not doing cloud native you're using the cloud to host your legacy apps there is no shame in this it is totally okay but i want to talk about cloud native so what do you think of when you think of cloud native we've got three minutes left so i'm gonna whiz through this zero trust just in time access control closing all your ports all the time except when you need them automation for patching and patch management yes automated va scanning all the time you don't have to set up agents it just does it for you complete visibility threat monitoring monitoring of everything and then adding automated responses using ci cd pipelines like azure devops or oh yeah it's going to say github actions but i guess because one company owns the other company i kind of associate them having amazing resiliency so that your a and the cia availability is always standing strong doing security as code so like once you have people doing infrastructure as code and you have people releasing in pipelines oh yeah i'm gonna do security as code for sure it like there's just so many amazing things there's more there's devsecops so i can add security tooling to your pipeline i can automate a whole bunch of security checks i can use serverless and logic apps to protect ourselves so detecting something bad happened and then responding automatically um they call them playbooks in azure you can use um the cloud native tools that come from your provider like their sims sentinel like native threat detection um having native firewalls means like you just press a couple buttons and it's implemented and it actually just works it means less heroics which means happier staff and happier staff means less turnover and that's the most important thing and so um i want to tell you one last important thing and that is that this responsibility of securing your systems is still shared between you and whoever your provider is whether it's microsoft or someone else it is a shared responsibility and so you see like your vendor on one side so physical assets data center operations cloud infrastructure but then on the other side what you need to secure is if you have virtual machines and networks you need to make good choices if you have apps and workloads again you need to make like you need to write secure code and like i know that i am the person that teaches secure coding and so that's why it's like oh well you know she'd boba but you can learn this in many places there are books there are articles there are so many ways to learn and so i'm going to give you a couple resources so microsoft learned there are a whole bunch of different awesome things i know i'm i know i'm right at the cusp so take a screenshot of this some awesome books about devops and cloud natives and about security if you are a woman or a person who identifies as a woman or non-binary i have this international nonprofit where we want basically women and non-binary folks to get to meet each other and be friends i have an online free community at community.wehackpurple.com it's all free you can just join meet people there's articles videos conversations events every monday on twitter i do cyber mentoring monday and basically i use this hashtag to help connect people with professional mentors and the last resource is me she hacks purple on every single platform like twitter instagram uh all the places and um i have a youtube i have all the things and with that i want to thank you and i want to thank the azure summit for having me and i'm sorry i am one minute over um and thank you so very much for having me hi simon hi wow tanya that was absolutely amazing i would say a very well placed empa paste you talked about zones he talked about distributions he did talk about water flowing in the data center and you know what the best part is that people really listen to you and go ahead and check the comments there's somewhere around 100 comments people have literally answered what they feel about cloud what they feel about traditional data centers so that was an absolutely amazing session really enjoyed hosting your tanya um any any final thing you want to it's a lot of community thing that you do i checked the last slides here i also went to this community dot hack purple.com uh any file thing you want to plug in before we move to the next session um i have an azure security course at we hack purple and you should check it out i don't know how to find that link but i would have added it so academy.wehackpurple.com all right thank you simon i'm going to put it in the chat all right i got it so i have dropped in the chats uh so please go ahead everyone and check that out uh thank you so much tanya please take care of yourself um we would love to host you back and yeah see you soon bye take care bye all right with that we move to our next session tanya delivered an absolutely amazing session on uh cloud native security i believe this is something many people i really wanted to learn around security and now we have another session by friends who's going to talk about azure snaps serverless sequel pool uh this is i believe session number 18 just on azure sequel i mean people really want to learn sql and uh yeah so we have sessions for you and yeah if you're watching us on learn tv we have dropped from learn tv i know there's no point of saying if you're watching us because we've already dropped never mind uh let's go ahead and play uh invite our next cloud speaker 2021 [Music] [Music] [Applause] welcome friends welcome to cloud summit 2021 hello how are you doing today fine thank you it's friday so in other ways it's a really good day i know sometimes i'm really lazy and friday to be honest if someone would ask me to work after in the afternoon time of friday i'm kind of lazy and i yeah never mind but what time is it at your place actually at 6 00 p.m oh my goodness i caught you at the wrong time oh that's absolutely good for uh that's simpler yeah absolutely amazing topic friends let me tell you this is something serverless equal poor people really want to go ahead and learn while we are taking the registration form so please feel free to go ahead and share screen and then we can get started and everyone who's watching uh this session is a q a session after this session ends uh we'll take the questions from you so please feel free to go ahead and ask the question in the comments friends i see you have shared your screen i have added it to the stream everybody can see it and next 40 45 minutes is all yours okay hello good morning good afternoon good evening my name is franchoka and in the next 40 45 minutes i will speak about asia sign up serverless sql especially for one approach to build logical data warehouses or self-service bi or bi sandboxes over data links and some kind of databases first of all just who i am i'm speaking from budapest which is the capital city of hungary i have a phd in computer science from computer graphics i have a data platform mvp report and nowadays i'm public speaker in a lot of power bi and azure data platform topics and and now they send a power bi azure expert and trainer so let's jump into the details if somebody doesn't know what is azure signups analytics or why is it a big password nowadays so i just would like to summarize what is it so it is microsoft limitless unified data platform to create a good data platform for your data analytics projects it it is able to ingest data from on-premises cloud data software as a service data or streaming data into a cloud environment where the basis all of the thing is available in an azure data lake storage gen 2 and you are able to run sql and spark workloads over this data lake today in this session we will focus for the sql part but we will mention also the spark part but as i mentioned it is a unified data platform so it has a unified user interface called signup studio we will see it several times today and there are a lot of features which needed for a unified data platform to integrate different services to manage for monitoring stuff security et cetera et cetera so it is a very big deal and from microsoft i could say it is a one-stop shop everything is available in this big package and we selected only one component from this big package it is called serverless sql pool and i have to confess that there are multiple sequel pools within this era synapse analytics one it is called dedicated secure full formerly it was known as azure sql data warehouse and they if somebody doesn't know it it is a massively parallely processing engine a cloud data warehouse which is really good data warehouse in the cloud if you have a lot of data besides this there is this serverless sequel pool so what is this server-less sequel pool as the name suggests less of course it is not true so in the background there are lots of servers which handles this sequel pool the serverless means that you don't have to think about when i will pause this serverless secure pool and resume this server let's see compared to a dedicated secure pool or you maybe know that you have to sometimes stop and start a virtual machine within azure using this server let's see for pool you don't have to do anything like this you have to pay only for the data that is processed when you are executing sql queries and what kind of sql queries you are able to execute against the data link it is a t sql so like in a microsoft sql server if you are a t64 people you will really like this technology because it is absolutely the same with see some differences there are some limitations compared to a normal sql server but the same language that you are using for sql server you are able to reuse this knowledge in this serverless equipment and why we are speaking about the serverless equal pool besides that i could run this equal against the data league because if you ever work with the data lake i am sure you experience the situation that there are a lot of csv files rk files or other kinds of files and you would like to know that okay what is the shame for this data file or okay this is a 10 gigabyte csv file i don't want to download to my laptop so how can i grab some data from this 10 gigabyte csv file so such data exploration and data grabbing solutions could be fulfilled by the server elastic for a pool but there is another thing and for this session this is my topic is that besides than exploring the data of it in this data lake it's very typical nowadays that we have in on-premise environment microsoft sql server with a lot of databases and the customers are just creating new and new databases for their own cell service bi reasons for example create the bi sandbox play with the data from different other databases within the same sql server server and that's why they have to upload a lot of data into the same sequencer variational database but on the other hand nowadays everything is about the cloud as this cloud summit is also about it so we are just moving the on-premises databases to the cloud and we realized that maybe this is not the best way to reproduce and migrate something from the on-premises environment to the cloud environment that if we would like to do a bi sandbox that i could play with my favorite id manage data warehouse and i would like to enrich this information with some own custom data how can i do it in the cloud in a much more better way maybe a bit less expensive way uh for use the data lake because data it could handle big data we could load tons of data we could process transform a lot of data within the data lake so this is the place where logically we should handle this kind of sandboxes this logical data warehouses so if i found something interesting on a data lake and i would like to create from this interesting data as star schema for example which would fulfill a microsoft power bi report maybe i could do it within the data lake instead of loading the data itself and into a in sql traditional database and also to create this bi sandbox within a sql database so uh the more important logically to use in this bi reserve service bi solution is to use the server sequel pool and within this session i would like to show some use cases how to do it so the serverless equipment is able to execute these queries against the data lake and it would be useful for data exploration and creating new databases like self-service database with a lot of tables with for example a data mart or a star schema model but besides this on the data leak we could found several spark tables and spark databases because the spark is also work on the data lake and the spark component is also available in this azure science analytics so another interesting advantage of the serverless sql pool is that if you have some spark background or you like to write by spark or sql from the spark engine then the data that is available on the data lake via the apache spark tables it is also possible to read by these signal queries from the serverless sequel tool how it works because in the same synapse analytics environment we have this apache spark for synapse and the serverless sequel pool and they use the same shared metadata store live store they are just syncing the tables between the two different systems the two different components so you are able to query using basic queries of this spark tables and why is it good so i could grab the from csv files from parking files which are available on the data lake but i could grab it also from spark tables which is also on the same data lake using basic work queries creating some nice solution for a for example a star schema and then i am able to expose this star schema to to burst the power bi or excel or some other bi tools so i believe that's why this serverless sql is a really important technology within the whole synapse component environment as i mentioned it supports these equal queries most of the things are interestingly supported but there are some limitations and you will understand why there are some limitations the most important limitation that it is not containing data within the sql server so it doesn't contain any data so you are not able to create tables rather you are able to create databases within the database you are able to create views shameless stores procedures so you are execute select commands you are able to execute dds statements but you are not able to store data within this serverless sql pool which means we don't have tables we are not able to cache the queries at this moment it means that if you execute the complex select command for example for 10 seconds and then you execute once again then it will takes again one ten seconds and it is important because i mentioned it is a serverless sequel pool you pay only for the data process so if you execute something two times then you will pay a double price okay but the main message that you are able to execute a lot of in this equal queries against data lakes and there are some similar technologies so maybe it is not a new technology for you just in science maybe you heard about amazon antenna or databrick server less sql these are very very similar competitor technologies i have to confess also that this serverless sigfur pool microsoft created an absolutely new sql engine called polaris it's a highly distributed and parallely processed engine at this moment it is used only in the serverless sequel put but maybe later it will be used for other technologies also so let's jump to see what kind of use cases we are using several sig fur in our customers at our customers so the most typical one is that we have a data lake with different zones because a data lake if you don't organize into zones it will be a data swap sooner or later so we just grab some data from one of the zone and try to reproduce this data in an absolutely different way for example create a star schema model a data mart or something which do some self-service bi stuff so for this reason i just will show immediately a short presentation i've got a big flat white table in an azure sql database with five million rows i'm using sign ups to sign up spike languages in the background nearly an a azure data factory so we just load the data into a data lake into a row zone then from the row zone i've just moved to and the entry zone which means that i am converting from file to par key file i removing all the unnecessary columns i convert the columns to the appropriate data type i enrich the data with some additional columns so in the entry zone the parkifis are really good for any other reasons for machine learning or just creating a data mart as our purpose at this moment so we will move the data into a so-called curated zone and in this curated zone i would like to create a bi sandbox and create some kind of star schema but i don't want to use spark i don't want to use usual data factory rather i would like to use my basic world in knowledge that is coming from my own premises or cloud sql server knowledge so then if we will have this star schema i will show how it looks like and this star schema will be exposed to the power bi and then you are able to create nice power bi reports so this whole process could be done using the server sig fur pull because server signal pull is able to create such stars and schema models so i just would like to show first that this is the original data source it's a flat white table with 5 million rows and i just would like to jump into the sign studio so if somebody hasn't seen it this is the unified user interface where you are able to see all your data all the different meant stuff like sql scripts jupyter notebooks etc and i have an integrate hub where i could see the different pipelines data pipelines that moving the data and transform the data between the different systems so at this moment i could see that i have one data pipeline with two activities first step in this activity is to move the data from the azure sql to the row zone in the data lake and in the second step i grabbed the data from the raw zone in csv format and transformed this csv to a parquet format and besides that converting to parque i'm just selecting only those columns which are really needed for example from 14 columns to 11 columns then i add two additional columns which could be enrich the data so typically we add columns like this row is coming from which data pipeline runs so for example run id or date time for this run is also added and finally we just delivered this data as a parking file into the android zone so using this one data pipeline i am able to grab data from a data source and load into an android zone and now we are at the place where from this parking files which are available in the android zone i would like to create such a data mart such kind of star schema which could fulfill my power bi reports so how can i do it using t sql so i just would like to show some disks equal scripts this is the important part so first of all to use this server sequel you have to create a so-called server-less database you see at this moment i'm using an engine uh called built-in this built-in engine is the server lasiko and i'm using the cloud summit curated database how can i do a new database within this serverless i'm just going to the data i'm calling that new sql database and then serverless and i could create as many database as i would like to use okay then if i've got this database i am able to execute the sql script first of all i have to define a data source an external data source which would deliver the data for me as i mentioned earlier in these databases in the serverless databases we don't store anything we use only external data sources so here is the data lake where the data will come from and then if i've got this data source i could execute a select query like this this select very means that the select distinct region from open rosette this is the trick and within the open process you are able to see that within the data lake where you found this rk files that could be useful for us so in the android zone there is some parking files which is loaded as i showed earlier and if i just execute this select it takes some seconds it goes to the data lake process the data lake data and then give back the results like it was a sql database i don't see in the background how it communicates with the data lake itself and then because i could write this sql query then from this sequence very i could create a view and then i am able to query this view using normal tcp sql varies but what i don't like in this solution that every time if i would like to grab some data from the data link i have to write open rosette and define where it is available etcetera rank what kind of format is available etc so typically the next step i usually do is that instead of using this open row set i'm creating an external table and before creating an external table i'm creating some file formats to define as many things as possible previously so for example instead of writing every time if i would like to grab csv file with command delimiter and the data is starting on the second row i just defined like here an external file format for such delimiters files or if i would like to use a park a5 which supports snappy data compression i could write such a formula and then later i would refer for this parking format using snappy compression using this name and then after this definitions i am able to create an external table what is this external table you could imagine like a table shame for the data that is available in parque format so you see here that we say that this is the data source these are parking files the parking files are available here and i would like to see the content of the particle files in this tables because i'm creating such an external table then if i would like to reproduce the similar region the dimension table where i could show the different regions in this big database it's much easier to read so originally it was a select from open rosette now it is select from this external table and because there is nothing open reset and parking files and file formats and anything else the whole view definition is looks like it is a sql database so i believe it's much more readable for simple people and similarly we could reuse the sales external table for country dimension and all the other dimensions and i just would like to show how it looks like the view for the facts sales table so you are able to write inner joins group buys and all the typical things that we are using for their bigger sql project so if you just create the first lego elements what kind of files what is the external data source what is the external table then you are good to go to create views and these views are exposed to power bi for example height looks like so you start your power bi desktop and you would like to connect to a serverless 64 because ference created this nice dimension and factor views how you are able to connect to a serverless sql you have to go to the portal azure portal and on the azure portal you are able to see what is the serverless secure endpoint and if you copy this endpoint then you have to connect to a sql server so from power bi perspective it looks like a sql server on the other hand it is not a sql server it is a serverless sequel pool and then you are able to select which views you would like to import data and create your nice power bi report i just would like to show how it looks like in power bi i loaded all the data imported all the data into power bi the relationships are automatically found and then i could create a nice power bi report and this report is based on the serverless pool views which are based on the data lake so when i loaded data into power bi i loaded the data through an interim solution this is called the server elastic verbal from the data lake itself so somebody could ask that it is worth playing with this serverless inferpol i could directly connect to a data lake and float everything from the data lake and my experience if you would like to load something that you calculate something bigger or do some choice or anything serverless equable is a better solution to load into power bi then if avoiding skipping the serverless sequel push so it is a really good performance solution this serverless equipment or it is possible because it is like it looks like a sequel server you are able to use directly for example the fact table doesn't have just 5 million rules as in my example but maybe 100 million or bigger factables you have then you don't want to load everything into power bi rather using directory you would like to grab the data on the fly so it is also possible but if you use these views from the power bi using the directory and from the views you are going to the data lake and do all the processing it could lead to performance issues so for such situations we suggest that besides creating views we always do another thing it is called external table external table is you could imagine like a data export creating a snapshot of your data so whenever i execute this sql script i will create a new fact sales table in the curated layer using parquet format and the same select is generated to create this external table the same select is available for the views so the fact view and the external table effect is the same select but the view is executed during on the fly by the external table execution is once when you create the external table and it generates an output on the data lake so instead of using the view if you would like to use directory from the power bi through the server sequel it's better to create this external table and the fact sales would be an external table uh from the serverless sql so from power bi you are able to do the direct very sync i just would like to show something parents can you zoom a little bit zoom yeah okay better thank you okay so for example this effect sales is a direct value table all the other tables are imported tables so fact sales is now an external table external table you could imagine like a snapshot on your data leak after executing a sql query and using this external table using the directory you are good to go to handle not just five year records but rather more more millions of records from your power bi report so until now just to summarize where we are we have power bi reports which uses serverless sql in the serverless sequel there are views and external tables which create grabs the data from the data lake itself the typical problem with the data link is that it is not a transactional handling solution so for example if you just execute something and it fails it would happen that the written file is incomplete and corrupted or if you would like to delete one row in the data in the normal data lake and it is not possible in the normal data lake rather it is possible to create a new file override the old file in the new file where the deleted row is not available so several issues are known and they're known in the data lake that's why some years ago that the data lake started to growing and this year it is available in general availability so data link is a solution to handle something like in a relational database called transaction handling and acid syncs now it is available within the data lake itself delta leak is very similar to a normal data leak it has bark effects but it has some description of things to handle the transactional look so the first question is that if i have such a delta leak files in my data link it is possible to reuse or query this data leak within the server less several and the answer is yes and i just show it very quickly how it looks like so the same csv file which is available in the row zone i will load into an energy zone and as a data data lake and in the delta league i just remove some columns some rules where the data is not hungary and then the same data lake files would be re read from the curated zone and expose the power bi similarly into as the use case one so how it looks like i'm just going back to science and in the science i have a data pipeline i just would like to show it very quickly so grabbing a file from the row zone and selecting the appropriate columns add additional currents and finally deliver it as a delta file and if it is delta file i just would like to show how it looks like in a data environment in that environment you always have a folder called data log this data don't look for that contents that so-called transactional log so what is transactional login relational databases in the data lake it is the solution and if i i have a another data pipeline if i execute this another data pipeline it will delete all the rows from this data lake where the country equals to hungary so no rose in this data lake where her country it was to hungary i just would like to show how it looks like from sequels and serverless sequel when i enter uh select before executing the deletion i see that in the country dimension table i have one rule for hungary and the country is seventy 67 on the other hand after the deletion so after the delete data pipeline executed i just executed again the select star from the country delta and i don't see any country in this country dimension where country goes to hungary so the data is deleted what does it mean delta lake works and the other thing is that from server last sector you are able to execute basic work commands uh queries against the data lake files and height looks like in good in server on sql code so very very similar to the park a files but the file format is delta not sparky rather delta and again i just created an external table where i'm using the delta phi format and then very very similar to the previous one i'm just executing selects which creates the views and based on the view i could create nice power bi reports similarly as use case one so the only message here is that the delta leak could be used within synapse within microsoft environments and the serverless sql is supporting this database so if you are thinking something like a data lake and you would like to delete updates something modification or transaction handling then you should use delta lake because data lake is supported by serverless equal pool the next use case that i would like to highlight is the spark tables i mentioned earlier because spark tables and the spark metadata information is shared between the serverless equal and spark it is very simple to create a database logical data warehouse over a spark database and i just would like to show it very quickly again it will be very very similar to the previous one now i don't have to define something very special thing i have one spark database this is the name and from this pack table from this shamer i could grab data from a table and from this table i could create nice dimensions tables and if i have a dimension table similarly i could create fact tables then i have the star schema and then again it is exposed to power bi and i could create power bi reports over spark tables okay so this is the third situation that i could grab data from data lake first was the csv parking 5 second was the data lake and the third is the spa tables and there are some supports for databases unfortunately not the best i have to say at this moment but as i know microsoft is working to have more support for the databases so it looks like a sql server it looks like i'm writing basic work queries so it would make sense that i could run pixel queries against dedicated sequel pro because it is within the synapse workspace or maybe i could run basic work queries against azure sql databases and unfortunately at this moment it is not working but as i know microsoft is really working on this so in a bit longer time it will be available and then you are able to combine data from data lake from sql relational databases and create nice power bi reports from absolutely different databases from databases and data lakes fortunately there are some support for databases via synapse links so you are able to grab data from azure cosmos databases and microsoft dataverse databases and over this databases you are able to create again dimensions and facts views or external tables and then exposed again to power behind and you are able to create power bi reports over cosmos db or over the interverse and the interim is the server-less sequel whipple i just would like to show quickly how it looks like to create such power bi reports or create this logical data warehouses prototypes over the cosmos db so if somebody doesn't know what is cosmos db it is a distributed and highly customizable not just structured but semi-structured data storing engine and if you just store data in this cosmos db it stores automatically in the so-called transactional store so like in an oltp application stores the data in a transactional way but it is possible to enable for cosmos db that every time is just duplicate the data and sync some of the data into a so-called analytical store and from this analytical store using the hdap technology you are able to execute these sequel queries against the analytical store and creating the curated zone again dimension tables and fact tables and based on the factorable and dimensional you are able to create power bi reports and again we will use the serverless sql to do this whole stuff how to do it you have to go to the cosmos db and enable this azure synapse link for the whole cosmos db and also you have to enable the analytical store for any container if somebody doesn't know what this container container is something like a table or an entity within a database so in cosmos db there are databases and within databases there are containers this is the entity or table so you are able to granularity says that i would like to move this data to the analytical store because if it is available in the analytic or store it would be queried using pseq from serverless sql it looks like i just would like to show one script so again i would like to create a select start from open rosette and now we will get the data from cosmos db from a cosmos db database from one of the entity called closing and that's all again we are able to write seek for query create views or external tables and grab data from these views because we have this synapse link to the analytical store which is stores the data in analytical format in the cosmos db so again serverless sql pool supports t6 against the database and the fifth use case i will just show in the bpt that if you have a dataverse database if somebody doesn't know what is it if you are creating a powerapps model driven application where you have dynamics 365 in the background there is a dataverse database and it is not so easy to connect to this dataverse database so grab data is not so easy but it is possible now using synapse links to continuously export the data from the database to the data link and because continuously it is exported then data is in the data lake and you are able to do the very similar things as previously or again run basic work queries against the data lake so the final message here is that you are able to grab data from data lake from data lake from spark from cosmos db from dataverse and create views external tables and ad-hoc queries above this different data sources using the serverless square pool and because we have use and external tables and serverless sql shows it like a sql server we are able to expose it to power bi and create power bi reports over this serverless sql and one thing that i would like to show and then i will finish my session is that i highly recommend this serverless for bi self service bi stuff for bi sandboxes etc but if you start to work like this and you will have tons of different bi sandboxes for your colleague and he has five another colleague has 10 different bi data sandboxes then later and later you are not able to know that there is this information which bi sandbox i would like to use for my ad hoc report that is needed to done within some hours for myos so if you need some data governance azure purview is the answer from microsoft nowadays for data governance and the good news is that purview is able to scan such sign of serverless sequel pool and grab the data what kind of tables are available what kind of views what kind of ethers are available in a server less info put and then you could use the per view for this so when you and i will show it immediately how it looks like in purview so if you start scanning perfu just to see what kind of data assets available in a serverless sequel you are able to see that not just dedicated sig for poor but also serverless uh six pool databases are available for scan so you could grab data from a serverless sql and i just would like to show so if somebody hasn't seen a purview this is the starting screen you are able to browse assets in a purview and you see that this cloud submit curated database it is a serverless sql database and i could start to grab what kind of shama is available within the shame of what kind of dimension the views and ethics are available or i could start to find that what is the different data assets within the server secure pool which contains the name country and then i could maybe find this nice view which would be useful for my power bi report so later if you have a lot of pi sandboxes logical data warehouses using this server let's see what will that perv you could be a good choice to connect and it will be your data asset management tool the data governance tool so that's all my site i hope i could expose this solution this serverless sequel for you because i believe for data lakes and later for databases it's a good solution to create your self-service bi solutions that was a very very detailed session friends uh you must you must look at the comments people have been asking a lot of questions right i i know you you might have a dry throat but maybe let's go ahead and try to answer a couple of them uh the first one is by sarvaran uh he's asking i'm sorry for the name can you please share some of the best practices using dedicated sql code maybe not all maybe if any uh first thing that you would like to go ahead and do it it's equal okay so a dedicated super pool is the uh or formerly known as the sql data warehouse and it is a massively parallel processing engine which means that you have a lot of different nodes where you could execute data processing things you don't have to use one server rather like in a big network you are able to distribute an execution and there are a lot of best practices i highly recommend that microsoft has a really good documentation on the best practices when i'm teaching this topic i also refer it because it has a lot of good ideas so i highly recommend it all right let's uh quickly go ahead and answer some of the questions can synapse analytics import data from xlx files stored under a file system okay so synapse analytics has a component called the sign of pipelines which is the azure data factory and it supports excel as an input source and so you are good to go to grab a locals xl5 from your on-premises environment load using the sinus pipelines to the clouds to a data league and from that moment it is again the same playing that from the data lake do the transformations etc etc so yes it's supported you're rocking friends uh the questions won't stop coming delta uh league cannot override parkette files is it correct it is to all of our files okay so as i understand and i have to conference i'm not a data lake expert as i see there are a lot of park files and yes it is not override partifi rather it says that new and new files are available and there is an in honest in the background that the old files are removed after 30 days or something like this so there is a retention policy and there is a folder called transaction log delta log and it contains the transaction log to handle this ideality some rows i updated these rows i inserted some rows so looks like it works like a relational database this is the way with this whole so-called lake house approach which is delivered by the databricks guys all right final two question of the day that i'm gonna throw at you the first one is can we have serverless sql and server required sql for one application at some point of time oh sir server required yeah i don't get that uh maybe comma you like to reframe it or maybe there's a typo let's take a final question from tread if it is saying do you consider this to be in the ms database or is it another product class i think he's talking about uh yeah snaps if i understand that the questions is it uh good to use database or not i believe now it is on the major level that it is good to start to work with previously one year ago two years ago i i was not at that point but nowadays i believe it is arrived to the appropriate stage so i could imagine production projects otherwise so looks like an essay says very well explained what is serverless uh thank you uh so he says more productive info let's see what else we have hasan uh husnan says thanks friends it was informative thank lot of comments coming in we'll have to stop it uh all right uh so okay what what the skisha says keshav says that was really deep dive session but i love each and every simplified detail although i missed quite a bit never mind keshav please register and we'll send you the recordings uh so yeah uh parents that was absolutely amazing session there's a lot of effort that you have added in the slide and there's so much you covered in just about 40 minutes uh thank you so much any final tip you want to plug in before we move to the next session i i would like to thank you for having me and overall in this whole cloud summit uh earlier isro something nowadays cloud summit a is event i just able to watch the videos only in the evening because during the day i have to work and and all the speakers are doing great job so and also you organizers so parents thank you i really appreciate you accepting meditation i know people have work to do but still they developed your time and they do the community contributions thank you so much friends please take care of yourself we would love to have you back uh very soon and yeah have a nice day ahead and thank you so much thank you bye bye all right with that we move to our next session we are already five minutes into her session but i hope she won't mind uh so our next session is by shwita shwata is going to talk about azure cognitive services on how you can detect and analyze face it's always very interesting as your current services are always very interesting uh and yeah without any further ado let's invite shawta to cloud summit 2021 hi welcome you're muted you are muted it's nice to see you again yeah i'm really happy to see you sweetheart uh i i like these live shows you know this helps me connect people although i'm sitting here in india you are in u.s so thank you so much for accepting the invitation and i'm really excited for your session because as i said azure community services are very interesting uh so i won't take much of your time please feel free to go ahead and share your screen and then you can get started sure and everyone who's watching please feel free to go ahead and ask a question in the comments using hash cloud summit and after that session we're gonna do uh a lucky draw uh from the comments live like from the comments and someone will cloud summit swag it i don't have your screen yet okay are you able to see now yeah i can see your inside stream yeah yeah now you move towards your slides everybody can see your slides and shiva next 40-45 minutes is all yours okay so first of all thank you so much for giving me this owner to so that i can speak in front of you once again and i always enjoy talking in the conferences and c panna provides me a very good platform to work towards this and for more about me like i'm working on the community contributions it's been like many years now and right now i'm holding rank 27 globally on the c-sharp corner and had been c-sharp corner mvp for four times now and professional side i'm currently working with the microsoft as a v dash in azure core compute team so we recently had a power pack session and i hope most of the minds are still working on digesting that so thinking about that aspect i thought i will try to make this session pretty lightweight and very easy going and like everyone need not to think so much about it so if you have any question during this session please feel free to ask me at the end otherwise we can also have a look on my twitter account so let's quickly look at the agenda okay so i will start by setting up the context on what is face analysis and what are the daily challenges or the problems which we can solve it using the space analysis and the recognition techniques we will also have a look at what are the key face attributes which are making this face analysis possible and then we will have a brief discussion on what is your face api which is again an offering from the cognitive services side and we will also have a quick demo and we will see how we can call this face api using client library as well as the rest api and at the end we can have a walkthrough of us very interesting use case without writing even a single line of code so when it comes to phase analysis it all begins with face detection and face detection is one of the major areas which comes under like image analysis and what it does is try to identify a face human face in any image so best example i can take is from the security side is like every day we are using our laptops or the mobiles to unlock the screen but the face analysis is not restricted only to the device security but it also works very well with the public security wherein we can read the feeds of the surveillance cameras to monitor and the track the criminal activities which are happening around and apart from that we can think of banking transactions so there are a few banking transactions and the banking firms which doesn't allow anyone to proceed until the face recognition is done so there could be many more examples we can think of on the health sector side uh the very useful example i can give you is like to detect the diagnosis or we can say that if there is any human or the genetic disorders happen and due to that any facial expressions or the facial appearance has changed so there we can go ahead and use this face analysis as of now it is not that much into uh the industry but still it could be a very good scope for anyone who is looking towards the genetic disorder identification and recently i read a news in which a company is working on making a device which will be like very useful for the blind people to identify the emotions of the person they are talking with so let's say a blind person is talking with a person and that person smiles during the conversation so in that case device can detect the emotions or read the expressions and signal the blind people that oh using the vibration or any other way that oh this person is enjoying the conversation another interesting area is the marketing so in marketing lots of researchers are using the face analysis data to read the emotions like whether the customer is happy how the product is doing and here we have noticed that there are many cameras which are running 25 24x7 in the supers supermarkets and many stores so they keep use to track the we can use those cameras or the videos to track this by reading the individual frames out of that and it will tell the product owner or the store owner that this particular person or these many number of people were not happy while leaving the store so this is how they can work towards the improvement of their store as well as the product another interesting thing is to we can use it to track the attendance in the school university or even an employer can do so and last but not the least image content analysis is one of the most common common among all of these now basically in a nutshell we can say faces are everywhere wherever you see you will see faces so and we human are lucky enough to detect any face with blink of eye but is it that easy for a computer to recognize that is it that easy to compute for a computer to identify an image that oh this is a human picture or this is a face of a human or another example we can say is is it that easy for a computer to review and say that okay these both pictures belong to same person i wouldn't say yes here because deciding such things involves so many calculations so many parameters which needs to be read and for that computer needs extra intelligence to it so to add that extra insulins definitely a i will come into the picture and when there is ai ai model and air models and the algorithms play a very vital role there now very interesting question is it that easy to build the ai model may i being a developer go ahead and quickly do it i wouldn't say yes because building a ai model requires lot of knowledge with respect to the data with respect to the many parameters with respect to the domain and when we are saying that i can do i wouldn't say it's hundred percent impossible but it won't be that much effective rather a research scientist or a data scientist working on the same thing because scientists do have a knowledge of how to analyze parameters how to deal with thousands and thousands of parameters in one shot how to conclude the result so all these things requires a very good person having educated towards that artificial intelligence side now question is can i do this of course i can do this because now we have azure cognitive services for a rescue and using those as your cognitive services we need not to work or think that much about what is going behind the curtain or what all parameters are required or what all algorithms are being used so if you can have a look there are around 27 landmarks which face api internally uses to detect a human face so if you look at an eye or a nose they're around six to seven points so landmarks are the points which can be easily recognizable on any phase so using these points many calculations many complex algorithms are being run and then it is saying that oh you can use this face api so now today being a developer without knowing anything about the machine learning ai algorithm still i can go ahead and i can quickly fire one http request and i will have my results ready with me so now face api like i said it falls under the umbrella of the cognitive services and it is like ai based service so what it can do is it can detect faces it can recognize as well as it can verify the faces for us so there is huge difference between detection and recognition so detection only deals with identifying the face out of an image and once detection is done then only recognition and the verification takes place so until unless the image i mean the human face is identified we cannot go ahead and perform the recognition and the verification activities so verification recognition tells whether this face belongs to this human or not or who this face belongs to which human or what is the name of the person or what whose face is this where is verification deals with comparison aspects the smell similarities that whether these two faces belong to same person or not so these are the three different things which we can do it using face api and particularly for my talk i'm focusing only on the detection part so with this much of introduction let's have a quick look at the demo so i have already created a desktop-based application and i will try to browse an image so of course this is my image and here you can see that i just passed this i just browse this image and i got an output i received an output in the form of this red color mark so this red color square shape signifies that okay this is the face which is identified by the api now if i will hover over it you can see that this image belong is like of a female and a smile which is 0.9 percent she's wearing a glasses so the overall expression is 98.2 which is neutral let's take one more image in which we have a group of people so okay so in this image you can see that it detected that there are three faces there are three red mark over here and each of these swisses and the properties we can see it by hovering over it so this is a male with a smile of 1.5 so this 1.5 and 98.3 these are the confidence score which are returned by the api i will quickly show you the code so we are not going through each and every line rather i would just focus on the key things so here you can see that these are the two libraries which are required whenever you are working with the sdk or i would say client libraries and i got it pulled using the new kit package manager next thing is we need the key and the endpoint so for generating the key and the endpoint we will quickly go to azure portal and we'll search for face apis okay so click on create and here we can create an instance so you can select the subscription which works for you and here you can create a new resource group or even you can choose the existing one the region you should choose the one which is very closer to you and here you need to supply the name of the instance and here comes the pricing case so there are two pricing tiers so if you are just experimenting the things and if nothing is in production we can go for this free one and the second one is this standard when they both have their call number of calls per second which is permitted now at the end you can see there is a check box so using this checkbox you are acknowledging that this particular instance will not be used for any police verifica police activities or to track any criminal sort of things so once all these things are populated you can click on review and create so i'm not going to do it because it's going to take few minutes for me rather i will quickly show you the one which i have already created so once the instance is created you will end up seeing the page similar to this and on the left hand side you can click on keys and the end point so this is the key and this is the end point which i have just pasted over here and using these two then we have i have constructed the face client now the important thing is i can show you how and where we are making a call so this is the line where you can see that i'm calling your function detect with stream acing so what this function is this function takes lot many parameters but i have just taken the bare minimum once the first and the foremost parameter talks about the data so it's the image in the form of stream second parameter says the boolean and it says would you like to return the face id or not third one says would you like to return the landmarks or not and the fourth one which is the most important it talks about the attributes what all attributes we are interested in so you must have noticed that during the demo we were just showing we were just seeing these four ones gender and emotions smile and glasses but there are many more there are many more which you can see around 15 attributes are there which you can definitely track it age gender blur mask and all these things so whatever parameters you need to track you just need to furnish it over here in this array and things will i mean the api will return the response accordingly now rest of the code i would not prefer i'm really sorry to interrupt can you zoom it a little bit yeah zoom in yeah a little more yeah perfect thank you okay yeah so rest of the code i would not prefer because those are like housekeeping code how to change the image into stream and what to do when mouse over happens so we can quickly skip it and we can go on to the next part which is like rest api so for that you can quickly google it rather than because i am very weak in remembering the url so i personally do not remember those this api reference okay so you can click on this link and here you can see there are a bunch of references provided for all the different functions and the functionalities which are available now whenever you're working with rest api one thing is like i personally do is go to postman and just hit and try do it and see how the api is behaving or the what the api is returning but in case of face api you need not to use the postman itself whereas everything is inbuilt in the browser so you can see that there are certain apis for a certain functionalities for identification for these four are for identification and here you can have a different different like segregation detection verification identification so let's quickly jump on to the detection part so as soon as you will click on this you will get that left hand side there are a bunch of all the possible apis and this is one of the most important things so these are the regions in which this particular api is available now in order to try it out you can quickly select the reason which is closest to you so i will go with the west us and the very cool thing which i really like about this is here you will get the complete documentation what that particular api is doing what all parameters are what is the image format which is expected when performing the face analysis so definitely you can go through all these but i would rather like to highlight this particular thing detection models so as of now there are three detection models 0 1 0 2 and 0 3 so in case of 0 1 it works only if you have images which are like taken very cl in a very close fashion or i was senior frontal face detection means your image would be pretty clear and everything is clearly visible and the faces are like inclined towards the camera itself whereas in case of detection 2 this model you can use it whenever you have side faces or bit of blurry faces and the third one which is the most recent one detection 3. it is useful to detect the faces having mask which we really need nowadays so i will quickly experiment with this one and we'll show you how to do and again there are recognition model which probably i will not discuss in detail so there are four recognition model and the fourth one is the most recent one and provides result with the most high accuracy and it is recommended to use the latest one here okay so here this is the hostname which automatically got populated based on the region we have selected these are the various parameters which you can hit and try whatever you want you can try so let's you know keep all these as the default one and i will try to change it to mask i want to see how it works with mask recognition recognition model is four and here you have noticed that this is the zero three model which i am selecting because i am going to input an image which is having a mask now header side application and this and i will keep as is and here we need here we need a key so for key we can go to this and click on this and simply copy this key one even you can take key to no worries at all once this is done next thing it is expecting is the image url so for that i have already uploaded few images and azure storage so we can quickly go and grab those go to containers images and so this is the image which is having a mask so let's go and place it over here okay we are good to go let's click on send okay so you can see that the response is returned because we just selected two parameters here face id and the mass and that's the reason we received only two things so this is the face id here and this is the face mask which is saying that image the guy human in the image is having a face mask and nose and mouth are covered so you have seen that how easy it is to verify or to try out these rest apis rather than going and tapping the entire source code or using the postman to set up the headers and all those things next i will take you through one of the interesting use case okay so this particular use case like i'm inspired with this because my kid has just joined the kindergarten and what we are trying to do here is like uh it will try the teacher wants to identify how many smiling faces are there in the classroom and how many faces are really upset so what teacher can do is one thing is like and important thing is like nowadays there are many kids who are very excited to go to school but at the same time there are few kids who would still like to stay back at home with mom so in that case uh what teacher thought is let's go and ask each and every individual that hello baby how are you doing are you enjoying my class are you enjoying this activity problem with this approach is there are few kids who keeps crying and they i mean they are not comfortable or pretty comfortable to express what they're feeling so in that case teacher cannot go with this particular approach so one thing what she can do is she can click pictures at random point of time during the random activities and later on she can sit and analyze those pictures that oh this kid is smiling in this activity this kid is not enjoying it all now we will see how we can automate this entire workflow without writing even a single line of code so let's go back to azure portal and i will execute this workflow using the logic apps so i am not going to discuss the detail of each and every parameters of the logic app rather simply go ahead and create it so here i am selecting a resource group which i'm already having model time i'm taking consumption and here i can give a name of the logic app hope it is available okay and the region as i'm in washington i would go for west us let's not enable the analytics click on review and create click on create it shouldn't take more than few seconds okay so go to the resource and here there are many triggers defined but i would prefer to go with the blank logic app so what we are going to do as a first step is we need to read azure storage whenever the image is uploaded by the teacher there so for that let's search for the connector which works with azure storage sorry okay so i'm selecting this one and let's select the triggering action which is whenever blob is added or modified here we need to provide the name of the connection strings so we can provide name as blob con access key we need to grab it from the portal itself so i can open another tab over here let's close this one so it is expecting that what is the name of azure storage account so go here and click on the storage and you just need to grab this one so i'm copying popping it over here next thing is azure storage access key so it could be taken from let's minimize this first okay access keys and here keys are listed so i will quickly click on show and copy first one okay so next thing it is expecting few more parameters so what is your storage name so account name so this is the one which we have provided now we need to supply which container we want to track so fortunately there is only one container but if there are multiple it will list down all those containers how many number of items you want to return so let's go with one as of now how do you want to track how frequently do you want to track so i would give five seconds so every five seconds it will go and check whether is there any new upload happened in azure storage or anything modified there so once this is done i will click on next step so here what till this point we have read the azure storage now we need to perform the face analysis so let's click on the face api and it will give you the connector so this is the one you can select this one okay and here you can see that all the aps are listed if you are in preview so i will go ahead and quickly select the one which is for detect faces so click on detect faces okay again we need to provide the connection name so i can give face ap icon and the api key account key we need so for account key we can go to the face api instance which we have just created and which was already created with me so it is okay click on keys and copy this one so this is the key next we need the site url which is nothing but the endpoint so i will also grab this one and click on create okay so next we need to provide the image url where our images are saved so all our images are saved in azure storage so let's go to the storage back and click on containers this is the image images container i want to read i will go to properties and will grab this path from here so grab this part i will remove the back side and now we also need to mention what this path is talking about so it contains the list of paths so from here we can select list of files path once it is done we can click on next step or rather we can save it once at least okay so save successful now images are uploaded faces are detected next we need to pull in the result of the analysis and push it to some storage wherein teacher can go ahead and have a look at it so to make it simple let's go ahead and push this analysis result into an excel sheet so for creating an excel sheet i would be using let's go here i can quickly show you so search for excel connector and here you would get two for excel so one is for business online and another one is for drive so for in order to use this business one you should have an office 365 subscription with you so i will go with the one which is mentioned over here so i will open my onedrive and so this is my one drive and here i've already created one folder summit and it is already having an excel sheet which is empty so let's give it few seconds to get it open okay so now a teacher is interested only knowing whether kids are smiling or not and if yes or no then what is this gender specifically and that's the reason i have just taken these two parameters gender in the smile face id is just for tracking purpose you can definitely skip it so these are the four columns which i have created and i'm going to close this so let's go here and try for excel online okay so if you are selecting it for the first time it will ask you to log into your onedrive for me i have already done it so it didn't ask me so add a new row into the table so select this one ah it is asking me now so let's give it a few seconds okay so it's signed in automatically here we need to provide the files so under summit we have a file named results.xlsx so i'm providing that and we need to select the table which is the table when i have already created in my excel and last and the most important thing is about the parameters what all parameters you need to map with the analysis results so let's go with all the four columns which we have created now the time stem so times time you can make map it with any time so i will go ahead and check the last modified list of files last modified so let's okay so it will give you the last modified time next we need the face id so there you can see on the right hand side there is already a face id i will quickly select it then gender we can select it so here is the gender and the smile so if it is sorry so if it is not appearing we can search it over here so click on smile okay and we are good to go let's save it and see how things are okay so app is saved i will go to overview and i will keep this open so that we can track whether it is triggering or not meanwhile we will go ahead and upload one of the image to our storage so that it can trigger the workflow so i will upload an image and so this is an image which i have taken it from okay internet and it is of like kindergarten class kids click on upload so you can see that image is uploaded one dot jpeg and now i need to wait for a few seconds like five seconds so that this flow get triggered let's give it few seconds okay so it got triggered and it is showing it is successful now to verify it we can quickly go to one drive we'll refresh it and let's open it cool so we have these six rows got added recently and i can quickly show you the image which i browse so okay so this is the image which i just uploaded towards your store storage and you can see there are six faces and here also we can see there are six rows recently added so out of all uh these kids one kid is smiling little less but rest five are like happy i would say they're laughing and that's the reason it is showing 0.98 for one of the kids probably it's she the first kid which is shown over here now teacher is free to use this data anywhere she can perform any statistical calculations she can plot a graph or she can do whatever she wants with this one so in that case she need not to go to each and every kid and to ask how they are doing rather she can directly conclude the status using this particular sheath and one thing is like you may get some other numbers like point zero three point one three so in that case teacher can define some threshold that whenever the smile is having score more than point five she will be considering he or she will be considering those kids in the smiling category itself so these criteria she can decide so i think i'm almost done okay so yeah that's pretty much i have for you guys that was absolutely great without you too can you you made an entire project in just 40 minutes right so that was absolutely amazing you can come back on stream yeah oh sure so actually i thought like even i attended many sessions and decided that i will not post so much technicalities because anyways my head is also roaming like to digest things it will take time so let's keep it simple with just routine examples so there have been many questions throughout the live show throughout your session and let's take a couple of them it's first by kumal i think while you were giving the demo she asked do we need to use all three detection model at once or can be used independently because sometimes it would be good for security purpose what do you think actually i tried to use all three together but i couldn't do so so i believe at a time you have to choose only one detection model because the api will take that as a parameter okay so if you are not passing the detection model it will by default take zero one okay make sense definitely there is no scope to combine all three models okay and keshav is asking can we assign more than 15 attributes i haven't tried but definitely attributes are available so uh that limitation part i'm not pretty sure okay i mean 15 is enough right yeah i think list shows hardly 15 max 15 out there so yeah gabriel is asking and thanks to everyone who is asking question really appreciate it uh gabriel is saying what language is best to create model i think any language works isn't it there are many languages like you can go with python c sharp node.js even java sdks are available go is there so it's pretty up to the developer like what they are comfortable with but my personal choice like i had been dotnet developers so i would rather prefer to go with c sharp and second would be python yeah c or python ml.net is great too i've had some seen people living searching on ml.net so that is cool uh what what do you think should i now you if you work at microsoft i never knew that i'm working as a v dash right now still not full timer but yeah it's been like close to two years now wow i i just checked i'm like wow sure that works at microsoft i never knew it uh and the team i'm working like it's like pretty amazing it's like only analysis only data and that's it no like it's not like i'm writing full-time c-sharp code there said yeah it's not that case sounds very exciting so shivata sawan has asked that what is the most challenging part with face recognition or maybe you work with it right so what are some of the challenges you face or what with the projects that real-time projects that you work what are the challenges you face with face recognition one thing is like you should be very cautious by selecting the detection model specifically and there are images because not every image will have everyone facing towards the camera there are images where somebody is looking that side like in my demo i showed a picture in which three uh like three faces are there so two it identified correctly third one although he was a boy he's saying that oh she's a girl so like that uncertainties are still there and recent detection three model which was released in august feb 2021 so they'll keep working on improvising these models but still i can see few results are not up to the mark yeah definitely yeah go ahead yeah that is the one thing and like it's like i said you cannot combine multiple models so definitely you need to play a trick that first i will try with this detection model if it's not going well or i'm getting missing data or the error then i will move on to the second detection model so you have to make those strategies yourself being a developer yeah it makes sense i mean even the data that we keep machine learning models today and right we humans are anyway biased right if i go if i go ahead and search on internet if i bingle bing and google if i do say two things uh which was your favorite seo uh search engine if i search for beauty skin it's very surprising it would only show me the lighter skin tone that's true that's true but we humans are anyway biased and we give them the biased data so we cannot expect machine learning model to perform best let's take final question it's verbose can we use this service or voice recognition this cognitive service has a voice recognition all right right so there are five different tracks maybe i can sure if you can share my screen i can quickly is it visible yeah it's visible okay so when we talk about cognitive services not just face api there are five different tracks so vision speech language knowledge as well as the search so vision talks about the face api emotions and like ocr detection all these things so you need to see where it is falling so for voice i guess it is in it must be another one of these tracks yeah i think for voice it may work for speech it is there right yeah yeah so there we can go ahead and use it perfect that that's that's great and let's see what people are saying thank you shweta uh gabriel says come on says thanks for answering i mean it says just wow i like that but just one feedback like was it too technical because i tried my level best to keep it very lightweight now you see kirby says that was deep dive whereas ganesha says whereas ganesh says very simple very simplicity right so it depends right on how if people have some technical background right they'll feel it wow this was pretty good because as you said in the very beginning right you don't have to be a very hardcore developer even if you understand how to call apis and work around you don't have to be data science developer you don't have to create machine learning model just create a service and just start using it so i think it's a piece of a cake and it shouldn't take long for people to get started andrea says thanks really good session people have been enjoying yeah this is one question naveen says love the presentation learn something today thank you so so i should keep doing such i mean i should keep taking such a real life example rather than customers use cases exactly you know because when you go ahead and complete a project it may be like 10 minutes 20 minutes 25 minutes and whenever people come back and watch it it's a complete thing that they can learn right so that that really adds value i'm going to say is azure face api free i mean like i said standard and the free subscriptions are available but there are limitations that you cannot go with more than these many number of calls per month okay there are some limitations right so right so that you will get to know when you will create an instance of face api it was showing in the drop down that whether you would like to go for free tier or the standard s0 okay okay make sense uh thanks for the answer thanks for asking uh currently says thank you swatha thanks everyone i see a lot of comments coming the more you stitch with i think the questions will keep on pouring what will happen i would like to see so many thanks thank you thank you thank you so thank you so much swatha that was absolutely amazing any final thing you want to plug in before we move to the next session uh nothing much but i would say like even i was enjoying today's the fifth day and most of the sessions were pretty good and i hope to have few more sessions along with you all together yeah thank you so much uh taking out your time from a work day i know it's a friday so definitely you want to go ahead and enjoy your weekend thank you so much and we would love to have you back have a nice weekend ta-da bye take care thank you bye-bye alright with that we now move to our next session but before we do that we still have about five to six minutes days i know our next speaker rod is waiting behind the scenes but he wrote hold on we have five minutes before we bring you in we're gonna do uh a winner announcement so all this time we have been asking okay but even before we do an announcement i think it's good to go ahead and play a sponsors video right then you guys gonna stay back to to watch the winners right that's a good time to play so i'll be back in a minute [Music] [Music] [Music] so [Music] all right i'm back and now it's time to announce the winners for live comments all this time we have been asking everyone that whenever you go ahead and ask the question to the speaker or you share your thoughts in the comments please use cloud submit and everyone who's watching on linkedin please uh move to azure summit dot live now because the stream on linkedin is going to stop midway because we can stream only four hours on linkedin and we are going to cross over that um during this session and i don't want you to people miss the following session so please visit azure submit.live to continue watching the live sessions uh the session will end in just about couple of minutes so uh let's see who's going to win it uh for all those who have been using cloud summit it's time to do a draw in the count of three two one let's go who wins it cloud summit swag kit oh let's see who's going to win it all right as well khan you win cloud summit so i could take a screenshot of this drop us an email at simon at azure summit dot live and we'll make sure that we ship you the uh swag kit sometime next month this month is very busy so we'll ship you next month and everyone who's watching on linkedin please switch over here now because the stream is going to end in
Info
Channel: Cloud Summit
Views: 812
Rating: 5 out of 5
Keywords:
Id: yWTlqepoBV4
Channel Id: undefined
Length: 193min 14sec (11594 seconds)
Published: Fri Sep 17 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.