Adobe Developers Live | Frontend Code Pipeline

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
okay let's get started hi everyone and welcome my name is vlad baylesco i've been with adobe for almost seven years working as a software engineer on the side side uh working on hdl and the core components and anything that enables people to develop faster with aem and together with me is ivory uh who's a cloud software engineer and we're going to talk to you about something our teams have built a way to deploy front-end code easily with aem as a cloud service something that's quick and simple to do so we're gonna be talking about the the way things work currently what changes with the new front-end code pipeline i'm gonna show you a small demo and then library will take you through some of the technical details and tell you a bit about how everything was was implemented in the back and then we'll have some time to to answer your questions so right now if you want to use am as a cloud service you have to set up your program your environment you have to create a project or get one created from the archetype and then if you want to do any kind of development you need to install the local sdk create your templates your content policies customize how your website looks and then everything needs to be integrated back into the client libs and then deployed to am as a cloud service which means pushing to cloud manager git and waiting for some time usually over 45 minutes to get your change live and then whenever you need to do one small change for example you need to change a bit the look and feel then you have to go through all these again and again and again and if you if you are a customer and you want to create a website right you need to set up am as a cloud service but then you need to hire this uh backend developer that knows all the all the wild tools that aem uses like maven and java and osgi and htl and they will be the ones that are going to be doing your your code deployment for you and once you have some some sort of websites running if you have your front-end developer that knows how to to make your site all jazzy and nice then yeah they will be working with stuff like html and css and javascript and they will need to send all their changes back with the to the back end developer and integrate everything into client leaves and then he will do the deployment again and hopefully you'll have your website updated so uh our idea was what if what if you could just create and edit your website and do it that do that as you as you start using am as a cloud service you don't need to do any kind of deployment you just start and create a website and then a frontend developer can see what you've been working on what kind what kind of content you've typed in and she can make some local changes see them live and then when she's happy with the with the results she can push the update directly to am as a cloud service your website is updated and everything looks good and that that led us to the new front-end code pipeline which is a modern solution it's based on the jam stack model and it deploys and serves static files from a cdn [Music] in order to get this working and be able to create a website without having to deploy a project you would need to see your your website from a site template then the build pipeline is going to be fast and very loosely coupled to aem uh it will use cloud manager but for the beta and for our internal testing we've been using github actions and basically this it can be everything because it's just a front-end build and deployment is just about running the build pipeline and collecting all the output in a target rgz file and uploading it somewhere which is then processed and everything that's in there is making is being made available on cdn over a static domain as immutable files and aem is also updated to reference your files from the from the new location so if you if you look at the general model uh aem is serving html through the cdn uh once you have your front-end uh module sources you can just use those to to have a live preview of how things would change if you change something in the in the front end module uh but you don't need to have am running you don't need an sdk you can just do a live preview on top of your existing live instance and when you're happy you just push your changes into a git repository and then the pipeline will build your your front-end module it will collect all the outputs put them somewhere from there is going to be picked up by the front end code deployment and made available somewhere in a blob store uh aem will also be notified about that so that it can start referencing them and then the files will be made available through cdn under those static urls when you have your javascript your css phones and whatever static images you you want to include in your frontend module you will have them all served in cdn and this this would take a few minutes just so yeah let me just show you a demo uh in order to not have something happening like it did for facebook today i've recorded it so but i'll walk you through it so when you have your aem of the cloud service instance you can just go and create a site based on a template and you can import a template we have one available the standard template it's on github and then you type in a name for your website you click on create and your website is created just like that then you can navigate your website it's a full-fledged multi-language website with some pages you can edit your content and it has some basic styling nothing nothing too fancy but still offers some options it uses core components and the style system and you can already start putting in and scaffolding some content uh even if you if you don't have any kind of programming knowledge right and it looks kind of decent like a website and when you're when you want to customize it you can just download the theme sources uh which is basically just a zip file containing a front-end module uh it's just a node.js project and yeah you can just npm run npm install and then npm run build and it will generate all the files needed to to make your website look nice uh yeah it will generate css javascript and whatever other resources you have you have defined or the template creator has defined and if we load this into into an ide we can see that it's just javascript and css well sas in this case so uh you can also have a live preview you can just run npm run live and it will build your website or your frontend module and uh it will open a proxy to your original your live website and basically it will change whatever javascript css or resources you might have in your original website uh with those that are coming from your local build so if you if you put in the the page that you want to to see through the proxy you can just navigate there it will show you the website and yeah if you if you want to make any kind of change for example you can change the background to something more jazzy like pink or something uh yeah once you've saved it will detect that something has changed it will rebuild the files it will trigger a browser sync and you can already see in your proxy that uh yeah the change the color has changed and i think it looks great so uh yeah we should just commit this and add it to git and see how we can get it live and getting it live is just uh a few steps just add it to git you commit it you push it and once it's in there you can you can go and trigger your front-end build pipeline for now i can't yet show you the the cloud manager pipeline because that's not yet publicly available but for the beta like i said we use github actions uh which are basically running the the front-end build uh and it's just about running uh npm audit npm install uh basically it's it's npm ci because that's not touching the package.json file and it's leading to more repeatable builds but then at the end it will run npm run build and when when that finished it will collect everything that's in the disk folder it will create a tarball file out of it upload it somewhere and from there it's going to get picked up unzipped and yeah basically it will be made available to aem so am instead of referencing the the default css and javascript theme will now reference our new css the one that has the the pink and nice background and yeah that's that's it for the demo uh i'm um i'm passing over to to ivory so he will tell you about the the technical implementation and give you more details about how that works well hello everyone thank you a lot for the for the wonderful demo it's great to see the front end code pipeline in action um so far we have covered what the front-end code pipeline is and delivers to the customer so now we can talk a bit about the technical details of how we enable this feature and we can start from the user and customer here um yeah they will make a request to the am over a custom domain or authors publish specific one it goes through the cdn network and goes to the am instance which will serve the dynamic content in html files as vlad mentions we are following jumpstack model here and we are serving static files from a separate domain in this case it's a static prefix with the static here we call it static domain so the browser will automatically go and try to fetch those files so again it will go through the cdn network here and reach the azure blob storage where we actually host those files and we call it workspace here so it hosts css and javascript files um that we can reach so now the task is when a front-end developer makes a code change we want to get it to the workspace and the way we can do it is a developer will execute a deployment pipeline much as we just did with the github actions that will run npm build and generate a tar file and upload it to yet another azure blob storage that we call repository so now the task is to get the file tar files from the repository to workspace and make them available so we can reference them individually and to do this we introduced a front-end code deployer component that talks with repository and workspace blob storages and identifies packages that we have in repository and we want to get to the workspace so it synchronizes the data between the two and uploads files to workspace so once fcdc does that and uploads these files to workspace they become available over static domain and the last step here is to notify am about the new availability so it can start referencing them these files we can go to the next slide and we can talk a bit more uh what the fcdc does internally so uh fcdc is running a reconciliation loop continuously and the first thing it does it reaches to the kubernetes api server and fetches configurations and secrets and these resources encapsulate all the information a cdc needs to identify which front end called pipelines to enable which am instance to talk to and so forth so it grabs all this data and then the next step is to identify what packages we need to get from repository to workspace and the way to do it is first to go to workspace and find all the packages that we have installed there so that's the first step here we are a fetching state file from workspace which has all the metadata information that we need then we do the similar operation on the repository side where we fetch the packages that we have installed there remember we have tar files there the source package files so then we have all this information in memory and fcdc uh has internal logic that decides uh which packages we need to upload to workspace and then for each of this package we uh we we process them concurrently and then for each of these packages um we uh yeah we go ahead and we download them from repository we unpack them locally and we upload them to the workspace so yeah those those are the steps here once we upload those individual files to workspace they are already available uh over the over the network we update some state file in the workspace that those are some metadata information that we maintain so it's easy for the component to identify what we have installed there and then the final step again is to notify am about these changes and we do this through post endpoint so we uh run a post endpoint against aem with all the process packages that we that we did just now and then at that point the job of fcdc is done uh and it will run this same reconciliation loop again and if there is any changes that needs to be done it will perform those um yeah so it repeats these steps uh this system here as we can see it's quite decoupled we have a uh two separate azure blob storages one for hosting the source files the third versions and one for hosting workspace uh the target uh files and then the component that handles the data synchronization between them is also separated it's an fcdc component so we achieve this uh decouple system here that we don't uh um that is separate from am and that's what we and that was the target that we wanted to achieve if we go on the next uh slide here yeah so we when designing fcdc we had some considerations that we wanted to implement within the system and then the first one was first of all we wanted the single fcdc instance to handle multiple uh deployment pipelines and scdc achieves this by by using uh by being by using dynamic configurations so the first step in the reconciliation loop is to fetch resources from the kubernetes server so we can dynamically add a new uh pipeline that we want to enable or remove existing one and that will happen almost instantaneously the next thing was we wanted uh to have a lo advanced filtering mechanism to target the packages that we actually want to expose to the internet and upload to the workspace by default fcdc will identify the latest versions of the packages from the repository or source other globe storage and upload them to workspace however if the if your source packages are following the simatic versioning uh you can have semantic version queries uh to target packages for example you can say you want to upload 0.x version of certain package or some range query for the for targeting packages another consideration was that we want our deployment we want to version our deployments and we want them to be immutable so when we run deployment pipeline we want to make sure that the package the uh tar files that we generate and that we upload to the azure block storages are not overwritten and yeah they get to the workspace eventually so to do this we adopted a unique version naming strategy here we combine package name version package version uh we combine it with the timestamp that it was generated and the commit id and that gives us a unique name now since we upload the same package to the workspace we would be exposing the same name in the workspace and we want to avoid this because it has some sensitive information such as timestamp and diversions of the packages that you work with so to obfuscate that information we take a hash of that name chatu56 hash and use that instead so uh the url uri here that you can see css theme css is the path to your file within your package and then the prefix is the hash of the package name the one that you actually use and that's and that's uh what you will see if you inspect the html content that the am serves and then finally we also wanted to make sure that uh performance and memory consumption uh were in line one of the goals of the front-end called pipeline is to accelerate the deployment of front-end code changes to the production um so uh fcdc is um performance in from that aspect it can process 1000 unique packages within five minutes so what that means is if you uh suddenly upload one thousand unique packages in your repository and you want them available right away it will take less than five minutes to for for this component to get all these changes into the workspace and make them available uh over the um over the static domain and then finally so far we have been uh speaking about repository and workspace to azure blob storages we are only uploading files to it right deployment pipeline is pushing tar files constantly to the repository and fcdc component is constantly moving files to the workspace at some point we want to clean these these containers up from outdated or outdated packages that are not referenced anymore and yeah we have separate systems in place that will do a periodic cleanup of these containers so the memory consumption of those azure blob storages are are tracked um and that that is it for the design considerations for cdc and and and for this system that enables the front end called pipeline as overall the target of the pipeline is to accelerate the deployment of the code to the production and we enable this through this system um i hope um yeah everyone enjoyed the presentation thank you everyone for attending and for your attention and yeah if there's any questions we will be happy to answer them there were a couple of questions uh let's go a bit through them i saw that hyman has already answered to a few of them uh people are asking if these could be used for on-prem setup uh yeah but you'd have to create the same kind of architecture yourself at the moment there are no clear plans to add support for that for on-prem implementation but we've seen partners and customers building similar stuff uh and yeah now you have the option to to have this running for you automatically on amazon cloud service uh another question from lamar uh this delivery model covers support for static spas will the front-end pipeline support the workflow where we are seeing seeking to achieve uh server-side rendering with something like like next js uh at the moment uh yeah this covers static static assets front-end assets uh i can't really comment right now on on the future but yeah it's gonna be interesting uh again mainland i asked if there are uh documentation to implement this process in self-hosted am uh yeah you would have to to figure out how you want to do that based on on the stack that you have and the and the deployment and the infrastructure that that you have there uh what we built here actually plays nice with the infrastructure infrastructure that adobe has in the cloud another question from mir does this mean we are getting away from client leaves uh we could for for some of the websites you could still run uh some of the of the frontend stuff from from client leaves if you have some base clan dips that you want to use like for example the standard site template uses the core components and uh those color components already come with the base clan lib that that offers some basic styling and then whatever is done in the site template the theme that that you can customize the front-end team can add on top of that similarly if you want to create your own site template at some point or have or use one that's created by by an adobe partner you could still have some clan leaps that get deployed classically but then you would have just the framework part some components some clear lips and everything for for each website you will have just a small package that is customizable that changes the overall look and feel more like a theme wandering finishes is wondering if this is an official adobe development or a third-party tool and he would like to test it today is it possible and what are the requirements well it's it's an adobe development it's built on top of am as a cloud service so you would need to have an am as a cloud service program and you would need to join the beta program is still not available at large right now and you yeah you'll need to to reach out to [Music] to your account manager and see how you can join the the beta program or reach out to to us directly uh join the beta program and you can try it out another question from lamar if you are using a single deployment model with ams on aws or azure cloud manager will be available and by extension the front end code pipeline will be available is this correct right now it's only enabled for am as a cloud service and i don't i can't really comment on plans to extend it further to ams another question from matthias how is this integrated in the back end am code deployment pipeline qa teams will typically want to test in stage and the combined product the back end plus the front end which is then rolled out uh well right now uh when you when you do a front-end deployment it's made available uh on on an environment on both author and publish and you can you can decide if you if you want to later on if you if you only want to make it available on author for example or if you only want to make it available on stage it's still going to be a cloud manager pipeline it will still run in two steps for stage and pro environments it will first deploy to stage and you can make it wait for manual approval or you can have your your custom tests uh front-end tests and then allow it to pass on to to production and for production for example if you are still unsure that you want to take these changes that quickly to to to the publish node you can just configure it to only update your authors you can have another check there and then when you think it's okay you can publish the change and have it live another question from dale is this pipeline assuming hosting via aem and amcdn for the public facing pages you'd be deploying uh it could work with different pay pages as well but it works better if you're using am of course because it's tightly integrated so you you get uh you get the updates automatically when a deployment goes through uh it also works with the bring your own cdn in case you want to use your your own cdn with am as a cloud service so you can you can configure the deployment to to use your own domain name or custom prefix there and yeah you can reference the files and load them from your own cdn if you want and that cdn will just defer to too fastly and but the the endnote will be served from your own cdn and in the question and answer tab i see from yuval are there any links to the configuration we don't have the the feature available yet for everyone so there are no public links yet for documentation but when it will be generally available you will have documentation that describes how how it works until then the beta program will will give you more information cool i guess that's it we we also have a dedicated forum as well if if you feel the need to to gather more information send us a message there thank you guys thanks everyone
Info
Channel: Adobe Developers
Views: 199
Rating: undefined out of 5
Keywords: Adobe IO, Adobe developers, Adobe CC, Adobe I/O, adobe.io
Id: OZn6UBfWKDY
Channel Id: undefined
Length: 29min 52sec (1792 seconds)
Published: Thu Oct 14 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.