SAP Cloud Application Programming Model: Consume External Services

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hello and welcome to this sap tech bytes video in this particular video we want to take a look at the sap cloud application programming model consuming remote services demo from the sap tech ed 2021 developer keynote and this is an opportunity for us to look at the short demo that was less than four minutes when it was delivered during the keynote in much greater detail go behind the scenes of how some of the things worked uh things like authentication that we had to kind of gloss over in order to fit it into the time of a keynote we're able to go into much more detail here in this expanded director's cut version of the demo now the background is the same as when we talked in the developer keynote that often the sap cloud application programming model samples and other content we really focus on the ability to model and create odata services but the cloud application programming model also has some amazing functionality to help you with consumption of services from other systems or to create a mashup of local data and remote data fetched via a single service call and that's exactly what we want to do here today is we want to see how to take a successfactors odata v2 service how to pull that into our cap service to be able to call it remotely and to enhance it with some additional data fields that we will model and persist only in our cap extension project and then finally some of the things that we didn't show in the original developer keynote is we want to show newer capabilities of the cloud application programming model to also expose your service endpoint as odata v2 as graphql and as plane rest so let's go ahead and jump into things now and have a look at how this demo was built just like in the original keynote demo we're going to need to start by finding the details of the external api that we want to call and in this demo scenario we want to consume a successfactor's personal information api and then add some of our own fields to the new service and those new fields will only be persisted locally in our cloud application programming model extension application so we won't be sending all the new fields data back to success factors but combining it with data from success factors and data that will persist locally now like with any api based application or extension to begin we're going to go to the sap api business hub and here i am in the start page of the api business hub now i want to go ahead and be sure to log on here that way i'm going to be able to get my api key which we're going to need to process uh the service call so once you are logged in you can come up to settings and you can click this button and it will show you your api key i'm not going to bring that up right now um but uh later i'll show you in our cloud application programming model project where we're going to use that api key in order to test in non-productive scenarios with the api hub but once you've grabbed that api key then we need to find our specific api so we're going to want to go to the success factors section of the api hub let me just go to apis and we are looking for the personal information api that's what we need to use to extend in this particular situation so i'm just going to type in personal there it is personal information we can see the technical details that it is an odata v2 api and if we go ahead and click on that we can see the api reference we can try it out from here as well so maybe you want to test it you know in the sandbox environment go ahead and run that and you can see some of the results that you'll get back uh calling this api but what we're going to need to start doing our work is we're going to have to come back over here to the api reference actually to the overview and go to the api specification and we want to download the edmx file that's what we're going to need in our uh to import into our cloud application programming model uh project in order to bring in the definition of this api so we'll just go ahead and download that so that we've got it locally and that's all we really need from here of course you know you could read more technical information about what's supported by this particular api link over to help.sap.com to get more details about it we have what we need in order to import it into our cap project so now i've returned to my development environment which in this demo i'm using visual studio code for local development basically you could do all these same steps in the very same manner using the sap business application studio as well it's nice that we have our choice of environments that we want to work in there now what i want to do is i want to start by creating a new sap cloud application programming model project so i'll just open the command palette and then open the template wizard and you see some of the sap project templates that are available on a database development fiori development generic multi-target application uh generator but the one that we want is the sap cloud application pro programming model so we'll just go ahead and choose that as our project type now we'll give it a project name um i'm going to start off here let's just say extension doesn't really matter what you name your project um i'm going to go ahead and put a ci cd pipeline in there we can add hana configuration later but for now we're just going to do looking local testing with the sql lite database i'm going to do a mta based deployment project and i'm going to go ahead and have it generate some sample files just in case we need to quickly prototype something but we're not necessarily going to use those sample files we'll we'll pretty quickly be adding our own stuff here so let's just go ahead and finish and let the project generate and reopen and what we see here is we have a new workspace with this project and it's added configuration for potentially a user interface database development service layer development service enablement and it has put it has put the bookshop sample content out here of both a an entity to define uh ultimately a database table and a no data service exposed from there i said we're not going to use this bookshop stuff we'll we'll get rid of it here in a minute but you know it lets us kind of show you what what you would get there and now i don't need the recommendations everything else is ready to go so now we want to import the odata service definition into our cap project and there is a cds command line tool that will help us do that let me just open a new terminal here now if i just use the cds import command and then give it the location of where i just downloaded that edmx file this ec person edmx and we'll just execute the cds import command and we see that it has imported the api definition stored it in the srv external folder and then it also updated our package json file so let's see what all it added here if we go into srv we now have an external folder and now we've got the service definition in uh season format this csn file and that has a process that the cloud application programming model can process internally so that's otherwise known as schema notation we also see that it has altered the package json file which already had some of the basic cloud application programming model dependencies in it but now we see that it's added a section here for cds requires and named the service you see personal information it is uh it's denoting that that is an odata v2 service and where to go get the model definition of that service so we've added that external service to our project but uh that's kind of where it's the import is ended you know it hasn't adjusted our cds definition files it hasn't added it as an entity because it doesn't know what we want to do with it do we just want to call it do we want to create a mashup of it and combine it with some local data and that's precisely what we want to do we want to create a mashup so what i want to do now is i'm actually going to switch over to my sort of already finished version of the project rather than do a lot of typing but i'm going to walk you through all the manual changes that i did to further use this service and then re-expose it as a cap service but with extended fields on it so i've switched now to the final version of my project and this is actually the version that's available in the sap samples repository on github so if you want to check it out directly try it on your own local machine you can go ahead and pull this down and everything i'm going to show you here is pretty much available now the changes that i've made to make this work beyond what we've seen so far which is just the project creation and then the uh the importing of the external service there's a couple changes or additions i should say first of all in the package json i've added a couple of other optional features that i'll show you later um by default the cloud application programming model is going to create an odata v2 search i'm sorry an odata v4 service and it's pretty neat because we're calling an odata v2 service if you remember what we saw in api hub but then when we re-exposed that with our extended version the cloud application programming model will automatically convert that to odata v4 and do all the translation in between we'll we'll see that in a minute how simple it is to call the odata service from within cap but i've added a couple of other things i added the odata v2 proxy so if we also from our single end point we also want to expose od to v2 we can do that i also added a couple optional dependencies here for graphql you see the graphql tools express graphql and graphql module there now this is an an experimental feature that became available in the november 2021 release of the cloud application programming model but i'll also show you how easy that is to to add in and exposed to your expose as part of your service as well um i added a dot env module we'll see that in a minute that's going to help with loading our api key and then i added a course module so that if we want to configure course cross-origin scripting um particularly useful because i know later the service is going to be consumed by appgyver and we need to configure cores for uh we'll first be able to access our service from appgymer so i added a couple of other dependencies i did do an npn install as well so you know i've got a full node modules folder you want to be sure you know even after the project generation you should do an npn install to bring down all the uh dependencies the other thing that i did is i extended the cds configuration section now some of this you don't need you know some of it is just setting the version of the ui5 preview tool that you get when you when you test in cloud application programming model some are some of the optional features you know i wanted to turn on graphql or the new rest adapter i'm just kind of showcasing and showing off some of the newer features of cap here by configuring and turning them on but the part that's really i think interesting compared to what the initial wizard created if you remember when we did the when we did the cds import it created the ec personal information section for us of kind o data v2 and it pointed to the model and that was it there were no credentials provided and we need to provide some some credentials uh and and technical configuration of where to go call that service the edmx that we imported has the definition of the service its interface it doesn't have the technical information about how we need to connect to it so that's what i've added here and i've added two different configuration sections and and this is a neat feature that i can configure a completely different set of of credentials when i'm working in a development environment than when i deploy to production and this way i don't have to have two sets of of the package json or remember to change it i i can define both and and when i deploy to cloud foundry akima it's automatically going to use the production version and if it is the production version then i'm going to use a destination so when i connect to the to the service uh by by specifying configuring here credentials destination and then this is going to be a sap btp destination and it's only in the destination that i can figure all the technical information about how to connect and what authentication to use and things like that now for development purposes i want to connect to the api hub and i want to use the sandbox and if you remember you know if we went back to the api hub uh when we were in like the uh the try it out we could get the environment sandbox details um and we could get our api key that's what we're going to need to be able to call it from the api hub and that's what i've configured here is a url to the api hub and telling it to use no authentication because it's going to use the api key as our authentication so that's what i needed to extend in the package json now what do i need to extend elsewhere if we look at the database the data model i actually have not done anything with that i've left it as is with just the original sample bookshop i could take that out i'm not using it but just to show you that i've done nothing at the data model level i've only come into the service level and i've come here to and i've created a service file to define my odata service that i want to expose out of my application i'll call it catalog service but now i'm saying that i'm using ec personal information from and i'm pointing it to that cson file that was created via the cds import so normally when you're creating a service file in in cap you would point to your data model and you would import the entity definitions from your data model but here i'm importing my energy definitions from that that external that cvs import so now i've got this ec personal information i have an entity in my cloud application programming model application called external well a whole service definition called external that i've imported now what i want to do is i want to create a new entity and i want it to be a projection on the external service so i'm going to take the service definition and i'm going to say i want a projection on it and i'm going to limit it down it had a whole bunch of of columns but maybe as part of the simplification of my particular service i only want a subset i want first name last name instead of initials that was the column name inside the original service i'm going to rename that as name header so you see your title as personal title so not only do you have the ability to reduce the service but also to change the names of of the attributes or the columns exposed by the service and then finally with this dot dot a constant as middle name type string well i'm extending the entity i'm adding a whole nother column that was never there in the original service definition i'm adding a middle name and what i'm telling you to do here is i'm actually telling it yes i do want to persist this in my local in my extension itself normally an entity that is a projection would not be persisted as a as a table here i'm going to persist it as a table so that i can store the key values and this middle name but not store all the other data so that i'm able to do a mashup and combine data from the remote service with locally persistent data now you might see here that i've also taken this existing service the per personal that's my extended service right catalog service per personal and i've re-exposed it as another projection called per personal but with a different service endpoint catalog service rest and i've used the at protocol to also expose a pure rest version of the service endpoint so right now with this service definition i've got odata v4 and i've got a rest a pure rest version of the service so that's my model to to create the mashup service but now i need a couple pieces of code so the generic handlers of the cloud application programming model don't know how to combine my local data with my remote data i do have to code that myself in order to make that work i'm going to need a couple things first of all i'm going to need a server.js now that's the bootstrap of the whole cloud application programming model service it allows me to code on startup of the entire service and for the most part i'm this is where i'm adding extra things like i'm adding the odata v2 proxy that will automatically create an odata v2 version of my service as well this is where i'm adding the course the cross origin scripting support i'm also adding a health check um but probably the most important thing i mean you could leave all these things out they would all be optional features but what i have here though is in a non-production environment i'm also using this.env module to load configuration and all that's going to do is it's going to look for a dot enb file which you're not going to find in github because all it has in it is my api key which i've i've blurred out here because i want to give out my api key you would go to the api hub to look up your own api key if you create a emv file and put an entry in here for api key and put your own api key in here what that's going to do is it's automatically going to load that into the environment and then in a second you'll see we're going to use that when we make our service call but this will only be processed in a non-production environment if this was in production it wouldn't load that because we wouldn't have an api key as part of our project we only have that for local testing dot env is excluded from uh from my via my giving door so it's not pushed to get and it's never part of my project downstream it's only locally the next part and this is the most important part is to create a custom service handler for when the read operation is requested for our per personal entity and this is like an exit it lets us code what happens instead of the generic event handlers we code and we control what all is going to happen here and that's going to allow us to both make the external call to the successfactor system then take the resulting data and combine it with our local persisted data as well now what we have here in the read operation we just have to start off we have to take the we have to build the query basically that we want to forward and we can do that by simply taking the um the select statement that was already passed in by the generic handler it's the generic handler of the cloud application programming model is still processing the odata or or rest or or graphql request and it's taken apart and it's it's formulating this into an internal query structure so we're able to use the request query and it's select from statement it's select limit it's select where and its order by and the cloud application program model is doing the translation between the column names that we renamed and the original column names it's doing all that work for us all we have to do is assemble the query as a new select statement using the pieces of the original generated select statement but then instead of processing that against the original entity we're able to send that to the remote system simply by also declaring our ec personal information which is this connect to ec personal information that's going back to the package json and looking up the definition and the connection information and connecting us to the remote service we're then able to send the request that we built manually and there's our there's a query that we sent through and we can add headers to it and this is where we're adding the api key once again we'll only have this api key in our environment if it's in a emv file and it got loaded by the by the bootstrap but it only does that in a non-production environment otherwise this would be null and and then we wouldn't be passing an api key once again that's only needed when you are calling via our calling to the api hub so we're going to get the data back so now we have an array of data returned we don't have to worry about the fact that it's o data that's been translated back to a simple array of data either that or or a single record if it was a single get operation but then what we're able to do next is i want to mass load from the local persistence in my cloud application programming model application the additional data the middle name so what i'm going to do is i'm going to write a little mapping function so that this can execute for every record that we got back and what we're saying is here for every item that gets passed into this function take the id and form a select from the local table per personal where the id is equal to the the id of the record that we got returned from the external service as long as we got some data back then go ahead and take the middle name from the local data and add it into the data set that is returned from the external service and we built this mapping function then we can just say await promise all get extension data and what that's going to do is for every record in our on our personals it's going to perform this operation it's going to do that one record at a time it's going to parallelize that but it won't move on to the next statement until all the records are done and that's what this promise all gets us so we get a very efficient high-speed request of data in parallel there and you can see you know we're able to use select statements for both our external query to the remote service and for querying the local entity and that's part of the beauty of the cloud application programming model is we use its its built-in query language which is very sql-like select and update and delete statements we don't have to worry about translating between different communication protocols and then finally i've got a little a little hack here unfortunately with the way this extend mashup approach works we don't get a select count automatically filled and therefore fewer uis use that select count for this table pagination so we need to fill in that select count ourselves so do a little calculation here to return the select count and that's it as far as the custom logic with this logic we would now have a service that returns uh remote data mashed up with our local data the only thing that i might point out here is that i'm able to mock the data as well my local data for testing purposes i can just create a csv file with the name of the catalog service and the entity per personal and i'm able to mock up some middle names so i don't have to have a connection to a database and and pre-fill it with any data or anything like that i can go ahead and test whether my mocked up service will work or not just by mocking this data with the csv file and the cloud application program model will expose that automatically so let's uh let's test all this put together now let's just do an npn start and that will run our start script oh nope uh i'm not in the right folder sorry switch to the cat folder oh no switch to the section folder and then the cat folder there now i'm in the right folder and now let's do an npn start and what we see cds serve with mox that means it will load our csv files automatically and mock any entities that that find csv files for we see that it is loading the bootstrap from server dot js uh we see that the odata v2 proxy is working and that will also have an odated v2 version of our service running at slash v2 we see that it has loaded our external model and our local models that is going to be using bindings from uh cbs service json so that's our configuration to our endpoints with running right now just local testing with sql lite so it hasn't had to connect to a remote hana database for for the local persistence we see that we've made a successful connection to the ec personal information the odata v2 service that's running on the sandbox so we've connected to it and we're also exposing the catalog service at slash catalog that's our odata v4 service we're exposing the rest service at catalog service rest that was the second one that we had there and they're both using the same custom implementation exit and that we're serving graphql at the endpoint forward slash graphql if we want to test this we can just go ahead here and click on localhost port 4004 that's where everything is running and what we see here is we've got a couple different endpoints let's test our odata v4 endpoint first so we can look at the metadata of our mashed up service there we are oh data b4 there's our service definition and any additional annotations and and supporting information of the service now we could also come here and we could add v2 and now we have our odata version 2 metadata exposed so without needing a second service definition or a second service endpoint our single cloud application programming model uh application can expose o data v2 and o data v4 it's the same mashup it's the same exits it all works the same way now let's test this let's just go ahead and run our personal this is our extended mashup there we are it's loaded all the data from the remote service and that we see for some of the records the middle name it's added our uh our data that we persisted here in the csv file so like for record number one zero zero zero zero nine we said the middle name was test and there we see in our final service endpoint all this data start date first name last name id name header personal title that all came from the remote service we've got just a reduced set of fields we've got the renamed attributes but then finally we have our extended column as well and it's using the data from our local persistence we see the same thing here with george sally luke sue and henry but then the rest of records there is no middle name because we only we only persisted in a handful of records and we've got the full odata capabilities here you know if i want to do uh top count so only give me the top 11 records and check to make sure that our special logic to return count we now see odata count 11 and we only have the top 11 records so whatever uh odata before parameters you would test with are all being processed and forwarded to the odata v2 service that's behind this and cap once again is taking care of all the translation there now if we go back and we just change this endpoint to b2 catalog and now we have an odata v2 service still calling the remote service still processing the same exit we didn't have to write the exit multiple times uh but now formatting the the the output according to o data v2 so we support uh both of those from the same service endpoint and likewise we would be able just come back here show you some other fun stuff of course you have the fiori preview this is running against the og v4 service and here we see it basically loading the the same set of data uh there's our bigger you know there's our uh set of results and we can drill in and look at one of the records there we are so the fiori preview that's built into the cloud application programming model test framework there showing us our service but we could also test the the rest version of our service so if i click on the per personal that is part of the rest service it's doing all the same thing except now it's not o data that's coming back so we don't have the the extra odata metadata um or the uh extra uh a little bit of header information there and the result is just a pure rest uh end endpoint now same exits are being processed as you can see here same remote service call so now we've taken our remote successfactors odata v2 service and we've re-exposed it as plain rest uh likewise uh we added the um open api functionality here so this is once again our odata v4 service but we've got an entity data model and we can use the test tool the swagger open open api test tool to also test our service our new service our mashup service here as well very similar to the tooling that we had in the api hub to test the original service except we're now testing our live cap mashup service and then finally the last thing that i would show you come back here is we have an endpoint for graphql and from here we have the ability to write a graphql queries uh you know it's got nice code completion here of the service endpoints uh the service entities um and the various columns here come here like do another line you see all the all the columns exposed and we can go ahead and run that query and four there we are it has uh now this is still experimental support for graphql it's not finished it's not perfect and you notice like my top two actually didn't get passed through for some reason but it's still executing our our exit and this really shows the power of this with one exit we're able to take the input from either graphql rest o data v2 or o data v4 take cap processes out all the the query parameters gives them to our code our code forwards that the source odata v2 service we don't have to do any translation manipulation for the different protocols and likewise we give it the results back you know we get the we get the result data we enhance it with our local columns we don't have to turn it back into graphql rest o data v2 or data before cloud application programming model is taking care of that for us i hope that you've enjoyed this expanded look at the sap cloud application programming model consuming remote services demo that was originally presented as part of tech ed 2021 in the developer keynote we've had a chance to go into much greater detail in the demo here today and i hope it's given you some pieces that you could build upon in your own journey using the sap cloud application programming model
Info
Channel: SAP Developers
Views: 696
Rating: undefined out of 5
Keywords: API Hub, Developer Keynote, EDMX, GraphQL, OData V2, OData V4, OpenAPI, REST, SAP CAP, SAP TechEd, Service Consumption, SuccessFactors
Id: rWQFbXFEr1M
Channel Id: undefined
Length: 38min 0sec (2280 seconds)
Published: Tue Dec 14 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.