Call Amazon S3 APIs with AWS SDK for Rust | Rust Programming Tutorial for Developers

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hey guys my name is Trevor Sullivan and welcome back to my video channel thank you so much for joining me for yet another video where we're going to be talking about the rust programming language and this video will be incorporated into my playlist covering rust programming along with many other foundational Concepts in Rust now this particular video is actually going to cover a slightly different topic outside of just rust where we're going to take a look at how to interact with the Amazon web services SDK for rust which is currently in preview it's not quite in a general availability release yet so it's not really production ready per se but you can go ahead and actually start programming with it today and learn some of the different AWS apis so I wanted to show you the kind of highlevel architecture of how the crates for AWS have been distributed and packaged up by the team that's running this project and learn how to actually call one of the service apis from a rust application so if you're working with Cloud platforms especially Amazon web services which is the largest and probably the most popular Cloud vendor out there they really kind of defined what cloud is these days so this is a really great platform to work with and rust is a great language to use in conjunction with AWS now we're going to be talking about how to use rust to interact with the AWS apis there are other ways to use rust inside of AWS for example you could deploy a rust application onto an ec2 virtual machine also known as an ec2 instance you can also package up a rust application inside of a Linux container image and then you can deploy that to container hosting services like the Amazon elastic kubernetes service eks for short or other services like elastic container service or the fargate deployment model which allows you you to create a container without having to deploy manage or monitor any of the underlying infrastructure that's running that container in fact fargate is one of my favorite services in AWS because if you have an application that's already packaged up inside of a container you can just hand that container image off to fargate and it'll run that container application for you now there's other ways to deploy rust in AWS as well you could also use AWS Lambda for example lambda's a very flexible uh serverless capability in AWS that allows you to Define a simple function and then execute that function in the Lambda runtime I'm not sure if rust is officially supported yet there or not but Lambda does also support the ability to deploy serverless functions in a container deployment model as well so I'm sure that there's probably some way to get rust running inside of Lambda if it already isn't supported or it may be supported it's just maybe not in uh GA or production release yet so we're going to be talking about the AWS SDK for rust here but before we jump into this topic I wanted to mention something which is a big piece of news I'm actually recording this video the very first video in my new studio location here so I've moved my equipment from my little tiny bedroom where I was recording inside of and I've actually gotten a much larger space where I can record videos this gives me a lot more flexibility to do things like setting up my green screen setting up my lighting my camera and all of the equipment that's necessary in order to record video training content so this is a really exciting video for me because it is the first video that I'm doing inside of my new studio and so that's a really big thing and I would just be very thankful for all of your support as I continue building up this YouTube channel also if you do want to provide support to this YouTube channel I have some some affiliate links down below that you can use to shop through I actually have an Amazon store landing page and if you purchase anything through that landing page that'll go towards helping to support this Channel and helping me to bring you more video content on Rust along with many other topics so in any case let's jump into the topic here so the rust SDK for AWS is an open source project under the AWS Labs organization here we're going to take a look at how to include this SDK into a rust project they do a fairly decent job of documenting how to get started although there are a lot of edge use cases or kind of Fringe use cases that are not currently supported by the rust SDK so if you're a customer that's currently in AWS and you're using something like single signon that may have certain limitations other types of authentication methods May simply not work yet so just be aware that this is still in preview it's not production ready and so before it's production ready there are going to be some potential changes there could be breaking changes and there are going to be some features that simply aren't supported in the current version of this SDK now the way that the crates have been kind of segmented by the AWS rust team is that there's this Central crate called the aws-cli with the AWS platform this might be a little ambiguous right because there's actually an AWS service called called AWS config and AWS config is kind of a compliance uh service and it also does inventory of the resources that you've deployed into your AWS organization and your individual AWS accounts but this AWS config crate is not referring to the AWS config service the awsc config crate here actually refers to the kind of shared crate that's used for the core of all applications incorporating the AWS SDK for rust so any application that's calling out to AWS apis you're going to see this AWS config crate being imported and then any specific services within the AWS portfolio that you need to interact with from your application are going to be separate crates so in their example here in the readme they are importing the Dynamo DB crate you'll see that there's a standard prefix on all of the individual service crates so AWS SD k-h and then the service name will be the standard prefix that you see on AWS services for the rust crate names so just keep that in mind and then we also have a dependency here on Tokyo so pretty much everything in the rust SDK for AWS is going to be asynchronous and I've actually got another video that talks about asynchronous programming specifically with the Tokyo uh exeutive for rust and so hopefully if you've watched through that video you have a fairly good understanding of how Tokyo Works how async Works in Rust and hopefully you'll be able to actually leverage this functionality in the AWS SDK as well so just be aware that adding Tokyo as a dependency is necessary now as with any application in AWS regardless of whether it's python or JavaScript or go or Powershell or anything else we have to make sure that we provide a set of security credentials that Grant access access to AWS services and the way that we do that is by creating an i IM user in AWS and that'll generate an access key ID and a secret access key and we can use those two credentials together in order to call various apis so there we're going to need to generate those credentials and make sure that we configure those either by using the default credentials file or by using environment variables which is just another way to feed your credentials into an application and there's some other ways that you can actually retrieve credentials as well depending on if you're deploying containers into the fargate service and you have a task roll attached or with ec2 you can attach what's called an ec2 instance profile and that's basically just a special I am roll that has a set of permissions that's directly attached to an ec2 instance so that you can call out to apis but not have to hardcode any credentials but when you're working on the client side away from aw us like a developer workstation like what I have in front of me here then I do need to generate some static credentials in order for me to authenticate up to my AWS account so we're just going to go ahead and use the environment variables because that's how their example code actually loads those credentials in so as you can see here the AWS config crate is going to provide the credentials here so we just have load from EnV and that'll just import the credentials straight from environment variables and that'll allow you kind of the quickest way to define those credentials so that you can move on to actually calling the various apis for AWS Services now their example code here is using the Amazon Dynamo DB service which is a key value storage database that's uh basically just hosted up in the cloud it's a fully managed service there's no infrastructure to manage or monitor you literally just create a table and you can start adding key value pairs or documents into to that Dynamo DB service but I'm actually going to deviate a little bit from their examples and use Amazon S3 as an example because Amazon S3 object storage is a very popular service and it's a really great core API to learn since many other services interact very closely with uh the S3 service right so like AWS Lambda for example can actually be triggered based on certain events occurring inside of S3 and things like that so S3 is a really core really essential service and if I recall correctly I think S3 or I can't remember if it's S3 or ec2 was actually the first service created in AWS way back in the kind of 2006 time frame I think S3 might have actually been the first service so again it's just a very core service and it's essential to know so this GitHub page here is the open source repository for the SDK but they also have a landing page so if you just go out to your favorite search engine and do a search for aw ssdk for rust you'll see this landing page here and if you click on one of these links here it'll actually take you out to the documentation so they do have a separate documentation landing page that goes into more depth on the rust SDK and this follows the similar format the little layout the format the colors and everything uh very closely resembles the structure that they use for other documentation in AWS so if you're reading developer documentation for S3 or developer do doation for ec2 or developer documentation for Lambda functions all of that is going to follow this standardized format that you see here but what we want to do is to kind of take a look at getting started here this is going to show us how to go through and set up our credentials set up our crate our our cargo project for rust and plug in some sample code but again their example here uses Dynamo DB but what's really cool is that if you go down under code examples and you take a look at actions and scenarios here they actually break down some code samples for specific actions or specific API calls for many different Services of course this is not going to be comprehensive but this does contain code samples for many popular services in AWS like ec2 elastic block storage Amazon Cognito for identity uh API Gateway Amazon Aurora which is uh a managed database service that they provide AWS Lambda uh Route 53 DNS and Amazon S3 even right and so if we drill into one of these and then take a look at some of the actions supported here you can see okay here's how to create a bucket or let's say that I want to delete an object out of a bucket right so here's delete an object let's say I want to put an object into a bucket well here's how to upload an object from the local file system into a bucket right so it's all broken down into those individual types of actions that you can perform and then it's kind of up to you as the creative R developer to Mash all these different actions together to build some kind of meaningful application and store some meaningful data inside of the S3 service right so there's a lot of interesting apis that you can call here but once you start to kind of practice and learn how the different apis work they start to kind of all look and feel at least somewhat similar there is a lot of kind of variance in terms of parameter naming and the objects uh structure that you get back from responses and the object structure you feed into request but you know just kind of play around with the apis especially if you're new to AWS and over time you'll start to just kind of see those consistencies and inconsistencies in the apis here so one thing I wanted to point out as well is that often times when people talk about programming against AWS they'll be talking about programming against the AWS arrest apis and it's pretty rare that somebody's going to actually program directly against the rest apis typically a developer who's using AWS services is going to be using the official SDK for whatever language it is that they're using so if you're a Powershell developer and you're using uh AWS services like Dynamo DB or S3 or things like that you're not going to be programming the API the rest API endpoint directly because there's a lot of implementation details that go into that and so you're almost always going to be using some kind of SDK and the reason I call that out is because AWS and really the main Cloud vendors like Google Cloud platform Amazon web services Microsoft Azure they all kind of differ because they're so large they're just massive Broad and deep platforms but a lot of smaller Cloud vendors like linode or digital ocean will just expose a rest API and they make it very clear how to call their apis and so you can actually call those rest apis directly but when we talk about programming the AWS rest apis we're actually talking about using the SDK the libraries for a given language so I did just kind of want to call that out so that you don't feel like you have to write all of the boiler plate code that you would use to actually call the rest API directly using an HTTP client library for rust all right so let's go ahead and actually jump in and write some code so I've got my rust virtual machine here it's running on one of my Lex D servers I have earlier videos I think it's the very first video actually in this rust programming tutorial playlist that talks about how to set up your rust Dev environment but we're just going to go ahead and create a new project here so I've got cargo installed here and we're just going to say cargo new and create a project like S3 sample all right so that'll create a new binary application and we can go ahead and just open that up so I'll do control K control o and we'll search for S3 and just open up that directory in vs code here all right so this is a boilerplate sample here and we just need to get rid of this line statement here we don't need that and we also need to make sure that we install our dependencies here for our application so as the documentation indicated right over here we need to make sure that we import a few different libraries so I'm going to skip over the credentials really briefly here and for starters we're just going to go ahead and add the AWS config crate here again that's not the config service it's just the kind of generic crate that we use to do uh AWS authentication so we'll say cargo uh ad and then we'll just say AWS - config that'll import that into our application here then we'll go ahead and import the S3 crate so we actually want to go out to crates.io and we'll just search for AWS S3 right here and hopefully we can find AWS sdks S3 and so as you can see this is the standard prefix AWS SDK and then the service name here so we'll go ahead and say cargo add AWS SDK DS3 that'll bring in the S3 service apis and it's a really good idea to have the libraries or crates structured this way because if we don't need access to the vast majority of services like ec2 and EBS and Lambda right we're not using any of those so we're only having to import the crate for the specific service that we're actually calling out to so it's a really nice way that they structured the crates here so that there's not some massive monolithic crate that we have to compile and parse through when whenever we compile our application and this keeps our dependency chain much lighter than having a big monolithic API right so the other thing that we need is Tokyo so if you check out the docs here they use Tokyo with the feature set of full so we're going to say cargo add and then Tokyo D- features equals full all right so that should take care of our dependencies and as you can see my rust analyzer is doing some updates down here it's basically just analyzing the crates that we've imported so that it can provide intellisense or Auto completion for me directly inside of VSS code and so now if we take a look at cargo. toml you can see that we have our dependencies all imported right here and it looks substantially similar to the structure that they show in the documentation right so it is a newer version of Tokyo they're using version one here we're using a specific version 133.0 and of course you know libraries are going to get updated over time so the documentation isn't always going to be up to date with the latest dependency versions unless it is some kind of dynamic documentation right all right so now what we want to do is go up to our main. RS file here and we are going to go ahead and start building out our application so the first thing that we want to do here is to go ahead and import the AWS config uh crate here so we'll say use AWS uncore config so it's not a dash when you import it it's actually an underscore instead so just be aware of that and we are going to use the load from EnV function here so we're going to go ahead and just import that with a use statement and then down here we'll say let my config equal load from EnV and so when we call that load from EnV function it's going to be an async function so you can see it says imple future here and so that's a indicator to us that this is an asynchronous function so in order to actually retrieve the result and execute the async function we actually have to use the await keyword in order to call that function and get the asynchronous result so when you use the await keyword to call an async function it actually causes execution of the current thread until that function completes and then the the value that that function generates can be returned back to the caller and in this case assign to a variable called my config that's going to give us this SDK config object that we can then pass around to construct different clients like an S3 client a Dynamo DB client an ec2 client an e EBS client and that type of thing depending on what service we're interacting acting with so the first thing we do is load up our credentials and then we can go ahead and work with Services now you'll see that we have a message here saying a weight is only allowed inside async functions and that's happening because we have not designated or annotated our main function as a an async main function so what we need to do is specify the Tokyo main attribute here and then we'll say async function main that'll turn our main function our entry point into an async function and we're going to be using Tokyo as the async executor runtime in order to invoke our entire rust application again I have a separate video that covers Tokyo so feel free to go back to that video and check it out so now we've got this SDK config object right so what can we do with this well if we drill into the S3 API or the crate I should say there's going to be a client object that we can use now one thing that I like to do is say use and then we can say use AWS SDK S3 right so again all those dashes are being changed to underscores but one of the things that I like to do is rather than having this really long name that's repetitive here AWS SDK S3 you can actually just use the as keyword to Alias the import as a different name so now I can actually use S3 to refer to the crate called AWS sdks S3 and that helps keep the naming of my code or references I should say in my code to that crate much more simple right and a lot less typing so now that I've alist it I'm just going to say S3 so that's the crate AWS SDK S3 and then we can construct this client object right so we have this client struct here and then if we go ahead and construct a new instance of that you'll actually notice that it takes a reference to an SDK config object which we have right here so what we'll do is say reference the my config variable here and that is going to create the new S3 client so I'll say let S3 client that'll be my client variable and that's going to take the newly constructed struct here and it's going to assign that to this variable here so now that I've constructed this client here from the credentials that were loaded from environment variables which will set later on in our shell we can now use this S3 client object in order to call various functions so if I just do S3 client and then do a DOT here we should if I didn't have this bug here in my client I should see references to a bunch of different uh operations or actions that are available for the S3 service so I have things like create bucket I have things like delete bucket I have things like put objects Or List objects and there's actually two different versions of list objects there's a list objects and a list objects V2 there's a list buckets operation um S3 is actually a pretty complex service it has a bunch of advanced features like running inventory on buckets and spitting that out as a CSV into some separate bucket uh there's bucket metrics there's intelligent tearing to move your hot S3 storage into a colder storage tier to help save on storage costs and things like that um there's cross origin resource sharing configuration if you're doing static website hosting from an S3 bucket uh there's bucket encryption settings so there are just tons and tons of different operations available in the S3 service right same thing with any other service so if we go to the ec2 apis for example there's going to be tons and tons of different actions that we can perform inside of the ec2 service like launching new instances deleting ec2 instances uh creating elastic block storage volumes or taking snapshots of block storage volumes deleting snapshots of block storage volumes um you know creating new Amazon machine images so that I can deploy new virtual machines from a custom machine image right so the operations that you have on the client are really going to depend on which service you are specifically calling out to and in the case of S3 we have a lot of things dealing with buckets which is kind of the root object that we're dealing with in S3 and then objects which are the actual kind of files that we create inside of a bucket right so let's say that we want to create a bucket in our account right so come up to the create bucket API call right here and when we call one of these operations we're not actually calling the operation behind the scenes so when if I were to run this code right now well first of all it wouldn't compile but even if it did compile then we would not actually be creating a bucket because we have not told the SDK to send the operation or send the request to the S3 API yet so we have to basically build build out the configuration of this request and then once we're done building all of the input parameters to this request then we can call the send function on this fluent Builder that you can see we've got here create bucket fluent Builder so we're basically going to call a bunch of methods to configure the request and then call the send function which is asynchronous and that will actually invoke the necessary API to do whatever operation we told it that we wanted to do now the parameters you specify on each of these operations is really going to depend on the specific operation that you're calling and this is where actually looking at the raw rest API documentation can be really helpful so let's take a look at the S3 API docs here so if we head out to the raw Amazon S3 API reference and take a look at the various actions that are available here sure enough you can see that create bucket is one of those operations and then if we take a look at the request parameters here we can actually see if the parameter is required yes or no Boolean value right and so as you can see the bucket name is really the only required parameter here there's a whole bunch of other parameters but those are all optional right we don't have to specify those necessarily well down here under the request body right we have so we have URL parameters and that is the only URL parameter that's required uh and you don't have to worry about since we're using the SDK you don't have to worry about whether it's a URL parameter or if it's a body parameter the SDK handles all of that for you but we do also have to specify this thing down here called the create bucket configuration and we have this location constraint here as well and this is just kind of an implementation detail in S3 S3 is a little bit weird because even if you call the create bucket API against let's say the uswest 2 region you actually still have to go specify this weird input parameter called location constraint and then specify that exact same region again so it is a little bit weird because you do need to specify that location constraint um if you're using a region other than North Virginia US West Us East one uh so just be aware of that and then we get our response down here as well but my point here is that by looking at the raw rest API reference documentation for S3 it gives us a good understanding of the request parameters the request body and just any variables that we need to feed into that API call in order for it to be successful right so what we're going to do is come back over here we're going to say create bucket that creates this new request right it's not calling it but then what we can do is say S3 well let me do create bucket there and then what we'll do is say dot and then say bucket and remember the bucket parameter is actually the name of the bucket that we want to create so I'm going to choose something kind of random here like Saturn D Jupiter D Trevor I'm pretty sure that doesn't exist hopefully nobody else has taken that bucket name already so we're just going to use that as my bucket name and then I want to show you what's going to happen if we try to specify uh request without all of the correct parameters so I'm going to go ahead and try to send this here and then we're going to await the send operation which is going to invoke that async function now we can capture the result here and so we'll say let create result equal the result here and if the result is successful then we'll get this create bucket output indicating that the creation of that S3 bucket was actually successful so I'll say if create result. is okay then we'll print line and say bucket was created successfully let me fix that spelling error there and then else we'll go ahead and say print line error occurred while creating S3 bucket all right so let's go ahead and save this we could grab the error message dynamically as well but I'm not going to worry about that for the moment here this is a really simple program that will attempt to create a bucket and it'll just print a line if it was successful or not so let's see what happens if we try to run this program right now now we're not getting any compiler warnings from our linter here so we can we should be able to at least attempt to run it let's actually do a cargo build down here and just make sure that everything is actually compiling successfully now while the program is building here on my rust virtual machine or my Linux virtual machine I should say I'm going to go over to the AWS Management console and we're going to go to the I am service and we need to generate a user account with credentials that have permission to the S3 service inside of the IM console we'll just go to access management and then go down to users and we're going to create a new user here so I'll just call this S3 sample D delete me just as a reminder to myself we don't need console access so I don't need this user to be able to log in to the AWS Management console so I'm not going to check that box there and then we are going to go ahead and say I want to attach a policy so this is the permissions that this user is going to have and I'll just say that this user can do anything inside of S3 all right so that's the permission set that we're going to have here so we'll say create user and then once this user has been created we'll go ahead and generate some security credentials for this user so right down here under the security credentials tab we've got access keys and I'll just choose to create an access key for some local code I'm really not sure sure why they have multiple options here because CLI and local code should essentially be the same thing here I'll just say yes I understand and we'll say description say S3 sample and create to access key all right so now we're going to copy the access key ID and the secret access key and we're going to define those as environment variables so that our program can import them using the sdk's load from EnV function so for starters we'll say Port AWS access key ID and this is a special variable name so the sdks for AWS across many different languages including the AWS CLI tool are going to look for that environment variable specifically and then we'll go ahead and paste in our access key ID then we'll say export AWS secret access key and we'll set the value to our secret access key all right now the other thing we're going to do is say export AWS region and we're going to set that to Something Like Us West 2 cuz that's the region that I want to create our S3 bucket in all right so that is our credential setup here we'll say continue and we should now see that access key is active for our user right here now let me go back to the S3 service here and see if I have any buckets I honestly don't even remember if I have any buckets in S3 right now and so if I go over to buckets it looks like this AWS account doesn't actually have any buckets inside of it so we can go ahead and just attempt to create one right so now that we've defined our environment variables down here in our bash shell we can go ahead and just do a cargo run since we've already built our application we should just be able to run it by doing cargo run and it should immediately run without any additional compilation steps now it looks like we are getting an error while creating the S3 bucket so so what I am going to do is say if an error occurs then I want to print out what the error actually is right so what we'll do is go into create result. a and grab the error message and then we'll feed that into this format string right here and use the print line macro to specify that we want to print it out to standard out so this should recompile pretty quickly because we only made a change to our code we didn't change anything with the dependencies here so once this attempts to run you can see that we get this kind of bizarre message it says the unspecified location constraint is incompatible for the region specific endpoint and essentially what this is saying is that you need to specify that location constraint that we previously talked about as part of the request body here right it says if you don't specify a region the bucket is created in North Virginia but we did specify a region by using the AWS region environment variable right up here and that's us West 2 and so because that's not Us East one we need to specify the location constraint so even though this compiles it's not technically a valid request so what I'm going to do up here is actually just put these onto new lines here so that we can just indent and make things look a little bit nicer and so what I'm going to do is say dot and then we're going to add in this location constraint right so I think that's part of this create bucket configuration right here so create bucket configuration this is going to take a create bucket configuration as its input right here right so what we need to do is actually create that create bucket configuration so if I just type in here create bucket configuration you can see it's going to use the S3 crate that's the AWS SDK S3 crate of course because we aliased it then it's going to go down into the types child module and that is where this object is declared so let's go ahead and just say I want to create a new create bucket configuration and let's say let's see I think we need to do location constraint here actually no that's not it let's see how to create this now an easy way to figure this out is actually to look at the documentation and so what we're going to do is head back to our browser we're going to go over to the SDK for rust and remember I told you that under code examples actions and scenarios we can actually find a bunch of different services like S3 so if we go to S3 and say I want to create a bucket right well this is going to give us a code sample that demonstrates exactly how we can create a bucket so it looks like what we need to do is actually create this thing called a bucket location con constraint from a particular region as a string value so if we go to bucket location constraint that is also in the S3 types module there so we'll go ahead and just say S3 types bucket location constraint from and then I'm just going to specify Us West 2 here all right so we'll say let constraint equal that value and so after we've constructed that bucket location constraint then we can use the create bucket configuration and it has a builder function here that allows us to build a new configuration then it has a method on it called the location constraint which we can use to specify the location constraint that we constructed right up here and then we can call the build function to actually return the final configuration then we can go ahead and pass that configuration variable into the create bucket configuration in our S3 client fluent API here where and the fluent API is basically where we just chain a bunch of methods together in order to configure our request and then finally send and await that asynchronous operation so what we'll do is say let the bucket config equal and then we'll say bucket configuration what's it called bucket location no create bucket configuration we'll say create bucket configuration say Builder and then we'll say dot location constraint and we'll pass in our constraint variable here and then say dot build and that'll build the configuration and then finally we can take this bucket config and pass it in here and keep in mind that not all apis in AWS are quite this complex this one is just a little bit unique because S3 is an old older service and some of the apis can just be a little bit more challenging to work with so some apis are really easy to work with some of them have these weird little implementation details like this location constraint but once we go ahead and apply that create bucket configuration to our create bucket request here we should fulfill all of the needs of this API call and then we can go ahead and do another cargo run operation so if everything went well this time then it should actually run and create that bucket and sure enough we get the message bucket was created successfully so now if we come back over to our S3 service right here in the console and just refresh we should see that this bucket has been created inside of our AWS account so now let's say that we wanted to put an object into this bucket right we want to just write a file basically into this bucket well how would we do that well this is what's called the put object operation if you take a look at the AWS S3 documentation here there's an operation if we scroll down called put object so put object is what we use to just write the file into a bucket right and we call these keys so they're not really called files in S3 they're called keys and uh values essentially right so we have a key which is kind of like the file name and then the actual contents the value of that S3 key is the actual data for that particular file or key in this case right so we're going to use this put object API call here we do need to specify the bucket you can see that the bucket name is required because we have to tell S3 which bucket we want to create a file inside of most of these are not mandatory or required parameters here but we do have to specify the key or the kind of file name so to speak for that object and then we also need to spe specify the contents of the object which is part of the body here now again because we're using the rust SDK we don't have to worry about if it's part of the URI parameters or if it is part of the request body here we just need to know exactly which parameter names it is that we need to pass into our API call so I already have some code here that's going to attempt to create the bucket if it doesn't exist and if it doesn't exist or I should say if it does exist already and it throws an error because we can't create a second bucket with the same name then we're just going to print out this error here so after all of that logic we're going to create another request so we're going to say let put object equal and then we'll go back to our S3 client once again because rather than calling the create bucket operation this time we're calling S3 client. put object instead so we'll say S3 client. putut object and again again this is just the Builder for the request this is not actually sending the request until we call the send function and so then we can specify things like the bucket right so the bucket is going to be a string value so we'll specify that as the exact same name as right up here yes I know I'm hardcoding this for now but this is just the easiest way to do it temporarily here and then we also need to specify the key right so the key is going to be the name of the object object that gets created inside of the bucket let's just fix our formatting here and put each of these method calls on a separate line so let's say that the key name is going to be Trevor sullivan. txt and then we also need to specify the body here right and the body is going to be a bite stream because the contents or the value of a file or an S3 key in this case is really just any binary data it could be Text data but text is just binary data like any other application like a like a compiled application is just binary data right uh proprietary data formats like Microsoft Word or PDF or things like that those are also binary formats MP3 files JPEG files PNG files MP4 video files mov video files QuickTime files all of those different file formats are binary files and they have special decoding software that has to be used in order to interpret those files right so that is why the body or the payload of our S3 object is just a raw bite stream now this is where we can't just pass in a string directly right we can't just say hello from Trevor right so if I try to do that it's going to say hey I expected there to be a bite stream but you passed in a static string right A A String pointer so what we need to do is actually turn a string into a by stream first but what is even a by stream right well if we start typing here and say bite stream you'll actually see that if you go into the S3 crate Under The Primitives module there's actually a bite stream type in there and if we head over to docs.rs here and let's do a search for S3 really quick and let's see if we can find that so I'm actually just going to do a search for the exact crate name which is AWS SDK S3 there's our exact match and then if we search for B stream sure enough you can see that under this crate under Primitives module there is a by stream struct right so we need to basically instantiate one of these by streams by using a string right so how exactly do we do that well we'll say S3 Primitives B stream and then let's go ahead and take a look at the new function down here so the new function is going to take this thing called an SDK body it's like all right what is what is one of these SDK bodies right well this is where we're going to come back and actually take a look at SDK body here which you can see is coming from a different crate here all together right so what we're going to do is actually try to find a different way of creating one of these by streams and one of the ways that we can do that is by using this from function right here and so this will allow us to actually convert certain data types into one of these by streams and so if I just try to specify a string here like hello from Trevor and let's say let bite stream as a variable equal the result here but as you can see it says that we are not allowed to take a string and convert it directly into a b stream because the appropriate trait has not been implemented however it does actually tell us some other types that do implement the necessary trait to do this conversion and one of the most generic ones that we can deal with here is actually just called the VEC of unsigned 8bit integers so if we can take this string and turn the string into a ve of unsigned 8bit integers then we can use this from function to convert that VC into a by stream and successfully convert a string ultimately into a bite stream so let's go ahead and declare this variable as input string and then we want to convert this from just a static string into a VEC of bytes or unsigned 8bit integers right so if we do a DOT here you can see that we can get do as byes right here so that could turn it into a slice of bite but a slice of bite still isn't a VC right so what we can do is actually take our slice of bytes and then call this function here called two VC and that two VC function is actually going to turn that bite slice into a a bite VC right and a bite VC is a v of u8 and then we can take that VC of u8 as you can see the compiler inferred that data type here and we can turn that into one of these bite streams that we need in order to pass that string hello from Trevor into the payload or body of the S3 key that we're trying to create so what we'll do is just pass in our veca 8 in input string into this from function call right here that'll allow us to to successfully construct our byte stream and then the last thing that we do is just pass our by stream into the body function here all right so now we need to do a do send and we also need to do await because this is an asynchronous operation so now we're going to get a result with put object output and we'll just say if our put object operation dot is okay then we want to say successfully created S3 object otherwise we want to say print line failed to create S3 object and a failure could happen for a variety of different reasons for example one of the most common could be that you don't have the necessary I am policy permissions or maybe you have internet connectivity issues that would prevent you from communicating with the rest API uh so there's a lot of different situations where it could fail but in this particular case this is a really simple operation so I'm kind of expecting that it's just going to succeed so right now our bucket called Saturn Jupiter Trevor does not have any objects but if we run our application again then it's going to attempt to create the bucket it's going to fail because it already exists but it'll move on to the next phase which is to actually create the object inside of this bucket right so let's go ahead and do a cargo run here that'll just recompile our rust application and then once it compiles it should go ahead and attempt to create our object so the reason that you're seeing this error right here is because the bucket already exists and we already own it so it's it's already inside of our AWS account but then you can see that after that error message is printed out which we kind of properly handled so that the application doesn't panic and terminate early you can see that down here at the very end it says successfully created the S3 object so we know that our put object operation right here was successful we told it what bucket to put the object in we told it what name to give the object and we also gave it the payload or body of that object so if we head back over to the AWS Management console and do a quick refresh sure enough you can see that that object was created and if we were to just download that file from Amazon S3 and open it up in our text editor here you can actually see that it contains the text hello from Trevor so anyways this has just been an introduction to the rust SDK for Amazon web services specifically how to use the Amazon S3 apis again there are lots of other operations available there's things like delete bucket operations there's things like copy object operations list objects all that fun stuff that you can do with the S3 service so feel free to play around with the S3 apis in your own rust application and I hope that this has helped kind of jump start you to achieve that objective in any case if you enjoyed this video please consider supporting the channel by shopping through my Amazon store in the link Down Below in the pinned comment that really will help me to bring you more video content on a variety of different software topics and Cloud topics and programming stuff all that kind of fun stuff so thank you so much for your support on this channel this is Trevor Sullivan coming to you from the rust programming tutorial series signing off we'll see you in the next video take care
Info
Channel: Trevor Sullivan
Views: 2,861
Rating: undefined out of 5
Keywords: rust, rustlang, rust developer, rust programming, rust software, software, open source software, systems programming, data structures, rust structs, rust enums, rust coding, rust development, rustlang tutorial, rust videos, rust programming tutorial, getting started with rust, beginner with rust programming, rust concepts
Id: rXL8i4nGY6s
Channel Id: undefined
Length: 50min 34sec (3034 seconds)
Published: Sun Oct 29 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.