PyTorch Tutorial 1: Introduction to Tensors

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hello and welcome to the second video in this tutorial Series where I'll give you guys a quick overview on what pi torch is and show you guys how to write your first Pi torch code in the first video I show you guys how to install pytorch and link your pytorch environment to your IDE but don't worry if you haven't watched that video you can follow along with googleab instead pythorch is an open source machine learning framework it is widely used to develop and train neural networks at the core pytorch is this concept of tenses which are multi-dimensional areas that can be used to store and manipulate data a tensor as I said is a multi-dimensional array that can be used to represent various types of data including scalars vectors matrices and even higher dimensional arrays tenses are similar to non-pierre's but they have an additional feature that make them suitable for deep learning applications here are some of the key features of tensors and pytorch so first of all tensors are efficient pytorch uses optimized CPU and GPU implementations to perform tensor operations quickly and efficiently tensors are differentiable pipe watch tensors can be used to represent computational graphs that can be differentiated with respect to their inputs make it easier to compute gradients for neural networks basically python will just do all this gradient calculations in the background for you which makes it so powerful you don't have to implement these gradient calculations by yourself and lastly tensors are flexible Pi torch allows you to create from a wide range of data types including floating Point numbers integers and even Boolean values pytorch also gives you a lot of a lot of different operations that you can utilize to manipulate these tensors but I'll show you guys about that right in this tutorial I'll show you guys how to write Pi twitch code using chipped in notebooks the reason why I'm using tube to notebooks is that it's easier to for us to look at our tenses and see the data in them just like being able to print off the each line is just very crucial when learning this framework so we're gonna have to want to install Jupiter notebook so we're going to want to open up Anaconda prompt do this you can just go Anaconda prompt and then all right cool so it seems to be finished importing ports you can see over here it also imported torch so you can just follow along and it should be fine close both that close that all right let's focus on this so let's first create our first tensor so we can create our first tensor bike right so in Anaconda prompt we're gonna want to go to our torch environment first so we go activate Torch we just name the environment watches to say it's actually the same as the the package name Pi torches package name so if you guys get confused I should have actually maybe named it a little bit better um all right so let's just install Jupiter notebook by doing put and install notebook cool so you can see it's installed it's already satisfied because I've installed it before and then I can just go Jupiter notebook and that will also run okay you can see it's already going up cool and then we can just well let me just create a new one so new so let me just show you so you can go import torch over here that should run while that is running I'll show you guys how to disrupting collab so you're going to collab.resage.com go new notebook wait for it to load taking quite a while let's just go import torture as well and then that should run too this is still running for some reason I don't know why I'll get back to you guys while I mean everything is actually set up and what you can do with these tenses is that you can also add multiply add multiply subtract basically like everything you can do with the non-pierre you can do with the tensor and probably a lot more too so let's just create the tensor X again just going to copy paste this and then let's just change the values up a bit go comma two and we can three comma four I'm just gonna create wire like this as well so and then let's just do operations so you can go set is equal to X Plus y then let's just print set here what we can also do is instead of going using the operator we can also use the built-in function for adding so it's torch dot add and then said comma y I still get the same thing sorry X comma y that's better all right but I would recommend this is probably easier to read than this I would recommend using this obviously we can also can say we can multiply them let's just multiply them sorry print what print said let's hit multiplies them there's a whole bunch of other operations that you can do you can also actually just can go torch dot multiply to XY that gives us the same answer you could even go x dot multiply Y and that would work so there's a bunch of ways you could use these operations you can actually just call the method from the tensor object or you can just call the torch method like this I just don't put a personal preference so you can do whatever you want whatever is more easily readable for you just make sure that it's when you come back to it it's easy to read and you can follow what you've done and other people can also do that um after that we can also do matrix multiplication so why the at symbol is for matrix multiplication so we said times y and we can just print that again you can see if you guys do the matrix multiplication of these two you'll get these numbers can also do set operations like torch.sum X this gives you one value which is 10 just the scalar value or also you can also choose the dimensions of them equals one you can see I saw it sums across sorry um let's just do yeah okay if x is fine actually so yeah it gets the highest value here sorry no no it sums across it sums across these Dimensions let me just so this is the first Dimension second dimension would be going downwards I mean not the second the zeroth dimension would be going downwards all right and then you can also do torch.max so I said it's equal to torch.max and then we can pull that away of X that should be 4 right yeah that's four and then we can even do it across across them so let's just run this you can see the values so you just go dot value without values to get these values like that but we can just do that we get the indices so we can see that it's both at the first element so these are the first element obviously because this is zero so you can see it gets the max of this row and then it gets the max of this right you can also do it vertically which would be zero yeah you can see it takes Max which is three max which is four you can do this four you can change the dim like that for the sum as well you can also do some other operations which is said outset.xp but let's just do x dot XP so let's just check what that gives us this is basically it's going X bar e to the to the X basically for each of x's elements um you can also just do torch.exp which is of X that works as well and yeah that's a whole bunch of operations that you can do in tensorflow and not intense flow in pythwatch and yeah that's basically all I wanted to show you guys the next tutorial the next in this tutorial in the next tutorial I'll actually show you guys how to build your neural network from scratch we're not going to be doing completely from scratch we'll use pyte which is um autograd which is basically how pytorch both computational graph will use that to get the back propagation gradients so you guys won't have to calculate that yourself but I'll show you guys and I'll explain to you guys all of that stuff in the next video and how it all works all right I hope you guys enjoyed please leave a like And subscribe if you want to see more contact content like this
Info
Channel: Rashaad Meyer
Views: 265
Rating: undefined out of 5
Keywords:
Id: 3w0MeDvQk4I
Channel Id: undefined
Length: 11min 1sec (661 seconds)
Published: Mon Mar 13 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.