Functional Analysis Overview

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
alright thanks for watching and today I want to give you a glimpse of the beautiful world of functional analysis also known as infinite dimensional linear algebra and so the difference between finite dimensional linear algebra infinite dimensional linear algebra is that suddenly infinite dimensions convergence comes into play so the setting will always be as follows so maybe part one in functional analysis you do a lot with normed vector spaces so that we get an infinite dimensional normed vector space stays and norm just means we can put a norm on this so we can measure lengths of vectors and more and more importantly we can measure distances between vectors and using this distance function it becomes sort of a metric space and with this metric space we can actually define convergence namely now we can say that a vector xn or a sequence of vectors xn converges to X if and only if they become closer and closer namely the norm of the dear friends from xn and X goes to 0 as n goes to infinity and you can write this if you want in terms of epsilon n notation namely for all epsilon there exists an and not large enough such that if n is greater or equal to n naught then this thing becomes less than Epsilon and this does exist in finite dimensions as well of course but it's really infinite dimension where it becomes interesting and not only that what makes it also interesting is that you know we can define topologies on this space so we can define open balls you know perform let's say a ball Center that X is a set of Y such that the distance between y and x is that step or we can define those open balls once you have this open balls of a base for topology we can define you know topological vector spaces as well and so going back to our vector spaces so as I said we have now a metric space so with this distance V becomes a metric space and of course one thing that's not an analysis is to study complete metric spaces meaning that every Cauchy sequence converges and so if you have a normed vector space that is complete with respect to this metric then we get the first fundamental definition of functional analysis it's called a Banach space and in fact one can say this the functional analysis is really the study of the properties of those bana spaces which again are complete normed vector spaces and of course there is one prime example of a bond of space that's super important and that all Banach spaces are modeled by it's the LP space in analysis so the prime example again no pun intended is LP of R or RN or something it's just the set of functions functions F such that the F to the P is integrable so integral over R of f of X to the P power DX is finite and this well it becomes it's it's a vector space the cursor of the sum of two functions you know it's still the peep integral is finite mmm for any multiple of a function the P integral is finite and you can check that if you define the peep no LP norm to be this thing so integral F to the P to the one over P it becomes an or and it slightly trickier to show that in fact with this norm we have a complete metric space and therefore this is a Banach space and again I would even say that Banach spaces were invented to study those LP spaces again I'm not a historian so it might be wrong but okay so we talked about Banach spaces so the objects red vector spaces and of course as it's common in math is to study functions between two objects well here we have the same thing namely we can study linear transformations and what's you now is that there are two kinds of linear transformations continuous ones and non continuous ones and before there was an issue because it turns out in any finite dimensional vector space any linear transformation is continuous but here there might exist discontinuous ones so the next thing is to study linear transformations and here's a cool thing so usually if you've taken analysis it's pretty hard to show stuff is continuous you have to use an epsilon and Delta argument it turns out for infinite dimensional linear algebra is actually there's actually an easy way of checking if a linear transformation is continuous simply by checking that it's bounded and what that means is simply that if you take the length of T X so again T is a linear transformation for let's say B to W if you take the length of TX is always less than equal to a constant times the length of X where this constant is independent of X and linear transformations with those properties they're called bounded linear transformations and in fact it turns out that boundedness is equivalent to continuty and let me quickly show you this it's actually a nice proof so let me first of all show you the boundedness implies continuity it's actually not bad cuz you know how to schewe stuff is continuous you take the dear friends so calculate T X minus T why you wanna show X and wire close enough this is less than Epsilon but because T is the linear transformation that's the same as saying T of X minus y which is not only wrong for nonlinear stuff it's very wrong so really function analysis works because of this property and now assuming T is bounded this means this is less than equal to a constant times the length of the in and then basically from this it follows it's continuous basically because if you're religious continuous function then it's continuous but if you want the proof if you set this less than Epsilon this means that X minus y is less than epsilon over C and this becomes your Delta so if X minus y is less than that then in fact the difference of the outputs is less than Epsilon all right more interestingly is the converse so does it follow that continuous functions are bounded it turns out that yes because let's use the definition of continuity at zero with epsilon equals to 1 so let epsilon be 1 and Y is 0 then what this means it means that if X minus 0 is less than Delta then T of X minus T of 0 is less than 1 in other words if X is less than Delta then T of X is less than 1 in other words if the vector itself is very close to 0 then in fact the length of the output is less than 1 how does boundedness follow from this well now let X be any vector and let's construct and from this let's construct a vector of length less than Delta notice you can actually write X this so it's T off so let's do Delta over 2x over length of X and to make this equal to X all we have to divide by the reciprocal so length of x2 over Delta it's a horrible anti cancellation but remember this is a constant so it comes out not only out of the T but also out of the norm since it's positive so it becomes length of x2 over Delta length of T of Delta over 2x over length of X and why you write this in such a weird way well if you take the length of this vector it becomes length of X over length of X which is 1 times Delta over 2 so the length of this is in fact less than Delta it's equal to Delta over 2 which is less than Delta and therefore what this tells us is that the length of T of that is less than 1 so it's less than norm of X 2 over Delta 1 and that's just equal to 2 over Delta length of X so what have we shot we've shown that the length of T of X is less than I guess that's no equal to 2 over Delta length of X and since this thing doesn't depend on X at all this becomes our C and therefore T is bounded so continues linear transformations is the same thing as checking for boundedness and the reason I did this proof is because it's a very classical proof in functional analysis and just to give you a flavor all of you know stuff that happens in function analysis okay but this is amazing news because before an analysis it's hard to check something is continuous now it's easy to check that stuff is continuous which means that we can see a lot of things so function analysis is a very rich subject and now you may ask a force you know what is an example of a continuous linear transformation let me give you one example and one non example so consider so first of all let f be a fixed function so f which is fixed and assume that it's an LP prime of R where P prime is a conjugate exponent so P over P minus 1 or P is given it's between I guess 1 and infinity where you interpret this if P is 1 T prime is infinity and consider the following linear transformation T again former bond of space LP to do real numbers and T of V equals 2 integral for minus infinity to infinity of F P DX f of X P of X Y s and then clearly this is actually continuous and for this let's just check boundedness so the norm of t OFI here the norm is the norm of the output so in just the absolute value yo fee which becomes the absolute value of f v and then it turns out there is a equivalent of Cauchy Schwarz inequality but for LP spaces it's called holders inequality so this becomes Nestor equal to the LP prime norm of F times the LP norm of V again if you find this confusing just assume P equals to 2 then it becomes the Cauchy Schwarz inequality but remember F was fixed and so if you let this to me see you get this SC length of the length of V was in the original LP space so taking this in a chain of inequalities you do get that T of V is less than or equal to C V so without even doing an analysis argument you automatically get that this is continuous and in general in operators with integrals they're usually continuous don't quote me on that but usually when you say integration it's a good thing and now on the other hand let me show you something that's not continuous and usually it has to do with differentiation so consider the following space V is the span Oh basically trigonometric polynomials so span of ein X where and Betsy's in integers and by Stanley needs finite linear combinations so something like 2 e to the I 3x minus 5 e to the I 4 X is in that space and consider the differentiation operator so T of F equals to F Prime and I forgot to say here let's assume everything is defined on 0 to PI and let's also take again we have this vector space we want to put a norm on this let's just take the soup norm so the norm of F equals to the soup no soup for you okay sue X in 0 to PI f of X in fact you know it's a compact space so let's just take the maximum ok and I'm claiming that this thing is discontinuous ok in other words is unbounded and for this let's just take you know a general function here so T of e to the I and X so let's see what happens if you calculate T of ein X and you take the norm of this so the derivative of e IX is iin T INX and in other words what this becomes it's the maximum over at over 0 to PI of the modulus I guess of this function but the modulus of this is n and the modulus of e I and X because it's on the circle is just 1 and so this maximum doesn't depend on X at all so the maximum here is n so on the ordinate 1/2 the length of t of ein x is and and on the other hand let's compare this with the input well the length of Y I and X that's just a maximum again of ein X and that's just 1 and so what we get we get that the length of T of ein X well that's and but it's n times 1 and that's n times the length of Y I and X and so T of ein x equals to n times e IX but now if you consider the ratio T of e I and X divided by e I in X well this equals to ad and as n goes to infinity this blows up but that doesn't make sense because remember if you want to have a bounded linear trans you get T of X is less than equal to C length of X if it's bounded so T of X over X has to be less than or equal to C so this has to be finite basically but here we've shown that now it becomes increasingly infinite and therefore differentiation is a discontinuous operation in this case so not not every linear transformation is continuous kind of interesting like good try doing that you know directly from epsilon Delta it's pretty hard okay so what have we done so far we define Banach spaces we defined linear transformations and you know continuous linear transformations and then the next thing just like what's called just like any finite dimensional linear algebra we can define it to all space it turns out that the rule space is very important natural plays a fundamental role here namely given a Banach space V we can define the shadow space V Star just being the linear transformations from V to R or from C from V to C before in finite dimensional linear algebra that was okay but now we don't want to consider all the linear transformations we just want to consider linear transformations that are continuous so plus continuous namely we want to consider linear transformation on v2 are that are continuous or bounded and it turns out this weird shadow space is also a bun up States and you might ask what happens if we do that to our original LP space it turns out at least for P strictly between 1 and infinity the dual of LP becomes L P Prime and it has to do with this transformation that I talked about T of V equals 2 integral F V and V where F is a fixed function L P Prime it turns out all the linear transformations from LP to our that are bounded are of this form so you can construct I and I don't think I saw a tree but isomorphism from those linear transformations to F so forth LP start to an LP prime but carefully and generally no V and V Star they're not isomorphic and especially the double star so you can construct the same thing as no just take the dual space of the dual space it's not equal to V so in finite dimensional linear algebra that was true but in general it's not true however it turns out it's strictly bigger than V or at least it contains V in the following sense you can define given a vector X in V you can define a new super linear transformation X hat which takes he has inputs so linear transformations from V to ARs inputs and spits out a real number and T of X and algebraic they love that it's a natural isomorphism that doesn't depend on the natural injection that doesn't depend on the basis and this is sort of a way from going from V to V double star but in general they're not equal if they're equal it's nice it's called a reflexive bana stays what don't be that right okay so those are the main definitions and now let me talk a bit about the main results and there are a couple of cool ones so maybe that's part two main results because well I define linear functionals and the question is are there any linear continuous linear functionals and in fact there are many of them so let me first talk about the harmonic extension theorem and I like this name hon it means rooster in German so apparently anyway and it sounds a really cool following really cool thing so suppose you have a barn a space B and you have a subspace W and on that subspace W you define a linear transformation so suppose you have a linear transformation T from the subspace to R that is bounded so such that let's say even when respectively the length of T X is less than equal to the length of X then it turns out we can take this linear transformation T from this smaller space and extend it to the bigger space then there exists a linear transformation T Prime from the bigger space to R such that this still holds so T prime of X is less than or equal to X and well T equals so T prime equals to T on W so given a linear transformation on a small subspace you can actually extend it to a linear transformation on the whole space so it's really cool thing not obvious but actually requires the axiom of choice so if you're not believer then this might not even be true okay good luck constructing linear function also so it's the first thing you can expect linear transformations and that's cool and by the way in finite dimensions it's clear how to do that you pick a basis for W and then you to a basis for me and you define your transformation that way so that's one thing and then the next one I also like the name of its cabana Steinhaus or it's called the uniform boundedness principle and I like this name Steinhaus it means stone house so I always imagined Allah being in this stone house and he's like ah ha ha you can't get me I mean the stein house so uniform boundedness principle it's very cool it's something that's very not true in analysis namely suppose you have a sequence tears in the dual space and basically suppose that TN converge weight-wise to solve linear transformation T so that T of X be the limit as n goes to infinity of TN of X I get provided that this limit exists so suppose again the T ends converge point-wise to T then this sort of converge uniformly I would say that in fact the limit is bounded so he itself isn't V star it's like seeing the very wrong fact that suppose you have a point wise limit of continuous functions then the limit is still continuous which is outrageous in you know real analysis you might need uniform convergence but it turns out for linear transformations that's correct it's really really cold and you can have even more outrageous things one thing called the open mapping theorem suppose you a linear transformation from V to W that's bounded so that's continuous and invertible so suppose T is bounded one-to-one and onto turns out the inverse is continuous as well then he inverse is bounded as well which I get it's not I don't think it's true for a real analysis invertible continuous function the inverse is necessarily continuous but put in your transformations it is true and then the next one is a very cute way of checking if something is continuous it's called the closed graph theorem so suppose somehow you construct the graph of T so if this is X and you have the points you know X T X suppose you somehow draw the graph of T and turns out topologically suppose that graph is closed so suppose the graph itself is closed I get meaning that if you take any sequence and you know on this graph it converges to a certain point or in other words suppose this nonsense what I said but suppose that like basically the graph contains all its limit points so suppose you have a convergent sequence in on this graph that the limit is also in that graph so suppose this graph for some reason is closed then in fact it turns she has to be continuous and this is actually good because if you have a closed graph in some sense in applications it gives you a little bit more it gives you a bit more control or leeway because then you have lots of proofs that start like suppose XM converge to X and T of X and converge to Y then just show that Y is T of X so this is actually equivalent formulation of showing that the graph is closed and what's nice is we actually have some control over the outputs T of X N and it gives us an easy way of checking if functional is continuous or not and then there are a couple of other ones as well so again going back to can we construct linear functionals well there's actually a hanbeon of two theorem so hondana separation theorem so before it was the expansion theorem separation and it's also very nice theorem made me suppose you have two subspaces called an A and B and I think you have to assume they're come back something one is closed or something then it turns out I can suppose they're separated we can actually construct a linear transformation in the following way then there exists again let's say fee against T just to be consistent T and V star such that I think it let alpha be you know any integer you want or any real number you want then there is this a you know T in the dual space such that for all am a tilde is less than equal to a no tilde is less than equal to alpha that's whatever and t.o.p is greater than equal to alpha for all P&V in some sense you came for no matter which real number you want you can always construct a linear transformation such that T is less than equal to alpha here and T is greater or equal to alpha there which gives us you know infinitely many new linear functionals so in fact V Star is usually very big and there's also a special case of this in the case where a is a point so suppose this is X naught and this is any subspace B then what this says that there exists a t in the duo space such that T of X naught is less than alpha so you can modify to get strict inequality and T of B on the whole space speed is greater than equal to alpha illustrates the space with that there really infinitely many such linear functional so this theorem Pahang burn-up separation theorem shows you that some sense the space of linear functionals is very big now let me show you that the space of linear functionals is very small and this is what's called I like that name - aha Oh blue theorem so you almost have to let's go you have to drink water or something you can say that I'll open a small oh that was the heart I was thinking okay so theorem namely again we have this space V Star the shallow space consider the unit ball in that space so let V Star we just be the space of linear transformations continuous from V to R such that the length of T is less than 1 and by the way I forgot to say the length of a linear transformation is the smallest number C that we can pick in T of X less than equal to C X so the smallest value of C it's called the length of T so consider now the set of linear transformations for me to our search that you know the norm of that space is less than 1 question is is this a big set or a small set turns out it's a small set its contact so this set is compact then these star is compact namely if what does that mean suppose you have a sequence in that space then there exists a subsequence such that T n K converges to T and I'll explain this convergence in a second it's what's called V weak star convergence bear with me for a second but that's nice it just says if you have you know linear transformations with length 1 then somehow a subsequence convergence and try doing this with you know e INX so now because those kind of functions they're like 1 the analog is saying that those functions somehow converged to something except here we have been there you know linear transformations it lastly there's a last important result called the Chimera theorem theorem it has to do something called extreme points and extreme points it sounds really like you know extreme but it's not that important no it's not that crazy so extreme points basically it's defined by its opposite something is not an extreme point if it lies on a segment between two things so for example anything on that segment is not an extreme point but those two things become extreme points because there's no way to for them to lie on two segments and in particular for example the unit ball or like the unit sphere all the points on the you it's fear there are extreme point ok what's the result it's kind of neat actually let me let me draw this fear again so suppose this is the unit sphere they're all composed of the extreme points and the question is what if you take those extreme points and you just take complex combinations which just means you take like line segments connecting those two it turns out you get the original set back and so kind mailman says suppose K I think it's still about using a convex and compact so compact and convex subsets over Bonoff space me then it turns out that if you take the set of extreme points EK take the convex hull again just the set of all the segments connecting the points and you get the closure then it equals to the original set K back so in other words this gives us maybe an easier characterization of any compact convex subset it just means take your simple points usually they're not that many and take convex combinations of them and with this closure you get the whole set and let me remark this is actually used to improve the soil monstrosity report a very general version of it ok so those are the most important theorems let me just do two more quick topics and namely let me talk a bit about convergence I get because the question is what is this week star convergence so so let me talk about weak convergence because the question is what would it mean for xn to converge to X well as I said the strong way of defining this is saying that the norm of xn minus X goes to zero so one way of saying is is that the norm of xn minus X goes to zero but that is strong convergence and it's not it's not only hard to show but sometimes it's not true and the question is is there a more general kind of convergence that works for more points and in fact there is something called weak squeak convergence so we say xn goes to X weakly if I can remember we have this Banach space V we have lots of linear transformations on this space V and in particular is no matter what linear transformation you apply to xn and you get T of X it for all T in V star then we somehow say that X and converges weakly to X this is extremely useful for example in partial differential equations because it allows us to define weak convergence of functions and in PDE you usually say F n converges weakly to F if integral of f and fee goes to integral and fee for fee in your favorite space could be continuous functions infinite differentiable functions of two functions or anything and it's related to this example because remember you know a typical functional in LP is integral of takes function times something so super useful way of converging turns out for linear transformations themselves there's another kind of convergence and that's called the week star topology or week star convergence namely and this has to do with linear transformations so elements in the dual space so TN converges to T if and only if so we three star if and only if T n of X converges to T of X for X so it's kind of point-wise convergence which is also useful like in this I think I'll open Lou's theorem and so it was a very very weak kind of convergence it's bad but it's also good because the weaker the convergence is the more stuff converges all right so this is all the theory of Banach spaces and the last thing I want to talk about is a special case of a Banach space which is and I'm sure lots of you know it is it's a Hilbert space and he'll hear about space and the prime example of a hilbert space before i define it is l2 so the set of functions whose square is integrable and what makes this space so amazing is that it has a dot product namely you can define f OG to be the integral of f g bar yet the complex case we get it cheaper and more importantly so what is the Hilbert space then I forgot to say so once you have the dot product F dot G of course you can define the door which is the norm of F it's just the square root of F dot F which in this case becomes simply the square root of integral x squared which becomes an l2 norm ok the question is what is the Hilbert space it's a space with the dot product once you have a dot product you can define the norm this way if the space is complete under that door it becomes a Hilbert space so it's like a Banach space but with dot products and because we have you know dark products we can save more stuff and in fact a couple of pretty awesome theorems about Hilbert spaces first of all there's a least representation theorem because the question is what the dual spaces of H look like turns out they're pretty easy to characterize namely sort of if you have T in the dual space it turns out it always has to be dot product with something so if T is an H star then there exists you in H such that T of X equals to u dot X so turns out at least anything in a separable case or something but any linear transformation from H to arm or hqc is just given by dumb products so it's pretty nice and because of this H star and aged are isomorphic cuz you can associate T with this you and because of this we convergence you can also define it be more easily so weak convergence maybe instead of doing of all the linear transformations just take it with dot product xn converges to X if and only if xn dot y converges to x dot y which takes you know convergence of vectors into convergence or real numbers which is not too bad and also because you've done products you have orthogonality so you can of course an orthonormal basis or basis I think at least in the separable case every hilbert space has an orthonormal basis so you can actually do some actual linear algebra Hilbert spaces and not only that you can define worth of orthogonal projections projections of my best approximation like the analog oh these squares approximation is also in Hilbert spaces so this is sort of a case apart from Banach spaces but still pretty neat theory and with this this concludes our adventure exploration into the you know space of functional analysis I hope you liked it and if you want to see more math please make sure to subscribe to my channel thank you very much
Info
Channel: Dr Peyam
Views: 26,145
Rating: 4.9397593 out of 5
Keywords: functional, analysis, overview, infinite-dimensional, infinite, dimensional, linear, algebra, banach, spaces, vector, continuous, continuity, boundedness, bounded, operator, transformation, discontinuous, alaoglu, open mapping, closed graph, theorem, math, fun, weak, topology, weak*, weak-*, hilbert space, banach-steinhaus, invertible, extreme, extreme points, krein, krein-milman
Id: pTUo1g3kYMw
Channel Id: undefined
Length: 49min 35sec (2975 seconds)
Published: Thu Oct 04 2018
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.