Spectral2

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
last time in lecture we were talking about using the FFT we talked about the basic structure of fffft and the whole idea of the FFT really is to say if I have some differential or partial difference equation the idea is that I can expand the function sort of in basically in terms of cosines and sines that's the that's the fundamental premise now anytime you do this the important thing is how to calculate this and in particular we showed that the generic thing is that it's gonna be a order N squared operation to calculate these coefficients but the big idea with Cooley and to key was that this can drop down to that okay making it all of a sudden instead of just in another order N squared technique down to something that makes it one of the fastest methods on the market okay so that's the big idea now let's step back a minute say is there anything special about what I did here I mean this expansion in terms of these functions these cosines and sines it's pretty standard it's called Fourier analysis right so a lot of people have heard of this you expand any function in terms of cosines and sines there's nothing special about what we just did there in particular I could expand in any kind of set of functions as long as there's sort of some kind of they form a complete set here eclosings and sines do and in particular what we're gonna talk about today already called the chebyshev polynomials okay so I think what I've tried to do is is you'll see two spellings of it Chevy Chevy CH and Chevy Chevy with it s H will try to stick with C I found my notes littered with two different chevy chaps one of them asked someone with the C but if you look in the literature often it's referred to either way Chevvy coach Jeff polynomials now they're not that they're not encountered frequently probably maybe this first time you've heard of chebyshev but what they are is a type of polynomial that has some special properties and the key to this is going to be simply that for this specific example I can make a transformation which is going to allow me to solve problems and n log in again so this is why chebyshev gets singled out okay but again nothing special about this type of function except for that it does go fast but there are some other ones that you might want to consider for different problems so for instance here's some other special functions that you can expand a problem in friends is the Bessel function the Bessel function is a really good expansion basis instead of cosines in size when you're dealing with radial problems this is actually where the bus'll function comes up in the first place this is such a common problem that these solution types people said well it's called some let's call them Bessel functions and then there's all the stuff known about Bessel functions MATLAB can access Bessel functions okay it's a nice expansion basis because in some sense it's the natural one for this type of problem LaShonda polynomials which are really good for 3d laplace bombs that's where they arise so when you're in that situation lashauwn there's the thing to you want to think about using here's more these are kind of the more famous ones that I wanted to point out hermie gauss polynomials well hermit gauss comes up in Schrodinger equation with a harmonic part potential this is one of the classic problems of quantum mechanics and so this is where this comes up but you can think about if I have problems somewhat closely associated with this then hermite polynomials might be the right type of expansion basis because in some sense what you're really doing is you're expanding in these natural solutions to the system okay so in some sense these when you when you would use any of these things let me write down the other couple more here spiracle harmonics for 3d problems with some kind of radio symmetry and then finally our chebyshev polynomials which were going to do on 1d bounded domains and so when would you use these versus cosines and sines well you can always use cosines and sines and your accuracy depends upon what your accuracy depends upon how many modes I use in other words when you solve this problem you decide on the number of Fourier modes you're going to use and it's always a power of two two to the some power right so you chop up your domain into so many points whether it's 32 64 128 256 the accuracy depends upon how many Fourier modes you're going to use to represent your solution now if you take a generic function you may need quite a number for your modes but with an FFT it goes as fast so you kind of go well okay I can take alot of Fourier modes because I can still solve this fairly quickly now why would you want to use anything else if I can do this well you might want to use some of these polynomials if for instance you are in a situation for instance with a hermit Gauss type situation where you say I want to expand because I have some problems closely associated with this I want expanding these polynomials well why would you do that because this is still gonna cost you all these are gonna cost you order N squared why would you do this versus your cosine and sine expansion of v of T the reason you would do this is you say because I'm expanding in these modes that sort of in some sense are natural to the system maybe I can expand this and I only have to use 30 modes to get a good solution whereas maybe over here I would use have to use 256 modes to get the same kind of behavior why would I have to use more motor because cosines and sines are so generic they're not specific to the problem this in some sense is specific to the problem so it's gonna give you a much better representation of the problem with a lot Ramon's so the reason you would use some of these things is just simply because of that if you can get away with a lot fewer modes to expand the problem in still maintain your accuracy then you're working with a lot smaller number of modes and so maybe you win here despite the fact this is Emma again this is order N squared but if you can go you know go quite a bit smaller in the number of modes you're in good shape all right that kind of makes sense hopefully so for every problem you should think about these things if there's some kind of natural solutions or expansion basis you might want to take advantage of that now a lot of those I think maybe people have heard about people often have heard about Bessel functions people often heard about you know her Mead or Jean Durer but chebyshev is one of these ones ago I never heard that one so what's special about that one well what's special about it again is that this is the one this is the only one of those that you can basically make it simple transformation which takes you down to n log n problem okay so let's start talking about some of the properties of the chebychev first chebyshev polynomials satisfy the following differential equation there it is so it's solved on the interval negative one to one has this funny structure here with square roots of one minus x squared in it oh and that's equal to zero and some things to point out about this first of all it's a sturm-liouville problem it's very important fact what it means if you have a sturm-liouville problem and you have some really nice properties associated with associated with this different equation in particular if you were solved for the eigenvalues of this equation or the eigen modes are there I in values are real and given by this not only a real but they're ordered just like this so as you pick up and here's the ordering of the eigen value it goes like N squared ok that's one fact two eigen functions to this problem iein functions real and the eigenfunctions are this T of n u my eigenfunctions to the problem okay this is a mouth organ allottee commissions associated with the eigen functions which are the following negative 1 1 1 minus x squared and negative 1/2 T of n ex T of M X DX is equal to PI over 2 C then Delta M M where this is the Kronecker Delta function so it's 1 when N equals M and 0 otherwise and here C 0 is equal to 2 and C and as you could have won and finally last thing is these eigen functions are complete so these are mathematical things what do they mean for us when we solve something let's go back up to what we were talking about here when you do a Fourier expansion your whole idea is to say I take a function I expand it out in terms of cosines and sines ok that's what I want to do and the only thing I've got to calculate when I do this expansion is the a events now when you use chebyshev you're saying okay instead of expanding terms of cosines in sign I'm expanding out in terms of these polynomials same thing the only thing you've got to figure out now is how do I calculate VA events okay so fundamentally it's no different than when you're doing it cosines and sines you're just changing the function you're expanding in and by the way if you were expanding in Bessel functions or her meet polynomials or Legendre are you saying is the same thing except you're placing this by a so function or her meat polynomial and your the whole thing is to calculate this once you calculate that this gives us the algorithm now you're working with these things here these a events this is what gives you the ability to start manipulating okay now that's our main objective is to understand this here are some properties of the chebyshev in here so if you don't know what a certain little problem no big deal the most important thing about a certain Louisville is that to find these coefficients they events often what you need to do is rely on some orthogonal T conditions could have been worse okay all right so let me plot what some of these things actually look like the definition of the pollen of chebyshev polynomial let's see here [Music] definition is this T of N now this is kind of an odd way to write this thing by the way but this is the definition of this thing instead of writing X here I've written cosine theta but the nth polynomial like take this argument so this cosine theta cosine n theta okay so what does this give well gives the falling t0 let's put it back in terms of X go to 1 put in 0 then cosine of 0 which is one team 1 of X is X think about that put it one in here you get a t1 u cosine theta so if cosine theta fit so if I let cosine theta just be X and this is X now it gets a little bit fancier this is where you have to start using fancy trig rules think about this one here what does this mean t2 that means t2 cosine beta is equal to cosine 2 theta what's cosine 2 theta then cosine 2 theta cosine squared theta minus sine squared theta but it is sine squared theta cosine squared minus 1 so this comes out to be 2 cosine squared theta minus 1 if you just use your trig rules but now if we like cosine theta to be X what do we get this becomes X and this thing here becomes 2 x squared minus 1 oh okay now you see you're gonna have to get fancier and fancier with your trig rules you kind of know this one about to top your head using then ones cosine 3 theta is in your head remember probably not maybe so you know anyway these are how you generate these higher-order things let me write down what they are I'm writing down the first five of them here here they are these are your first five chebyshev polynomials you're gonna expand a function in the set of these things okay all right so what did they look like let's bought a few of these things all right so negative one one first of all something to notice with this definition the value of T of n is bounded between 1 and negative 1 this comes nice you know how these functions that have this nice properly because it is related to the cosine through this thing and so this is between 1 and negative 1 it's nice ok so the first one is just T 0 is just 1 through T 0 1 t 1 is equal to X well that's just a line like this the next one is 2x squared minus 1 ok well isn't that an equation of a parabola not only it's a parabola that at x equals 0 to a value of 1 right I'm sorry negative 1 so it goes like this the next one is a cubic and the cubic goes something like this [Music] and the next one is a quartic I have a picture of it there on it on the page but you see this is the set of functions you're gonna start saying let me expand everything in this okay now here's the big deep connection that makes this yeah yeah racial equation oh yeah and of course that I will cover a minute sounds nice but what did it come from it because it is it is defined this is yeah so a good question so this is in some sense the definition of the chebyshev polynomial that's just how you define the sequence of what you get now if you were to make the change of variable over to here this is a function that lives on if you think about it in terms of theta theta lifts from 0 to 2pi an X then goes from negative 1 to 1 ok and so if you were make a change of variable over to this problem over here then you'd look at a problem 0 to 2pi so it's just this not sure if there's a good explanation for you here basically the solution of the differential equation are all the functions that we the verifies f of cos cos theta or cos okay so these are all solutions to that different equation and they are all orthogonal through here they all satisfy this orthogonal a new relationship they're all real eigen functions and the eigenvalues are I'm squared associated with each one so these two are intimately connected right you can't do one without the other all right now here is here's the major observation on some sense you go well why would I expand in this you could ask that like okay cosine sometimes a lot easier yeah but two issues associated with this one boundary conditions and two because these don't have periodic boundary conditions like we did before so suppose if I want to solve a problem on a domain and I didn't have boundary periodic boundary conditions I had something a little bit more difficult then you'd want to use something like this - is the speed issue and login why is it n log n well show you notice that by the definition this is natural argument change there even in the way we defined this function my X is cosine theta and theta is on this interval for X on that interval okay now a couple things to notice if we consider a function f of X well now you know you think about that that's giving me some function f cosine theta which give me some function of theta all right now let's take the derivative of this thing respect to theta so what you get DG D theta is equal to then here you take the derivative of this you have to use the chain rule you get negative F prime times the derivative inside which is sine theta here's the observation to make about this theta goes from zero to PI what is the derivative zero and PI you know starting sitting out here so it's 8 equals 0 or theta cos pi the derivative the derivative of this thing of some arbitrary function is zero so no flux boundary conditions and then also the light goes on in your head like it is just plain going oh I get it because look no flux boundary conditions and there's this cosine transformation over here so can't I just use in fact of discrete cosine transform to solve this once you make this change of variables and indeed you can this is why no notice no flux boundary conditions you have this definition and notice what the ends are doing cosine of n theta what was the discrete cosine transformation what did it do fundamentally what a discrete cosine transformation is expanded things in cosines and it did it in M log N so you see this chebyshev polynomial basically when you transform it through this change variable here allows you to use a discrete cosine transform to actually solve for those coefficients your a events right remember you're gonna expand now and an arbitrary function okay let's write this down so big okay that you're gonna expand span now an arbitrate function f of X in terms of these chebyshev polynomials well you've got to calculate these coefficients this is the big deal for you to do this is what you want to do computationally as fast as possible here's the theoretical representation of this but this is what's costing and generically order N squared but what we just did is if you make this change of variable what you just did is you turn it into a problem where you could use a discrete cosine transform to get this you your discrete cosine transform to get that all sudden you got yourself down to n log n now I made the comment in class about the thing that has made the FFT just this you know ubiquitous thing in the sciences because if is just a speed issue it's not because everybody knows cosines in science it's because you and you uses cosines and sines and the FFT algorithm basically you are going faster than anything else out there almost don't you know the speed pickup you get from it isn't amazing and this is why the chebyshev polynomial which is this obscure polynomial that nobody hears about in computational mathematics is a very important one because of the speed it brings it back to life going hey I know nothing about this thing though over here I've never used ever in my life but if I'm gonna do some computing with some different boundary conditions besides periodic I know about this okay puts it on the forefront of computing so normally when you say and this is again a spectral technique because again you're expanding in terms of this and you just have to calculate coefficients all right thanks well I'm glad you appreciate it I'm not sure if my audience here does but all right so here we go now some other property about story a transcript chef two issues that are important one what are some of its properties and two how does it take care of derivatives remember one of the other key things about Fourier transform was when you Fourier transform the derivative of something all it did is pop out an i K and so you ended up with an algebraic system we have to have same kind of properties here happen to make this work all right some properties of the chebyshev polynomial first I'm just gonna write some of these down you have very nice relationship between an iterative scheme for generating the next polynomial so instead of going through this procedure over here where you go oh jeez you know do I have to know all of my trig rules to get these things no starting off with T naught T 1 is function you can generate team plus 1 T 2 T 2 and T 3 maybe T 4 and so forth so this gives you an iterative way to get your higher chebyshev polynomials to the polynomial is down to between 1 and negative 1 and you can calculate that it's derivative plus an N squared also if you remember the graph I drew there the polynomial at plus minus 1 is either at 1 or minus 1 ok if you differentiate this thing or the derivative at the battery okay so now we're looking at the derivative the boundary here is equal to plus or minus 1 to the n plus P product 0 to P minus 1 like that so you know that it's boundary conditions are for any derivative oh you know well we wrote that was just for the first derivative you go through the first derivative you find that this is zero okay so other thing if n is even and my thing then the function is odd and is odd function is even how do you see that well just come over to here if n is even did I write that down I'm sorry if n is even that's even if kind of odd it's odd okay and is even here zero this function is even and is odd-odd even-even odd-odd even-even this is important when you integrate against each other right one of the things that you know is that when you take these inter to calculate these coefficients here you're gonna have to use the fact that orthogonal the eigenfunctions well the nice thing about these eigenfunctions especially the ones when you're mixing even in odd ones they always go to zero an even function times an odd function over in the interval negative one to one gives you an odd function which goes out to zero okay so the only ones you have to worry about is and even even or an odd odd against each other okay alright so those are some properties and before we get to the derivative properties there's one last issue associated with this problem that we have to take care of okay which is what we really want to do solving this chebyshev polynomials is we want to solve it in this cosine basis right we want to solve it and we make this transformation over here we want to solve it now on the domain of theta which goes from 0 to PI now think about what that means for a minute because this will be all important all right let's draw um so I don't do this when X goes between negative 1 to 1 theta goes from 0 to PI in fact X is cosine theta so let's draw that so basically get a cosine theta so this is mapping here between negative 1 to 1 and then cosine is gonna go in that interval between negative 0 to PI ok so this is this is a mapping of you can think of I'm actually working on this theta values that goes from 0 to PI as it goes from negative 1 to 1 now here's the deal when you do a discrete cosine transform you break up the problem into a bunch of points right you say ok what I want to do is I want to break up theta into a bunch of evenly spaced points so when you do that think about what happens here under transformation you break up theta into a bunch of evenly spit into a bunch of evenly spaced points so thinking about this is your theta this is a projection of it and you're breaking out to a bunch of evenly spaced points but you're solving in theta because you want to use the discrete cosine transformation but wait a minute when you come back to X what does it say about X how are the points distributed in X so the reason I drew it this way is because you project this down to the x-axis and what do you get and over here what being that I don't know if I drew it very well you get bunching of points at the edges and in the middle less points back to have a figure of it there which is a little bit better figure 28 if you look at that and you see a lot of bunching of the points at the end points so now when you solve this problem and you define something you say I want to say take a function and I want to define it on this interval negative 1 to 1 and then I want to project it out so if you had a function a Gaussian for instance negative 1 to 1 what would happen is you have a lot more points on the edges then you do in the middle so you have to be aware of that you get very good definition then know if you get an increased resolution at the edges less in the middle so if you have a lot of sharp stuff in the middle you might have to take a lot more points in the middle to fill it in and it's going to give you overkill to boundary however a lot of problems are such that we even actually solve them computationally the boundary conditions drive a lot of the dynamics there's changes near the boundaries and stuff and a lot of the boundary conditions affect the way the behavior gives is the boundary conditions are driving to the behavior and so you have a lot of changes near the boundary and you know this chebychev it's perfect for that because it puts a lot of data points there ok so it's a non uniform grid when you solve this thing and I'm going to provide you by the way with algorithms to do this stuff right so that's a couple key factors to this thing and we're not going to do that much with chebyshev in class but mostly or in our homeworks mostly what I want to do is introduce this idea of the chebyshev because you should be able to know that there's more than FFT in life right FFT you probably hopefully know that already there's more than FFT in life but you know like in math life specific more like computational math life there's more than FFT that you can use there are these other ideas out there and you may have a problem where you have to use them so just put the idea out there you can come back to it later in life okay yeah this one in one okay so what if you you typically are generic problems between some negative LML and what you do is you make a transformation of the variables down to negative one and one it's a lot like even when we do Fourier transforms right we're always working on a 2 pi interval but that's why when you do your K values your K vectors you have this two pi L in front it's a we scaling down to 2 pi all right now with that established we have all these nice properties well I don't know if they're nice they're just properties of chebychev there's a lot of things known about them but again this is like obscure math 50 years ago people were like oh here's this polynomial and here's all these properties who cares and it was kept as an abstract thing you don't really learn it in a lot of most math classes unless you just don't I never saw it until I came to a computation class because you get this n log n speed factor going for you so the only thing you have to figure out next n is how does this thing solve differential equations this really is why we consider this because now we say okay generically just like the FFT I'm gonna have some you know some differential equation and I went in now chebychev transform it now when we talked about a problem like this and you FF teed it so you for T transformed it what happened well you took the FFT of this whole thing and you said ooh I got to true is what I do with that oh that just pops out i K squared which came out to be I shouldn't use K let's call this alpha then you get this then you can solve for u hat and an inverse Fourier transform it is as easy as that so the FFT relied on the fact that there is a very specific relationship between the derivative and the derivative of the transform and a transform itself this is very important property when we come to talking about solving equations like this so the chebyshev polynomial has some similar properties as well particularly here's the high-level abstraction take some operator L acting on F and the idea is we know we can expand this in terms of our chebyshev polynomials we're by the way let's let f of x equals so here's the big deal what's the relationship between a of ends and B events that's what you try to establish when we did the Fourier transform the relationship was if you take a derivative you get an i k out that gave us the relationship between the a events and B events and the cosines and sines we have to do the same thing here I have a function here's its expansion if I take for instance a derivative of this function I have a new expansion but how does B of n relate to the Evans already know ok so let's do that one derivative what happens how does a of n relate to B of n by here something like this it's clearly not as clean as before you transform but if you take a derivative you can find the derivative coefficients by summing up your A's in this fashion okay second derivative I'll tell you what C of n is in a moment um yeah okay so second rivet I think I forgot to write down what it was there you can kind of see that new notes some stuff here okay this basically turns out to be a major it's multiplication I'm gonna give you this matrix I'll give you this matrix actually once you have this matrix once you know what happens with one D so in other words this here to find a new B values this is a think of that as a vector vectors of B I just take it some big matrix times a that's the point of this whole thing if I want to find the derivative I know what the function is what I can do I need to new know these newbies that these new B values for the derivative I can get them from the a by multiplying them by some appropriate matrix a which is the coefficients are determined from this some tip same thing here second derivative me so this is gonna be some a2 so you see it works a lot like finding a difference but I get spectral accuracy here so that's that's a big important thing and then what happens if L F is I multiply the whole thing by X then your B of n is equal CN minus 1 FN minus 1 plus K then plus 1 over 2 and by the way here Cena is 2 and C of n is 0 for n less than 0 and C of n is 1 for M greater than 0 so for instance here's some standard properties associated with these all right so that's the championship polynomial and really in some sense so what I try to do introduce it short the pictures are a couple things the things that are most important to realize 1 there is a connection between this and the cosine transform that allows you now to pick up tremendous speed and log n now because you take your problem you transform it over you can now solve it in log in a consequence of that is that you now work on a non uniform domain your points are now basically you have a little bit higher density at the edges been in the middle but you could solve an N log in as a consequence too once you understand that aspect of it when you want to take derivatives you see that if I take a derivative if I have a function I want to take derivative essentially it's a mapping between this a of n to this B of n through some matrix there's a derivative matrix I'm gonna give you this I think we're actually gonna play with it in class I think I can't remember exactly when but we will and that's going to be the most important thing why because just like the Fourier transform there's a mapping between the derivative in the function itself which is the ikp s-- here that's the connection between the derivative and the function itself and once we have this derivative matrix we can say give me a PDE all you need to take derivative no problem I have this derivative matrix hit it with it and expand everything in terms of these chebyshev polynomials okay it's just another basis to work with it allows you to consider different boundary conditions then Fourier transform stuff and so it's a good thing to have in the back your head right that's it for chebyshev we're done for today
Info
Channel: Nathan Kutz
Views: 1,437
Rating: 4.8400002 out of 5
Keywords: Kutz, Nathan Kutz, Scientific Computing, PDEs, spectral methods, Chebyshev Transform
Id: eMFb1JvMFKs
Channel Id: undefined
Length: 46min 49sec (2809 seconds)
Published: Fri Mar 08 2019
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.