The Search for Randomness | Jean Bourgain

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
oops good afternoon I am fellow griffiths a professor in the school of mathematics and former director of the Institute and it's a great pleasure to welcome you here to the Institute this afternoon to hear John Bragan professor in the school of mathematics who will speak on the search for randomness following his lecture there will be time for some questions and answers and after that you are all invited to a wine and cheese reception over in the Fuld Hall common room the need for pseudo randomness in various parts of modern science ranging from numerical simulations to cryptography has challenged our limited understanding of this issue and our mathematical resources in his talk professor Borg and will explore some of the problems of pseudo randomness along with some of the tools that are available to address these problems his work touches on several topics that are central to contemporary mathematical analysis in addition it has had important consequences in fields as removed from the topic itself as theoretical computer science and analytic number theory in 1994 professor Borg and was awarded the Fields Medal in addition he is the recipient of numerous other prizes and as a foreign member of a number of academies including the French Academy the Royal Swedish Academy and the Polish Academy he received his PhD in 1977 and as habitability own in 1979 from the Free University of Brussels he has held positions in the free university of brussels and @ih ihes in France before joining the faculty here in 1994 it's a pleasure to introduce professor Jean Bergen so I'd like to thank you for this introduction Phillip so let it let me first tell you that this is not going to be philosophical talk this is about this supposed to be a more mathematical talk so I want to explain something about the title what this title is about is the problem of generating explicitly and say within our computational means models and processes that look like two random models what do I mean by a true random model it would be ideal coin flipping for instance so such this problem is obviously important of significance in many applications of various types you just think what going what is going on in a gambling in in a slot machine or in a video game in but many other issues and it's also a problem that has generated particularly at the Institute a certain amount of quite interesting mathematics also recently and so what I want to do is to incorporate some comments about this mathematics through this talk so search for randomness of course a process which you really generate the way I'm describing which should be easy generation is going to be far from truly random process and you can only hope that there will be some common features with to randomness so more practically you have the the question of the randomization which is roughly the following question suppose you have a job which you know can be and by a to random process so you know that if you had access to an ideal coin-flipping you would be able to take care of the matter this matter can be it can be various things quite different one from the other so then the question you facing is how to come up with something which is really practical some process which is practical and more or less accomplishes the same task so I will discuss two examples of that the first example is that of generating random numbers random integers the second example is the generation of random graphs and both of these questions have questions of great practical importance and also led to quite a bit of I think interesting mathematics so to recapitulate on one hand we have two randomness which is generated by truly a random process like a your coin flipping on the other hand there is pseudo randomness which is the behavior of deterministic processes it has various appearances there are physical appearances one noticeable one issue where this randomness is put forward is in the Boltzmann or Gothic hypothesis which is the fact that the mechanical system goes to all states compatible with each thought with its total energy something I will not discuss here this is what you may call the the physical version of pseudo randomness what I will discuss here is on the other hand the the other way to come to observe the randomness which are arithmetic and means now in fact at the second thought if you it would guess most of the pseudorandomness you encounter in real life for instance just think what you see when you walk to an casino room there will be a dove this physical kind or they will build the of the arithmetical kind no this talk is only dealing with medical simulation of randomness and this is already quite vast topic as as you will see generating random numbers well besides video games there are other many other issues where that is important there is the issue of computer simulations Monte Carlo methods numerical integration cryptography and so on and actually going back to the history early example of such a read medical process of generating random integers was put forward by von Neumann although eventually he was rather negative about it and rightfully so and I don't think there was ever any kind of rigorous mathematical study made out of it which is the middle square method but say historically should mention it we will see something later that looks very much the same but which is quite different in fact so how does the middle square how does the middle square method works well you you start with a four digit number which is 1 1 1 you take a square and you make it an eight digit number so the square would be 1 2 3 4 3 2 1 if I calculated correctly so you want to make it an eight digit number so you add a zero in front then you take the middle digits 2 3 4 3 and you repeat they keep going now it's not really clear what it does experimentally a posteriori it was found out that as a random pseudo-random number generator it does world of poorly because periods tend to be relatively short which means that rather quickly you come back to the same numbers and this is in fact the main enemy when when you try to generate observe the random numbers this is the the problem of having short periods in fact in the same paper which was a paper later in in his life one moment was rather a negative about it and he made it clear and rightfully so that using arithmetic on metals to produce random digits is observed on the other hand there is a famous paper by bloom bloom Shoop I will say a few things about where they take a more positive stand and well the first statement is is more or less not illogical but then we have to make it more precise or what they say is that ideally we would like absurd a random sequence generator to quickly produce from short seeds long sequences of bits that appeal in any way to be generated by the consecutive flips of a flat coin now of course that is what you would like but you have to be more specific and they were more specific by requiring that obviously under standard statistical tests should be satisfied so if you check on a long sequence the occurrence of frequencies the occurrence frequency of the zero and the ones they should be a half and also when you take consecutive integers you should have the right mixing properties that is why they've called standard statistical tests and there is quite an extensive theory now how to justify this kind of things in the context of sequences which are admittedly produced we are going to say something more about that then there is another issue which is that of unpredictability of course already the fact that you have good mixing is a weak form of unpredictability but we want to go beyond that and the statement would be that either give you a finite segment in the sequence it is not easy to say what is going to be the previous or the next digit guessing whether the next element is going to be 0 over 1 cannot be done better than the random guess as well as an predictability concerns as we will see there are in fact very few and condition results and this is something for which we are much less well-equipped to start really proving theorems about unless these theorems really conditional well they discussed Blum Blum shall discuss two examples the second example is kind of reminiscent of von neumann's and it is called the X square mod n generator so the X ray mod n generator is one of the early examples of absurd a random number generator so the way it goes is that you start from an integer which is a product of two primes about the same size P and Q and you assume that P and Q are equal to three mod for this so when you divide by four the remainder is going to be three this is about as much mathematical this talk is going to become so when you have that what you do then is look at the quadratic residues mod n so these are the integers say from 0 to n minus 1 which have a square up to a multiple of N and take the set of quadratic residues because of our assumption every element has a unique square root in X now let me stress the fact that and this is quite important it's that you know the integer n but you don't know the factorization of the integer we will come back to on that so what is happening eventually what you want to output is the sequence of zero ones so how does it work well so your your mapping goes from the quadratic residues to itself you start from X 0 say some integer between between 0 and n minus 1 you take it square you get another integer you reduce it mod n so you look at the remainder after division Bay and you get X 1 which is again between whatever 0 n minus 1 take X 1 square reduce it mod n get X 2 etc so the sequence X 0 X 1 X 2 xn plus 1 and so on are integers between 0 and and minus 1 in fact your quadratic residues andto get a bit sequence what you do is that you just take their parity so if that is not sufficiently clear let us go through an example we take P equals 7 Q equals 19 so the product and is 133 you start from X 0 which is 4 which is an even thing so I get 0 the next one the square X 1 is 16 so again B 1 is equal to 0 16 square is how much is 16 square 256 so you reduce it by 133 you get a remainder which is 123 so B 2 is equal to 1 take the square of 123 reduce not 133 you get the handout so between 0 then you get 25 which is you 193 vc-1 and then after six iterations you are back to the number 4 which gives you 0 so this is the outputted sequence and its periodicity 6 so that is how these things work and so to address the first question what is the statistics now how do you express that you have a good statistics of 0 and ones that you have a good mixing of 0 and ones well the crucial concept is the mathematical concept of discrepancy so how does it work well you start from a string Omega of 0 and once of length R is an fixed number so you have Omega 1 Omega 2 Omega R say here you have R equals 3 so Omega 1 is 0 Omega 2 is 1 Omega 3 is 1 so you're taking a fixed weft of zeros and ones and you look at the occurrence frequency of Omega in a long segment of your secrets so clearly if you had a perfectly random sequence this occurrence frequency would be 2 minus F because every word Omega is supposed to occur with the same frequency so ideally the occurrence frequency would be 2 minus R and what we are expressing with discrepancy is just the difference between the two occurrence frequency and the ideal occurrence frequency now any theory of pseudo randomness somewhere has its roots in mathematics in the sense that in order to to do something we are relying on mathematical theories and often non-trivial mathematical theories so the interest of modular arithmetic is that there are such theories available in the theory which expresses these good distributional properties is the theory of exponential sense which was developed in particular to the work of only very quite famous works of way and which applies particularly well to getting the statistical the confirmation of these good statistical properties of arithmetical sequences for instance of the type I just described the only travel of the of the the results that used there is that they require very long periods for instance to get results about the x squared so the random number generator one meets periods which are as large as in fact which are large compared with n at the power three quirks which is a problem because there we are entering the realm of unproven assumptions that in fact say for those four mathematicians have to do with the multiplicative order of to model crime and proving that this happens infinitely infinitely often the same thing we can do at this point and it is related to an mathematical problem known as the afton conjecture blum blum shop got around it in another way also in a conditional way by relying on plans which are sufficient primes these are primes of the form two peoples one with pisa prime itself so in other words the integer end we are starting from is a product of to Sophie's of my primes and if you have that kind of primes it's very easy to see that this period is going to be long so what to do about that well if you want unconditional results there is still a way and this came with recent developments in combinatorial number Curie which allowed to handle such very short periods so compared with the classical techniques same methods alibi they what you are getting is a less precise information but which is more broadly applicable and in particular with these techniques you can prove results about distribution and joint good joint distribution of the X squares of the random number generator without having to rely on Afton's conjecture a welcome back to all these methods little later now I have to say a few things about impractical and predictability issues as this was really the the main emphasis in the the bloom bloom soup paper so in predictability issue like I said there are few unconditional results what bloom bloom shop put forward is an assumption of what they call quadratic raise in diversity assumption so what is quadratic residue acity assumption well remember I put forward the set X of quadratic residues no the quadratic residue SAT resumption is just the fact that if I give you an integer X deciding whether X is a square mod n is hard can't be done in an easy way this is an assumption note that it's very important again that n is a product of two primes and that we don't know the factorization of n otherwise it would be easy so what they show is that if you assume this quadratic residue versity assumption then X square up so the random number generator is fences unpredictable to the left what does it mean well assume that you had a better than random guess what is going in this typical sequence what is going to be the previous digit if you know a certain segment well if you had a strategy to do that then you would automatically have a strategy to decide whether an integer is a quadratic residue mod n or not at least with a very high probability so therefore since you assume there is no easy way to decide with high probability whether your integer is going to be a quadratic residue or not automatically you conclude from that but this is of course a conditional statement that predicting the next digit to the left is hard and a strong result stronger claim was obtained by that assuming quadratic residue acity assumption the sequence is produced by X square mod n pass every probabilistic polynomial statistical test and I will not explain you what it is because it's not going to be very helpful and it basically tells you what you feel it it should tell you now what we have seen there is one example where we use algebraic complexity as a to substitute as a substitute sorry for true randomness so in other words we have produced sequences in a simple way in fact by simple arithmetic because it's very easy to generate the the bloom bloom Shoop sequence that has a certain number of random properties even provable some conditional and it turns out that there has been particular advances in this area of using algebraic complete complexity to generate so-called randomness because of recent developments in trauma networks of finite fields and sure some of you may have heard about that earlier because of some understanding which is called the same product phenomenon so to repeat the way my colleague avi Vig design is putting it what is the same product phenomenon well if you go back to - how do they call it early grades school what you learn is the addition table and you also learn about a little later than about the multiplication table so if you do that for the numbers from 1 up to 99 or a hundred and you're looking at the addition table and you compare with the multiplication table you will see that there are many more numbers that cannot be in the multiplication table so this of course is something which is well known but what turns out is that what you have there is actually only a manifestation of something which is much more general and somehow it was only realized like six years ago or so and in all the whatever recent developments in referring to this principal place quite an important role or say variants of this principle and also in many things I will not talk about here another example where this principal plays a crucial role is the D randomization of random graphs so the property of course you always have to be specific so the property which is know at stakes is expander property so we are looking for graphs which have spouse I would explain what means spouse and have a high connectivity which is going to be expressed to certain expansion properties as I will make more precise in a moment so this is again an problem of the randomization that was solved by algebraic techniques there are many applications this is a problem that is of interest for many reasons it gives you efficient communication networks for instance the the way the brain works or at least this should work error correcting codes problems of the randomization of random algorithms problems of quantum computation group Theory geometry etc in what the the rest of this talk is going to be about some of these applications and they will probably not get to the end but say when when the time is over we we just stop we see how far we can get so the first the first concept is sparse so a way to get the sparse graph is by looking at K regular graphs K is a fixed number say three for instance the graph is on a large number of vertices and K a regular just means that every vertex has K neighbors so what you have here is a model of a graph on 80 vertices if one counts them and is three regular because every vertex is connected to has three neighbors is connected to three other vertices as one sees the other concept is high connectivity what means high connectivity it means that if you start doing a random walk on the graph although this graph is very sparse very quickly you can end up anyway so you want to say that very quickly you fill the whole set of vertices so the key concept there is that of expansion ratio since this concept is going to be essential for the rest of the talk let me just take a moment to explain it so what we have is a graph and this graph is this graph is on six vertices for instance if we have a subset of this graph so we are looking for instance at the upper three vertices what we call the boundary of the set s is just a collection of all the edges that go from the set s to its complement so for instance if you look at the set of the three top vertices what is the boundary well you have these two guys which go from the set to its complement then there are these two and released so the size of the boundary six and this ratio for that particular set is 2 now what means that you have so what is the expansion ratio it is the minimal ratio between the boundary of a set and the size of a set where we restrict of course two sets which are not too large expansion means that this ratio remains bounded from below so roughly speaking in Lemmons words what it means is that every set has a large boundary and this of course the set is basically everything and if every set has a large boundary then obviously if you're going to walk around the graph very quickly you're going to fill the whole graph so the question about existence of such graphs so the two properties there which a priori may be some where contradictory on one hand we want the graph which is sparse on the other hand we want to have his expansion property well this problem was solved at least in the random world by Pinsker in 73 who proved that if you take a typical k regular graph where K is a fixed integer artistical 2 3 so look at the typical three regular graph typical means a random 3 regular graph which is clear what it means on n vertices well this is an expanded graph and well this is the ambiguous statement of course what it means is that when n grows this expansion ratio doesn't go to 0 this is what you call an expanding family so here we have obviously a problem which is about the randomization which is the question how do you come up with examples which really constructive because randomness doesn't belong to to the practical world so how do we came up with such complete expanded graphs well again algebraic methods did it in particular there is the work of Margulis who produced also in 73 explicit expanded graphs by looking at KD graphs of groups no Kayla graphs on groups very simple graphs they are completely XP it the way it works like that I will give you several examples is that you start from a group these are the vertices then you take a set s of generators of the group a set of small size say k1 legs to take this set symmetric and then you get a graph which is called the Cayley graph corresponding to the pair G being the group as being the generators where the vertices are just the elements of G and then you get the edges simply by connecting an element of G with the elements you get by multiplying with one of the elements of the generating of the of the generating said why do you want be said to be generating well you like to have this graph connected the minimal condition to have a graph which is an expanded graph of course is to have a graph which is connected so that's why you assume that your generators so let me give you an example which is also of very elementary nature and fortunately does not produce you expand the graphs so here we are taking the group G so the group G are just integers with the addition and then you reduce mod 10 so the elements of this group L 0 1 2 3 etc up to 9 and then when you keep going you get you get 10 but 10 is 0 if you lose if you reduce not them so one is definitely generate or since you want to make the set symmetric you take 1 minus 1 and so you connect every element of your group with the previous one and the next one if you are at at 9 well X minus 1 gives you 8 X plus 1 is 10 but this is 0 because that's a reduction object so that is one simple of a kala graph as you will undoubtedly realize this kala graph and the extensions when you go when your place 10 by 10 thousand or so are never going to be good expanders because if you take a large set sorry take 8 7 6 5 4 you only get a boundary which is of size 2 and no matter how how large an integer you take instead of 10 you're going to keep that phenomenon so you will not get good expanders and there is reason for that if you want to have good expansion like the way my good is did for instance what is needed very non commutative groups and typical examples are provided by matrix groups so when I told you before this is as mathematical as it's going to come I cheated a little bit this is probably the most mathematical slide here so we are looking at groups which are SL to P so SL to P are just two by two matrices ABCD ABCD residues mod p so they they range from 0 to P minus 1 and the condition is that the determinant should be equal to 1 after reduction mod p so it should be 1 percent multiple of P so an example take p equals 5 then I would have the group SL to 5 now in general SL to P has P P minus 1 P plus 1 elements but what we are doing here is the following this is a small twist we are taking the projective version so we only have 60 elements what means the project the projective version well it just means that we are factoring out minus the identity so for instance if you look at the matrix B B is not B inverse but it is B inverse up to minus the identity so the setting I'm writing there a a minus 1 B is indeed a symmetric set in the group PSL to 5 which have 60 elements and well what you get is a graph here which is 3 regular and when P goes up you get nice expansion properties which is a quite non-trivial thing and perhaps instead of trying to draw the pictures of this kind it is better to look at a description this is not exactly the same thing but it's very closely related and relate the relation is gotten by linear fraction maps mod P if you don't know what it is just ignore it let me just tell you that instead of taking SL to P we are going to get back to the old set which are the residues mod P now this is exactly the the example which I was describing there where you would look at the Cayley graph of Z mod PZ but then we are going to add one additional edge we are going to connect not only X to X plus minus 1 but you are also going to connect to 1 over X so the whole still remain very simple we are just adding one more edge is connecting X to 1 of X but the structures you are going to get are much much more complex and kind of illustrate that these graphs have a much higher level of connectivity what you are seeing here are pictures for peak was 101 for 99 9 97 and so on and as you can see these things are infinitely more complex than the whatever what I was telling what I was showing you here so what is the mathematical theory which is underlying that well part of this theory was provided by Selberg who from whose work followed expansion at least for a large class of generators but still for special generators to illustrate this more completely there is the following problem which was which is really showing what was the the type of situation one had which is called the 1 2 3 problem so what you see here three sets of generators of s l2p the first set is the matrix 1 1 0 1 1 0 1 1 you have to take the symmetric copies I didn't symmetrize it the second one you replace the ones by 2 and the third type is that you just replace the 2 here and there by 3 so these systems of generators they look I mean these sets of matrices they look quite analogous they are all generators of SL to pee but strangely enough the S one is to turn out to lead to expand the graphs as followed from Celtics theory where for h3 this problem was only solved quite recently again two methods from arithmetic on networks because somehow the s3 did not fall under the purview of whatever Selberg was proving so the recent advance is that one has no robust theory of expanded kelly graphs in particular for groups ssl Twain which again came from this progress in arithmetic community talks I was referring to before one particular manifestation of these results would be that if you look at such a kelly graph basically to have a good expander it suffice that you don't have short loops so you don't want to have loops which are small compared with the logarithm of n in particular this implies that most choices of generators give you good expansion and so what you have is some kind of robust theory why is this interesting well of course it produces large classes of expanded graphs that it has another interest because it explains certain phenomena beyond randomness that one couldn't explain before and well two examples of this phenomena one has to do with quantum computation and the other one has to do with quasicrystals so I was having these two topics depending on the amount of time that remain but they think I have enough time to talk about to say a little bit about both of these topics so quantum computation in quantum computation what you want to do operations on qubit States so these qubit States are super positions of two basic states which are 0 & 1 with coefficients alpha and beta which are subject to the rule which is written here so alpha beta complex numbers the sum of the squares of the medullary adding a - one so the operations on these qubit states are performed by unitary matrices for instance if you look at Shor's algorithm to calculate a fast Fourier transform so on in order to avoid false what you like to have are gates for which there are fault tolerant constructions so the problem you run into is the following assume you have an alphabet a of Base gates which are special gates for which there are Fault 11th constructions available then if I give you a general gate which means the general matrix unitary matrix you like to approximate it well by word by a product of elements of a and their inverses so from well the following natural definition what is a computational set a computational Universal set of gates well it is a fixed it is a fixed finite set a of Base gates such that if I give you an arbitrary gate it can be approximated a little bit really well with a product of elements of a and their inverses now the concept wouldn't be interesting unless you're kind of can give an a promising answer to the question how good can an arbitrary gate be approximated by a short word in a so this problem is solved to some extent by a process which is relatively simple but still has quite a bit of content really uses the structure of of the group of the the special unitary group which is the salawa key type algorithm so the so lovicott type algorithm it tells you that whenever you take such a finite set a which is computational universal then a is going to fill the special unitary group very quickly and there are several there of the soul vaquita alberene one of these versions performs as follows you have they give you any Epsilon no matter how small you can produce a sequence inside your alphabet a of bass dates which is going to give you an approximation so how does it work you start from some gate some element of su 2 for any epsilon you can find a sequence in your alphabet which is going to have lock 1 over epsilon 2 power 3.97 say B of that length and is going to give you an approximation up to Epsilon and as an extra bonus to it which is also quite useful is that the sequence you are getting there it doesn't only exist but it can be generated in a relatively easy way with an algorithm that has this running time of course the problem that people were putting forward is whether one can do better so if you want to approximate up to Epsilon since as u2 is sent in three dimensions you need to generate epsilon minus three elements and you cannot do that unless your sequences are at least of length lock one over epsilon in general we call sequences of length shorter them than that are not going to produce enough elements for proximate everything at words from where another classical notion which is that of an efficiently universal alphabet well an efficiently universal alphabet is an alphabet so that any gate any su to gate can be approximated within epsilon precision by a string of auto-lock one over epsilon well what I told you is that we are going beyond randomness the truth is that we don't know whether a random choice of elements for a would be this job on the other hand there is a world a famous example which is due to Lobot ski Phillips and Peter Cernik which was in fact quoted several times in the literature on question in quantum computation where they produce based on how the non-trivial theory of the type I described before an explicit system v1 v2 v3 which would give you such an efficiently such an a system of efficiently universal gates one recent progress to put it a little bit in a simplified form is that what is shown is that actually as soon as you have a system which is computational Universal this this system is already efficiently Universal but there is a catch you see the thing is that with Saleh vaquita if you have a poly you haven't an algorithm which is a simple algorithm to produce these approximating sequences but they are a bit too long they are above the low point of epsilon the main open problem there is to find a polynomial time algorithm that generate an approximation with a world which is of length only of logarithmic size so only we know that there are such short words which do the job we don't know how to produce them efficiently the other thing I wanted to discuss say a few things about is how these developments around expanders explains certain things around in the theory of a periodic tilings which is of relevance in modeling quasi crystals so we should go back first to work of Penrose here you have an example of a so-called Penrose kite and dart tiling this is a telling of the two-dimensional plane which uses two figures one figure is a kite so the the purple thing here the the dark Singh here is going to be a dart and then that guy the other kind of kites so there are two types of tiles and all the tiles are congruent copies of these two types the tiling is a periodic it is a typical what they call tiling obtained by substituting s AB stitute on demyx but the rate of mixing the the degree of a periodicity is not very high what I mean by that is that if you take a large box here of size n well the number of distinct orientations you are going to get in a box of size n is going to be small it's going to be only logarithmic in N and for some reason in the plane you cannot hope to do better so what we like to have our tilings which have much more chaotic so we like say to make sure that in a volume of size n you will have a number of tiles which grows much faster in the volume say power like and not the logarithmic locate meet rate so such potential tiling was put forward by john conway and charles raden and is called the Quaqua versus telling of three-dimensional space it turns out that for this tiling the number of tiles grows so the tiles there are all congruent images of the same shape which I will describe you in a moment and the the rate of the orientations grows as a power of the volume and so also this the mixing rate so this is a rather dark and confusing picture but what happens is easy to explain so what is the construction process there is a little bit of TL of algebraic groups which is in fact underlying this construction that they were not getting into that anyway does not explain the the features we want to explain so your staff with a prism which has dimensions one one square wooden three two this squared of T is quite important there so this has very special measurements and what you do is that you subdivide this prism in eight dot prisons which share homothetic images with a factor health and then you rescale the whole thing by multiplying by two and you keep going so it's probably not completely clear how you do this subdivision but you will see at least clearly what the to what are the two front tiles here so we have to generate six more by splitting up the back so you have the right back and the left back so for the right back what you have are three tiles which are these two and one on top now note that the left side is splitted in a different way because you get a lower one and then the top is splitted that way so for each operation you're going to replace a tile by eight tiles that have different orientations of course which homothetic images with factor 1/2 and then you rescale the whole thing just by so you have these eight tiles you want to bring you want to bring them back to the unit size by multiplying with two and you keep going so after after n steps what you have generated is eight at the power n tiles which will create I mean which will fill up take an amount of space which is of course going to be exponential in n and so the point is that what congruent Ladin we're hoping is that these orientations in fact also should grow exponentially in N and have a rate of mixing which is exponential in n so there is a paper back on when riding in invention as well they do an algebraic study of what is really governing this tiling and also obtain certain results in particular they show that the squawk reversal tagging is going to be in the limit statistically invariant and all rotations without addressing the problem whether the rate of mixing is in fact going to be exponentially fast now extensive numerix well done in particular by the Rocko sudden and when we're done well they checked what was the rate of mixing of these orientations just the way one does mathematically is by looking at the representations on spaces of spherical harmonics and you probably don't know what this means but the only point I'd like to make here is that what you see on the right have certain eigenvalues in fact record Eggen values and what matters to have a good rate of mixing is that the difference between the number the difference between 1 and the number you have there is quite large so as you can see already at 258 you have a gap what they call a spectral gap which is at most a few thousands so based on these numerix and this is only going up to 258 it seems like it is a quite bold conjecture to believe that actually there is a two spectral gap which means which is another way of expressing that in the limit the rate of mixing is real exponential and but it turned out to be correct and it was never clear to me how people based on such numerix can make this kind of conjectures but amazingly a few years ago the conjecture was proven to be correct again using techniques from arithmetic comma networks and of course these arguments are quite complicated than they provide of mixing which is help me explicit but for once say poor results mathematically which are non-trivial but of a poor quantitative value they are just confirmed because the the true experimental values are also very poor so I mean it would have been a contradiction if we eventually would have proven some if something would have been proven that that is in violation with this very poor numerix so the the moral of the story is that the clock reversal tailing has an exponential mixing rate but this mixing rate is very slow by the way the the question whether this is the right rate which was confirmed by the numerix it's still open the only thing which is no no is that there is a certain rate so there is some number here which is strictly less than one but it is very close to one you don't know what it is so if I think I have convinced you that so the randomness has many faces there are many faces that will not discussed here and that's the good thing actually can give many lectures about which are relatively disjoint from each other what I have done here only is to describe a few aspects of total randomness that has to do with arithmetic and say some recent advances in arithmetic and no matter what is the the type of mathematics you know you are using to understand this pseudo randomness typically the the theories which are behind quite enriching and so I would say that for instance the the problem about the the problem around the the randomization of the theorem of Pinsker I mentioned you earlier has been a source of inspiration for for mathematicians and you know and it has many other aspects which I haven't that's the point here so this is the end of the talk I have to thank two people the first is Alex Gunn Bert who made my job very easy because basically half of the material which was presented here was provided by him and the other person is a legal system which also helped me a lot in the talk like to find job for a very interesting talk and the floor is open for questions or comments you mentioned at the beginning I think that you could generate random sequences by some sort of physical processes right well the the typical mathematical theory which is again an unproven theory as some of you may know of justifying randomness to something like a boltzmann ergodic hypothesis definitely is something which is extremely deep and the topic of research for mathematicians the eventually the way this thing is going to be justified or not is another matter now the question which was asked well it I was mentioning the fact that when you're looking in real life say you work in a casino the examples you will encounter whether it's slot machines or Auto devices are either in medical physical say for instance one it was trying to find this on the internet so I don't exactly know the name of the game but you you want to illustrate something that would look like a Boltzmann hypothesis application which is this game where you have a device which suffers balls and then at some point some ball is going to come out and then delete a number so this I would call something which is not very medical but covered say a physical process because basically the believe is that in this device basically all possibilities are eventually going to come up of course the most obvious form often a physical device to generate pseudo randomness would be coin flipping whatever that means because depending on on what you what your coin is the way your coin is is based and it's always one or another way or the way you're going to toss it this is going to affect it's it's random or pseudorandom properties so in any case what is kind of interesting is that there are two studies on one hand you have studies which are in the tier of dynamical systems for instance where you try to understand certain randomness up to the randomness - deterministic systems and these models are very non-trivial to analyze on the other hand you kind of have what you may call the dynamics in arithmetic which similar models which are no arithmetic works with modular arithmetic and fortunately there one can analyze them much better because what you have behind you are very powerful tools like reverse inequalities and things like that which you wouldn't have for the real physical devices so there is some kind of parallel of true chaos in physics which is some kind of arithmetic chaos I mean these are the obvious things I can tell you about the parallel so in some sense one can do more at this point it looks like in the arithmetic world then proving things rigorously in the physical world von Lohmann in fact a look just yesterday at his paper and he was putting forward as a typical example to create this is a statement maybe some of you understand but they did not as a typical way of generating physical randomness was to look at at nuclear catastrophes in some sense so he was putting this forward as an example of the run of the random process but maybe there is some theory behind it just to explain why that could be indeed a form of pseudo randomness but there was no explanation for it so I'm just citing one of his of his thoughts there which is not a very practical way to do it of course yeah well I can answer you on that the example of the X squares of the random number generator is certainly not good to generate random numbers in Monte Carlo methods and other techniques at least experimentally proven to do much better which are for instance mason twisters what the X squares of the random number generator is good for isn't predictability on the other hand if you go to say cheap computers with limited computational capability then the type of generators like considered by bloom bloom shop so this was the X square generator there is another one which is what they call the congruent shil generator where you multiply with a number they have the advantage that they produce something random using say they require much less memory so if you go to many devices say video games or so the kind of pseudo-random number generators which are present there of the of this simple type they perform less well and this is not for cos calculation but they can be implemented with a very limited software but at least experimentally there are much better devices now I'm not sure you are expecting that kind of an solve it it is not a lie well this is an issue is how you break the code now what we were discussing here makes an assumption of course that factoring an integer is hard know whether this is really true or not is another matter but if it is not then of course that that whole thing falls apart
Info
Channel: Institute for Advanced Study
Views: 21,848
Rating: 4.8914728 out of 5
Keywords: Jean, Bourgain, Math, Lecture, Randomness, Pseudorandomness, Computer Science, Discrete Mathematics
Id: xMWlv_1DqyM
Channel Id: undefined
Length: 60min 13sec (3613 seconds)
Published: Wed Apr 25 2012
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.