I'm very very happy to have been here today to speak as a joint mouthpiece your goal of your joining has been one of the people who is really promoted and also developed the problems thinking the method in quite some influential way he's not only co-founded including one of the and giving joining us in the area he's also reading together his move along with the thoughts of textbooks on the method and today I think he's going to tell us about two gems which connect sort of the promise the methods with. So that for they're doing so OK thank you very much loads and it's always a real pleasure to come back are we OK OK. It's always really a pleasure to come back to Atlanta and it's always a pleasure to talk about Paul because Uncle Paul we all called him Uncle Paul is now a figure of legend he was a giant of twentieth century mathematics and he worked in so many fields he had a unique personality in there a million anecdotes. About him and they're all true. I want to talk about a particular area that he worked in. And in my prejudiced view it's it's I won't say it's the most important one though I I actually think that because it's my hair here too but but it's the area that most clearly bears his stamp because it was something that that he really invented in the sense that anyone can invent mathematics and he worked it and it's the area called the probabilistic method but more and more I like to use the term erudition magic to describe it and. So let's back up so ever this magic is a way to prove the existence of some common authorial object be it a coloring or tournaments or a graph or a ranking or or some kind of thing and and the way you do it is you create a random object and how you do that. A lot of ways to do that so that's part of it and then somehow you show that the random object has a positive chance of having the of of being the object with the properties you want and then the conclusion is not that the object might exist but the that the object must exist in the mathematical sense of existence because if something has probability greater than zero it can't be the null set so there's got to be you know in the space of objects so there's got to be an object in this is what we call air dish magic. But today I want to talk about modern Well modern being not that recent but. More and more. Mathematicians in general are looking at algorithms and here I want to call a modern era dish magic you create a randomized algorithm whose object whose Who where you hope that the algorithm will create this object with the desired properties and if you can show that the randomized algorithm will work with positive probability once again. The object must exist so you get the same conclusion I'm going to give specific examples in a minute but not only do you get specific examples but furthermore you get an algorithm Now I started out in the math department and I'm ending up as a mathematician. But I'm in the math department and the computer science department and to me the great thing about computer science is that it's led to such interesting mathematics problems it's rather parochial view of the world as Stephen smale said it said it better he once said that peak was and P. is computer science is gift to mathematics and here you have this beautiful beautiful mathematical problem but we would have thought of it as mathematicians except that we have computer science OK so. Let me let me I'm going to look at two examples where the original results one is the first one is due to me. And the second one is due to the last low low Vos and in both cases the original proofs did not involve algorithms and now we have new proofs that involve algorithms so a this gives an algorithm. Which. A lot of people maybe not you and me but a lot of people care about having an album and by algorithm I mean effective algorithm I won't say much about just how much time they take but these are all polynomial time algorithms. But suddenly in both cases these give new proofs of the existence of these objects and it's really quite interesting to look at the two of them so let me turn to my own result from back. Quite a few years ago. And always get more and more embarrassing but there it was so here's the result of first do it this was a problem of air dish. Dealing with discrepancies So you have an sets on and elements and you want to color the elements plus one minus one or red and blue if you want and you want the set to be evenly balanced as much as possible between the red in the blue so you red is plus one Blues minus when you add them up and look at the absolute value and you want that to be as close to zero as you can and the theorem that I proved in one thousand nine hundred eighty five was that there is a coloring where. A coloring of the underlying sets one thru and so that all of the sets S. one through S.N. have discrepancy last the real to some absolute constant the six is maybe five point three two but six standard deviations was the title of the paper in that sort of stuck. That all of them are within six that they're all less than or equal to six root N. Now it's not hard Well it's not easy but it's known that you can't do better than. Constant root N. a one example if your income in taxes had a mark major sees you as a hatemonger matrix you can show that the discrepancy if you turn the how to mark major So you into a bunch of sets on elements then the discrepancy is greater there to some constant Ruden I forget the constant So either way it's known that Ruben is the Best Buy We don't know the constant OK but my proof was. Actually used to pigeonhole principle I mean this is my best theorem ever Everybody should have one best theorem ever and hopefully you don't have it yet hopefully you have it hopefully it's still to come OK still in front of you you know but I think I can say that this is my best theorem ever was was proving this to use the pigeonhole principle but I'm not going to talk about the proof but immediately pigeonhole principle is very non algorithmic I mean is there some classical some logicians have can make that substantial but so the proof it just seemed it was not algorithmic Now of course you can use brute force but by algorithm I mean polynomial time algorithm. So immediately I conjectured that. There is no polynomial time algorithm that will give this. Coloring. And then in one thousand nine hundred ten Maciel Bansal using semi definite programming. Found a polynomial time algorithm and then a couple of years later I love it and make. Also gave an argument and I want to give you an idea of their argument so we want to get this coloring. And the key thing about it is that all the sets have discrepancy less than or equal to squared event constant squared event Now why is that what's the intuition suppose you call a randomly OK so you have a set maybe it has an A and elements in the bigger the set the harder but I suppose there is an element if you color randomly it's discrepancy well before you do the absolute value it's discrepancy is going to be gaussian and it's going to have standard deviations rootin so the chance that it's more than six that that's why I call it standard deviations the chance that it's bigger than six through ten is like one in the million but the thing is one in the millions a constant and we've got an sets so if N. is ten to the one hundred we're going to have this one in a million we're going to have these outliers so the question is can we eliminate the outliers the key word. Is all we can get most of the descriptive we get all the one in a million less than or equal to six wouldn't just by coloring randomly but we want it we want to eliminate the outliers OK Let me switch to a vector formulation. Where all the work you have N. vectors in N. space. That have L. infinity norm less than or equal to one. It's easy to make the connection to sets these or you look at the sets and the elements and the matrix and you look at the row vectors in something like that. There's an initial value that's only think of the initial value is zero because it's only in the proof you're moving it around and so we want to do with a general initial value but whenever you see Z. just think of zero zero. And now what we want is the coloring is a vector of pluses and minuses and we dot it with our audience which is a vector say with all coordinates maybe plus one or minus one like the Hadamar matrix then we want the dot product to be less than or equal to K. Ruden again for all of them again forget the Z. just Z. is zero. So this is this is the formulation I wanna look at and again like I said the problem is the outliers that is if you just pick the. X. at random it will work for all but one in a million cases but that's no good we want is zero we want all the I's from one to N. OK So actually I'm only going to do phase one I'm going to so I'm going to use a notion that due to Joseph BECK We want a coloring so coloring of the elements that we want a vector of N. plus ones and minus ones we're going to start the coloring with all the colors at zero so think of the coloring as a point in the end dimensional cube going from minus one to plus one it starts at all zero Well that's great all the discrepancies are zero because you add up all the colors you get zero except we haven't done anything so that's so but now we're going to move it around we're going to move the coloring around inside the an dimensional cube so. So we're going to start here except this is N. dimension not two dimension and we're going to move around until. We read it this way. Until we until all the things get to be plus one and minus one and this idea is called Floating colors and it's due to Joseph Beck who's now a Rutgers but we're going to move them around and this is the love a chick car I idea in in a Brownian motion but a controlled Brownian motion if you just. So for one thing we're never going to leave the the unit cube so. The way I think of it is when you reach minus one or plus one you stay there so if a VERY of the variables are moving around when when one of the variables reaches minus one or plus one it stays there in terms of the algorithm because we're taking discrete steps we say if you get really really close to plus one or minus one you stay there but if you're within one over and square then it doesn't matter you could just you know so so it's just a technical thing that will will stop when we get when we freeze it when we get real close and we're only going to do phase one we're going to go until half of them are frozen and L. say a few words about the other stuff now we're looking at again zero zero we're looking at R.J. got X. which starts outed zero because X. starts out at zero. And we look and the normalization is divided by RUDIN So instead of one constant Rudin we want constant OK so so were we will we want is that all of the L A J S are less than or equal to K.. This constant Here's what we do so we're out a certain point and but again think of this is N. dimensions some of the coordinates are frozen and the others we want to move in a random direction but a controlled random direction. And here is the control first of all once something is frozen we're not going to move it OK but we're going to this is only going to be phase one so that will only be half of them will be frozen. That. Second technical thing which which is actually not in the love it make approach but I like it we move our thought only to to where we are but the Imp here's the important one we look at the current values of L.J. Remember we want all of the L. days to be less than or equal to six in absolute value so this says how dangerous each coordinate is we or sorry how dangerous each are subject is because the dot product has gotten big and so the value of remember L.J. is is in the back up L L J is the danger it's R.J. dotted with with the current value of X.. So we look at the. Top quarter most dangerous our js and those we freeze well these we freeze temporarily and we freeze them by saying that for those are js we insist on going perpendicular to those our J's. OK So when we do that we have how many things have how many degrees of freedom that we have left we've taken we've taken at most and over to coordinates which were frozen and we've taken at most and over for our Jayson insisted that we be a thousand oil but we're still in a hyperplane of dimension and over four and I could move around the constants and do two other constants that would be fine. So the. Dimension in which were allowed to move is still of high dimension it's still a dimension constant N. and now we make a Gaussian move. In that and that's our step and now to make it into an algorithm when we take the galaxy move their parameters about just how far we want to go for because in my mind this is a continuous algorithm but continuous algorithm is somewhat of an oxymoron I mean you know computers don't well anyway. So so so there's some stuff that we actually do the Gaussian and go a certain distance. And then we were given by this some little number Delta and then that are moving OK So what happens is we're moving along and Delta was the distance we move at each step so since we're going orthogonally we're adding Delta squared it each time and. At the end everything's at most plus one or minus one so the el to Norm's at most and so the total time is at most and Delta squared And now let's look at a particular L.J. and and this error L.J. and what's happening to it at each step Well it might be frozen OK well that's great it doesn't move at all or we're making this Brownian move and a brownie move with. This D. dimensional Brownian move when you project down to L.J. It's just a Gaussian move it's just a gas so you moving it by a Galaxian. And the Galaxian has variance at most Delta squared for over and where the four reflects the price that instead of making the move in N. dimensions were making the move in an over four dimensions but that will only be a constant and now what happens is if you look at any particular L.J.. It's a Martin gale at each step it might be frozen it might not move at all but if it moves it moves as a Galaxian with a certain variance so it's a Martin Gill Now how far it moves depends on previous history but that's the whole strength of Martin Gales is that your expectation is zero but your move depends on can can depend and here does depend on previous history so we have a Martin gale and the variants of the Martin Gale. The the ends cancel out. And the Deltas cancel out as they should and they do and the total variance is at most four and that four is the expense of going from an dimensions to an over four dimensions so now you have a mark. Gale with total variance for and each step is Gaussian and so the L.J. are the the the. The. Large deviation results on any particular L.J. It's just as if it was a galaxy in with with variance for. So this is the Martin gales when every step is Gaussian are particularly pretty everything works out very nicely So what happens is that the probability that L.J. is bigger than K. is as if it was a Martin gale and we pick a equal six or ten or whatever the number is so we make it less than point one at this stage it's not at all clear where we've gained but there's only one line to come after this. At this stage. Any particular Elwood today has probability less than a tenth of being an outlier let's define success saying that fewer than twenty percent of them are outliers. But if the expected number of outliers is ten percent. Then when we run the algorithm. At least half the time fewer than twenty percent of them will be outliers so what have we done with made an algorithm with at most twenty percent of them outliers but I started out saying that just doing a random one would get ten to the minus six of them outliers so what's the last line here Whoops. None of them are outliers because we were always freezing the top quarter and offset up the quarter in the fifth I mean you could adjust that to make the constants better so we were always freezing the top quarter of them and each step is just very tiny and doesn't have an effect so we're always freezing the top quarter of them so if less than a fifth of them are outliers none of them are ever outliers. Because as soon as they became outliers that is bigger than K. they were frozen. So it's a funny little piece of logic here that you go from having at most twenty percent outliers but since you were always freezing the top twenty five percent it means you have no outliers and this was the argument and this gives the algorithm and it turns out it's really quite a fast algorithm and indeed some of the technical things about just what what to choose the Delta I mean the bigger the Delta you choose. The faster the algorithm is but on the other hand you don't want to jump over the. So there's thickness of the boundary so there are a whole bunch of technical things skip but this was the argument and I want to add to that this is a completely different proof from the proof that I had which was then improved by many people especially Ravi Bopara. But still was non algorithmic But now to me this not only gives an algorithm but it actually gives a better proof of the original result data. No. Good it's not the same constant you know to tell the truth I haven't worked it out yet it's not the same constant. And that really gets into the technical things of getting the Delta correctly. OK. And then all of the there's other technical part once you get down to an over two frozen now you want to move down to an over four frozen and you do kind of the same thing but I'm going to skip that. And turning to the low Vos local lemme low of us I think of him that you know we think of historical figures like air dish but you know each generation has some people and for my generation it's the last low low VOS In Budapest who's done tremendous work in many many fields and in particular is this result to love us local M. which has turned out to be a a central thing in probabilistic methods it allows us to I like the phrase find a needle in an exponential haystack. So I'm rather than give to the general result if you've. Taken blitzes courses hopefully you'll you'll know about it. But I'm going to give it with a specific example. With Boolean case sat OK. So I want to think of K. is fixed and I'm going to take boolean N. variables X. one through through accent Sorry yes X. one through accent and I'm going to look at clauses case that clauses that is the disjunction of K. of the. X. js or not X. js and then an instance of case at is that you want all the clauses to hold simultaneously you want their conjunction to hold so. For those. From theory C.S.I. The three sat is the classical and P. is the classic N.P. problem OK to which we contribute nothing with that with with this OK but still gives you an idea all right now I want to add something I want to say the two. Was overlap if they have a common variable X. of seventeen including if one has AK seventy nine the other has not except in they still overlap so now here is the local lemma. Assume that each clause overlaps at most the other clauses. And the clauses all have size K. So when you randomly put in truth values they have probability two to the minus K. of failing OK And then I want ears actually two point seven one eight actually he surprised me with that if Edi to the minus K. is less than or equal to one the strange but then the US local lemma says yes one can satisfy then all of the clauses that is the conjunction is satisfiable So a key thing about this is that you can put in values of K N D So for example cake was five in the quiz ten so you have an instance of five said not a random instance an arbitrary instance of five set where each clause overlaps at most ten other clauses but you've got. Ten to the seven variables all right and yet the last local lemma. Says that it's possible to satisfy all of them at the same time the reason why it's called local lemma Well there's in the litter a sion I guess that's the reason sound sounded good but also it fits because locally you've got a clause and it's only sort of connected to a small number of other clauses so locally things are looking pretty good but you want something that works globally. And the original proof. There was no way to actually get the algorithm Joseph back I keep mentioning his name had some key input the question then came Could you give an algorithm. Beck gave an algorithm that worked for some instances of D.M.K. and then Robin Moser came along he was a first year. First year graduate student had a T. Ha he teaches in Zurich and he got this idea and it totally transformed the entire thing yeah just. Blew Mazie Yeah yeah he was student of him a belt so. Yeah M.-O. said he was a pretty good student. All right here's his algorithm. Interesting. No go alone and I have many many people look at and try to come up with algorithms Here's his algorithm I call it the fix it algorithm. First. You just randomly assign true and false to everything well if that works you're done go home OK no problem there. If it doesn't work all right so if it doesn't work there's some clause that's turning out false again I'm going to leave out there's a lot of data structure questions here about making the algorithm efficient that I'm leaving out OK but this is definitely a polynomial time album I polynomial time I mean you fix the five the numbers five in ten and then you let and go to infinity. So you take you select one of the bad clauses and indeed you can do that in any way you wish. You really remarkable part of this algorithm and you take that bad clause with its five variables and you randomly reassigned those five variables to True or false but my favorite step. Is the Fourth Step some of you have seen this talk before there is no fourth step that was it that's the whole louder of them that's all you do that's all you do end of algorithm you just start randomly. If there's a problem there may be many problems pick one of the problems any way you want to do it you can be clever or dumb or anything anyway you want to pick you know you can do it lexical graphically anywhere you want you pick one of the you don't take all the clauses at once that are a problem you take one of the clauses that's a problem and you fix it that is you reassign course you could be reassigned the same way you know so but and that's the loop now of course what happens is when you fix this it breaks this and then you fix this and it breaks this and then you fix this and maybe at that it breaks the thing you started with so it's not at all clear why this would work indeed. My memory with no that alone is we sort of looked at this algorithm and said No. Can't do this this is ridiculous you turned out. This is the algorithm and I want to give you an idea of why it works. And there are many approaches I'll give an approach due to Don canoes who I know I think of Don. You know one thing about computer science is you know a lot of things in math are like two hundred years old you know but computer science I mean Don can do things like the George Washington of computers I mean theoretical computer science I mean to me he really just like you look back at kalian Abo I mean these figures and math what other fields get developed in they have to start at some time and I mean there are other people carp certainly but and others but but to me it's done can do that really started the subject of theoretical computer science and this is his argument for this result so you look at this algorithm and you take a log and if English is not your native language log has about six definitions and here what I mean by log is like a ship's log where you're like a diary that you're keeping of it and you just keep a track of of which clause so so I changed I'm going to use letters for the clause and instead of looking A one A two A three I'm just going to call him a B. C. D. E. F. G. So maybe I switched clause A and then I switched Clause D. and then I switched clause C. and then I switched clause F and up priority it could go on forever indeed if there is no satisfying Sime it will go on forever because it won't stop right but this is called the log so what we want is to show that the length of the log which operate already might be infinity but if it's expected sizes is finite that means it can stop fact it stops with probability one. On. But it can stop and therefore by modern air dish. Magic there will be a running of the algorithm that leads and the only way you can stop is when you have the assignment because otherwise you keep going so so you'll be done and furthermore again the part that I'm leaving out. The expected size is actually quite reasonable so you get an effective algorithm. OK. So we're going to play Tetris. So we're going to take this log a D C F D C B I hear the clauses so when I say the clauses one two three I mean the clauses on X. one X. two X. three maybe with some knots on it and they might be X. one or not X. two or X. or and the fact that there consecutive is only is not part of the theorem I mean it's just there for the display purposes OK So I mean I've made the clauses consecutive numbers but but but that won't be the case. So you have to expand your butt in Tetris usually the sentance. All right so here's this. Clause. A D C B E C F B And let's start throwing them down and playing Tetris with them so we throw down a then we throw down D. then we throw down C. then we throw down half then we throw down he then we throw down C. then we throw down B. and then we throw down at the case so we get this I could have made had you have to go further down that it doesn't matter OK I just do so then we get this Tetris pattern. Now I want to talk about. Looking at the last one of the sequence F. I want to look at what I'll call the paramedic which is really the things that are holding that half up all right so in this case the be in the C.. Are not holding the F. up but the ears and this a is because the A is holding up the C. which is holding up the E. which is holding up the F. So we get this notion of the pyramid. C. which is just all the stuff that support the last letter OK. Now. What happens is the a little bit of nice comment torics if you take any sequence forget about the probability you take any sequence and you look at the param aids of the prefixes So you go out one step two steps three steps forceps and each step you look at the pyramid you're not going to get the same pyramid twice you're not going to get the same pyramid twice why is that while the last letter would have to be the same but if the last letter of this one was a D. and the last letter of this one was a D. then this one's permit would include this once pyramid and have one more so nice little common to torics the pyramids are distinct and now we can say is that the expected value of the log is just for each pair amid. We can ask for the probability that that Param it comes up. As the pyramid of some prefix and that's right because if you have a log of length one hundred they're going to be one hundred pyramids and now we turn this around for each permit we look at the probability that pyramid comes up that we don't say when it comes up and then and then it's equal. Now here's a powerful idea. In looking at random algorithms and I think it's very powerful. I think you're going to take two things away from this talk of first the phantom quote and secondly this this idea of pre-processing randomness because with randomized algorithms you get into all sorts of very subtle questions and mistakes involving conditioning and so it really helps if you can set up the randomized algorithm so that all of the random This is pre-processed. And to do that you you put in more random this than you might use so how does that work here. What happens is each variable X. eleven flips a coin a countable number of times so it goes True True False False True true false. At the start it uses it zero thing if it ever needs to reflate it just goes to the next one. They see that's exactly the same as here but somehow at least it helps me this pre-processing I find is a helpful idea OK now. If you look at it as a. The chance that a is a paramedic Well if A is a pair made it means that A has to be false on the zero is calling flip which is probability one in eight OK. But now what is the chance and it's actually less than that because it might be that that A and B. were both falls on the zero coin flip and you actually flipped B. but it's a most one eighth to most one. Now what is the probability. That a C. is a pyramid. A and C. overlap. So it looks like it doesn't look like it's an eighth times an eighth but it is because in order for a C. to be a paranoid it means that A is false with the zero three point flips of coin one point two in coin three and then C. is false with the zero The coin flips of four and five and the. First point the start I'm a real mathematician I start counting at zero the first coin flip of of three so though they overlap their independent events so the probability that A C. is a pyramid is actually an eighth square actually Again that's an upper bound because it may have again it may have been that that B. was flipped first but it's an upper bound. You know. Yeah. That's right that's right so because we're doing this now we get into some nice alledge with this and beautiful semigroup theory here we have these strings on a finite set of letters. And we're looking at them but let's define a semigroup and this is been studied by a group theorist will say that when two letters. Have no overlap we're going to say they commute to have a finite set of letters we look at the free semigroup on that finite set of letters but then for certain pairs X.Y. we say that these pairs X.Y. commute and that gives an interesting semi group definitely interesting to study just by itself I mean what do you get what are its properties well. One or two things equal when they have the same Tetris picture that's the nice thing so if you don't like Tetris you can talk about this algebra of semi grooves but I like Tetris but it turns out this is this is the equivalent I mean it's a nice exercise to prove it so just for the example you see a D. C. If instead the sequence was D. A C. first put down D. then put down a then put down C. we would get the same Tetris picture and a indeed don't overlap so they commute so in the algebra A D.C. and D. A C.. Are equal and in the Tetris picture they're the same and that turns out to be if and only if it's not not too hard. To prove and then we get into a very nice algebraic common a tour Express and. I think I'll just say this and then pretty much leave it there so let's look at all of the pyramids. Now you can do this one. Not everything is size K. but here everything is size K. so the probabilities are all to do the minus K. So if you have a pyramid of length ass it has probability to the minus K. times yes and we know that the expected length the expected. Time of the time unquote of the algorithm is less than or equal to the sum of the probabilities over each pyramid that you get the pyramid which is. This Some here so if this sum here is finite. Then by modern Arish magic. You'll get that fix it takes time meaning the length of the lot T. log At most this sum and now you get into a very nice question in algebraic common a tour ix. I give you a finite set of letters A through Z.. In this case to each letter I assign to do the minus K. but that you could give them different values also and now I'm in this Tetra semi group where I say that certain pairs of them commute and other pairs do not OK and I tell you which ones commute and which ones don't. And now you look over each pair amid And you have so let me take a simple example because we're just going to stop with this. Suppose you only have two variables and and they they do commute then everything can be written as to the ages. You can put all the A's to the left and now if each of let's let's assume they each go to pay though. So it's a P. and Q. which are right. For the let's give them different things and so what we want is the canoeist property is that the sum of P. to the a Q. to the B. is less than infinity and that holds if and only if P. and Q. are less than one but if they don't commute if they don't commute you're looking at all words of length an and each word is each always given a P. and each B. is given a Q And then the condition becomes that people ask you is less than one nice not so easy exercise try for the four cycle suppose you have a B. C. and D. A and B. commute B. and C. commute C. and D. commute D.N.A. commute you know what is the value of P. for which it's finite P. is too big it will be infinite How do you find it out and you get into some nice some nice generating functions as a whole nice area here. Which I won't say but the partial answer is that in the case we're each letter only overlaps with and most other letters. So that means that everything commutes with all but D. other things. So most things are commuting but for each thing there said most D. other things that it doesn't commute with and then one can work things out and one can get the. Proper even more than that here the eve for explicit D. and K. The E. is given by an explicit function of D. and K. which has limited value D. so it's even better than that. I want to add and with a will and OK on an intriguing thing and I've talked to some people and in theory C.S. about this that this algorithm actually works against depression. Oppression adversary now. So imagine that you randomly make these countable number of coin flips now your enemy is going to decide which clause you're going to flip and he's going to try to make the algorithm go on forever OK I mean if there's only one clause that's bad the adversary can't do anything but if there are ten clauses the adversary can the adversary is trying to make the algorithm go on forever circle around the round do something like that you know. It's an adversary. So imagine that this adversary actually sees all of the coin flips. That is for each X.J. there's a countable number of coin flips the coin flips were done at random but the adversary sees it so when the adversary says. Flip this clause the adversary knows what's going to happen the adversary suppression it turns out it doesn't help. The theorem works against oppression algorithm oppression adversary so when I say. That you can do any selection for which clause to pick it's even stronger than that it even works against depression adversary that actually knows what's going to happen when the adversary says flip this one the adversary knows what's going to happen and even against that powerful pressure and algorithm adversary. The. The argument for this argument that I gave is fine it's the same argument that. You don't change it at all and but I think it's really remarkable that you can have this notion of oppression doubt adversary and find quite intriguing so I just leave with that and. Paul was known for many things poetry was not one of them. But in this letter to their show she is number one certainly is number one female collaborator maybe is number one collaborator. He. You know his letters always started you know let S. One Dot That S. and be a family of set go on and on and in fact I did actually see the letter and that was the whole letter except that at the very end he added. This. Nice line it is six in the morning the house is asleep nice music is playing I prune and conjecture Well he's not known as a poet but I think I think this is poetry thank you very much. Yes yes yes yes yes yes yes yes it's really quite striking that I don't know how one would use it but the notion seems very striking to me that you can defeat oppression Dowd with a. Book if you actually write my. Own can we do randomize. In the. Case of the no outliers we can do you randomize because when you look at Brownian motion you can make Brownian motion the large deviation results on on Brownian motion a really depend on a weight function that you can create and there it is possible to do randomized I'm not sure about the full local lemma whether it will work in that case I don't know yet that's a good question. Yes Yes You know that's what was asked yes the answer's yes in fact the wall I'm not saying it's exactly the same on time but the bounds on the time are the same between the pressure and algorithm and the arbitrary algorithm is when you look at the proof the bounds are the same. With this that I don't and it is polynomial time and in fact not only is a polynomial time but it's actually quite quick and if you run this sits very easy to run this you can say Take instances rant say you want five sat were worse say one five sat and. One way you could do it is just on a million variables you could split the million million variables into two hundred thousand sets of five and so they'll give you two hundred randomly and they'll give you two hundred thousand clauses were each variable as in one. Thing and then you just. I guess you do it again I guess you do it depending on the very way you do it you do it a certain number of times so it's easy to create these randomized it's easy to create instances and then when you run the album In fact they do run very quickly there and very quickly Yeah. OK. Thank you.