[00:00:06] >> You. Know if. You're. Going to be me we want to. Stop the violence last year and he's got to. Come you know. That or this one ear. He's. Cool thanks for the introduction and I think all for coming so I'll be talking about mixing times of Markov chains and sort of relations or connections to high dimensional expanders and this is based on joint work with the minority and shine of a scone. [00:00:58] So suppose you have say some non-negative function from subsets of one through. And so the 2 task would be interested in is generating samples approximately so you want to sample a set s. with probability proportional to mewe of s. and l. so you'll be interested in computing this interesting computing this partition function which is the summation of us over all sets s. Ok I. [00:01:25] Think here that like me was not normalized the sum to one but you can still think it will still be thinking us a distribution where the normalizing constant is exactly this partition function and. Also assume that news given by like Oracle access so you can query like if you are given some set s. you can query the volume us but you're not given all the new assets to some like list of all of all the numbers. [00:01:50] Ok So lots of lots of examples in practice like you can imagine like looking at uniform distribution over say perfect matchings in a graph or like connected subgraphs of a graph independent sets of a graph or a number of bases I'm a choice or any anything a lot of things can be cast in this framework. [00:02:13] So a natural class of. This is just some examples if you don't know where this is this is really a matter for the purpose of this talk these are just examples of like things you can cast in this kind of light. So so natural class of algorithms that people study are Markov chains and here's a very simple one that people use which is so Markov chain is this random walk so I just have to tell you how to transition from the given state to the next state so let's say my current state is some set s. so to do the transition I pick a uniformly random element I. [00:02:53] And then I move to the one of the following which is how you throw out of us or add I into s. And you know if if like if I is in s. I can throw it out so there's some chance that like I'll just stay at s. just because I could if I is already in us then adding I to it's not going to change but there is this a chance to stay in the status. [00:03:15] So this simple like add one remove one kind of Markov chain is known as the global dynamics and so the main question I'm asking is under what conditions on new Does the glob or dynamics mix rapidly Ok and I haven't defined what mixing rapidly means but I mean just basically roughly like the efficiency of your algorithm for solving those 2 tasks I mentioned on the previous slide and this is the same thing as bounding the 2nd largest eigenvalue of the transition matrix of the global dynamics away from one. [00:03:51] So we've crossed one criterion which is which we call spectral independence Ok and to define special Independence we let me 1st define this paralyzed correlation matrix Ok so this matrix has entries or has rows and columns indexed by elements of one through n. and the entry correspond i j is this like difference of conditional probabilities Ok so it's so this is just shorthand notation for so if I write j is the same thing as the event that Ga is in a random set s. sample from you. [00:04:27] And I write j or like I guess here if I write I bar this this is the same as the event that is not in random sample for me Ok. So this is just a shorthand notation and you so this is kind of like a paralyzed correlation between i and j b can also think of us as some kind of influence right so this is what this is like. [00:04:50] Condition on the status of being in or out of a random set s. How much does it affect the marginal probability of j So this is you also think of us as some kind of influence. So we say our distribution is specially independent if this corresponding matrix of pairwise correlations has bounded maximum. [00:05:11] And if the thing is bound in x. and I get is like quantifying like as saying the total amount of paralyzed correlations is small in some sense so for most practical applications it will suffice to bound the maximum absolute row or column some Ok so let me show you some examples just to sort of clarify what the scene looks like so let's say I have this distribution which is say on the singleton set 3 has Ok so this is not distribution but normalize it by if I divide by 8 with the sum of all these numbers then this this will be actual distribution. [00:05:49] And. So that's let's look at the corresponding correlation matrix so in this case so throughout our question matrix will be assumed to have 0 diagonal so if I enter the same this is going to be we just defined to be 0. So look at the coalition between the elements $1.00 and $2.00 so in this case the thing to notice is that in all these subsets it's either either one in 2 or both in the set s. or neither of them are in s. And so in this case you can see that these conditional probabilities are either are one in this case if I know one is in a random set as samples from you then 2 is automatically also in that set s. and if one is not in the set s. then 2 is also not in a set this so in this case so this conditional probability is one in this country but to be 0 so these entries are one. [00:06:47] Let's say let's look at the the corresponding correlation between say one and 3 so in this case. If I know one is in. A random set s. then only considering these 2 sets and then 3 is in the set with probably 3 over one plus for just 3 fourths and then if I condition one to be out then I only have this one here and so the problem with reason that set of set s. condition one being not in the set is one so I have 3 fourths minus one which is minus one. [00:07:24] And the same holds for the correlation between $2.00 and $3.00 and. Also get minus for 7 so you notice that this matrix is is not symmetric between have positive and negative entries but it was still nevertheless have realized in values and I'll tell you more about that fact later. [00:07:48] So. So another Another example is to say he was an independent and so in this case when I mean by independent it's like each element is in my set s. independently of other all the other elements and so in this case these Barger pub is exactly the same like condition I to be in or out doesn't affect me at all and so in this case my question is exactly the 0 matrix and so. [00:08:14] Any independent distribution is 0 especially which kind of makes sense. The more dependence you have the more spectrally. You can I mean I guess for most purposes to be bounded by like next maps the row or column sum is the same spot in the maximum absolute about. Yeah but we only need a maximum. [00:08:46] So at the other extreme. Let's say my distribution is like say one half on these 2 disjoint sets Ok so in this case knowing the status of whether of a single element actually determines the status of all the elements right if I know one is in around a set s. then No $23.00 and half are also in the set s. and these guys are all out similarly And similarly if I know one is out then these guys are all out and these are all all in. [00:09:14] And in this case you can just write down this coalition matrix as this block matrix and this minus identity is just to 0 off the diagonal. And so from this form you can you can easily see that this thing as and minus one specially if it is maximising values and minus one. [00:09:30] And this is like the worst possible case because so I told you that I always bound the maximum value by the maximum like the absentees row sums and each each entry of my matrix has absolute value at most one it's a difference of probabilities so it's always any disappearance always at most and minus one specially independent and this and this this distribution achieves that because this is like the worst case. [00:10:01] So for the Russian It's like they have a slightly different notion of influence is very related but it's not exactly the same thing the matrix they have there is a little more complicated described but it's. Yeah. It's really it's kind of morally intuitively kind of the same idea but it's not. [00:10:26] And then just one final example for class of distributions are scum in the study in the richer So let's say new is negatively correlated so what that means is that if I condition an element I only can decrease the marginal probabilities of all the elements so in that case all the entries in my coalition matrix have the same sign and so if I sum up the absolute of the entries the same as the sum and if I further impose that my my distribution is like homogenous meaning it's supported on sets of a fixed size then you can then you can calculate that it's actually the difference is absolute of the difference exactly one and so so any homogenous negatively correlated distribution is one specially. [00:11:10] So now let me sort of strengthen this definition a little bit so I know I want to encompass all conditional descriptions so not only do I want my distribution to be a to 0 special independent y. I want all distributions obtained by condition on a single elements to be a one special independent all distributions obtained by conditioning on 2 elements to be data to specially independent and so on and so on special Independence not only for the description itself but all conditional distributions 1st result is that if if I distribution is Alpha Phi Alpha Alpha especially independent then I can get spectral gap down on the club or dynamics for sampling for me Ok so you think that if our constants their independence of n. then I get polynomial time algorithms. [00:12:03] So I want to put off for each one of these just one special dependence of a special dependence for every conditional distribution this is. Yes I condition on 0 many elements and this is a condition and minus 2 elements. That make sense. Ok. So our 1st up our main application is in this work is the hard core model so as to define the model so hard to model this this distribution independent sets of the input graph g. where each each independent set is weighted by some parameter Lambda the power of the and it's the size of the. [00:12:55] So this is a I have this. Graph is just a quick example so I have this graph then its corresponding like the partition function which is the summation all of these Lambda the power size of I. It is actually this this one comes from the empty set for lambda comes from these 4 vertices which make up Singleton independent sets these 3 lambda squares these 3 pairs of verses that all have an edge and then the cube comes from this this big independent set here. [00:13:26] And as many expect. Exactly value in this partition function is hard and turns out even on like bound like very restricted classes of graphs and this is why I like up to this point I've been mainly talking about like approximating these partition function or approximately sampling so why should I care about the hard to model so this model is has a lot of patients in operations research in statistical physics where it's used like model like particles in a gas is used to model like communications networks as a lot of connections to like the ising model of magnetism and so on and is also used to understand so-called phase transition properties Ok so let me describe these phase transition so and illustrated I'm going to do it on the street so this is a truncation of the infinite through of the tree and it is rooted I'm going to root it at some vertex are Ok so so let me. [00:14:25] So let me describe what happens when I condition vs to be in around Ok so let's say I condition neighbor of my root to be in because I'm looking in the pen and sets this forces my roots to be out right in a set so in particular this decreases the marginal probability of the root. [00:14:41] And similarly I look at say a vertex at Legacy a grandchild of my root that condition to be in this will make the course when apparent to be out and because this you can think of this is like loosening up the constraints on the route and so in particularly this will increase the marginal probability of it. [00:15:02] And just give you one last example if I condition like a whole bunch of or to seize the matter by condition a whole bunch of vertices That's a distance 3 to be in. Then I have a bunch of go to seize a distance to their conclusion be out and this will increase the marginal probably these of very citizens one which will drive down the Marshall probably at that So the key takeaway here is that there's this kind of dependence so these entries this is the penance on the parity of the distance. [00:15:31] Away from say your route Ok so if I'm at odd distance away I condition to be in this will drive down the marginal probably of the route and if I condition vertices that even distance to be in this will drive up marginal probably at that Ok so this is the pennies on the parity of the this is dependent on the parity of the distance but would you kind of want from a distribution is somehow that this dependence will go away as like the distance gets large. [00:15:57] So it turns out that this is sort of this is critical threshold where for lambda below this critical threshold this is the case that this dependence on the parity goes away as the distance goes to infinity but above this threshold this is not the case so you can then a way to say is that above this threshold to dispute the Harker distribution of the truncation the tree even instance looks very different quality of the qualitatively than the distribution if I truncated or distance vs. [00:16:28] And for many years theoretical computer scientists have this intuition that this this like. This critical threshold this is a phase transition and this face transition should correspond to a transition in like complexity of approximation Ok so what So let me slightly so so young a drug graph of so this is like land on this axis. [00:16:54] Mislead people. Love this above this critical threshold you should expect hardness results and below you should expect like algorithmic results and roughly speaking it's because like you know for large lambda. The distributions can put more weight on like maximum independent sets and so and we know that finding a maximum depended set is hard to approximate so above some special lambda should be hard. [00:17:23] This is going to be like time like results so so. So I 1st you know they're like So 1st was a play like b.m. to go to in 1007 we studied this problem and this showed that. For lambda below one of the Delta minus 3 can get fast mixing the lower dynamics and this was subsequently improved. [00:17:45] In several works up to like. In 2001 by Christopher Botha for arbitrary graphs you can get through a Delta minus 2 so this boxing trying to feed us is that this work like makes certain additional structural assumptions in the graph. And subsequent works could get up to the units fresh rolled up to the social but they all had to assume like some kind of you know additional. [00:18:13] Properties of the graph like large girth or like certain growth properties so on. On the hardness side you know. You go to they already knew that above some constant over Delta you should get you should have like some kind of hardness and this was subsequently improved over until like. [00:18:36] The seminal work of sly which basically showed that you know above this unique this threshold this critical threshold there can be no approximation albums of any kind. And there was further refinement Spiceland sun and Galanos and good and stuff and can decode and Young and others. So yeah I was going to yeah yeah so whites has also this all of this like matching. [00:19:08] And Legace matching results is that there is some approximation algorithm below the threshold but I'll get to that like a couple slots. Yeah so so. So like this depends on the parity it is more later like this phenomenon called like this is special case of this phenomenon called like correlation to k. where like correlations will take a fast like especially fast in the distance between 2 vertices and that in that property is crucial to do like developing these kind of algorithms this is a question a case another way to saying is I go one way a quote of formalizing light when like local kind of algorithms will work because you don't have to care about like vertices are far away from you. [00:20:04] Makes me think. Ok Ok And they also mention that there are also other works like studying this is also interesting even a like restricted class of graphs and also that other algorithms also been studied for this problem not just mark up chains but for most of the 4 this talk would speak Markov chains. [00:20:32] So let me look formalize this complexity fizzed interest in some more so as I mentioned earlier this is a result of sly and also for the refinements that says that there is no approximation of album of any kind for lambda above this critical threshold and then there's this matching result by whites in 2006 which showed that for led below this threshold you can get efficient approximation on bound to graphs. [00:20:54] And so. Yes So this little delta here is like the how far you are away from this critical threshold like The Closer You Are You expect the running time to blow up. The key. Downside of it is all of them is that it has this exposure dependence on the log of the maximum degree so arbitrary graphs this is not the time this is quasi. [00:21:22] So I was always that. Our 2nd result is that for the model we can obtain fast nixing. Without this depends on the expense depends on log of the maximum degree and without making any additional girth or degree assumptions or any structural sumption. Ok. So basically our proof comes in 2 steps so the 1st says that the Harper description especially Independent and the 2nd result is what I have already mentioned which is that special dependence implies fast mixing. [00:22:08] So the rough outline is yes I'm going to tell you the Harker distribution especially dependent and then that special dependence implies fast mixing. And sort of these 2 the talk splits naturally like these 2 these 2 parts are like logically independent if you like if you go sleep in one of them you can get out and get back on track on the other one. [00:22:28] So. Your choice. So. Let me tell you about. The special dependence that implies fascinating 1st. And this goes through this these recent development in their high dimension expanded Ok so basically the rough idea here is that this object called simplicial complex which I'll tell you about the 2nd is a way to decompose you know one massive markup chain into many many little small ones Ok so so Ok so let me tell you what a simpler complex is so. [00:23:11] So simple as a complex this is downwards close collection of subsets of the ground sit in my ground say it's going to I'm going to have for each element I have one through n. I'm going to have 2 copies. Which is as we as as I said earlier just indicates we're not in a set s. or not in the set this. [00:23:30] And then for each set in the support of my distribution you get a corresponding like face of my simple complex or of course when he said to myself to a complex where I just put I reach I know this and then for each i not innocent but I bar. [00:23:47] So it's easy to see this like with an example so as to say I have. One too is equal to one then going to get this simplex which spans 12 and 3 because 3 is not in my set s.. Put give it away to one. Move I have 12 and 3 I have a simple expanding 123 and if I have a simplex simplex you have an empty set the simplex being one bar to bar 3 indicating that all my limits are out. [00:24:19] So let's see what this looks like for the hardcore model. So let's say again I have this graph and I have yet I can duplicate each vertex I bar. On the have at the top and I have a feast for each independent set and so here this corresponds in the 234 and have one bar to say that one the vertex one is not in my independence it and I get to give it the corresponding way to learn the cube. [00:24:48] And then I can do this for I do this for every Independence that. And then I'll take the downwards closure to make it an actual simplicial complex Ok and so is think of the stuff in the meat in the intermediate layers as just being partial assignments of to cease to be in or out of independence it. [00:25:12] So now I want to show you how this is related to the club dynamics Ok so I'm going to basically the point of this is that you can the dynamics can be realized in the very natural as a like a very natural walk on this simplicial complex like it called the done up walk. [00:25:27] So I was I'm going to show you by like a picture so it's at the bottom there's the glow of an x. looks like a top and look show you what this walk on the simplicial complex looks like so let's say I started some independence at 234. So I pick a random vertex say for an assign it and this corresponds like. [00:25:54] The 1st step of the walk which is like sampling a random element of the globe or dynamics and then. Simon for that specific vertex and this will evolve this will just keep going so I can say pick 2 on a sign it and then every sample that spin and then I picture again and then maybe I go back to the same independence that. [00:26:17] Just keeps going like this. So the key point here is that this natural down motion on this in the accomplice is exactly the Glover dynamics that I care about. So so basically so so essentially I'd have to decompose this large down up walk. And so now let me tell you why decompose it into. [00:26:45] So on a look at in some sense like a local view for each face of my social complex of the whole thing. So let's say and let me show you what the local view is from the bottom from the empty set so I'm going to get like this small graph where the vertices correspond to the element of my ground and then the edges correspond to. [00:27:08] Faces of dimension to size to Nice and fishel porn flicks. So let me draw it like slowly more naturally. So a 1st tell you what edges aren't in this local graph so I'm never going to have an edge connecting one and one bar just because there's no independence that I mean you can't assign a vertex both to be in and out in the same 4 holes for all the other vertices but I also have these monitors which is to say I cannot have both one of 4 and it's because they're connected by an edge in the corresponding graph. [00:27:41] So every other juice so this is the unweighted version so let me tell you so I actually need a weighted graph that has to be sort of consistent with the hardware distribution so let me just tell you choose some weights so the weights of an edge is going to be the summation of the weights of independent sets that are consistent with these 2 assignments right so for this one I only have one independent set that assigns has one and then doesn't have 4 which is just I also take 2 and 3 to be out and so I have a way to Lambda here for this one I have. [00:28:21] But condition one and 2 to be out there since I have no edge between 3 and 4 I can do whatever I want there so I have lambda q. squared away for if I assign both and to be in plus one. And then I have all these other it is and I'm not going to all of them is going to get too messy but. [00:28:41] Ok so it's basically the point is that in this big sum to a complex I have something I can also look at so I show you a local view from the empty set from the bottom here but I'm going to back also look at it locally from any other vertex right so why do as I condition eventually condition on this vertex to be n. and then look at it like this sub complex. [00:29:05] So I can condition to me and I get a sub complex and again I can look at this local view by looking at murder scenes here and in edges. You just go look at 2 layers of time yeah so yeah exactly so but I can for example like you know I just wonder yes but the idea. [00:29:26] So that we. Should function. Condition on the correct you know this is. This or these are all sides. It's not it's not between. Sort of really. Between like I was so yeah so this is a picture or if I throw away all the stuff at the top and look at vertices here and then they're connected by edges these are like my edges. [00:30:02] Ok so like sets in my simple complex of size 2 will be edges sets of size one of my simplicial complex will be ver to seize and then they'll be connected like those corresponding weights but the point is that I can do this for like any I can look at it honestly from the empty set but also from like any other vertex or any other set in my simple problem so I can and I correspond to Century just I can dish in on the stuff and then I look out. [00:30:29] Further and then I look at. The remaining vertices and I connect them by corresponding weights. Yeah so at the top here. So. But so yes but it's not just the bottom there I can also look at say a vertex here and look at the So You Think of it as like at every level of this simple complex I have this little graph. [00:31:13] So it's not just the bottom line but they all these little graphs like they cover my graph So yes so I can look at it I can also condition should be out and so on. So yes so so let me so basically the rough serum they have in your head is that if all these local walks have like very good 2nd eigenvalue or like very good special gap then you want to be able to from that deduce that the big walk that you care about in the end also has spectrographs. [00:31:49] And so what I mean by one of my good I'm going to use I say I have all these 2nd item as we bounded by some. By some alphas. So. Yeah this is kind of the reminiscent of like specter of the pen and so I want the 2nd eigenvalue of the bottom. [00:32:14] Local graph to be bounded by Alpha 0 and all the conditionals to be bounded by I say some Alpha Phi. So this is what it means for all these local graphs have a good argument. And let me tell you what it means to have good value for the global thing so this is sort of the main theorem of a very recent work. [00:32:46] So they show that if I have very good local you know you said From there I can deduce the guarantee on the global activities so you think in this case that if all these alpha case are bounded by some order order one over and minus k. then I can get a spectrograph of like one of to the power that constancy. [00:33:15] Ok so there are previous works by cough and master nor coffin top and open which give few weeks or a version of this so there if you play there are 3 I'm sure you end up needing is instead of constant over and minus Kate you need something like constant over and when it's k. squares and so this is a refinement of this. [00:33:37] So rough strategy is that I want to prove this specter expansion and I apply this local theorem that I just mentioned the previous flight and this will give me a fascinating. So and So for us this is what we do our 1st before the local special special we show is the century that this coalition matrix look at its maximum value and normalize it by this one of a minus one which I'll tell you where it comes from this corresponds exactly to the 2nd eigenvalue of this local random walk and I plug it into this tells me that special in penance implies x. Ok so. [00:34:16] Any any questions. Like this is the strategy that. We use. Minus one shelf and ice to a half yes a little bit. There because you. Are right. Yes when we can think of it. So let me tell you. This is what this is for this. To give you proof sketch so I have this local raff. [00:35:02] And so let me look at what the random and she's of the random walk looked like. So if you do a calculation it turns out it's exactly this conditional probability So why is that so 1st let me look at the weight of the edge the weight is given by the sum of all of the Independence s. that are consistent with this right and so up to normalization this essentially the probability of I and j. is the probably of so if I look at say $4.00 and $3.00 this is up to like normalization this is exactly the probability of seeing both 4 and 3 and in normal and then if I normalize into a random walk I just divide by the probability of 3 or 4. [00:35:43] So so so the main thing is that I can write down the entry of this random walk as exactly these conditional probabilities. And that's because of that I defined this. As the difference of conditional probabilities. So the prove the claim I just basically have 2 observations The 1st is that if I look at my ran amok matrix has a certain type structure basically course one of the fact that for any 2 for any for any element I cannot have it both to be in and out. [00:36:19] And actually have these parts which have no edges between them and what this tells you what is the implies that I actually have a bunch of trivia like that so not only do I have a while I also have a bunch of copies of I can value negative one over and minus one so as you think of this is a generalization of the fact that if I have a bipartite graph I look at the random walk matrix of that I always have also I can value minus one Ok this is a generalization effect. [00:36:44] And the 2nd is just that I can you know I can write down this closure matrix as roughly the kind of projection type operation on this random walk matrix. And basically to fit these 2 things together you just have to say that this projection is exactly the projection away from the eigenvectors spending these. [00:37:05] But I'm not going to get more into the details. Of questions. So easy that wraps up this part of the talk so let me go to the 2nd part which is that the Harper distribution is factually dependent. So as I mentioned earlier what we want we're really interested in is showing that this. [00:37:36] You know the simplicial complex for the hardcore model has this local spectral expansion property Ok so when asked. Why previous techniques for proving local specialty expansion don't work and so a lot of previous proofs of local Specter was mentioned they go through this so-called trickle down to us Ok so this is a result of open time. [00:37:58] And often open time where they show essentially that if I have good arguments at the top I can get some guarantee on argument is that the next level below for local agonize But the crucial thing here is that the action Ok so what you really need if you want to use these kind of trickled down to them is that I need a 2nd eigenvalue at the top to be like one of the end and then at the next level below what I want to at minus one and then from this I'll do said the bottom of the like one half roughly speaking. [00:38:29] But the problem here is that like in practice it kind of goes the other way around would you expect at the top advisor a really bad deal like one half like a constant and then as you get down to the down in those improve Ok so because right like if I look at. [00:38:46] These local graphs at the top they're only on like forward a constant overseas rate so you can't really hope for eigenvalues to be like as good as one of our at the top. So what we do is we instead use this coalition to Kate I mentioned earlier. And this is sort of formalized in the notion of weak spatial mixing and there's a strengthening called Strong spatial mixing here this is not exactly weak spatial extent because you should be able to condition any set but for the version of Just like pairs of her to c.s. Well this is just that difference of pairwise correlations that is exponentially small in the distance between the 2 vertices. [00:39:32] So if you want to say is that if I have 2 person next to each other they might have like a lot of influence on each other but if I look at a very far away these correlations disappear as distance increases. And this is notion of strong spatial mixing which basically says that this holds for every conditional distribution as well. [00:39:54] But let me not go into that So from this so let me just show you that from this you can already get some kind of guarantees Ok for certain graphs so I'm going to show you what looks like for integer lattice so I have that I want to bound this maximum value. [00:40:14] Suffice to the bound this app's some of the absolute values of like a given row or column. So I can write this down as just like some nation over like a given distance and a sum of all the distances. And I just apply weak spatial mixing saying that Ok well at a given distance it's going to exponentially small in that distance and the crucial thing here is that the balls you know the number vertices at a given distance is growing only Paul move fast and so this is dominated by the fact that the courses are decaying exponentially fast and so in particular from this you can already deduce an order of one upper bound on the spectrum the pendants But the crucial thing is that this thing. [00:41:03] Really relies on the number vertices growing only polynomial fat or sub exponentially fast in the distance Ok so this strategy this sort of black box just up the case trade up for the application that we special mixing would not work for most crassly expanders are like infinite infinite trees. [00:41:25] So we do a come as a basically resort revisit the proof these proofs of like this spatial mixing Ok so this goes in like roughly 2 steps the 1st is the sort of reduce the problem to trees in the 2nd is to apply the so-called true recurrence. So this 1st part I'm not going to go into it's sort of you can it's kind of like a black box reduction using like the whites is self avoiding walk tree representation and so essentially it's suffices to instead of bounding this sum of correlations in a given graph suffice the bound to some a coalition in a given tree on the root Ok so I want to say that for every root tree rooted at some vertex are the sum of these correlations with that root vertex is is small. [00:42:22] So so. So so my goal is to bound the sum of correlations So I'm saying I want to be on this maximum value it suffices about the row sums so the row sums correspond just like for a fixed vertex bounding the whole number of correlations and saying that it suffices roughly speaking to do this just for trees Ok I'm lying a little bit here it's not exactly this but morally it's this. [00:42:56] Ok so I'm going to zoom that the reduction the trees part is that I'm going to go into the port. So let me tell you. What the true occurrences So how you would analyze this for trees so the basic observation is that if I have a graph theory I always decomposes partition function as the sum of smaller partition functions basically for a fixed vertex with its in or out I get a corresponding part of function and us atom up this is if I believe or takes this corresponds to a condition in the vertex to be out and I believe artist and his entire neighborhood is corresponds to 2 condition ever to be in. [00:43:33] And for trees this like has a very nice form because if I delete the the root of a tree this will give me a bunch of independent sub trees that are not connected to each other in any way and so you get this kind of product form. A recurrence. [00:43:55] So that history I believe. Which has happened. So I have this tree and I delete the vertex I get these smaller subtrees and I'll just get like the corresponding. Part of the functions for those and we did a calculation is get this kind of nice form for the probabilities and furthermore this holds if i even if I condition on some vertex below. [00:44:25] So I have this true occurrence and now I want to use this true occurrence to bound these. He's paralyzed correlations so they just show you how to balance problems quotient just by Lambda the power of the distance Ok so let's say I have this tree here Richard are now looking at the correlation between you an r.. [00:44:49] And that's a v. is like the unique child of the roots that has units. So I can give like an inductive proof. So I'm going to have this. Relation I just apply the tree of occurrence I just write out these 2 conditional probabilities as this this this difference of after applying this true recurrence. [00:45:16] Put them into a common denominator and then pull out the common factors of lambda and this. V n r I was bounded by this much like you know it's like the denominator is always at least 11 so I can just throw them away. And apply this inductively all the way down to the root always down to you itself I just get exactly Lambda the power of the distance. [00:45:48] And you can mean this is essentially a tight for light graph that looks like a path. So does that make sense. So I have this bound in terms of lambda and there could be going to be at most so it's a graph of Master Degree Delta. Reduces of distance d. there can be a most There can be in the worst case roughly like d. minus one to the power of the distance number of it so if I does apply this bone I can gets something of this form which is bounded only for lambda less than one of a delta so so this is already kind of interesting but it doesn't quite get up to the critical threshold which is over dealt. [00:46:36] And the reason this is losses is because like I know so this bound holds photographs that look like a path. But. Graphs that have this many vertices at distance d. should not look anything like a path right so what we do instead is that instead of down each vertex separately and applying this kind of worst case bond we sort of amortize over all over to see that at a given distance so instead we should look at is the summation of these correlations between overseas at a given distance. [00:47:09] And you can show using various prior results on quotation Katie is that this total amount of correlation over vertices of a given distance does the chaos as your distance gets larger. And so this statement itself is not really true but. Is true if you like you know at some other technical details like potential functions and stuff but. [00:47:35] This is truly what's happening. And if you had this statement then from this then you can conclude exactly. Bound in the unique this. For a land of below this critical threshold. Has. Been any so. Yeah well I mean. Yeah you are. Strong spatial mixing here as well and also some other like. [00:48:17] How do you mean by. These. You know I like. So so this is very very sketchy proof but it's entirely beside me ideas that go into it. So I know it. Was. I haven't thought about of specifically trying to get like beyond this critical threshold but I imagine just like any you just need about this like some correlations and something like the 2 you know balls grow project quite radically fast and you can use that for the properties of c o 2 to try to do but I haven't tried to refine this for special classes. [00:49:38] So so that it's actually concludes like the 2nd part which is that the hard core distortion is special in the pendant. So. Let me just summarize with like the rough again with this rough strategy that we followed to get to get back at mixing Ok so I want that in mixing and using So using these results from the high dimension expansion community of like local global theorems that suffices to prove the local spectral expansion of this simplicial complex defined earlier and what we did in this work is the use like spectral independence and correlation decay to get this result Ok. [00:50:24] Previous results have used like trickle down through Missouri like luck. To you are these other things but in general we seem to be a bit lacking like good tools to like establish these like local special expanders so. Like one of the main like open ended problems is just like developing new techniques to do this to prove look respectful expansion and then hopefully from those you can you know what expand like the class of distributions you can you know analyze this with it so for example maybe you can analyze colorings which is like a big open problem. [00:51:05] And. Also there are some old poems just even for specifically for the hardcore model so it's sort of a technical deficiency of our proof that this constancy like in the exponent of the running time is this constancy Delta So Delta remember here Delta here is like the gap between how far you away from the critical threshold Ok. [00:51:28] Delta in our proof is actually something really horrible it's the exponential and one of a delta but really we think that the right thing is one of the delta and like you can do the calculation on just the infinite tree itself and there it actually is just one of a delta and but this is sort of like a technical deficiency of our proof. [00:51:49] But. It's actually I think it's believed that you can actually get even. Even faster mixing holes they actually should get something that does not have exponential dependence on Delta but rather some constant depend on Delta times analog and. And. As far as I know there are techniques cannot cannot get this fast mixing. [00:52:14] Like so the best you can hope for. In the infinite tree. This correlations some correlations is not just order one over Delta it's actually also lower bounded by order of one of the data on one of the delta and so you cannot so the best you can hope for using the techniques is like and that one of the Delta roughly speaking so again this probably needs some some new other ideas. [00:52:41] And. These are you. X.. Yeah I mean like these locals these local eigenvalues probably I think I best can be all right best like one of the Delta one of our and something that. Maybe you can hope to improve these local global theorems further to go from these local eigenvalues to something as good as this but. [00:53:14] Not sure how I don't I think they're pretty. Good if you want to get rid of the x. if you all say you want to go to this the next I don't want to get into this thing that may be going to be due to the one yeah so even though. [00:53:33] You get the axe you don't hear grandchildren want to go because of. High dimensional standards no it's just like how we're doing the correlation to k. Yeah Ok so yeah this thing is specific to like how we use the correlation to k.. And that's it thanks for your country. [00:54:08] As we only got covered in ice just because a mass very naturally to this like down up motion. So those are now also. What it was after if you want to design some other tree. All over again you're going. To get it all again but like. I mean you have to find some way to like sort of embed that chain into the simplicial complex if you want to use these things there's a slight generalization where instead of like. [00:54:47] Sampling the spin for like one vertex you can sample like say 2 or 3 or light some constant k. but there but that's as far as like. This. Because I don't know where all the whites 3. Maybe and I mean yeah I mean this thing is pretty specific So there are these as far as I know like though a natural want to cite the squad dynamics for its funny other algorithms is I'm not sure how to realize this in this. [00:55:34] One. 0 so so why it's all going to deterministic closer to that yes so if this woman is sick it's only for about a degree and they're all their albums also based on like optimization or like polynomials but there are all these they seem to be. Able to do this but that's the one you. [00:55:59] Want to go it was. So that orders were actually good only. Yeah those seem like about a degree and oftentimes it's like that extra dependence that seems almost like inherent in like the algorithm itself not to its analysis but like. Ok so far thank you. Mr. Oweis I already want to be known for a story like yours or thing I. [00:56:35] Was Like. So I was a lot yeah you want to be that you want more of what I'm going to call you know Ok what is for sure yeah I know something I could not follow I'm not like I said there are a good thing to get very far before. [00:56:59] The 1st Ok.