I thought I'd point out that if you're going to place a month schedule in March two more people. Well I just want to point out that next time. Two weeks from today. We'll also be doing it in sort of the introduction part of the seminars giving thank you to all the volunteers we have the last couple of weeks with the K. through twelve activities that hosting the high school in middle school kids from the last couple of weeks so worked out of that will happen in two weeks from today. So our speakers today is curse America person with his undergraduate work here at Georgia Tech getting this sort of science in or better of engineering temp and nearing you done with us. The University of Texas to do his doctorate work getting his Ph D. in chemical engineering and then did a two year stint as a post doc at nest in their Palmer's division before coming back to Georgia Tech first as an assistant professor and I was an associate professor in the Andes school of chemical and biological engineering. I work he is an associate professor in the Simmons faculty fellow. That's right and he directs the advanced polymer then film and coordinates your wheels Research Group and he's going to assume tell us something about that today. Yes OK This is a pointer. OK so it's always a little bit more daunting to to speak to people that you know or who are in your neighborhood and then it has to go on. To Chicago or somewhere and speak to people he may never see again. So feel free. If you if you have questions later to contact me at the school of conclusionary it's just Carson dot Meredith. C H B E. Well you know the rest. What I want to talk today about is first of all what I think is unique about our work relative to nanotechnology and that is how we actually do things how we do experiments and how we analyze the data and many of our projects are involved nano top nano materials nano scale phenomena nano scale. Composites for example and many of our projects don't. So we kind of have a mix of different link scales in our projects and so the topics that we actually that I'll discuss may depart somewhat from nanotechnology in fact you'll notice that I don't have the word nano in my title. So maybe I'm an outlier in that sense but rest assured that the things that we're working on are equally applicable to the nano scale materials of my crime scale. What we need or are fairly well known for my group is developing and applying what I'll call high corporate screening methodology. To advanced polymeric materials. And in any talk. At your own institution one thing that's that's usually on my mind is the opportunity for building collaboration's with people. And so that's one reason why I wanted to focus. Maybe a little bit more on the methods that we use for research. So people get an idea of what we're capable of doing. I want to talk about two application through goods. One is in mixing and looking at fairly complex formulations and I'll explain what I mean by that in a few minutes but the application there is developing new proton exchange membranes for fuel cells and the second major application I want to talk about is Michael pattern by a biomaterials OK So two quite different applications. So in terms of the word. High throughput another word that people oftentimes use it's common authorial and it's a word that you will certainly see in the pharmaceutical industry and that you're seeing more and more in materials materials synthesis and as well as characterization and one of the goals of our research has been to develop new proton exchange membrane switch or a critical component of a fuel cell to take only him use fuel cells. And essentially this is an electrolyte. Membrane that is usually a polymer. Usually an organic problem or that is responsible for transporting protons. It's also responsible for preventing. The actual break through a fuel or or air to the to either the end of it or Catto so energy is one of humanity's greatest challenges. I think that the motivation is there. It's fairly obvious. So some of the challenges in PM chemistry. That I would like to point out this is one of those slides that. Comes in one line at a time. So why do we want to try to do this in a common toil or high throughput method why not do it in a conventional way. One of the reasons is that these were to. Those have very complex design goals in that there are numerous properties that we're trying to optimize simultaneously and all of and in many cases these properties are working against each other so. In other words optimizing conduct to Vittie meaning I don't conduct of any proton connectivity. Generally if you optimize that without regard to the other ones you end up with a material that's not very durable. So one of the approaches that's been probably the mainstay of most of the research in this area has been the synthesis of novel materials from the monomer level. OK So developing polymers at home and polymers that possess all of these properties generally by combining monomers that would contribute one of more of these properties and then searching for the right way the best way to synthesize them the types of catalysts that are used the compositions. Another approach is to look at this is a formulation or a blending problem and that's the approach we're actually taking and here. The idea is to use distinct precursors. Meaning polymers that are already synthesized. They may be polymers some of polymer they may even be short. MacRumors. Or look them are but they're already synthesised beyond the monomer level and the idea is to combine properties that you want in a blending approach and then do something like cross-linking for example to stabilize the membrane. OK So an example that I'll talk a lot about is probably been the lead for I had as an inert phase. Many of the phase that doesn't. Transmit protons but is fairly stable towards hydrolysis which is one of the major. Mechanisms of degradation. Combined with polystyrene sulphur made which is a very academically model material to look at in terms of a poly electrolyte it's probably not a material that would ultimately find its way into an application but it's a good one for us to work with. And they demonstrate. The instrumentation we have plus I'm allowed to talk about it. So most of this work was developed in a collaboration with a large chemical company our come up. Who make that actually the largest supplier of poly vanilla bean fluoride. And this is actually it's kind of like tough one. Except it only has two Florian's. On each monomer and instead of four so. It's a lot cheaper and Arkham is goal is to find a way to use poly vanilla The fluoride in. P.P.M. membranes. So intimately seldom. So idea was we want to build a response function and other words we have these controllable parameters things we can go in the lab and and actually turn the dials and things like chemistry. The chemistry is that we select as a building blocks the composition the ratios we actually put those and. The way that we actually perform the mixing of those various components the order that they go with. And the time training affects the temperature of course that we carry out the fabrication had and the dimensions the thickness. For example of the membrane all of these things are things we can directly control and ultimately influence the approximately five properties. Well property classes that were interested in optimizing OK So the way that we go about doing this in terms of a common a way is we don't make a few samples at a time. OK so in a in the traditional view you would use some type of design algorithm and that may be something as simple as guessing. To looking at the literature and seeing what other people have done or if you have a good model it may actually be running a model. It's very rare that you actually have a good model that you can design from on that and that level. So you could actually what people would do is pick what they think are the best options. OK and make some samples meaning a few tens of samples at most maybe and then spend time characterizing those in great detail. And in that case the emphasis is on careful selection of a few compositions and very detailed very precise accurate measurements of the physical and chemical properties. We're taking a different viewpoint. Another way of looking at it and that is still selecting carefully. The the regions that we want to look at here because this is a huge space of variables. With billions of possible combinations even for. Even if you look at two or three the building blocks like the poly but nobody in Florida probably started Cell Phone eight they're still on the order of ten to the nine combinations of these five things that would be relevant. So we still locked in there are never that down graduate students tend to get nervous when you tell them they have to do. Tens of the nine experiments. So there's still a considerable narrowing down. All of this based upon previous knowledge even when we do that. Our goal is to make several thousand samples several thousand combinations of these and to map out what the properties look like for those combinations and our goal is to do that quickly. We need to do it in a few weeks or less. And the way that we need complex that is by building libraries. So samples of distinct temperatures for example and distinct compositions. Are combined on a run one sample with a small footprint. So maybe a few inches square. And we can do this using gradients and I'll talk to you about how we do that and that and then the Grady instead of metal and compositional are combined on one one substrate and you get a two dimensional library so to speak of different conditions then the next step of course is to screen this and to screen it fairly rapidly. OK So the basic idea is to go point to point with some type of measurement technology. To measure the properties you're interested in and there's three properties we focused on one is proton connectivity and others water transport and another is mechanical properties. And finally the desire is to build some kind of response function so to take the known compositions. And when I say composition. Usually I mean the composition of these parameters. So a set of parameters several thousand for example and then to relate those in some way some rational way to the the measurements we get and that's data analysis. It's. So. So how do we actually do the make these compositions variants and what I was going to point out is that actually these are nano scale materials if you look at the domains. They're from warmed in the most optimal materials. You have this inert matrix that provides mechanical stability and resistance to fuel crossover and inside of that matrix you have face separated domains of the ionic conductor and in the best materials that we found so far those those face separated domains of electrolyte are only on the scale of a few tens of nanometers at most. So this is one case where we are doing nanotechnology. So one way that we make composition gradients to actually make these high density libraries is by doing a match up that mixing process. In which we actually miss say two components although we can do. Many of them if we want in a small vial and what we would do is say start out with pure B. So we have a solution with material B. and it. And pump it into a file and then we start pumping in material a maybe the other component in the blend makes this vigorously and the mixture. OK back into this file. And so what happens is we be going from here to be pure a. As time goes on. And as this composition is changing from B. to A We sample it into a small syringe. OK. And then this is when you're done with the sampling process you have a gradient meaning a column of the liquid that has a variation in the composition this that gets deposited on the substrate as a as a thin. Stripe. And then we have a knife edge that comes down and coats it kind of in a Dr blade type method and unbelievably it actually works. And so to my amazement it actually works. Simply because you have so many things working against you here in terms of diffusion or merely diffusion mass transfer which ultimately would would wipe out the gradient if you wait long enough. As well as flow of this right. Pressure driven flows do the height differences. But amazingly it actually works quite well. So just to give you an example. This is a sample it's a blend of two polyesters which isn't which is for a project I'm going to talk about second and biomaterials but you can see along this axis is the composition of one of the polyester and that that polyester is Crystal and so you can see the different the color of the haze not color but the appearance of haze light scattering due to the crystalline of the varying if you go across and then here's the temperature that we have the older that in this range. For some reason instead of flipping the image over. The student who prepared this decided to make the act the axes go. In the opposite direction that you normally would put them but. So one of the things that we discovered about this method although it works really well for for some materials. It's very sensitive to the Scotty So if it's those two materials that you're mixing have the Scots cities that are very different from each other. The two solutions. Then you get into trouble because you have you're trying to cut a gradient where you have a high. A lot of this gossipy. And what happens is. Instead of getting nice straight lines like in this case you begin to get. Instability some of the diff whether it's a disk. If the viscosity gradient is too steep the flow becomes unstable. OK. So it turns out that this doesn't work very well for. The types of materials we're looking at in fuel cells fuel cell membranes because generally we're looking at blending not ideal materials the per floor Palmers that we're working with tend to have very non ideal behavior in terms of viscosity and then we're generally blending this with a low viscosity amount of mine or a low molecular weight. Compound and then Alton times we were also looking at composites nano composites adding an inorganic face that's dispersed. So you have three different very very unique types of reality three different discussing these sometimes different by a thousand or so and it doesn't work well here's another example of one that doesn't work well this is a library temperature and composition of those same two polyesters the image I showed you before. This is probably a couple locked on that's the white face and the holy left eye coke like the light is the dark face. So if you go in with a microscope and look at the detail areas even though it's a continuous gradient we can go in and look at areas. You can see that there is their face. Separated microstructure and then they go from if you go in with and they have film. You can show that these go from. Tens of nanometers to tens of microns. OK. So. But in cases where we're looking at. PM membrane formulations we had to develop a new way of doing this gradient blending and so we actually developed a more dynamic approach where we we actually built a dynamic. Mixing apparatus where we could achieve high shear. And look something like that and we essentially code in pump and that's the solutions we desire to look at in our library and these a pumpkin at differing rates so this one might be speeding up while this was slowing down. They go into the mixer and then they're coated into a knife edge blade. But we've edged into the sides of the knife edge these channels. So it distributes the flow out evenly. OK And so what's coming out now is in time the solution coming into these channels and getting coated is changing in composition with time. OK And so while it's being coated and mixed the substrate is moving under it. At and the rate that it's moving at determines the thickness of the film. So as you might imagine this is a well this is a continuous mixing process as opposed to the batch one. And you might imagine you have to be a lot more careful about what the flow rates are or else you want to get the nice linear gradient we'd like to look at in fact we had to develop a model to look at this which is come a chemical engineering bread and butter mixing models and we found that for any given combination. There's only one. You know any given type of material viscosity so there's only one solution. There's only one set of those two pumping rates where one speeding up and the other slowing down. So only one solution that results in a linear compositions. At the end. OK all the other solutions this either goes up kind of exponentially or or it goes over exponentially and approaches and I say asymptote So every time you do a different library you have to design how you're going to do it. Fortunately we have a program that does this for us. Computer program. So once we make the library. Fortunately most of the materials we look at have distinct infrared fingerprints so we can use them in the metrics go in and find the distinct peaks. For example this is these are blends of kind are twenty one which is a product number. It's simply a type of probably even a wooden fluoride and it's blended with a poly electrolyte which is probably starting cell phone. Eight and we look at these are different ratios of the blend going from zero to one hundred percent. And this was actually calibration data. So you can you can calibrate this quite nicely and then as soon as you made the library you can do I.R. measurements and get the compositions in your library and. For example this is one that was made where and this is an acceptable grading and you can tell it's not absolutely perfect but for the purposes of screening. It works really well and it takes about three minutes to make this. So you take computer stock solutions. Drop them into the syringe problems and three minutes later you have this gradient. So how do we actually so that means that a student can for a pair easily tens of these gradients in a day and depending on how large the measurement size is of say the conduct of any measurement. You can easily generate hundreds. Distinct measurements in a day. So we're getting we're approaching that limit where we could look at a large parameter space in a short amount of time. So how do we actually generate the data. The proton connectivity is the one I'll talk about the most essentially we're doing and kind in spectroscopy using a a four point probe. So the probe is placed down it's something that I'm sure most of you are familiar with from work in microelectronics perhaps. But essentially it's a pick and place measurement where we have this this head. That measures that has the four probes there tungsten carbide. Probes and the head is designed to always put down a pre-determined and constant load. So the needles are actually loaded with a constant force on the sample and the membrane is immersed in water that's free. That's essentially free. And in other words we've measured the of the resistivity of the water and then we're looking at the intrinsic proton connectivity of the proton center in the membrane. On the self unaided polystyrene. Has been acid treated so that. We get rid of the salts that might be present. Anyway we're measuring the impedance in this membrane and using impedance to back out the real part of the connectivity and this is an example of what it looks like we centrally modify we had a commercial instrument which you can barely tell what where that is any more. Right. There's a commercial Here's the base and that's the commercial had and then we have modified it with the robotics here to essentially allow. That very precise placement of the head. This is actually a second head that measures thickness. So that the contact to Vittie. Requires that you measure the resist to video in the film and you measure the thickness and then divide those so this one measures the thickness to within plus or minus a half Micron It's a simple gauging probe and then the connectivity had goes and next. And so you end up with for example this is a screen of the thickness of a typical membrane with the height here and micro meters. This is an absolute height. So the variation is approximately about fifteen microbe years and the reason that there's that variation there is there's some tilt in the instrument and what not. But we know what that plane is and we subtract it out later to get the thickness. So here's actual conduct Tiffany's and Milli semen for centimeter of some of the formulation. So this is five different kinds of current kind are which is the poly dependability in fluoride So we shouldn't trade names. OK but these are different molecular weights some of them are copal mothers. With two different fractions of the nobody in Florida and then we're looking at the content of one of our Probably electrolytes going from twenty to sixty percent. And so each row here is one coming from one library and what we did is on one library we can make these measurements and then. Along the composition direction and then along the other direction we just have repeatability repeated samples and you can see there's a wide range of conduct Tiffany's explored here and this is actually for example max's out at about one hundred twenty. No. Seaman's percent of the year so we have some that are performing much more poorly and some that are performing better. So another thing that we need that's important is understanding the conduct of it. He is a function of relative humidity so we take to take a relative humidity enclosure and sit it down over this whole apparatus. And then do controlled humidity and look at how the kind of committee changes as it goes from a dry state all the way to the liquid water environment. So one other thing I'll say a few words about mechanical characterization. Simply because this is a technique we developed several years ago that seems to attract a lot of interest collaborators and it's ideal for looking at for example nano composites where you want to explore compositional or chemical dependency is so this is actually an apparatus where we can do rapid. Screening properties in the way that it works. And we see what you see this grid here. This is a grid. That's about two inches square and we have one hundred holes in that in that grid. Ten by ten array and there's another plate just like that on the bottom and they align well and in between those two plates we put a film and so that film can either be a uniform film or it could be one of the gradients on it. And down here although it's hard to tell there's actually a little needle here invented and a strain gauge. And essentially what happens is this is positioned under each hole and then this is driven down of the normal rate and we get stress versus strain curves then it moves to the next. All and does the whole thing over and over again. So this is a side view this this time the needle is going down. And the samples on the bottom. This is an enclosure to control the humidity. Or the environment. All right so just to let you know if you look at one hole in this grid and you took a cross-section. It would look something like this where you have a frame sandwich the plates holding the samples and then you have under defamation the film looks something like this. Where it's being in reality there are some curvature to the film. Generally it's a small enough amount of curvature that we can treat it as a linear film and we can calculate the stress and the strain. That would actually correspond to a unique axial defamation. So where we're doing a biopsy go test because that's nice and convenient for doing rapid assays but the stress and strain we calculate is comparable to the the instruction or the S.T.M. test that better standard. OK. Let's see. So for example. This is a nylon stress versus strain where you can see an initial elastic response. This is actually building behavior and then we have some strain crystallization or toughening before the material finally fails and this was just a film where we've measured by for more creative holes. You can see that it's pretty nice repeatability. So even though this is a high to put test. We can do one hundred a grid of one hundred and about fifteen minutes. It actually is fairly detail and also you can look at the whole sort of low. Behind after you puncture them and you can get an idea of what the crack here was like brittle or ductile. And we actually get pretty good agreement with the conventional assays which are generally in the axial tests where someone makes a doggone type sample and measures the tensor properties along the axis and we compare that to our measurement we get a pretty nice correlation. This is a modulus For example you know a set of polyurethanes and we're very in the amount of curative in the polling your thing and by doing so we vary the modulus and this ratio. You know the correlation of R.S.A. to the conventional one is is a number that's predicted by theory. So. So I'd like to put a few acknowledgements of that work. And if I have a few more minutes I was going to say a few things about another project area. And looks like I do. So don't worry I'm not going to show every slide. Another area that we are working in is in biomaterials problem if I am materials and what our goal has been in this area is to learn to understand and to utilize the way that cells living cells interact with material surfaces and this is relevant to the right range of applications and really the key questions in this in this technology are how do you control cell responses in a predictable manner. When they attach to and grow on a surface and how do you sense the cell or sponsors. And the way that we are addressing these questions is by using pattern surfaces. So so prisoners that have a scale to Micron scale. Relief features on them. Where the chemistry may be homogeneous where it might be heterogeneous. So we're looking at chemical and physical acts as well as a combination of those two. And. Essentially. Let's see it just so essentially we're looking at combinations of multiple link scales. So perhaps a little islands in the nanometer regime combined with islands in a micro meter regime and then in a two dimensional pattern and trying to find out what are the spacings and what are the heights and the lateral link scales of those patterns that give you the optimal behavior. So for example. What's the what's the spacing. What's the pattern that give you the fastest growth rate for bone cells for example. In doing this we use the library technique I just talked about to make. Surfaces that have gradients in composition and then we can deal them on a temperature gradient. And the only reason for doing that is that due to the reactions that might be there or the face separation crystalline of the it might be there. We can force the materials face separate and achieve structures. You know from you know tens of animators up to this as large as we really want to make them. And then we have a way to generate on a microscope slide size sample hundreds of different surface features combinations. Surface features. And if you look at some established bioinformatics methods. So how can we understand. The interaction of these many site scales with the physical chemistry the chemistry chemistry of the surface. How can the correlate that to the biological work spots it's a very complicated problem. And if you look at bioinformatics methods that are designed to take the data from high throughput experiments and things like the genome project protein sequence and so on. There are stablished tools that people in the community. These communities use. To analyze state of but if you look at C.S.I. stands for cell surface interactions I was trying to be cute with the acronyms and never seem to catch on though. So in looking at these are all chemical type problems where the chemistry generally involves a small discrete set of combinations like amino acids or nucleotides in materials though we're looking at structures that are generally non-deterministic and they're not discrete. Meaning that we have gradients or we can adjust composition over a large range and we can adjust it. Smoothly. We can pick from. An almost infinite variety of material chemistries OK. So we have a huge amount of information and possible combinations to look at and what we really want to understand is how to relate that information together. So these are lives. You may have seen before. So from a simple experiment we make a library and then we can hear her eyes the features that are on the library for example looking at the diameter of done. Mains as well as the height of the roughness. And you know we have variations from a few tens of nanometers to hundreds of animators in height and for less than a micron to close to one hundred microns in diameter and that's on one library and these are little crystalline islands that are popping out of the surface which is a part biodegradable polyester. Here's another library where we did a coke limitation and the basic idea is to generate a diverse variation in structures so the bright areas are crystalline in the darker of one of us here. So then what we do is we take cells. Usually with a collaborator in this case. These are bone fell. And we call for them on this library all at once. So that all of the cells are exposed to the surface is the same surface features. The library of surface features is it's close to the same cell population and we look on the surface of this library. At various types of cell assays to find out what was the biological response. Of cells that were attached to different positions on the library and for example. This is circularity which is an indication of cell shape that it's been shown to to correlate with certain kinds of cell behaviors. And you can see that if you go to zero. It becomes in the lips. If you go to one it becomes a perfect circle. It turns out that we see a strong correlation between cell shape and position in the library. This could be here is the phase transition for the two problem are sort of blended together. And one thing that we notice over and over again is so in here the polymers are to. Face and I hear one face. So are you. There's no microstructure to speak of. But inside you get a range of micro structural sizes. As well and the structures and the cells almost always prefer to if we see optimal performance. It's generally in here in the micro structured regime. So. Here's another ass a looking at proliferation. So the rate at which cells divide. We can take the library after the culture is over as they sell proliferation and do simple. Fluorescent staining and. Look at the areas where you have highest proliferation and compared to areas where you have no proliferation no cell growth. OK. So the whole question is how do you at the end of the day relate the experimental details of the library meaning to temperature in composition to the cell response. And then when importantly how do we relate the polymer that's under there. It has certain size ranges. It has certain chemistry. Sitting underneath the cells. How do we relate the properties of that polymer. To the behavior of the cells in culture. Because we have and on top of that considering that we have literally as many as many distinct bots as you want to measure which is generally a couple hundred on a from the library. We have two hundred spots. Different compositions temperatures. But each spot has a different set of micro structures on it. And a different ratio of the chemistry's. So how do we take all of that information and understand which of those factors is the oil. The ones that are controlling cell behavior. So the way that we do it is putting everything in a database an Oracle database and carrying out some statistical tests to figure out which are the important factors and what are the trivial or not important ones so. The traditional method that we would use as engineers and scientists is a global approach where we would take an average of each property on the library and from the cells as well as from the surface and we try to make a plot of those two and see if there is a relationship. OK. And. When we do that we get something that looks like this if we if we plot the island size of the microstructure versus the cell. Proliferation rate meaning the fraction of cells that are dividing we get this nice shot down of blood. Which is very useful as you can tell. So we know that something's happening with island size but we can't really say what it is. So this is another one where we looked at proliferation ratio as versus cell density as the cells credit each other they're supposed to stop start shutting down their proliferation. And as you can see we've got another shotgun plot here. So what we've developed is a new informatics concept for the biological materials interaction it's what we call local metrics and it's based on the idea that the cells. This is an image where we have cells these green yellow dots are the nuclei. Of cells that have divided and you can probably barely see the red. That's the cell body. The other large purplish domains are probably copper locked on islands that are that are raised up above the surface. So we're looking here at how the cells are related to. The material they're sitting on spatially and we have some information the bright colored nuclei have divided and the dark colored ones. Have not divided. OK so. How can we correlate that well based on the concept that the cells don't they don't respond globally to the microstructure state of respond locally. So the hypothesis is that the cell. Here is going to be most sensitive to the structures in its local neighborhood. Not so much to structures that might be over here one hundred microns away. So what we've devised is a methodology for picking a cell in an image measuring all the distances to other cells and classifying those distances as either proliferated or nonproliferation and then doing it for all of the cells in the image and then doing this for the cell to the microstructure. In that image and then we know from our database we have the positions of all of these micro structures and cell nuclei. In this image and we have a couple of hundred other images and we know where those images were were positioned on the library. And then for each object in the image we know its size and various parameters from image analysis. So from all of this we put it into a database and we can build probability functions as a function of local distance between the objects. OK. So instead of taking a microscope image over the library and averaging the data in it. And in a kind of conventional regression analysis. We're actually taking that image and we're only looking at the local interactions those who are within about one hundred microns or less. And so we're in a way we're doing a huge histogram normal. Ation all the data and looking at it and in a different way. Surprisingly this this hadn't been done before. As far as we can tell. And the analysis of of cells interacting with us with features on a surface. What we notice is that these shapes have very distinct features in them. So this is the. This is the probability that a cell at a certain position from another cell. Would be proliferating or would divide. And so you notice that as you get within fifty microns they probability goes less than one it will only goes to zero. Because the cells were not divide if there are overlapping there are chemical pathways to shut that down but locally very close together. It's and hence in this small region. And there's another reason why that happens chemically So we've got this nice this nice tool mile to detect these cities phenomenon. And we've used it for example to map out. This is going over some of the statistics. Let's go back for. I don't want to see all this. All right so we've used it to map out cell interactions with the materials features. So. Here for example we're looking at the size microscope girl size in micro meters or little islands on our library and we're looking at here. The center to center distance between each island and each cell and the color tells us whether the cells were resting or whether they were proliferating Girling dividing and we can see that there are these hot spots of activity. This is actually a very rugged landscape so to speak in that there are a lot of peaks and valleys local minima and Maxima. But there are a couple of areas that show up as being. Areas with high activity where cells really like to grow and there are a number of theories about why this might happen. It's a it's a fairly new area of research across biomaterials research. So we don't exactly understand completely why this happens but the point is that trying to do a conventional analysis where we might pick a few combinations of these things and then look at it we can easily miss these these two combinations that show up at literally unpredictable positions in the in the face space but by doing it on a library. We're able to find things like this that show up sporadically. And that's all I want to say we can just skip forward here think I have a knowledge of slide and that'll be the end of the talk. So if there's any questions. Maybe there's a little bit of time left for those thanks. This was very good. So the variation on a library is always going to be more than it is in a uniform sample where you've controlled everything really well but it's still it's acceptable so if you hard to quote dumb burst stuff in my head but it's what the it's an acceptable standard deviation maybe within say fifteen percent on the diameter of the. Yes. So when I see that going in terms of applications one application is tissue engineering where you would desire to select out certain kinds of cell to have years for certain applications so you can use the features of the surface not just the chemical features but the physical ones as well to select out preferred self behaviors. Maybe causing cells not to grow in some area of the device and causing them to grow more on another area. Those are things we can learn how to control. Another major application is in diagnostics so designing surfaces of say lab on a chip devices and where those surface surface is pre-select. Certain types of cell behavior in order to do an ass a kind of thing. Certainly. Absolutely. Yeah there are a lot about the patients where a gradient will artificially influence their response that whole question there is usually these are cases where some kind of instability is created by the presence of a gradient kinetic instability and so usually that instability doesn't become active until the gradient exceeds a certain magnitude and all I can say is that we are several cases like the face separating polymer case and as well as polymers that segregate. We've looked at how the gradient effects. There was segregation across the seas and so there's a large range of gradients that have virtually no effect. But then you hit a gradient like a maximum gradient above which it changes the phenomenon drastically and usually it's when the size scale when the gradients is changing some feature like the thickness significantly over the same size scale that the phenomenon is occurring at that same thing. So the surface interactions have to worry all about silt room. But you know when we look at that same thing now. Exactly. And in fact that's that's exactly what we've done as a matter of fact that's it's very important when you're looking at trying to correlate the cell material behavior. The fact is there are millions of cells there and they're all in a sense communicating with each other and you have to accurately you have to appropriately. D. convolute those two effects and the truth is nobody really knows how to do that properly simply because there isn't there is no general mathematical theory that takes into account all the complex cities of the soul so interaction but we can do it at least in the numerical level or a phenomenological level because we can have a distribution function for the cell cell interactions for at least certain properties like proliferation and we have a distribution function for the cell material interactions and. By looking at also the pier like uniform services for example that feature list surfaces compared to the libraries where there are features weekend and essence to convolute roughly what's going on think there's a bigger day. Thanks.