Thank you. I'm going to ask you all to turn off your phones if you can. And also, please do remember to touch your trash before leaving. I'll try and d you later on. We'll keep the nice new floors and everything looking great. So I'm going to introduce today's speaker. Doctor Kyle Yasha. He's currently a peral fellow at Wash U and is an incoming assistant at UCLA. Mechanical Engineering, school of mechanical engineering and aerospace, but he's more on the mechanical side. So don't ask about foes. Yeah. Ch you want. He received the NSF graduate research fellowship, and also the Washington Research Foundation Postdoctoral Research Fellowship, does a lot of really interesting work around soft robotics, Haptics, perception, human robot interfaces. And a lot of it comes from his background our shared, maybe heritage as Native wine folks, Kanaka. And he's going to talk a little bit about that. Something really cool that I want to highlight, and part of the reason I know, Kyle was he's the executive director of Hona Scholars. So that is a STEM mentorship program, recognized as one of the top ten native STEM enterprises by the American Indian Science and Engineering Society. He has done amazing work in that area, that kind of goes far above and beyond how you support your community. So heading that off, I'll headed over to you, Kyle. Let's welcome our speaker. Thanks. All right. Thank you for the intro. We're good on the mic. I hear myself. Yeah, thank you guys all for coming. The lunch is very amazing. Today, I'm going to talk about some of my work in Haptics and soft robotics and also a flavor of kind of that community integration that was just mentioned. I'll start off a little bit about me. I was born and raised Oahu. I went to oma High School. Yep. It's where all the Novan go. There's also Puna Ho where Obama went, but that's our Enemy school that we don't talk about. Some fun things I like to do are scuba diving. This is one of the shipwrecks in Waikiki. There's a few that you can get to. Then I also really like hiking. Particularly hikes through forests. I am scared of heights. After graduating from high school, I went on to Harvard where I did my bachelor's in bio engineering, and then I also did a minor in African studies. That also helps shape a lot of the viewpoints that I have in engineering. Stanford or I did my PhD and masters with doctor Allison Ocamaro, at the collaborative Haptics and robotics and Medicine Lab. And currently, I'm a post fellow at Washington State, mainly working on kind of soft robots for orchards, like apple picking, and also interaction with humans. And then, as mentioned earlier, I'll be starting as an assistant professor at CLA next this coming July July 2025. Oh. Thank you. There's kind of three main pillars of my research in developing new haptics and robotic systems to improve daily life. The three pillars are tics soft robotics, and also community engagement. So in the realm of haptic, some things that I work on are developing new haptic devices, mainly wearable haptic devices, and understanding how people perceive these devices. So conducting user studies around that. In the realm of outreach community engagement, I work on outreach and education. Some projects that I'm currently working on are, how can we use use native values to inspire the engineering design process, using culturally based ways of knowledge and ways of thinking and how we actually contribute to our work in engineering and science. In the almost robotics, some work on actuator design and trying to build new actuators that do different types of things. Then at the intersection of these three bubbles. I also have a few more collaborative projects. So one that I'll talk about today are these crowd sourced and clinical perception tests that use smartphones. So the idea that we can do tic studies, but also integrate the broader community by having them participate in our experiments more broadly and use it as an educational tool to people who might not have as much exposure to what we do inside the lab. I also work on some collaborative robots, robots for agriculture, for environmental monitoring and surveying. This is a project that we're currently working on with apple picking in Washington, working closely with farmers and people who run orchards and seeing how we can develop really low cost soft robots to work alongside current farm workers. Also at the intersection of Haptics and soft robotics, there's a lot of new ways that we can build human robot interfaces to control these robots, such as using wearable displays and new tracking systems. Then also at the center, there's this area of social robot interfaces where we can use tools in Haptics and soft robotics that influence things like healthcare, different effective feedback methods. For today, I'll mainly focus on these areas looking at some of my lab work in Haptics and soft robotics, and then how I use some of the tools from my background to bring it outside of the lab and into the community. I'll talk about a wearable Haptic device, the massive open online platform for Haptics, which uses these cell phones, fiber reinforce, soft actuators, and some of the current and future work on collaborative robotics. First, I'll talk about this three degree of freedom pneumatic Haptic device that was created for the forearm. This is presented in two paper is a conference paper and a transactions on Haptics paper. But before getting into the nitty gritty details of the work, I'll preface this with some background in Haptics because it might be not as familiar to everyone. Haptics or relating to the sense of touch. So Haptic feedback comes in two main pathways. The first is kinesthetic. And kinesthetic deals with forces that are often across or across joints. So we can think of kinesthetic feedback as the location and configuration of our limbs in space. Kind of the motion of those limbs, the force large forces that we feel like carrying a heavy bottle while we're pouring water, and also like compliance of different systems that we feel kind of like through our joints. Kind of like trying to move our hands around in space, we use kinesthetic feedback to kind of locate where our limbs are. Kind of another kind of haptic kind of type is cutaneous or tactile feedback. And this deals with kind of the skin level feedback. Things dealing with temperature, texture slip on the skin, vibration, and force. When we think of pouring a glass of water, some of the kinesthetic feedback we might have are the weight of the bottle and the glass, of the orientation of our wrist in space. While the cutaneous or tactile feedback would be things like the temperature of the water, the temperature of the glass, kind of the feel that you're holding a glass and not a paper cup, and also kind of the small vibrations as you're pouring, you'll feel like that trickle and it vibrates the tips of your fingers. What mediates the haptic feedback in our skin are mechanorceptors, and different mechanorceptors are able to sense different modalities of feedback. There's things like pacinian corpuscles for vibration, marco cells for pressure, and a few other ones that sense things like stroking, normal force, temperature, et cetera. These mechano receptors are also distributed differently throughout the body. Someone who works on a lot of wearable haptic devices. This is also a really interesting area because when we're looking at wearable devices, they're commonly worn where we would wear a watch like on the wrist. However, in these areas, there's actually a lower mechano receptor density, making things to design around this space a little bit more difficult. When we think of haptic devices in daily life. Some kinesthetic devices are things like desktop devices that exist. Some people have used it for surgical simulators, dental simulators, where you move a pen around in three D space and there's a bunch of motors that will have you interact in three D space with an object and you rigidly touch objects, and you can feel it through this pen. Another example of a kinesthetic device would be an exoskeleton or exost where it's delivering forces across joints in the body. On the other hand, cutaneous devices tend to be much smaller because they don't need these big bulky motors to act across these have large forces to act across joints. So a lot of these wear cutaneous devices are often on the hand, where we have those higher density of mecana receptors. You'll notice that there are also in wrist worn or arm worn devices that are much larger. A cutaneous device that a lot of you encounter in daily life might be your smartphone or your playstation controller that delivers vibration. Many fingertip haptic devices can produce salient tactile cues. But as mentioned earlier, because of that sparse mechano receptor distribution, when we're building displays on the arm or other parts of the body, it can create a few complications. Because of this, most variable tactile displays on the arms take up a lot of real estate. Um, and these displays have been created to deliver social haptic cues. So things like haptic emos, where you can kind of deliver haptic cues that elicit feelings of happiness, joy, or sadness, also directional cues, kind of for path following path navigation, consonant articulation, so the delivery of language through touch. But you'll notice a lot of these do take up that large amount of real estate. Another method that people have investigated is actually by using multi modal cues. Instead of just having one device across the whole arm that delivers different patterns of vibration over a large space. What if we had a device that might produce vibration or type of skin shear or normal indentation. Those multi modal cues would in theory interact with different types of mechanic receptors allowing for users to identify different cues. The goal of this first project was to develop a arable haptic device that can simply use multiple degrees of freedom for feedback on the farm. There are two main research questions. The first is, how can we design devices to add more cues? So a lot of shear devices were one directional. But what if we could just give different paths on the arm and say like, nor South East West and all the intercardinal at 45 degree intervals? Also, how can we use rigidity and softness in designing these haptic devices? Ideally, when we're thinking of wearables, we want something soft and compliant to be worn. But as we'll see in this project, that doesn't always play out that well and that you might need some rigid components in addition to soft materials. This is a video just showing the device working on the arm, and we created this device that uses motors and soft actuators to deliver those directional shear cues at the wrist. Going over the design of this device. We use soft actuators that are constrained with fiber. These are fiber reinforced elastomeric enclosures, and they create all the linear movement on the device. There's also a tactor that uses a bubble soft actuator that goes above the skin. All of these soft components create all the linear motion needed. In addition to this, we use a rigid motor for a rotational motion to orient the direction that that shear will occur at. In addition, for this in lab test device, we integrated force sensors, position sensors, and offboard not shown here are also pressure sensors that measure the air pressure in each of the pneumatic components. That was done to characterize how the device is actually working on the skin when it's delivering forces. A lot of haptic devices that are warrant on the skin. They don't actively measure the force delivery, but it's most of them work on an open loop paradigm. I will say this device, we did the tests also on open loop, but we wanted to measure the forces to see if they're consistent in how they were being delivered. I won't talk about the data too much from those sensors and that analysis, but there was a whole subset of analysis on the forces, kind of the distance that each kee was delivered, and also trying to understand how that matches to how people perceive the haptic kes. Using the different degrees of freedom, the device can provide multi modal heptic cues. By combining the normal and lateral soft actuators, we can create shear. By combining normal and rotational, the device can create torsion. What's happening in this is that bubble soft actuator in the middle can bubble out. It'll touch the skin and create a twist on the skin. You can also add vibration to any of these degrees of freedom by creating pulses in the pneumatic lines or by oscillating the motor. This is an example of how we can have more traditional vibration queues that might be more familiar from a phone and this could be used for alerts. To test the device, we mainly wanted to analyze how the shear queues on the device worked. To do this, we did a user study that we had 28 participants doing two different scenarios, a four direction task, and an eight direction task. These are analogues of some previous experiments that we did in lab on devices that didn't work differently. This consisted of three phases, a training phase, a practice phase, and finally a test phase. For the four direction task, we were able to get an accuracy of 85%. And note that chance for this case is 25%. So if you had a participant guessing randomly, you would expect that they perform at about 25%. The average angular error was 19 degrees, and shown here is a confusion matrix. For those who might be less familiar, on the x axis is the delivered stimulus on the y axis is what is perceived. In an ideal case, you would have one hundreds going across that diagonal and you would want it to be all dark blue. That means that all of the cues that were delivered by the device were perceived correctly by the user. From this example, we can see that a lot of the confusion mainly comes in queues in the opposite direction. If we're delivering a queue that's up like in the first column, it means that about 12% of the time they perceive it as going down. You'll notice this is an interesting thing. Earlier, I talked about rigidity and softness in these devices. And what's interesting is that why are they feeling this opposite force? It's because the device has to be grounded on the arm, because it's a wearable. When you think of the forces that are being delivered, each force has an equal and opposite reaction force. So when you're delivering the cue, the device actually is grounding that on the arm, so you get that confusion in the force delivery. In the eight direction task, as expected, you would have a lower accuracy. This is 43%, which is still significantly better than chant at 12.5. In this case, we see that a lot of the errors in identification are at angles near or adjacent to the delivered stimuli. The angular error is 49 degrees, so that's a metric that shows that you're within just one boundary away from the desired. So in summary, we created this wrist worn haptic device that uses soft and soft actuators and a motor, and we were able to create a shear display on the arm that gives these linear directional cues. I didn't talk about some of the displacement models today, but that's in the paper. And also, we can look at and interpret different cues from the device an effective manner. After doing my first few years work in Haptics. I also I'll talk now about this massive open online platform for Haptics. But reminding to COVID 19, so I had some time in lab building these Haptic devices. But as I went home during COVID, talking about what I do to my parents and they're like, Yeah, I don't know what that is. What are you even doing? I was thinking of ways that we could bring it into the community, teach them about the work, and maybe use it as an educational tool. Another thing that I worked on during COVID was a Scholars. Take anything from this panel. Please take this. Come home. Come home. There are jobs for you. There are programs from you. We need you all. Come back to Hawaii. COVID is going to be our long term. Where that place is like your presence there is important and valuable. One very important thing which I didn't do, which I highly recommend doing is networking and communicating with people. Don't be afraid to ask Fill. Don't be afraid to ask your resources. The family and the friends maybe something like Hua Scholars. It is completely invaluable. That was what kept me going through different things when you're only. The only Coking class. We have Stem industry here in Hawaii, and that's very important to know that there is a job for you. Thank you so much for Oua scholars and for the continued success of this organization, the advancement of Stem opportunities in these islands and for the future of Hawaii and all of our people. Cool. That was like our little commercial snippet. But essentially, when I went back home during COVID, a lot of people were struggling to keep up in school with remote and distance learning. But it's not always about understanding what is going wrong, but it's about how can we use COVID as an advantage? Actually, Hoa scholars, we started out as an online stem mentorship resource. Actually, we realized because everyone's getting used to Zoom and distance learning and all these online meetings, it was actually a good way to loop Kanaka who are not in Hawaii to actually support people in Hawaii. It's about figuring out how can you use things that are unideal, but actually find new ways to benefit the community. Over the past three years, we've grown it. We've gotten a lot of sponsors along the way. Now we hold annual symposiums. We have a proposal competition. This year, we're giving $5,000 awards to students. Over 100 students in Hawaii apply every for the past few years. It's like growing out this new ways of thinking and new ways of getting ideas in the community. So using that focus. I also learned about a lot of other projects in the community that were crowd source. There's things in Hawaii like the King Tides project where they have everyone go out with their smartphones, take pictures of shorelines, and they try to analyze it to see how climate change is impacting the coast. There's also projects that crowd source bird watching in Hawaii where there's databases of birds and they track the frequency that birds show up. Um, but there's all of these crowd source tools, and I was like, Hey, what if we could implement that also in the work that we do in Haptics? So I along with a few other people at Stanford, we created what we term as the massive open online platform for Haptics. The premise of this is that we're doing these crowd source Haptics experiments that people can do anywhere. The idea is a participant could just scan a QR QR code, go to a website, download the app, and do a bunch of user studies, give us information as researchers, but also could help them learn tools that we use in our academic realms. Yeah, I got a few of my friends just using the smartphone and walking around campus. There's been three large papers that have come out of this work. The first is a reaction time study, which was just measuring reaction times to different audio Haptic and visual cues. Delivered by the phone. I also did a distraction study, which I'll talk more about today, and then also clinical tests where we can use phones to deliver vibrations to actually monitor peripheral neuropathy. We're actually currently working with doctors at Stanford and getting a larger data set and matching these to traditional clinical benchmarks. And what's really interesting about all of these projects is that in Hawaii, where there's remote communities or people on different islands, having these tools and clinical assessments or different apps for learning, it allows people in these more rural areas to reconnect with some of the things that we're working on at the universities. So today, I'll talk about one of these in a hart vignette, which is a distraction study. This is a paper in transactions on Haptics. So oftentimes we design Haptic devices for daily life, and a lot of these haptic devices are actually only tested in lab. So there's Haptic devices for outdoors to help people walk who are blinds, or for steering wheels, for feedback, kind of AR VR in that area, but The real world has a lot of distractions. It's really complex and challenging. We'll see if the video loads. Yeah. And there's kind of another researcher who talked about how running in the wild user studies is inherently challenging. So in Haptics, we're always building these custom devices, and even when we're using them in lab, it'll break every few participants and we have to redesign, rebuild. So what if we could use the phone as a platform to test and study these concepts in the greater world? So this project had two main questions. The first is, how does our vibration perception change in the presence of cognitive and physical activity? And the second is, can we use smartphones to broadly study vibration perception? Some prior work in terms of distractions and how we perceive vibrations has been done. It's been found that when you're contracting muscles, you have different sensation of vibrations. Also when people are walking around, they have different ability to sense vibration at different parts of the body. Also, in some pilot studies, it was found that when people are trying to identify haptic signals, they have a more difficult time under cognitive load. To study this concept, more in depth in a foam based platform, we created a smartphone that delivers a vibration response task and a shape memory task. The vibration response task consisted of delivering vibrations at nine different intensities with ten repetitions each. The user simply presses this red button every time they feel a vibration. So you can imagine you're holding the phone, the phone is just vibrating randomly, and we're just like press the red button every time you feel a vibration. Now, the shape recall task was done to impart cognitive load, and how this works is that there are four different shapes shown in sequence, and they're shown every 3 seconds. And they're shown in a pattern such that we ask the user to press this blue button when they see the same shape shown consecutively. So this is a play on the nba task, which is commonly used to impart cognitive load. So for example, if we had a shape sequence of square, circle triangle triangle square, you would have pressed that blue button when you see the second triangle. And yeah, the shape is just shown on the face of the phone. In addition to this, we worked on designing a phone case that can be used for these user studies for the in lab testing. The idea with this is that the phone case would help to control where people place their hands. We took vibration accelerometer measurements. The phone case was designed so that no one could squeeze the phone because then if they're imparting extraneous forces, it would then also impact that vibration perception. Um, kind of alongside this study, as I mentioned earlier, we are also doing that clinical study. So we wanted to create other stand ways of holding phones that we can understand what queues are being delivered to the user. In this experiment, we had 24 participants with four combinations of cognitive and physical activity. Denoting the level of activity is lowercase or uppercase. In this top left corner where a participant is sitting and simply responding to the vibrations, we denoted it as C P. We also had a condition where the participant is doing the same vibration response tasks, but doing it on a treadmill while they're walking. We also tested if they're seated, but then they have that N A shape memory task. This is the case where they have low physical load, but high cognitive load. Then also the case where they're doing the shape memory task while walking and responding to vibrations. In this case, it's high cognitive load and high physical load. This shows example data from one participant, and this is a psychometric curve. What it is is it's just a plot of the number of the proportion of detected vibrations for each amplitude. On the x axis, we have haptic intensity, which is that vibration amplitude, and on the y axis, we have the proportion of vibrations that were detected. As expected in the bottom left hand corner, you would have very few vibrations detected because they're harder to sense. Then when we have large vibrations delivered, we're able to detect most of those vibrations. What we can do is for each condition, we can plot the data for all participants, for each participant, and then fit it with a function. Then using where it crosses this 50% detection threshold indicated by the black dotted line. We can see how the different conditions affect their perception. In this case, we see that going from light green to dark green, this is a case where they have increased physical demand. We can see that the curve shifts to the right, indicating that they need larger vibrations to sense kind of that larger vibrations for detection. That previous graph was just for one participant. What we can do is plot all the participants on this graph and get averages. In the yellows and reds are individual subject data, and then we have the means and standard error shown in the purples and greens. Here we found that there was a significant effect of both cognitive and physical activity on vibration perception. When we go from the low physical activity to the high physical activity state, we see that there's an increase in vibration perception threshold. And similarly, when we go from low cognitive activity to high cognitive activity, we also see an increase in that threshold. We also measured how response time to the vibrations also changes between the two conditions. Here, we actually found that the main difference in determining response time was the impact of cognitive load that the user had. Here in the low cognitive conditions, they have this shorter response time, but when they have this dual task assessment with the shape memory task, they actually have slower response times. We also found that participants reported increased demand with task difficulty. This shows that the mental and cognitive demand that we imparted to the users actually resulted in an increase of user reported mental and cognitive demand. In summary, we found that cognitive and physical activity can impair vibration perception. Cognitive activity increases response time to vibrations and that widely accessible smartphones can be used to study vibration perception. And in the catch all is that I'm really interested in working with future collaborators on how we can bring other apps or other studies into this platform, and also bringing it in as an educational tool, where we can teach people about statistics and a lot of these other tests that we do. So now I'll talk about some work in soft robotics. So I'll talk about this fiber reinforced soft actuator. This was a robo soft paper all the way back during COVID 2020. So a lot of times we use these rigid robots and they work very well in predictable controlled situations such as in manufacturing, but they also pose risk to humans. So if you cold your robot wrong or the human goes within the barrier, It can actually cause harm to humans. One of the methods that nature inherently uses is by using soft bodied mechanical intelligence. Elephant trunks, they're flexible. They can move around, do a variety of tasks. We also have animals that can fit into really small spaces by using that inherent soft body compliance. And soft robots are one avenue that we use to kind of mimic this area. So we have robots such as McKibben actuators that use kind of patterns on the outside so that they artificial muscles kind of constrain them. Also robots that actually have gait and walking. One key thing is that there are these class of robots called fiber reinforced elastomeric enclosures. This builds off those Mcibbin muscles where you use these constraint patterns to dictate how an elastomer deform. What this is, it's just silicone, and then you wrap strings around it, and then you can cause it to extend, twist, expand, or bend. However, similar to rigid robots, a lot of them are one robot for one application. You have one robot that can flip pancakes really well, but it doesn't do laundry. What if we could have soft robots that can change their output and do different things? The idea was, can we create a reconfigurable soft actuator that can produce all of these different multiple output configurations. How can we actively change the configuration of these soft robots? Then how can we actively change these fiber patterns to dictate those outputs? So to do this, I designed this active fiber reinforced elastomeric enclosure, which uses active fibers that are constrained around the soft actuator. How this works is that there's two motors that independently control two sets of fibers. So there's two motors shown on the bottom. And then we also have a spool at the top because as you're changing fiber length around the actuator, that string length will also change. Then finally, in the middle is that soft actuator, which has a pneumatic input for that actuation. Here we can see that when we use these different fiber patterns, the actuator can actually produce different outputs. Now, the idea is like you could create this actuator that has multiple output configurations that can be used in a variety of scenarios. To characterize this, we just pressurized the actuator and tested the different fiber angle combinations and measured the twist and displacement. We also had two different conditions, one where we spooled the slack in the fibers and one where we didn't with the premise of how can we simplify the design? This is data shown at the highest pressure, and we found that the actuator can create varied angles of twist from plus to -60 degrees with two to four millimeter displacement. As expected, there's a predominantly symmetric response because the fibers are symmetrically patterned. Further, we can I'll point out a few things. So we can see that in the bottom left and top, right corners of each of the graphs, we have the maximum twist, which also coincides with the maximum displacement, which is expected because those fiber angles are aligned. So in the case earlier with those examples with the fixed fibers, there are fibers that do predominantly twist and they're kind of candy cane strike. In those cases, you would expect maximum twist, and then when they're cross pattern, like a double helix, then you would expect minimal twist, and we similarly saw that in these plots. We also found that there is a point at which the fibers activate and control the response of the actuator. We took some dynamic response graphs, and we found that at different pressures, you'll get different steady states based on how the soft actuator is interacting with the fiber configuration. To summarize, we found that these active fiber constraints can increase the range of motion and create other patterns that one to one mappings of robots to applications traditionally do. And currently, we're working on coupling these actuators, reducing the size. Our current one actually only uses two motors now, so we found ways to reduce the complexity of the design. Yeah. So After designing these soft robotic devices inside of lab. It also made me interested in kind of the area of collaborative robotics that I know a lot of people here might be working on. And we often see that robots are not well integrated into our communities. So to the extent that a lot of people will see robots is through their vacuum cleaner. But we also have robots that can do a lot of tasks currently, like manipulation, grasping objects. This could be really critical in home environments or in senior care facilities. So I think there's this huge potential for growth. There's a lot of reasons why we don't use these traditional rigid robots in daily life. Some of it stemming from the technology itself and the dangers that it poses, but there's also these psychological constraints. Fear of obsolescence, if you're trying to design these robots along with workers. I recently heard of a story where there is these robots that are used in hospitals and nurses try to sabotage the unit because they don't want to be replaced. I've also seen in some airports they have robots that pick up trash and people will intentionally just step in front of it just to piss the robot off. But there's a lot of these kind of things that people do when they encounter new technologies. And soft robots actually open up a really different avenue where soft robots kind of because of that biological inspiration and compliance, they might actually be ideal for collaborating with humans. This soft robot, it's called the Vn robot. It came out of the PhD lab I worked in at Stanford, but essentially, it's this everting tube. Where you can use air pressure to evert this tube and grow, so I can do a variety of things and grow through really small spaces. The best part is it's pneumatic and it's made out of fabric or this low cost plastic. Working along that. We use similar concepts to build these fabric based robot manipulators that can actually be used for collaboration. Currently, we're working on it in a few areas. But essentially, we can use these concepts in soft robotics to design new types of robots that interact with people in daily life. In agriculture, we're working on new soft grippers. We have a soft robot system that can go and grow out towards apples and pick apples. Some of my current work at Washington. Then also understanding these psychological aspects of human robot collaboration. Compared to this traditional rigid linkage robot, what if we replaced it with a soft bodied robot that would pose less risk to the human? How can we control those robots to have better aspects for collaboration by changing speed, proximity? Does it work in a cooperative or collaborative manner? So those are some of the questions that I'm currently working on and tackling, and it shows kind of how bringing out the primary research in the robot itself makes you think of how the robot can be applied to these different communities. As an outline, I covered a few different topics in Haptics and soft robotics, and then also how I use some of my background to bridge those areas into more community facing projects. Altogether, this forms like some of the future collaborations that I'm interested in in the Realm soft robotics Haptics and that community based education approach that we can use in research. I'd like to acknowledge funding resources for different projects, Um, I've worked with a huge I think it's a very large number of people throughout my PhD and at the beginning of my post stock. But shout out to a lot of the really strong undergraduates who made a lot of the work possible. And I really enjoyed mentoring a lot of these students and working alongside with them, and also my PhD advisor Allison and Posto advisor Ming. And I think yeah, that is my last slide. So yeah, I'm happy to take any questions. And thank you for having me. Questions. I have a mice if people want to use it or to yell out a lot of people. I do have a few that I can kick us off if you want. Yeah. Kyle, one thing I noticed, and I really appreciate the talk. And how you we some of your own journey and story india. There's a I say different types of students in the room too, and your presion move back in one sense. But I think a lot of students things I can ask are how do you do the scientific, technical, academic work and then also get back or work with intrigue to your community. I have an example. But how did you think of that when you were first starting on? What was best question. So I When I was in undergrad, I went on this retreat, and there was like a business school professor who gave a p. Kind of so we've all come from like cultural backgrounds with native wai values. But also, like, I'm part japanese. And I didn't know of it, but there's this concept called P which is kind of the balance of what. It's like, what you like to do, what the community needs, what you're good at. I'm on what you want to do. But there is kind of I don't believe. As I talk to other people, I think about money isn't a factor in what I do or what I should do, because I think as long as we focus on the three areas, what we're good at, what the community needs. I forgot the third. But as long as you focus on those three areas. I could start the list again, but I just said it. As long as we focus on those three areas, so money will always come after. So as a student. I was always thinking of, like, Oh, like, I really love kind of some maths. Is too much math, but some math things. So my engineering and connecting back like, Oh, how can I use this for the love of community. While I also accelerating the body of research that I'm working in. So I think about as a student, no matter where you are, high school undergrad student. There's always a project that you might be actively working on now, but it's also how can I use those skills to work on something that the community or I might be interested in as a side project. So actually pal of those px. The clinical test, it was all It all started off as a psych project and my was like, you don't file like it's you can b. You should try to write the graph and see what happens. Eventually between this large section of the lab and what I. So I just pursuing that passion while using existing skills, I think it is a great thing a practice. Any other questions? Thanks for presentation. I thought research topic is related to hepta feedback. So have you considered any passive hepta devices for similar. Do you have any experience on that? So are great questions. So one of the people who were Katy. She was a full the PhD. But she actually helps re a lot with these projects. S actually does a passive learning. So I haven't touched into that specifically, but I am interested in kind of these passive habit solutions, where it's not too much in your face habit. So there's a project that I'm working on now for, like, bias of robots using copic devices. But focus on do people like detect how do they respond to the pits itself? How do you use the bits to try to train them to do cyt motions. And I think that's kind of a whole another area that is a little near to me that I'm really interested in, but yeah. That's the best idea. A lot of the systems that we build are systems that are unocive, and kind of like naturally go in what we are So yeah. Do you see soft body robots changing your perception in regards to the fear of obsolescence? Because I know you talked about the fear of harm, but I don't know if, like the design looks far more approachable than something rigid. Do you see that like combating that fear as well? For human obsolescence. Yes. So for fear of human obsolescence. Found something that I haven't tackled yet. I will say that recently, I did a kind of a trip to Hawaii to see how we can design your robots for things in agriculture. And a lot of people are like, punch. It's really important to have the kids come out and actually work understaff all the loves. And I 100,000% degree. I think keys in using robots. It's how can we have them work alongside like human. At Allegion explicitly replaced like human immediately. So in the realm of kind of the ark now in Washington. There is a huge labor shortage where there aren't enough people to go and pay. I kind of arts is diminishing over time, and especially as climate change is coming, there's a lot of a lot more harsh conditions for the workers. So But to see. We have been working for 30 years on te robots. We haven't found a way to fully automate everything. So the curry that we're working on is actually trying to see how we can have these robots work alongside humans. So maybe there's fruit that are harder to reach or more difficult to grow. And how do we the robot work alongside the humans. I think things like healthcare, also. How can we have them alongside a healthcare worker? So like the robotic carts that go around furnaces. It's like I'm sure heart cannot really replace your job. Like, we definitely need you, but there's still kind of, like, things This feeling against certain things in robots. I think is to have kind of a design process integrate with community and people you're working with. And have them build their robots or have the build it with you. And also looking at areas where we can have them co exist. I think ultimately I don't think I think in a lot of cass, we won't fail to explicitly replace, but I think in most pss how can we work alongside it and ease our working? Can robots together? I guess this is a question about the future of Haptics. I feel like you went from a word digital appliance or digital system output where people were spurring hept options, but nobody really had had, had any of them to this exposure of all of us having the vibrating. Digital appliance. And it a very common output modality for digital appliances. And I just often have. Is it the right atic output for all of us? And is there a future where there might be something or something that would be quite different maybe richer than the vibration, which really only has, you know, you can pulse, you can get two vibration versus one or three, but it feels fairly impoverished. So I guess I'm wondering where the future of tics. Will be my phone. Simple question. I was a great question. A lot of you guys might have seen like metals that came out. I think Mark Suckerberg has a really cool so like new mil and you can grab things in spits. But along these devices that are kind of like technology like highlights show where the field is I think it's like still and I and a very early research stage. A lot of those devices right now are super expensive. So there's kind of ideas of how we can have these soap devices. That can be right next to the computer that we can use all the time. But I think that in the future, we will move kind of beyond simple vibrations from phones. I think that it would take a lot more work because of the energy use. So kind vibration waters, they're really small, but to deliver like sheer the Marvel haptic devices are arable devices that might and other parts besides the Hinds, that's kind of like the force output. I'll use more energy. It'll also here. So I think there's work in kind of the materials, and energy side, but also in the optic perception side. But yes, I think there is there are pathways to move beyond traditional phone vibration. So other areas that I think are very appealing are paper based optic devices. So because papers can be kind of machine kind of rod role manufacturing, really easy to mass produce. I think that might be one of the intermediate transition stuff, you might see it I won't give the years because I might get ca I'm really interested in so fabrics might be the next step for normal indentation, very simple, so low cost. I'm going beyond. That's a very provoco question. Imus to see what you think of the future. Awesome. Well, over almost the time, to make sure I remind you all again to take your trash and put it in the trash g. Let's take your time. T time and he'll be up here for a few minutes, if you want to come by and say, Hey, but otherwise, tanks to come.