[00:00:02] >> Well hello everyone I am really really excited about having this conversation with my colleague Professor Yana Howard. It's very interesting that we are having a book conversation in person but we have some colleagues joining us this is a clean the room which is refreshing and we were commenting that we're we've managed to have the 1st sort of physical book presentation a long time but her book is not physical it's an audio book which I have gotten loaded from the cloud and listened to it just a beautiful book The other thing that you need to know that I've learned one of the things we have in common. [00:00:47] Which is the fact that both of us are introverts. And that I've learned that about you in the book but here you have. A conversation about a virtual book by 2 introverts. About very difficult topics what can go wrong. Well or do just one so I think it's so much fuller for agreeing to discuss it this has been just a fabulous book to listen to really and but before I ask you questions about it I want to get your sense of trust as someone who's dedicated your life to robots. [00:01:29] I want to get your sense of trust in robots doing things in your own life so I want to ask you some examples of robots and units I mean would you trust it or not there's an. Easier one would you trust now I'm I told you you know into introverts I mean I was going to hers for. [00:01:51] A vacuum cleaning your house. Yes I had a car so that you wouldn't mind that Ok mowing the grass a robot knowing that Jeff and I had. Even up. As I always have a comma Yeah Ok but but otherwise but that's just restricting the use of How about just in your car yes. [00:02:31] About And this may for Shadowline or more complex conversation how would a super cool robot policing your neighborhood absolutely not with that. So by the way just for for all of you to know the book that we're discussing today it's an audio book which is beautifully narrated by Amanda Stamberg. [00:03:02] Who is known for her role in The Hunger Games and Dr Howard just gave me a hard time for not knowing that before the but I'll tell you the performance the reading of the book is is phenomenal Now I know that President Obama just announced his book and he read his own book Walk me through the process of having someone else not write yours yet though one I was going to originally marry. [00:03:32] You how someone who you think is the study since she kind is like reason to thinking like who is it and they said now for anyone who isn't a science fiction alike. They're like yeah like then she didn't like his movie or else like you know like yeah I was like I'm not doing it. [00:03:56] Actually it's it's pretty awesome I have a little bit of it so you can you can hear it we're going to use a very advanced Georgia Tech technology. That city. If I could collect a dollar for every. Day I is going to take all of our jobs I would probably be a millionaire Actually I sell for is another question that makes him have to worry that if you make robots smart it will destroy us. [00:04:27] If we only knew how not so smart robots really are right now in these situations and take up the challenge to explain both sides of the story I feel it's my duty to share my knowledge while at the same time sharing some chips is Israel shocked by profession I'm roboticist also my oldest black a member of ours live other I still feel pretty young or care for the rest you have to buy a. [00:04:58] War or you can download it from a local library near you. So where you can you can see your fact that. Ok this is life we did have a short short conversation about this and you share with me that tell me what your mom said when she heard the voice So she actually said you know she sounds like you were 20 years ago which is actually a compliment I thought if you listened to it so I have a curtain humor and into science fiction so if you're into science fiction you will love it like I totally get it. [00:05:36] Sounds like me young man. No it is it is so and it is it is a beautiful piece and I love the way you weave and your own personal life with very profound questions about Ai and robots that even people especially people who don't know anything about robots should be asking themselves if you blend those one of those that maybe you can talk about sort of your discovery of saying this is what I want to do professionally and I think you've you've mentioned the story of the maybe's when you were spending time at the j.p.l.. [00:06:19] Sounds a little bit about how that happened my there was my 2nd I can job as a student it was a sudden and turn Yeah and the 1st task I had was programming up the lights so it was like in a database and I think I finish it because you know it was like. [00:06:38] They let me explore and one of the nice things like that then I was like no security you just walk in and open the door and though there's one room I went in and I remember I saw one of the very 1st rover I saw this manipulator arm that was doing surgery on and on and I also like it's just like the stuff in the books I think it's well it's not just on the left and so I was like you know well this is real stuff not imaginary I'm going to. [00:07:06] And you did now that you share a few If you situations you would be probably yours and judge you find another black woman doing robotics so for many many years in your professional career you are yet I look in the mirror like how their school must i'm And in fact you call yourself for a lot of that time an anomaly right and dealing with that. [00:07:33] Did that give you any advantage I mean I can imagine that this is about it as of having to prove yourself but is there anything that you got from that from being that one in one anomaly Yes because every time you walk into the room. I am like everyone is I talking in one way I'm like I don't know and so you bring that in 1st in your eyes because you just don't see a thing and I think. [00:07:58] So so the book answer a lot of time. And that this was an eye opener. In many ways of of how you know you would think that once we built a I and robots. That we would build them better than us in some ways right like in some how we have managed to transfer some of our worst attributes to them and they can success may come. [00:08:33] Racist though how how did that happen if even for folks I mean there are folks you know who are computer science students many of us are not. How does admission How did we manage to do that 0 one of the things is when we design robots designing that we build them in our image just like Frankenstein right it's like we have this concept of the robot supposed to work with us or even just and so you don't really understand our own biases and so when we're creating a I'm creating robots when you're just mimicking us not realizing that we're also making. [00:09:10] Our own bias but how does that happen in practice it's not like someone starts programming an Ai and says I want you to be as racist as I have that right how does that happen in practice yes I'll give you a really good example of the missions that's in the Imagine I'm a developer I'm programming and in the dataset and so maybe I'm a black female like let on the status and when Ok I'm going to put in the one hand and Spanish I had black I have 4 I look at this is great I train my algorithm Well guess what at the 60 year old because I'm not that old and so it doesn't even occur to me that like older people use and like maybe they'll be walking outside and they need to reckon And so then the systems get it all right in real world scenarios in surveillance in policing now the data sir are the data that was left out that was unused. [00:10:05] It's like I'm on a weapon so you know if I'm a 65 year old the better stay at well actually they are almost actually Ok now but. So what's very interesting is that. In a way that explains why it is so essential that we bring. Gover city into technology right she said because if the process you describe where we use the data sets that we're familiar with and whatnot Well if ai is driven by the time profession which right now tends to be young again. [00:10:48] White and male then that's what we're going to get it is and the thing is that's why I am so passionate about this is that the world is diverse and so if you think about what the world looks like it's not even us centric I haven't yet center on you and even I know that I think about things that you are Central on you and so if you have a very small group training technology for the rest of the world were guaranteed 100 percent no part of the service is that men want to see. [00:11:20] So let's talk about some examples like Ok scare us about what what can go wrong so there's of course there's the case of all Facebook doesn't recognize my face because I am a black woman well it kind of sucks but Ok Facebook did recognize my face is not the end of the world but there are some of the other examples you mention in the book that are kind of bad Yes So give us some I will give you an example and facial recognition because you've heard so recently actually the summer there was a black man in Detroit that was falsely recognized using facial was arrested like Ed is house and I think in front of this kid you're like yeah well you know now imagine that same sense recognition software it's actually was the late 1st of aliens of old us of the seat and those were young college students right in all shapes all sizes and so now think about it you anyone here in the falsely recognized and your midst. [00:12:27] Ok. Let's get on the on the on the scary theme here. Legal system and. Like are we already using I think I heard that in your book that we're already even using some of this has systems to decide on whether. You know you'll get out on bail or not or. [00:12:56] What can go wrong yes so in some political systems and more citizen. Is being used to decide whether you should stay in jail or not and we already know that. Jail system at least in the United States and elsewhere is already buys We already know that and now we are I mean I. [00:13:17] And so you think well maybe they thought they were already criminals or in this into a longer but you know in the eyes and all of the young students you know mean it it just made it almost as if they select and are just once and they had a record and just let anybody I will now consume it for like in jail I actually think now I think so but in a way so let me let's let's talk about solutions because if I if I'm going to train even if I'm incredibly well intentioned and I'm working in that field of Ai applications in the legal system the data sets that I have of my disposal already reflect that history of bias right so if let's say if black people are getting more severe penalties for drug possession let's say and that's the data that I'm using to train my systems in a way I'm going to perpetuate those biases right so what what what is the I'm trying to judge of the things that we. [00:14:23] You can still think about it like historically women couldn't vote like today we can and still think about it that if we know fundamentally what's right or wrong and so we fundamentally know that we can design the system so that they are better so they can learn are better so you know if I'm designing a search algorithm you know why is it that most of the doctors surgeons are male I mean like that's a falsity because you know medical school now it's now 5050 so what does that mean that's perpetuating the past the stork that we want to move out of the we couldn't change anything any change and so I think I can. [00:15:03] Change that how. Do you do that the right is like we need to take responsibility and so I always say like when you go into search I'm going to choose like the 1st 3 options I do you ever go to like the 2nd state or the 3rd Age Absolutely not I mean some is like no this is not and so they show the research that Len will see let's use the 3 and the 3rd of the top 3 don't like your bias and your search and you like well and I'm sorry actually find out something else that's wrong and so one of the ways is these are Ai systems It's undated like if everyone in the world literally went to the 3rd Age The results would change but of course people are going on and that's just one example of our so you're talking about that even the users of Ai can help make better if I thought you gave an example I don't remember and just write like give us a you know who won the soccer the World Cup you're going to get the men's. [00:16:04] Right work on the rest automatically that's going to be a. And then you say I as a user what what kind of I mean how can I help shape Google so that we'll go becomes less biased going forward yes a little Thanksgiving the data reflecting our own desires and vices though. [00:16:24] You might say woman start right are men suckered I'd be very explicit it will give it to me and that means that factor in general won't always be referred to. That I know if I put in gov It already knows of my filiation it gets what governors and I and still think about if I explicitly say Democratic governor Republican governor independent governor it's going to basically unbias itself because we're saying look I have a bias I'm going to clean it we're not going to have an assumption that there's one that's right and that so and that requires a lot of discipline on the side of the of the user so I guess that's very interesting sure what you're saying is it's not just that sort of Google search was created biased is that because we are bringing our own biases when we discern which we're making Google biased The reflect us that's reflecting ice and they try like every so often they'll do something like We're going to fix this and then what happens us humans nest it up again and because it's retraining or not so I tested this I think you're mentioned that in the book and it hasn't changed yet because I tested it yesterday I mean works so I asked and that I would love to hear the story behind this I ask Google do black lives matter do this at home and you can ask Google you can ask Siri and I don't know of a thing I like so you can ask any of them. [00:18:03] And I was quite surprised by the answer but that's not an answer that the system has been trained to produce that someone who has sort of a human out that has gone in and said Ok this is not right I have to fix it. But how do you do that when there are millions and thousands of questions that we're asking that are going to provide those buyers answers I mean this one is such a one temporary big question that I can see someone and Google say Ok let's not get ourselves in trouble let's change. [00:18:46] Overriding when people and so I'm always about empowerment of us I think because when we have a collective and we have a voice in change again example Green I think about how many companies one green how it's it's not incentive is not like they're making money from going green but yet people think no this isn't or it's about sustainability though with all these things if you think that I do not want my voice assistance I was in a voice like if Oliver said like immediate choice to choose male or female or British or American I think it would change the companies want to make sure that their rights are. [00:19:22] So that's one of the things that I've been sort of obsessed over after I after the book which. I had never reflected on the fact that the default voice on Google is a female voice and I have never reflected on the fact that the default Morse on Siri is a female voice Why 1st of all is why all these the different companies somehow made the same choice as they are catering though as you know this in the u.s. most assistants are female and so it doesn't even occur to you at your assistant shindy so it doesn't occur to you that your Ai assistant in. [00:20:07] I thought it was catering on they're going to like this and I'm going to try and whereas it's a male only like human interaction be using a series they're kind of weird and so they want to capitalize on our end of our. Dough Ok this is a very important question for Georgia Tech in fact I asked that question to our colleague. [00:20:32] Charles he's Bell dean of computing who was with us today which has given that state of affairs and all that is a stake great knowing that Ai and robots are quite literally penetrating every aspect of our of our life how we get that knowledge with the disease how we get treated if we are arrested and how we know that the list goes on how do we make sure that more people of different backgrounds join the ranks of Ai companies and and and program Sydney University so that we start changing things I think this is going to win or you know who want to join attack but I think one of them is on civilities as educators is to ensure access to Ai eating but also ensure ethical and so it's like it's not just teaching the latest Java code or teaching the latest Ai algorithm it's teaching a dozen Ai algorithms in the dataset is going to be online something that is so retraining because I actually always say like I talk about these stories and now criminal or citizen Ai but those developers went to college right and so some of the responsibility actually resides on us as educators because we train we didn't train that part of them and now are like well that companies and I do those back companies the older our students you know and and yes I think we can if I thinking how to reach a nicety now. [00:22:11] Going to your own personal story which is a big part of the of the book. And there are some characters that come in the story who had a very positive in Europe and some other characters not so much right and they probably didn't even know what they were doing when they were a reaction to answer and ways so those of us who whether it's we we teach or we advise students or we mentor students who may be foolish and in anything that we can do to be better at making sure that we're not sure cutting but then surely incredibly successful careers and I think I know that's a big question or not but I think one of the things is always the lean and whatever your work you know sometimes you might think I just don't get it and I'm going to be a very computer scientist I think that's wrong by because I know that some of that is because you see someone that doesn't look like like yes well they don't look like an f.. [00:23:16] And so it's believe in every single person that comes to that that is going to be the person that finds a cure for cancer that is going to be the person that because I had a fly in and people like that believe means that you are taking in all the anomalies and maybe you will find if you don't know we might have been able to find a best the 1st hole of it if that one student 20 years ago had been turned off we don't know that's right that's right so I'm going to make I have more questions for you but I want to make a it just a reminder that if people have questions and if you're watching us online you can enter them in the chat if you're here in the room you can catch them and they will be introduced Chad and who is going to be asked in that Russians are you are Ok Do you have any questions yet. [00:24:10] Ok why don't you go ahead with with what I'll save mine for next so we have one here Dr Howard you are an extraordinary leader school chair scholar Beecher mother and Community Champion how in the world do you do it all and I'm also was even struck her as. [00:24:33] Though I blended all with things that I care about and ask that actually doesn't feel like I am doing 10 or 20 different things and so because of that my I'm an educator but my education also links to my research and i'm other but you know my kids are engineers and scientists obviously. [00:24:55] And so really it is I do one job it just has different tasks that are socially if you are a kid if gone. So Dr how what made you decide to get into computer science in the 1st place I'm weird hybrid I was classically trained as an engineer but I asked myself computer science because that then I don't know what computer science I want to do robotics and robots were built with hardware so that meant engineering but I like the intelligence and so I got into that my 1st program was I'm going to eat myself something called Basic and then Pascal for those out there that know what that is and know how to act I know both to me. [00:25:42] You just aged I don't. And so this was back in pretty much middle school high school now to please you mention the problems of a technology being us centric How do you suggest expanding that world wide because diversity is enormous. This is a with the fact that most of the companies are present and basically one area of the United States majority you know some tendrils elsewhere what's hopeful is that something is are starting to go into these areas so those companies in India Africa now is a continent and things are interested and because of demographic and so one of the things that is like fishing for these companies but also pushing for programs like you know Georgia Tech to have a Georgia Tech Africa that's right so that we can also influence how the students are grow so they have a workforce they can work on Iraq that Georgia Tech Africa you saw that right there yes yes yes. [00:26:52] It's very much in my heart yeah please go ahead this question might be for both of you what's ups is Georgia Tech taking to address these issues that you're discussing in the curriculum for online master's programs actually which kind of past talk to us later here. How government is being it completely out of nowhere actually has an answer to this question is a Yana is teaching the 1st class and the online master science computer science so if we were around ethics and they dissent is that not correct then that is hurt but if you tell us about it. [00:27:30] That's that's what Adidas prose. Style I when I did the same thing to him. So I created a orse ethical ai for the students you know and of ramping it up one of the nice things is that at some point it might be Arment just like you know algorithms and really it's you know you learn a little natural language processing your information recognition you know enough to wear them but it's all under the Jane what does this mean in terms of disparate impact what does this mean in terms of different groups and things like that so trying to retrain the students at a coming out as masters of computer science that has understanding of the ethical constraints and implication let me let me jump in then we recently announced the establishment of. [00:28:22] Open centers for Ai and ethics. And. So the the broad question is why why does it matter why is it so crucial that we devote efforts to that. I'm still the reason it is a secret and the world is adopting a I read in the celebrated. And like I say the train has left the station and people are doubting and companies are pushing for it because people say I want this quality of life like that and we have to do it. [00:28:57] And we have to do it so that we can retrain I thinking of engineers computer scientists I'd like playing in the paper like I don't I was trained like gadgets and cool things and yet I don't really care an evening out because I'm using it and you have to do that but it really requires a concerted effort of engineers computer in mind as opposed to social scientists really carve out what this is so I know I I. [00:29:24] Failed the test on The Hunger Games Yes but I did read past them all growing up Ok Ok And and it's it surprises me that when we talk about ethics of robots that we still go back to Asimov who by the way wrote it as part of a novel he wasn't even trying to build a theory of of ethics for robots but. [00:29:53] He can't seem to have made much more progress so how do we so let's play examples of of of I mean why it is so essential that we come up with at least a series of events a call contact for a robot. Though for those of you who don't know the 3 Laws of Robotics and is you know that children are making an. [00:30:21] Example for that would be. Helpful not commenting and being then maybe we should and great killer robots aren't like that's rule number one but then what happens if you have a nanny robot in your home and someone breaks in and is about to do something that your kids. [00:30:40] Well and Mr realize as well sorry that's not film though are overkill the kids and their And so this is the dilemma so well number one that does not work so. And of course you talk in the book about the classic dilemma of. Driving a car going down the street and this is not any more true hypothetical right I mean we already have Tesla's in other cars with some of the systems built and but are we just are is going down the street. [00:31:20] And you find yourself in a pickle if I don't do anything I'm going to run into this pedestrian the wall on my left I can't do that I turned right I callate different pedestrian any have to choose a or b. now in the absence of a reflection about ethical principles for that robot because the robot would have to choose a random or something so how do we get to that because we didn't we don't even have those answers ourselves us humans in fact you mentioned the different cultures The only problem which I'll tell you a lot of researchers kind of arthritis because I like and this is not going to ever happen but depending on where you are right so if you're in a cell phone I mean are we have to kill someone that's that's really what it is if you're in places where a social economics is either you kill the poor people right if you have a place where women don't have as many rights you kill the limit and so it's actually highly coupled with the values of the region. [00:32:37] So that means that now the world is going to be governed by the values of 27 year olds in Silicon Valley I mean this is a frightening prospect here. Not arguing with you I have no answer exactly Yeah Ok so I was yeah all right so how do we do better than that give us some hope Yeah so one the hope is is that we really think about this aspect of how do you bring all of the individuals worldwide to the table designing developing and deploying these robotics so that's what the other is and ensuring that we all do have a voice even if you're not a developer a computer scientists we all have a voice we've had movements this summer from just everyone that has changed I was in the regulations I we all have a voice as long as we speak up so I think it's as well and that applies to Ai and robotics I was so surprised when they had this vanishing recognitions off and and cities that's then of all things is right I said when I did this was like my gosh there's a movement and it's impacting the world because people are fleecing their opinions and just saying hey this is wrong it is better. [00:33:56] So and you are part of the gang that is developing this technology that and you're telling us we should trust it actually that's when we started this conversation right when he said I would not trust the robot to police my neighborhood so eyeing that I'm an optimist I think that the president's going in are actually better I thought you look at a guy for medical diagnosis Yeah there's bias but if you compare to what it was like before people who lives are being say if you're looking at robots for health care and education there are more kids that are being trained because they have access to Ai and right so it's everything is shifting in a positive but we need to make sure that the positive is as a for everyone equally and that's really where I think. [00:34:48] So Ok I have another. Set of questions but I'm not going to open it up for another if you from from the uninspired. So pretty easily people who didn't want to go to college went to work in factories etc In addition salary however these places of employment are no longer available and automation is taking over more jobs this large group of people are disenfranchised How can a guy not add more division of society. [00:35:16] So I do worry about that and there's efforts going on to try to bridge that it's an old divide and divide now and so one of the things is if you think about Ai ai can be used as a partner to do retraining and so I'm not a computer scientist necessarily but I want to be an accountant I don't go to school for accounting but I'm a smart person and I like numbers I like math like and you know like all those things well if you pair that person smart with an agent that does know if. [00:35:48] You can actually do training on the job to grow and so we need to do a lot more of that you see it in some places there are some manufacturing plants for example that are using virtual reality and Ai and scaffolding like on the job training and so we just need to do a lot more of it. [00:36:08] Right. Follow up to that question about how high school students who did not go to college for whatever reason how do they prefer prepare for this role of Ai and I think you touched on it there but would you like to go further in that yes actually I am going to I'm going to highlight a program in this organ are you. [00:36:27] As of that see this all call data words and she actually is aching individuals who are not necessarily in college and she's making them data wranglers there's a number and I meet all this data what do you think is curating the data and labeling the data and making sure that Ai can run on so there is a whole new job that exists because ai is out there like what happens when your Ai software goes bad or as you wrote I think we have you know Apple Genius Bar We don't have an ai genius are like we can train a whole workforce of like this is what you do when the I close vertically I go through and things like this you seem to be moving back and forth between Ai and robots is there something special about robots having a physical form that makes them special. [00:37:19] Cell the way I define robots and Ai and I put them together so robot is in a physical embodiment but it has an ai brain was a idea itself it is the brain which can be virtual as well as couple What And so that's why I use them interchangeably in a fact if you Google robot a 1st rate choices will actually be an ai virtual agent I'm in jungle but given that we do know and this is some of the research at Georgia Tech as well and when you have a physical embodiment there is something different about the interaction people actually on more with the physical engine I guess and it doesn't even have to be a human or a it could be like a verb a that's automated if you actually bond with this is a little more than maybe the virtual you know as this room show through the human to human interaction the human robot interaction is very much. [00:38:09] And if I may add to that because I have Benji your lab and I love that the from shapes of robots that you were experimenting with and some of your own research actually deals with that question right off of what type of robot inspires more more trust and what have we learned I mean is that the more human like the robot is the more I trust there or the more it's Rich me out you know the the more that it needs and and move with me it doesn't matter if it's in shape or I haven't so one of the robots I have is basically a tent and with arms it's an emergency evacuation robot and people just are just as much as my humanoid robot that actually has a real shape and has voice recognition because it moves and it has the right thinking that's all you need. [00:39:03] That that's very interesting so in. My my prior job we actually deployed we were the 1st cops to deploy these little robots that deliver pizza and Starbucks to you. And they're like a cooler on wheels and you basically order your it's in computer knows where you are and and 10 minutes later this little guy shows up and then you get your pizza and if fascinated me to see how people interact it was the 1st time I saw robots interacting with people at scale and I feel like since we didn't know what to do with robots like we treated them like pets like people reacted to them like how cute like you almost like we wanted to pet them or something and it's a cooler on wheels and is that. [00:40:07] It is normal. So surprised when I bring my robots out just like equal and outreach I mean there's like you know cameo shots and I'm like no I guess this is unusual for a lot of people but yet it doesn't matter if it's if it moves it's like after it is fascinating and the bad people treated them like pets even for bad even they were bad people that did that things to these robots like after like trying to Ops I down you know have a little robot I mean it's amazing in that it like the little things droop the the forest is not still. [00:40:47] There in silence and. Unfortunately the thing yeah give us some examples because you mentioned some of those in the book to write yeah I did so these are just a couple of examples so robots in Japan where there was a couple of them where the kids in a sickly you know threw soda and marked it up with a crayon because you can do anything you know they'll show not on humans they know that there was a robot that was you know across America so it was fine when it was an era as soon as it came to the Americas it was beheaded in 2 weeks right like that tell you something about how u.s. centric we're here right so that's an example of you know violence against robots you seen engineers and new to the thing out the you know robot dark and people are like this is so horrible so horrible and you think a robot it's mechanical It doesn't even know what's being heard but people willing and violent and people have this reaction of well that's just the whole. [00:41:51] I'm a couple more. I was I was very intrigued by how. Someone who's been trained as an engineer. Which each is in a college of computing. That you had the guts to write a book on sex race and robots like say Ok and I pick any like harder topics that by the way that bring you outside of the realm of programming a computer and they bring you in the realm of big questions we're asking ourselves and you went there and you actually even because of the timing you went into it and the racial issues that we're living with I mean you went all out. [00:42:40] What was this book inside of you like it had to be written or did so want to I think about as an engineer you know where and I so introvert right like we really have been trying to think well. I mean Else we would go into a different major. [00:42:59] And I think that we have to leave it right because as an engineer I'm going to expect I'm going to respect and the computer scientists I'm going to respect this and that's just the way it and so if you don't win and when someone that's not a computer scientist you know you should care like every you're not if you're not trying so you don't understand and so I think. [00:43:22] Right you bring that layer of the Gypsy. And I guess you cover all the boxes so you have to. See various any additional questions so we do have a number of questions so bias is drawing a lot of public attention now which other answer which other ethical issues related to Ai do you think are currently under the radar the students need to be aware of someone is looking for a dissertation topic there right now that's. [00:43:55] So one of actually does a couple One is we don't do a good job about people with disabilities and older and older. I mean even when we talk about you know that race we very very little the ages of her little bit are not those who have special and so that's an area that I think we need to do a lot of work in fact if you google artificial intelligence and disability you basically get like opinion piece you don't actually get hardcore like this is a tool that can enable and engage a so that's one area the other that's kind of it's practical lady is this whole area of weaponized. [00:44:38] We kind of know it goes like bans on like killer robots and now like well these downs are going on and people are signing up you know those acts were deals and so I think that's something that needs to be put out a lot more about what's actually going on in the different countries including the u.s. that's life and I think they are and what role does the public have it in the sales you know what just deployed and so I can go I can literally continue going and I can then another 2 and those are just 2 that come to south. [00:45:10] And you also talk about. His book how if if the robots are going to earn our trust of course issues of transparency and accountability. Are essential. The problem of course is mean I charge I have a little bit of experience with the. Stone age of neural networks back when they never worked. [00:45:37] But I'm out of having I got enough of an understanding to know how impossible it just you know how they work once they've trained that they will train them right so you trained you train one of these machine learning algorithms are the data set and now they're making decisions that can affect my life like whether or not to get it taken or whether or not I get a restaurant I get bail whatever it is but there is no way to actually open up the machine learning system and to know why the machine did what they did when I guess it judge can do something that I disagree with at least there is some hope I can get an explanation but not in these in this systems is that of a like a structural impediment. [00:46:25] Is that there is a whole field out there and roughly explainable Ai where the ai black right it comes then it comes out I'm just like at believe in justice but in in this whole world of explainable ya is giving human understanding to the decisions that are made so when it may allow an hour from now ultra religious at 01 of the parameters in the ice ending about is many gender our age it also allows us to people like that's totally wrong whereas before I was like yeah that's right it's like little thing or your premise is incorrect and therefore this is wrong all right let's see if there's any in the last couple of questions do you think that the way we treat robots reflects the true way we would treat humans if we did not have consequences in our society. [00:47:22] I think it was. Yeah. Ouch And so the very 1st word of the book a good example is sex. And so there's actually just so you know there are brothels with robot partners and they do know that in some cases there are cases of violets and so what does this mean it means ask people have some inner desires that they're not yet willing to have against humans but the robot is theirs Ok One of the things we worry about is at some point that might translate into human or do we have a way to measure and assess bias within Ai systems like we do with implicit bias and how do we trick changes and bias so the community is working on this of how do you determine bias when there is no one solution I think alas I think there's like 21 definitions of what's vice an Ai and that was made in January so I'm sure there's like double that now there's no agreement just like with implicit bias you know there's really no agreement what that means but the positive is that we're still working on it a negative is that there's no clear answer Are there any programs that allow people who are not in an engineering program to 2 or labs or facilities for their personal educational purposes I'm going to interview or. [00:48:58] Yes I will tell you George attack edition Lee opens up the labs at least robotics labs once a year we have a week that's in March or April called robotics week and national and during that time Georgia Tech opens up all the robotics labs of the universities wherever you are open up their lot so that's when we. [00:49:20] Got Ok one last question. Really related to that how do you encourage bright high school students especially students of color to pursue a degrees computer science or engineering when they have the skill but not the interest. Yeah interesting enough at Georgia Tech there's one program that's out of the College of computing installations and one of the focus on solutions in engaged I school those into this is going to morning computer science but also wanting and thinking about it as a possible career link or one of what it does require you to touch and ask contact with the younger generation to say hey what this is possible this is good this is what can happen and you know I always say you know a Georgia Tech grad how much they like as an undergrad student and I show them the number and like what do I have to learn and so that's a that's another motivation don't let me. [00:50:21] Begin to wrap this up and 1st by giving you any anything else that you would love to highlight about about about the book or or you know what that of maybe your journey in writing it or what you've learned from actually writing and anything you would like to yes I want to things and just a summary we don't talk a little bit of the story but one of the things is like a lot of times students or individuals will see someone like they're so successful you know they must've known where they were to what they were going to do you all have a story and a lot of times that story realm your impressions in your career and so I always say it the story because it happens to all of us we all have stories about one somewhere and then feel like longing he added yourself now haven't. [00:51:14] We definitely do. But I'll tell you what I had. I don't know how many of you watched the latest Space-X. NASA launch the other day. I I still get Chelsea every time we send humans into space it's like it's odd that we've been able to do that and what was different this time though is that the 3 persons were commenting on the launch were 3 women. [00:51:48] The engineers are going about what was going on of course even the diversity of the astronauts that we were sending to the International Space Station and were also very diverse and I couldn't help by thinking and what those moments the masses that it that it sends to kids back to this last question and in any way I feel like a. [00:52:14] Fair or unfair people who have been the anomaly as you have carry that with them right now that sort of almost this added responsibility but it could be also a lot of fun of encouraging others to follow suit and I appreciate very much what you do in that regard Listen to this book is phenomenal and the interpretation is absolutely a lot of fun let's give our guest a great round of applause thank you thank you thank you thank you thank you.