Her research for the good side of this here is that. And the focus on this was just to. Get out of balance of course that time for you to sit this $100.00 hours to talk about a particular subject or just from their perspective. Just bring a larger audience together much want to have made comments raise questions. But it's way too good. To disappear most of this. Discussion within our college or college course. That robbery suspect design has the architecture school the planning the building construct. And it is. So cool I always liked music blasting. Everything that's the. Forums on the back of. The sun is the notion of nothing which we are forced. To do 1st we love to do these. Urban Meyer Yeah well this is. Good but. But but there is so much going on in this city to argue with a lot of us are telling us there's a great deal of work going on just how that relates to that you know we're not over we're not good scientists as are good clones or it isn't there so we. Respect the college and and we have for the 1st time. To get. Outside the college. And it actually costs are in college. But he also says director for the Center for Ethics in technology clean tech so something that you wear this I just. It's really. Neat. That we meet someone who really knows something about values just. To be part of this is so good to talk 1st focus and also talk about these latest hot. After that. I think I've I've done that like I said I have I think I've had that I think the last that she said I was. Sure I have said. You know who is the director. Was was formally. Married centers How city are serious for a city like this that they should. Have. Said that. Well thank you for the invitation Can you all hear me or right back OK thank you for the invitation again thank you for the introduction My name is Jason Borenstein I do work for the office office where I do studies along with the school public policy and a lot of what I do here at Georgia Tech is ethics related ethics of technology one of my main research areas is related to robotics. Within that area I work a fair amount on solving cars Autonomy's vehicles but beyond that I work on issues related to health care robots the ethics of health care robots including robotic exoskeletons some of what I look at is trust issues well and people might under or over trust robots and what kinds of arms may emerge as a result of that now of course today because the focus is urban automation I will primarily discuss Thomas vehicles that will be the primary thing that I mentioned and one of my main goals for my short introduction to this area is to talk about some of the ethical issues at different levels or different levels of cranial layer the are different levels of scale largely talk about ethical issues of 3 main realms 1st as related to an individual vehicle so there's of course a lot of effort. Relating to an individual self guarding car urban automation means lots of things but what all largely focus on for today's is a vehicle and more specifically that is a vehicle used by passengers more or less not in other words I'm not larger than a talk about commercial vehicles like trucks a large we talk about what we might all use down the road so to speak an individual passenger Autonomy's vehicle so in terms of the individual vehicle the literature discussed a fair amount the trolley problem which is suspect if you've talked about autonomy vehicles you probably run across that to some degree I'm probably not going to focus too much on that particular issue although of course I'm happy to discuss it later if you'd like but among the ethical issues emerging is of course the trolley problem or so-called no win situation where someone is going to be a harm to true ethical dilemma so that some of what's discussed relating to an individual vehicle also of course there's issues related to privacy so we're already aware that our data is being shared with lots of different kinds of companies and entities Well this will likely continue with this off vehicle because there there may be a lot of different entities interested in the data relating to that where that car goes and of course could reveal our travel patterns etc There's also issues related to how these vehicles are going to interact with. So for instance when we're going through a cross walk sometimes we informally or subtly communicate with drivers of cars we may give them a nod or nonverbal communication to go ahead we may waive them on so to speak so we may actually have the right away but we want to be able to go 1st because we want to take our time for example and whether those kinds of communication and those kinds of communications can be built into a car remains largely remains to be seen so that's one level brainy Larry or one level scale we could talk about the 2nd level is when these technologies you can interact with one another. So 1st I was talking about an individual car interacting let's say with pedestrians for example but the 2nd level is. Different types of self driving vehicles on the same roads how they can interact with one another so in the in the literature in discussions about Thomas vehicles we often refer to levels of driving automation level 0 to 5 and for those who are not terribly familiar with these levels 0 it sensually means there's no driving automation in the car a traditional kind of car that we would have had on the road in prior decades level 5 essentially means that the car can do essentially anything a human being could do in a car that it more or less mimics or can emulate anything a human driver can do would be Level 5 well for the forseeable future in the coming years and decades we'll probably have cars with different levels of driving automation on the same roads and this adds complexity to trying to predict how these cars are going to interact so for example around level 2 or Level 3 driving automation the car can take over some tasks for instance control. You can modify cruise control for instance or may be able to parallel park it may be able to drive on highways as long as there is some level of superhuman supervision but Level 2 or Level 3 of driving automation does require a human to supervise it doesn't require a human being to intervene especially especially in states the safety critical situations level for level 5 more or less is not expecting a human being to intervene but if you have these different levels of automation on the same roads it adds layers of complexity in terms of predicting how these cars might interact with one another so one car might not expect a human being to intervene and it's driving tasks but the other might and the ones that require a human being to intervene we know that human beings sometimes are distracted they sometimes are impaired you know drinking or doing other kinds of activities that can impair their reaction. Times so this can of course add complexities to predicting how they may interact on the roads so that's another layer layer of ethical issues to tend to look at is the safety risks and other related problems a can occur and then lastly talk about ethical issues at the broader system kind of level so among these kinds of issues that I'm getting at here at level at this 3rd tier of ethical issues is how are cities might change due to the introductions of these technologies so for example how's going to change or interceptions going to change where pedestrians can cross for example are we going to add lights or remove them from intersections so there are already predictions that we may have centrally managed intersections without the need for lights if most of the intersections are going to be used by Thomas vehicles for efficiency A and speed maybe you don't need traffic lights if the flow of traffic essentially manage so you might ask the ethical issues of merging there the issues could be pertaining to social justice so for example if you prioritize traffic for the Thomas vehicles and say that's the 1st priority then what happens to bicyclists and pedestrians so in centrally managed intersections if they're more or less built around a design for the technology what does it do to other modes of transportation for example and there are other kinds of issues relating to infrastructure and design of cities that may be built around these these technologies again largely I'm talking about autonomy as vehicles What does it do to parking spaces there's lots of different predictions about whether we'll need more fewer as a result of the creation of these technologies I would say at least in the literature I'm reading is not a straight forward to say parking spaces are going to go away it really depends in part on the type of ownership that may occur with these vehicles meaning roughly are we individually going to own the vehicle or are we going to rent them what's going to be the model of ownership vehicles so again that's just a broad introduction to different types of ethical issues emerging related to the technology and hopefully will discuss these during the discussion period later on. Thank you. OK All right. So we're going to shift gears. And full disclosure in no way am I an expert on ethics and values on urban nation but I'm doing a lot of research and have been for some time the diffusion of robotics and automation the implications for local labor forces local economies and the implications for the cities and how our cities develop So let me just share with you some of the things i a key findings from a key article I've read that tells me a lot about what I think we should be thinking about many of the unresolved issues and then I'll switch to talking about some of my own research. But but through this I want to the 1st part of really thinks more about sort of public sector perspectives within the urban context and what what these public sectors perspectives could be or should be according to in particular kitchen who published a really interesting article in. The fullest philosophical trance. The Philosophical Transactions Royal Society journal on the ethics of smart cities in urban science and then move to talk a little about perspectives from the private sector. So in the public sector Roland urban animation in Smart Cities kitchen defines data driven urban ism as a computational understanding of city systems that reduces urban life to logic and calculative rules and procedures and because of that there are serious implications for privacy data cation data valence and Geo surveillance and data uses for social sorting in anticipated Tory governance and there are a range of. Entities from which which are collecting this data this is urban data including our utility companies our transport providers our mobile phone operators travel and accommodation websites social media sites and. Crowdsourcing and citizen science as well as government values in public administration so a range of those who are just after you are not really public entities however. They rely on public rights of way and are often regulated by the public sector a few others I should have mentioned are financial institutions and retail chains are all collecting data private surveillance and security firms emergency services which is back to public sector and then even increasingly our home appliance in entertainment systems. So they because they are in many ways governed by the public sector and report rely a public sector right away. I believe the public sector has a key role to play in determining how we define intreat ethics and values in urban automation. So the promise of urban big data and what's very exciting about the research in the work that goes on it is that it can produce a highly granular longitudinal whole system understanding of a city system enabling them to be managed in real time that's the promise that's what people are trying to achieve. The original realist the piston driving the production of urban data was one that held that such data can be problematic Lee abstracted from the world in neutral value free in objective ways and this was subsequently criticized by social scientists for collapsing diverse individuals and complex multidimensional social structures and relationships to abstract data points and universal formulae and loss therefore ignoring the role of politics ideology social structures capital and culture in shaping urban relations governance and development. So kitchen identifies really 2 ethical issues key major issues in what is called data driven urban ism and they fall into the categories 1st of data for cation in privacy and data use sharing and repurpose and. So because of free time I'll just focus for example on the privacy issues there are multiple ways urban data collection can breach privacy it can be our bodily privacy. That. When which we have the expectation to protect the integrity of the physical person territorial privacy your personal space objects and property location and movement privacy privacy which I think fairly self evident Communications Privacy and then transactions privacy to protect against the monitoring of queries searches purchases and other exchanges. The key point here is that all of that can be breached by the urban data collection and. And in doing so the urban data science itself is not providing an objective neutral and all encompassing view of the city which initially was sort of one of the lot of promises of it so and indeed it provides a particular view of the city through a specific lens and this is something that is needs to be thoroughly mapped out and addressed and one of the issues again because of the complex of the of the subject but. There is some argue that. The ethical frameworks some of these issues that about privacy for example can be. Interpreted is that individuals are gifting or sharing their data on a voluntary basis or for that they're what they're getting for their data are beyond been the tangible returns so that it is not there is not a ethical dilemma in the collection of that data but the reality is that many of you may have had your own experience with this you know you click on downloading an app or some some purchase potations participating in some form of what will ultimately be urban data collection and in order to have access to that app or that service you have to agree there privacy rules it's compulsory you don't have any choice so if you're going to participate in many aspects of the urban economy in every society at this point it is compulsory sorry that you accept these terms which. Breach your privacy so. The benefit that sharing data the status called sharing data. Are in. The way which is collected are stacked in favor of those capturing the data which can subsequently be monetized and shared with 3rd parties in used against individual interest. So ethical practice would be to require those give to the data to have full details of what the data are being generated what additional data are being inferred from them and to have shared control and benefit and how data relating to them are subsequently use this requires full noticing consent and the key value of transparency with respect to the actions of data controllers and processors so that's sort of an overview of some of the dilemmas with and with me not providing any solutions particularly except for saying this should be the way in which it's approached. So shifting more to address concerns over private sector or an animation I'm going to just briefly finish up with. Him with something that illustrates my own particular view through a specific limbs the lens of economic development planning and briefly talk about how e-commerce driven which is very much an urban data. Phenomenon. Is is changing urban supply chain logistics. So this the disclosure on my own particular view is that. I make a distinction between what we call traditional economic development which is focused on the creation of wealth and jobs and sustainable economic development which my values in ethics say needs to be more than economic growth and job and wealth creation because it doesn't take into account the distribution of that job and wealth creation and who has access to higher quality of life and standards of living except in as probably many of you know one of the biggest societal issues that we have in our own country and around the world is growing inequality so traditional economic development really does focus on inequality that focuses on growth so my own definition and those that work more from a sustainability perspective say that economic development is achieved when a community standard of living can be preserved and increased through a process of human and physical development that is based on principles of equity and sustainability it establishes a minimum standard of living of all that rises over time as our economies in pants it reduces inequality between demographic groups that based on age gender race ethnicity and other criteria and spatially defined groups such as indigenous populations versus newcomers central cities versus suburbs and it promotes and encourages sustainable resource use in production. So looking at through that lens at e-commerce. It's it's a very interesting set of issues that are arising e-commerce for those of you that don't know is actually the the buying of goods the the Internet. It's one of the biggest forces driving the restructuring of our retail chains in wholesale chain outlets as well as how goods are transported so so there and it's very much it's having a strong footprint on our metropolitan areas what we're seeing is that there are enormous warehouses some of them as biggest $22.00 football fields or more occurring on the Metropolitan fringes impacting smaller communities and then there is the issue of what we call the last mile delivery into urban centers because one of the key elements of e-commerce these days is the expectation that you are going to receive whatever you have ordered. Potentially in as little as an hour but certainly within the span of a day or 2 and that really drives having really many trips going into urban centers so it also has creating a lot of competitive pressures on. The supply chain in the warehouse is because they have to have so much more activity which is in the competing with others the Amazons in the Wal-Mart everyone's competing with Amazon right so you have Amazon then you have the rest of the supply chain and they're trying to move faster and to do so less expensively so that the packaging can be cheaper to deliver exam so. The issue there is that. I was there a lot of land use implications I'm going to focus on the job implications. The issue is that automation is being used heavily to try and make the e-commerce or supply chain competitive our metro area with other major areas and that in turn is introducing more robotics and automation into warehouses and that in turn there's a whole aspect of this is their warehouse workers and then their truck drivers who are the other big group I'm going to stay away from truck drivers chase and are even talking a little bit about the Thomas vehicles and there are Thomas trucks in in development. But the issue is what's happening to these workers because the workers themselves and let me back up just for a 2nd so here is a. Ethic or a value that it's put forward by the United Nations Universal Declaration of Human Rights which says that everybody has the right to work to free choice of employment to just in favorable conditions of work and to have protections against unemployment that's a that's part of the universal Dellec Declaration of Human Rights which is not something that we see referred to very much as we're going through major restructuring such as what's happening in the comers but but if we want to employ that lens then we look at the work and see how that does or does not meet that that human right. Basically the work in warehouses is what we call precarious work it's low wages unstable work arrangements temporary employment that for the reduces wages there's under employment economic insecurity no employer provided benefits and a real lack of legal and regulatory protections so that's the work in the sector that is growing so much and for which there's a shortage of supply of workers at the same time that. There is other nation and robotics being introduced into it. Companies employ this model because they work in a market based economy and there are very strong competitive pressures which I was alluding to them the global come competition needs to. Need to out source a lot of their activities to 3rd parties in order to maintain their their competitiveness and also in many ways to shield themselves from some of the implications of the labor force that issues that they're. In they're having to deal with are having to offer conditions that are less than desirable. So the jobs themselves have a deteriorating job quality a lot of the jobs are now temporary workers particularly in the Amazon warehouses that are going up all over the country and there are several in this metro area and more in process. There temporary work they they have to ramp up during high seasons for commerce which of course is the holiday period and then there's a core of permanent workers in the temporary workers are always hoping they're going to make it into the core jobs but the only way they do that is by showing and they're being monitored continually with sensors in other forms of mation. That the that they are the most productive workers and these workers work 12 hours a day on their feet running and collecting packages and getting into other places the robots being introduced are mobile robots we we don't have the problem of picking robot going in be able to pick a package off the shelf result at this point so but the Mobile Robots are moving the goods closer to the workers that they can pick them up and in some ways that's an improvement but there is a lot of. Prog potential for automation to go into the warehouses that May the jobs or it may change the jobs and that's where the question comes back to the human rights and the quality of jobs that are there Kim as we adapt the out of the warehouses explicitly be thinking about this and looking for ways in which to adjust this work that will improve their conditions and the pyramids of their work and make and share with them some of the increased productivity and the revenue that comes from that with the workers so that's part of the perspective that I would suggest this is relevant to be thinking about. Urban automation for a work it also applies very much to manufacturing which is the other sector I'm looking at and I will stop there because I think it takes too much time. OK. But. I think it'll merge I don't know about this that much about the subject but I do think it's important to care. So I'm going to Sheldon most of you know I run a program around building information modeling simulation. That whole side of. Building representation which is now becoming key to. So the interaction between information and the built environment so again where I came from was using 3 D. models and data for design engineering construction and the inevitable consequence of that is now that we have these digital assets when they come out into the when the buildings done we have what's called a digital twin which is essentially digital asset that is a mirror both spatially and in terms of information for for what's happening in the built environment. And by connecting the digital to the physical through sensors. And actuators. The. Is this increasing convergence between digital and physical aspects of the built environment so that's where I've been coming from historically this stuff has been fairly done but it's been ubiquitous so there are many sensors in this room they are here for one purpose which is to. Turn things on and off in optimize usage of resources but in fact the capabilities can be used for many other things and I've been pretty clear for and I'm I have to say I'm excited about this possibility of the impact the potential impact of digital understanding and its increasing convergence with physical experience and operation in the built environment so this is very much a piece of where things are going for me. I sometimes get a little bit polemic or overly excited and talk about the the built environment as a as the next generation complaint form for computing and I don't think that's necessarily. Just a metaphorical So the notion is yeah I mean you know again this is a great prop right and now I have one of the and pretty soon you know it will be on my fingernail but more importantly it's going to be in the world right we're not going to need devices for interaction with information it's just going to be around us right so. So you know some of the applications that I'm getting excited about are again historically has been build building usage and performance how do we save energy how to make sure the lights aren't turning you know how that aren't sitting being turned on when we're not using them increasingly in things like workplace there is a notion that you know basically most office buildings you know have lots of names. On spaces but they're empty almost all the time I think there's sort of like 25 percent or 30 percent utilization so there's new ways of thinking about work and the relationship to personal space in the in the commercial or business environment. And lots of stuff about sort of tracking and monitoring the usage of space to make that more optimal again in terms of consumption of space but there is something more profound going on where there's a potential interaction between the ways that we use space as workers potentially and productivity and that's something that you can monitor very effectively and start making hypotheses about that there's a lot of interest as Nancy said in retail. And how we can better understand people's perceptions and experience of Commerce and consumer you know consumer interactions there's something called now there's a term called impressions which is you know if you go into a store you may not actually buy something but people based on your behavior in that store can figure out how interested you are in something you know people are coming up now with dynamic pricing and advertising and things like that. So you don't get very far before the questions of kind of ethics around information usage and you know who owns it where is it generated What are my rights around that all of the things that are happening online start becoming part of the spatial experience but of course you know and you know of course there's Alexa and things like that but the difference is people are remarking as you know a web page you can close your browser but you can't close space I mean it's so there's a possible sort of in escape ability of that. And I have for a long time at least mentally I don't think of our public say publicly said this Well I'm not an ethicist but I don't think you. I don't think that's an excuse anymore so I think even though it's not my area I think it's something I have to be interested in so. So I would also say that you know I'm I'm working with one of the things that I'm fascinated by also is. You know the kind of the history and narrative around modern Sometimes it's called cyberpunk architecture but there's kind of an interesting heritage in science fiction about these kind of things lots of really interesting imagery in movies and possible speculation about a future type of experience of space that is this kind of hybrid of physical and digital augmented reality all that kind of stuff so really the next generation of architecture has to assume that it's a hybrid digital physical experience right and I think that's really that's probably a fun although often dystopic thing to think about so as I was realizing that I had nothing to say about this last couple days at 3 AM last night not like last night but the night before this thing showed up which is a new book by. Zubov OK some of this will be apparently the Thomas Piketty equivalent but it's called the age of surveillance capitalism and this will be if it's not already the New York Times bestseller It seems like it will because it's striking a nerve and I'm going to read a specific passage from the. Mature have the right one OK So surveillance capital surveillance capitalism unilaterally claims human experience as free raw material for translation into behavioral data although some of these data are applied to service improvement so you get a better experience the rest in this is really fast and the rest are declared as proprietary behavioral surplus. So there's the stuff that you're aware of that you're exchanging within the environment and then there's a stuff that can be inferred and current legal and political boundaries have no. Even no recognition that that exists but there's all we all think we all know this is sort of ancillary hypotheses that are being made about your behavior that you don't even know about and the only people that know about are the people that are collecting inferring from that data so this behavior 1st surplus fed into advanced manufacturing processes known as machine intelligence and fabricated into prediction products that anticipate what you will do now soon and later finally these prediction products are traded in a new kind of marketplace called behavioral futures markets so this is is intended to make predictions individually and collectively about what we will do not just what we're doing already and and hypothesis this book is that this is a new model of capitalism and I think again you can't get very far in the stuff and to get to the politics and the economics of these things so really quickly to wrap up I think we're all running long. I haven't read the 600 page book because it just came out but there's a great synopsis of it in the Guardian and I think there are the guardian kind of straddles there are you know they're very liberal but they're also techies so I think they have a they see I think they see. A set of possibilities around this so the conclusion I don't think this is just on is conclusion but it may be the availability and control of personal information isn't a technological inevitability. Privacy can be introduced but it takes it actually you know it's going to take democracy and you know and and the will of the people to take a position on this because right now it's not even so. Part of it we don't even see it I think there's aspects we don't see. I would point to some positive technologies that that deal with this one is encryption is a great I don't know how many of you I mean everybody uses encryption that is a public you know a public open source technology that is ubiquitous about protecting people's information it's very simple to do it's almost impossible to crack which is why you know governments are trying to put back doors into these things because it actually is fairly bulletproof at least today there are cameras. That do onboard processing of information so they may infer behaviors about crowds but not actually be able to do anything with the video of that. My wife as I was talking to her this morning about this mention Tim Cook has certainly the big one is the European general data protection regulation G D P R which is you know a a set of democracies that are staking a legal claim a legal position about this there's a lot of pushback in many parts of the world Tim Cook from Apple has suggested that G.D.P. are become the law of the land in the U.S. and it's above and beyond sort of the you know the to do goodness of that there's an interesting hypothesis because Apple at least 6 times that they don't make money there's a reason why you pay a 1000 dollars for their phone and 60 dollars for somebody for Google's because Apple makes money on the phone they don't make money brokering your data so I just leave it that that but this is you know these are important topics they're not unsolvable topics but they are important for us to all be thinking about and and acting. I think you find can. C OK Yes OK great I'm going to build upon Nancy and Jason and Dennis have said and those are great points that you're making so I'm going to dive a little little bit into that the space that I'm going to be looking specifically at is how do we consider all of this. Accessibility and inclusion and ethics and this is one of those conversations that we've been having for years yet we're still standing at that crossroads of where disability and ethics and technology really intersect and still trying to navigate and figure out where what are the next steps it goes beyond beneficence and justice right Tanami it really does get into the specifics of what ya'll have been discussing as to you know who really does drive the driverless car who's in charge of that who's information is being collected and how is it being used and what does the work force look like is there a place for people with disabilities and true inclusion in the workplace as we move forward so thinking all of those things through. Back in 2003 I did a workshop and it's been repeated a few times so that this has been one of those issues that we've been wrestling with for for a long time when it looks at ethics disability technology and actually referred to this it was the click to continue piece was the name of the talk and the point was what information are we giving away what information what's the exchange when it comes to if you have a disability and you actually need that or that piece of software in order to live more independently or to do your job or to navigate your community what information are you giving also a lot of those and we've talked with folks about that the language that's used in those terms of service it's exclusionary language it is not written at a 3rd grade level or 2nd grade level it's often written at a much higher level and late legally and so people don't actually know what they're giving away. When it comes to information so that's one of the things that we've been kind of struggling with are thinking about in the space that we're in we're also finding some big gaps when it comes to policy to practice whether it's looking at Autonomy vehicles or Autonomy technology and seeing you know here's what 5 o 8 says which is the law the Rehab Act even though done specifically mentioned some of the technologies when we're looking at urban technologies it actually does cover that it's important to make sure that their excessive will and that everybody is able to access that wicked 2.0 and 2 point one. There are web accessibility guidelines that actually take that conversation even further and I would say ethically and this is me and we're going to push I feel like we have to continue to push the conversation is that we should be writing to those and we should be thinking about inclusive design every time we design whether it's Fema sitting at the table looking at their policies and struggling with the ethics of as we redesign and rebuild Puerto Rico how excessive well does it need to be made I would argue that it needs to be completely accessible and we have to incorporate universal design principles in that and so thinking about all of that this week on Monday I was actually doing a keynote at a conference and there was a representative there from the President's Council on intellectual development on disability and it's a person with intellectual disabilities and we were talking I was talking about some of this and I was talking about the ethics and he said I'm afraid you know those are actually the terms he was using because we have a history of being excluded and so how do we make sure that we are included and if we don't help right the policies are help right and design in the story is bad for us and so thinking about how does that how does that. Layer on to some of what you're saying Nancy about the way data is being used and how do we collect it when it goes dentist to what you're saying about what data is actually being even collected with the lights on the sensors around us and if we're not part of that conversation what does that look like there's also a history of layering on not having the design process from the very beginning including folks with disabilities or you know different abilities what have you so there are easy solutions that you know could be easily overcome we're still seeing buildings I was doing some work up in Canada and there's this beautiful building yet the excess of the entrance and it's a building that's only 2 years old the accessible entrance was still through the back door and ethically I have a truck I have trouble with that we shouldn't have anybody going through the back door and so there are even simple things when it comes to design yes it can be beautiful on multiple levels and so ethically should we be designing for exclusion or should we design for inclusion elevators we know a lot about how we could make elevators more accessible using our voice using sensors and the number of things some of what you're talking about Dennis when it comes to intuitive shopping or what have you I bet if I'm stand in front of an elevator you could probably intuit that I want to that to go somewhere on that elevator and then figuring out maybe perhaps where we want to go and thinking about other things when it comes to just in general rebuilding. The World Health Organization has jumped in on this and they've actually in the midst actually were there's an article that's going to be coming out soon specifically about ethics and widespread global exclusion when it comes to technology and it's frustrating to see that still happening but I do believe that we here at Georgia Tech and a lot of our colleagues that are in the industry are dancing in that space where we can actually make a big difference Tim Cook I was glad to hear you bring him up and. When he was speaking at Auburn University in 2013 he actually said people with disabilities have been left in the shadows of mainstream technology development he saw it as an ethical dilemma and he said we're going to design for inclusion and one of the great things is whenever you buy your cell phone and I buy my cell phone I can use it the same day you use it even though I use a screen reader and in the past I would have had to download something or upload something or any number of things or sign my rights away which is which is good to know so the main idea there and we're seeing that inclusive design has served Apple Well Google nothing a big fan of Google but you still have to download and know how to navigate that and we find that it's not intuitive How do you download the successful features and so there should be something to consider there when it comes to ethics John Sanford and I at the very beginning of the new year had the. Great opportunity to meet with some of our colleagues at Microsoft and Amazon and we got to meet Microsoft's one of their very 1st employees that they've hired that specifically is focused on disability technology and ethics that's that's her space that's what she's saying in and of the questions that she is bringing up is we have to question everything things as simple as you know now with word you can. You can change you can scale the understanding of the speech or you can say you can actually create a range of readability So the same text could be written at a college level but I could also scale it down to a 2nd grade level. Problem is they're collecting a lot of data on that they know who's doing that what level they're doing that at and that's a problem and so she said at the table saying does an employer need to know does you know a teacher would might need to know what level the student's reading at but maybe the employer does not need to know that you know so thinking ethically about if you have a print related disability what does that really look like they're doing the same thing with Powerpoint is it ethical that we still allow the development of non excessive will power points when we could actually design every powerpoint to be accessible for everybody what about video is should it be somewhat accurate if you're going to do the caption on You Tube which is not what the law says it doesn't say you know 32 percent accurate or should it be required to be 100 percent accurate as Dr Ballenger insists upon which I agree with also thinking about gaming and other areas and how do we make sure that we're thinking inclusion from the care the console all the way to the avatars and thinking what would it look like for my avatar in my game for tonight which I've been playing with my son to actually use a wheelchair is that OK it should be OK right so I think inclusion and this all does tie right back to ethics and what are ethics and just like you were saying Mansi what are our values so thinking about all of that with Amazon one of the things that we've been we were talking about and considering as maybe it's about gradation privacy when you actually say yes you click to continue Yes How long are you going to keep the data what are you going to do with the data where does that data live in and how is it going to be utilized You know in the long run and making sure that we have a solution there when it comes to that so there's also very specific issues when it comes to safety and some of the folks that John. With there actively in dialog about should Alexa be able to unlock your door now here's the deal in Georgia right now there are a bunch of folks who cannot unlock their door to their home yet they live in a home but the reason why is because they can't physically lock it Alexa could actually help you lock the door you can use your voice to lock the door but you can't use your voice to unlock the door and some cases so wrestling with that and thinking about safety even in a very tangible way so making sure that we're thinking about all of those things so as I'm concluding. Courage to think about are you employing an ethical framework throughout your design process what does that really look like have you really wrestled with some of these things if you want to have that dialogue I'm happy to hang in that space and I'm sure John would be too. Is it ethical to design knowing you're going to exclude that's one of the questions that I think we've got to keep thinking about as we rebuild in areas that have been completely devastated should we be building with government dollars if we're going to exclude as we have factories that we've created what would it look like to build in the fact of a person with a disability could also work alongside automation and how do we scale that and we can do that you know so thinking about all of that are using you X. accessibility testing procedures are you employ in the news in folks with disabilities to help from ideation to deployment and thinking about all of those things and how are you staying up to date on what the cutting edge is when it comes to ethics and also to technology and how those intersect we created a massive open online course we have 10831 folks that participated from 176 countries one of the biggest areas that people took and really focused on and commented on had to do with ethics and. Do with universal design and these are issues that are not just being discussed in this room and within our college but what does it look like across the world so I believe we're collectively brilliant I really do believe that's not fake you know kind of nice thing to say and I do believe that we can create truly inclusive environments for everybody whether it's in the home so people can live in communities of their choice whether it's in the workplace and there are a lot of folks that are you know I think continuing to push that conversation forward as we move everybody out of the shadows as Tim Cook would have said and definitely into the light so. Thank you for inviting me. So. I'm not feeling welcomed them. Here's. My thing and you can. Be. Justified. For. More. Weird words for your writing here are. So weird. And. You're in this class and it's so and I actually think it's a good story I must dance who is with the parents. Today. And that's the time but the rest of. It's not there hopefully right. Here Yes I mean it's true it's actually what I said yes you're right absolutely and a lot of times the values of our society and the ethics they absolutely have had to be prescribed and described in detail within the law and sometimes it still misses John and I have talked quite a bit about John Sanford and I've talked quite a bit about how the A.B.A. is not when people tell me their idiot complaining like you know that's not even almost where we need to be we have got to be far beyond that and so Jason that you want to have anything Gary Fink has is being alluded to that's a long conversation but I was I would least say the law helps to set. Of least in some context a minimum we should not go below in terms of design for these technologies but I think what we need to be doing in ethics is in part in reviewing the law and see where it needs to be revised and where. Failing because a certainly and you it's much easier of course to see historically but you can certainly see where the law fails to pick up in terms of protecting different groups or individuals rights making sure that for instance as Nancy was alluding to whether we have meaningful employment the law as you know give us a certain kinds of structure in society but we need ethics to continue to revise and review those laws over time and as you're alluding to it is not as simple process it is messy in part because we do carry with us different ethical frameworks different values that in part is what requires us to have a need for conversation in review over time to see what our principles are a given time what our values are to given time and whether they're accurately and faithfully being reflected in our laws let me let me add to that. This is why I think I can get away with saying I'm not an ethicist because I'd like to think I'm an ethical person but parsing all this I mean that those are deep philosophical humanistic topics right I think the relationship between ethics and law is at least tractable but you know at some point you come down to the question of right and wrong right which is a much harder question although it seems more obvious right. When I was thinking about being provocative I was thinking about you know and I was talking to another fact a person last night about this this book and this question and I think in this terms of this question about. Privacy for here is all of those kind of things it's it would be risky and inappropriate for people of our generation to have assumptions about what future generations are going to think about this and I did none of undergrads in the room to ask for a survey but I'm actually wondering I mean well OK yeah I mean how much do you worry about data privacy. See I mean because you know people that you know younger people are sharing this stuff all the time right on you know and I think you assume there's boundaries about what you'll share and what you want but there's clearly that boundary I mean I don't put anything on social media about myself and many people you know let it all hang out there and that's just part of how they achieve. Well you know value and popularity so what do you what do you can we ask the. Please. Do you do you have this background sense that you're being observed all the time and are you open and if you do you are you OK with. I think. So again I'm just I'm so I'm going along but I raise that because I don't think we can apply early 20th century values and principles to 22nd century on circumstances it will be a whole new ballgame and I'm not sure how to go. There are things one. Cat would you know when you're 15 years old what the implications maybe of something that you're sharing and how that could come back to haunt you when you're 40 years old or whatever age you know so so there's this. There's this I think societal obligation to try and understand the risk involved and to be able to to train in teach young people and everyone about those risks so that's one part of it the other thing we can't we can't really last there but Laos actually cannot anticipate in keep up with how fast the information surveillance possibilities have have been how fast that they have occurred and been developed that they the law moves in a very slow process for good reason because when it's finally gets into place it is the law but we can't we can't really rely on the law in in many ways to resolve some of these issues without going back to really what you're saying is you say I I think of myself as an ethical person in that there are rights and wrongs in our There are universal rights and wrongs That's why I sort of resorted to thinking about the United Nations universal human rights of humans because can we have something fundamental that begins to guide this and then use that as the limits to try and about what's going forward so. But do you have a yeah. Right. Problem. Yeah that's an interesting observation one of things I didn't talk about and I'm sure Shoshana does in her book as well is the fact that data became Sri package with other parts of partially its own data and it creates a much fuller picture of who you are than you would have ever predicted from agreeing to have this data collected on you by this one chuck so that there really larger issues associated with that as well that aren't in no way being resolved at the moment so so that's the some other part well yeah but I was just going to I was just to say I'm Dennis I think that's a very interesting concept and I'm going to think more about it about whether or not we should be. You know defining what it is in the future I would agree on some level but I really do believe that we keep repeating the past errors and huge ways of mistakes specifically with marginalized communities and I think until we have truly equal rights and true leadership that we've got to keep pushing and hoping to find shape so that people understand what the stakes are of the past and all of that. The exam law and all. The other comments questions from. The audience. Well hopefully we've given people something to think about it's like as we're also I think Carol in support of the song in Jason in terms of thinking about this but did a sigh and great that we should be thinking about this. And and hopefully it's a beginning dialogue for the college and for our colleagues in other parts of the Institute too to be pursuing these key questions as we advance in our nation and other forms of technology thank you very much for coming.