[00:00:05] >> I'd like to introduce our the next moderator for session to it is Cynthia who she is the library's business and entrepreneurship librarian Thank you thanks Jody. I am delighted to present our next makers Peter Swire and shove it dos Peter Swire is the Elizabeth and Tommy holder chair of law and ethics the Georgia Tech seller College of Business Wire has been a leading scholar in government official for privacy and cybersecurity since the mid 1990 s. e a senior counsel with Alston and Bird and research director for the cross border data form in 1998 Squire was the lead our author for the book none of your business world data flows electronic commerce and the European privacy directive published by the Brookings Institution in 1990 it was named chief counsel for privacy the 1st person to have u.s. government wide responsibility for privacy policy in that role his activities included being White House coordinator for the HIPAA medical privacy rule and helping negotiate the u.s. in use safe harbor agreement for transporter data flows. [00:01:28] In 2013 after the Snowden revelations wire served as one of 5 members of President Obama's review group on intelligence and communications technology and the report was published by Princeton University Press wires the lead author for the official textbook to be certified as a u.s. privacy professional This summer the International Association of privacy professionals published the 3rd edition of this text us private sector privacy law in practice for information professional liars a graduate of Princeton and the last. [00:02:11] Das is an assistant professor of interactive computing at the Georgia Institute of Technology his research which intercepts human computer interaction h.c.i. data science and cyber security aims to empower people with no hole security systems at mitigate cost a kind effort and social capital his work has won 3 best paper and best paper met honorable mention awards or beer venue's in h.c.i. as well as an honorable mention for the N.S.A.'s best scientific cybersecurity paper award in 24 g. his work is also that widely covered by the popular press including features on Financial Times slate ours Technica and the Atlantic in addition he was a national defense science and engineering graduate fellow Qualcomm and nation fellow student car graduate fellow and n s f e a p s I felt. [00:03:17] Earned his Ph d. and him as a human computer interaction at Carnegie Mellon University and his b.s. in computer science at the Georgia Institute of Technology. Well here from Peter Swire 1st Hi everybody this is Peter Schweizer I'm a professor at Georgia Tech and this is my talk today about privacy on autonomy in the European Union in the United States as an overview of briefly describe my own background I'll talk about the u.s. approach to privacy and autonomy meeting what we call the fair information practices and the default in the United States is that you have freedom to speak about people your freedom to process data unless there's a law against it and consumer protection laws and privacy laws protect against certain bad acts the European Union has a somewhat different approach to privacy Tanami privacy is seen as an important fundamental right individuals have control over their data you have control over your data and the default is that it's illegal to process data about people unless you have a legal justification also under European law fundamental rights laws are strictly enforced with only narrow exceptions and at the end of briefly talk about some litigation led by Max Rahm's of Austria and how there could be possible blockage of transfers from the e.u. to the United States has been working on these issues for quite some time back in 1988 I wrote a book called none of your business world data flows electronic commerce and the European privacy directive and I am a professor here in Georgia Tech in the business school I have appointments in college you computing and School of Public Policy I teach privacy and cybersecurity here I've also been a law professor for many years and I'm senior counsel with a law firm of Alston and Bird a research director on these issues for the cross border data form and if you get certified as us privacy professional I'm the lead author of a textbook for that so I've been on the ins and outs of privacy for quite some time and also had the chance to work on these issues in the federal government under President Clinton I was the chief counsel for privacy in the White House so for instance I was a White House coordinator when we drafted the HIPAA medical privacy and I also helped negotiate an agreement with Europe called Safe Harbor the ones who effect since you 1000. [00:05:32] And 2013 after the Snowden releases about information about n.s.a. National Security Agency activities I was named by President Obama to be one of 5 people on a review group on intelligence and communications technology but out of that work I ended up with this nice picture this isn't in the Situation Room on the person to people to the left of President Obama of paying attention or falling asleep but we wrote a report of 300 pages propose a lot of changes to the n.s.a. surveillance laws many of which actually came into being here's a quick summary of the u.s. approach to privacy and autonomy the word privacy famously does not appear in the u.s. constitution but there have been court cases over the years fear physical privacy the right to contraception or to decide whether woman's going to have an abortion there's also privacy issues and things like the 4th Amendment which says there's not going to be searches of your home unless there is a probable cause warrant the government can't just walk into your house you have a private house in the 1960 s. one of the leading philosophers of privacy Alan West an American wrote a book called privacy and freedom and he proposed a definition of privacy has control over your information but the tricky thing is because it's my freedom and your freedom how far does that extend a few 100 years later there was a process in the federal government to define what were called fair information practices trying to figure out what's fair or the people who control the data who run the database what's fair for them to do with your information and a quick list of these practices might be notice to you about how your data is handled some choice about an important circumstances you can access your data the data should be kept securely and there should be accountability so people actually follow the rules the business approach of the United States turns on what really counts is fair here's the default if you're a business or a nonprofit or an individual you're allowed to research and write about other people you have your 1st and then it's free speech issues. [00:07:29] First free speech rights now there are some limits defined by law if you're a doctor you're not supposed to disclose confidential information if you're a lawyer you're supposed to protect your client's company actual information and ordinary companies often have privacy policies that say we don't do this or we do that and if they break their promises in their privacy policy there can be enforcement by the Federal Trade Commission and by the states are these deceptive trade practices basically it's a business mistake they promised to do something they didn't do they also have quite a few u.s. privacy laws in specific areas to protect consumers mentioned HIPAA for medical privacy Greenlee's Bliley has privacy banking and financial services and Children's Online Privacy Protection has rules for children who are under the age of 13 but the general approach the United States is to think of these as consumer protection law look when you write regulation is cost to regulations we're cautious about putting limits on the free market there's also benefits of regulation so the us has tended to try to protect the most sensitive data that could cause the biggest harm and the u.s. approach has been that many times using data in databases are beneficial is how gross happens the data is the new oil so autonomy under us approach is for individuals when you get to control your information and also for businesses and entrepreneurs and nonprofits trying to use the data including to find new members so we're trying to figure out what's fair to these different kinds of interests in u.s. processing of data the e.u. approach is often considered to derive in part from Germany's experience prior to World War 2 And during World War 2 under the Nazi years and then later under the Stasi in East Germany there is a total absence of privacy and so Germany among all countries in the world has become the strictest saying we really have to enforce people's privacy rights by the 1990 s. as the Common Market was flowing in Europe there were 2 reasons to pass privacy laws what you're opposed data protection laws. [00:09:25] One of them is for business for the common market the idea was let's standardize the privacy laws across all the members of the European Union so that's their free flow of data France and Germany and Italy and whatever they can have data move freely in the Common Market also for individuals there is going to be protection of privacy rights we're going to have pretty strict privacy laws and that's the naval business but just as a note of the incentives of business in the United States the incentives for business would have fewer privacy rights businesses wanted to see what they could do with the data for business in the e.u. the incentives of business will more complicated they wanted to achieve a common market and they were willing to have some privacy rights as part of the overall package in 1998 this law the law that I wrote my earlier book at this called the Data Protection Directive and it has a clear default I do European Union Law processing of personal data data about people is illegal unless the people running it unless the controller has a lawful basis one of the specific exceptions in the law when you're allowed to process personal data like consent by the individual and you're can be seen as having the precautionary principle here in environmental law don't use genetically modified organisms unless we're really really sure it's safe and I'm privacy be careful 1st use precautions and only permit data processing once it's safe because otherwise who knows who people might do with the data now in 1980 that shut down most of the data processing in Europe not really as it was written originally written the law was more or less aspirational the sort of goals to head towards the fines and penalties were low and many many companies in Europe didn't really put in place big compliance programs but what changed since 1908 in the new one thing that changed changes is not just the common market anymore the European Union has constitutional law in the Treaty of Lisbon that was signed in 2009 and right to privacy the right to data protection our constitutional rights in this treaty of Lisbon there are written into their constitution. [00:11:27] Another thing is that Europe toughened up as privacy laws in 2014 no didn't happen in 2013 and a lot of people in Europe were upset about all the spying by u.s. spy agencies and they were concerned about the growing role of big u.s. tech companies Google Facebook Microsoft and all the rest of the g.d.p. are put in place a bunch of quite strict and stricter privacy rules than the earlier law also the penalties got serious So now a company that doesn't follow the privacy laws can pay up to 4 percent not just of its European revenues but of its global revenues and so g.d.p. Our went from being an aspirational set of European privacy laws to things companies really have to take seriously there's also been important litigation in Europe trying to protect the autonomy of individuals under the law so under the 1908 directive in the 20 the more recent g.d.p. are under European law data can only go to other countries if there is adequate protection that way there's going to maintain a high level of protection when it's in Europe or when it leaves Europe the u.s. and Europe have negotiated the safe harbor and later some later something called privacy shield and these negotiations with that companies could make promises hey we're going to follow all the rules if we follow the privacy rules we can transfer the data to the United States this 1st part of the Safe Harbor got struck down in 2015 by their Supreme Court the Court of Justice of the European Union they said that the Safe Harbor did not have adequate protection for Europeans and then this July in a very important and big decision in another shrimp case brought by this act of this natural means the European Court of Justice found the privacy issue was inadequate So today is just a few weeks later the question is to protect European fundamental rights the autonomy in data for Europeans is it going to be Ok to transfer data to the United States and the court specifically expressed its concerns about too much surveillance by the n.s.a. the National Security Agency. [00:13:25] Well but the United States is not the only place that has security agencies so if they cut off data to United States then would they also have to cut it off or India cut it off to China if you're going to be in a come an island where they protect privacy and autonomy inside the island but they're not allowed to do a lot of interactions with the rest of the world so in conclusion I don't want to exaggerate the differences in the u.s. there's a long long history of protecting rights Allen West and as an American law professor and protecting rights including in privacy and also the Europe is not just about protecting privacy rights they want to have commerce and Konami they're trying to figure out their own way to write it the right balance and so for both systems there's a challenge of how to uphold privacy rights while enabling the right kind of data uses but right now after these court cases there's a big challenge on whether and how the 2 approaches can interoperate and if we can't find a way to do that it clashing views could lead to big disruptions in the global Internet you know I grew up in the last bunch of years assuming there was a global Internet we might see a more limited Internet going forward one for Europe one for China one for the United States so in conclusion we're going to have questions and comments after this I look forward to doing that but I hope this has given you an introduction to privacy and autonomy in. [00:14:41] Europe the United States thanks very much how did you welcome subject us to well to share his talk with us thank you. Sir to follow up Peter my list of accomplishments recognitions isn't nearly as long that you know while you were serving under Clinton I was finishing a little school so hopefully have a little bit of time. [00:15:05] Today I want to talk to you about social cybersecurity some of my work where we are poring through forms in the context of end user cyber security and. And so a little bit about me before I get started. I joined the Georgia Tech faculty and generate 28000 and I got my Ph d. in human computer interaction from Carnegie and 27000 and here to direct the spyglass a sensor security privacies sign and as you can imagine calling my lab members science gives me all sorts of creative let me to do interesting things like have skater cuts that I potatoes. [00:15:41] But I knew my brother speaking we were at the intersection of a.c.i. cyber security status data science and machine learning so the sort of things that I'm going to talk about hopefully intersect a little bit with what it was talking about but it's also from a pretty different perspective because I have this orientation towards building consumer facing systems using of the insights from people like Peter so a lot of the questions that we try to answer in our kind of image from this one broader question which is how can we design systems that encourage better cyber security and privacy and I think this is an important question I ask because a decision or make of that if the basis of that whether or not to protect or not protect our digital resources ultimately sort of underlies whether or not I think Secor intellectual property things like our money things like our personal photos our protection or not from the various entities that exist in the. [00:16:40] Right. And so it's really no surprise that an entire industry has been built around exploiting weak cyber security procedures so when McAfee estimate puts it could step global annual damages of cyber cyber chemistry at 600000000000 annually with as many as 2 thirds of all Internet users having some sort of personal data stolen and as many as 323000 new piece of new units of malware being created every single That and the kicker of all this as at this massive cyber security enterprise like a crime and a president saying. [00:17:15] Would likely be hamstrung if any users like you and most of the people in the world sort of. Adopted actually pragmatic security and privacy things like keeping their software update or even 2 factor authentication important tasks or using a pass or better or using only an antenna interested services but that's not really where we're at on the Internet today here's some service if you look at end user Beatrice we have evidence that you're then for example 10 percent of the consumers unable to factor authentication on to their Google accounts and for many of these people they're good they're Google accounts also are attached to primaries that have access to their primary e-mail access to a lot of other. [00:17:57] Fear than 12 percent of us Internet users as I believe comes from him admitting past managers 22 percent of smartphone users both lost their screens and keep their phones completely up today and only about 10 percent of Android devices on the market were using the latest version and only 2.3 percent latest minute version so clearly there's this disconnect between what experts are men with and these assume And instead what we think about in the end user context is people that sort of one another or ignorance out for a bit until the very last minute because they're kind of a pain to do or propping up electronically locked doors garbage cans or falling to phishing scams but a lot of us think our office and I want to be fair here so far than having a little bit less likely in the user mentality now that that's if the dominant mentality in the cyber security and privacy industry I would I would say ever under be clear that I think users' behaviors usually make sense in the context in which they set as like this that's a very abstract. [00:19:00] Sometimes the solutions are quite going and I have been this person at various times in my life as well despite being a quote in quotes obscurity expert. And so I'm not saying all these things to cast blame on the users but I am saying that it points to this disconnect between how cybersecurity and private systems are designed today and how and it's actually used them. [00:19:21] So in turn for Team. Observing this I wanted to ask you if there are some questions I want to ask people you know let me see the panel what makes you unable to factor authentication what Mr Cooper saw products that are more generally what is what drives the user to be secure or engage in practice you protective behaviors or it's maybe just to provide us with some insights on how to get better under your secrets and. [00:19:47] I started using a pin because everyone around me had a pin so I kind of felt the group pressure to also use a pen. When my boys wanted to use my phone so I gave them a pass code and now that I have anything I that I don't care for that was serious thing but after they did that I changed. [00:20:02] My friends have a lot of different accounts the same as me but they didn't get into any trouble so I think maybe it would not be dangerous the reason anybody noticed a trend over here probably did but it's also sure in fact it turns out that human behaviors are generally social and security behaviors and exception as illustrated by this kind of camera clip where the protagonist facing camera and everybody faces always a confederate in a social experiment and eventually the protagonist is sort of going to the subject the state is going to turn around because peer pressure at least so he is facing the wrong way just because everybody else around him also has. [00:20:35] To have to spend on security and privacy well out of the over 100 behaviors that people talk to us about proximately 50 percent of them seem to be socially driven in some way observing other people having a conversation. I think of that nature and yet we almost never consider the social on the element of the design and even be of marketing of security and privacy and social behaviors at the security and privacy are these things that are meant to be you know this thing at the consumer level I should say this thing that I do and I don't really talk about it like my purview right but absolute knowledge of how security and social behavior is accessed we have a doing better today I want to share with you really primarily 2 bodies of work and a little bit of how we can apply that to. [00:21:20] Design more social security systems but I want to share with you some work that I didn't collaboration with Facebook temporarily measure the effect of Social Security. And at the end of the cycle have to see if these things with such nonsense affect security behaviors and this is contingent on the design of the tool and that making cybersecurity systems more social time for security and then that but there's this vast but largely untapped opportunity to a cybersecurity social service 1st but now of course the very 1st thing we need to do to make the claim that the social aspects of the site security ever is as a measure the effect of that's typically pretty hard to do unless you have access to a bunch of data. [00:22:02] The search for relationships between people and how temporarily my use of a security behavior affects my friends is the security. That sort of data doesn't exist in many places but it does exist in one place in the space but I was fortunate to be able to sort of partner with them in order to do this analysis so I don't have to use your man use every optional security tools affected by friends use of the same tools about 1500000 Facebook users and there are the 3 different tools of Korea evaluated these are optional tools that people can enable at any time line notifications you know give you notification whenever there's a suspicious logon to your account like an approval of Facebook's version of 2 factor authentication and then there is just a contact switch if the 1st 2 are sort of these and are security tools that you expect any consumer facing company to have a contact with a little bit different than that of social you specify 3 to 5 your friends can bastard identities ever that specific company or e-mail right so in that in that sense you're enlisting other people in the process of enhancing your security or social Ok so. [00:23:08] What we did was we started by collecting data and so we randomly sample 750000 users the newly adopted one of the after mentioned security tools along with 750000 so-called Usenet who had never adopted one of the after mentioned security holes and then with that data and hand our question really boils down to this can we distinguish between who is a user and who they use not based on the presence of social influence in their network. [00:23:34] So in order to assess that I use a technique called Mass repents a sampling analysis I'm going to all the details of that here say but there are a couple things that you should keep in mind in order to ensure. That the 1st thing we do and for a given security to we empirically select exposure levels differences are to for example one percent 5 percent 10 percent meaning one percent of the frenzy of locking approvals 5 percent of your friends is like cripples 10 percent of your friends is a lot of people you know we do then as we. [00:24:02] So to cut the population into. We have a set of users who have at least one percent of their friends logging approvals versus the studies or 2 or not who do not have at least one percent of their friends is on approval and we call the former the expose of the latter the unexpected and then for each of these exposure levels we compare the adoption rate of the expose vs the unexposed at the level of exposure and sort of the difference there is the effect this issue of not versus a lot of trickery that goes on statistical factories or goes on background or to make sure the comparison is fair and so we try to evaluate it I suppose and unexposed users who are otherwise very similar in a lot of other dimensions happy to talk about that more if you care but you know for purposes of practitioner just being there how do we interpret that So let's say we have this chart where the x. axis of the after mentioned exposure level I talked about before and the y. axis of the after mentioned difference in adoption rate between the exposing the unexposed there's actually a close had absolutely no effect we would see this black line a 0 that means that there's no difference in the adoption read blocking approval to lock in the extension there just a context of people who are exposed at a certain level versus people who are not exposed at certain level to frenzies that's to some extent. [00:25:15] Now more likely the expected effect is something like this line that goes like into the right this sort of encapsulates the intuition that as you have more and more friends you do a thing you're more likely to do that thing yourself like so if you have one friend you're slightly likely if you have 200 friends who you trust to contact you're probably more likely to trust your contacts or suffer and if we can find that that's great because that's the 1st Imperial evidence we have that social influence is a fact security of yours. [00:25:43] That we actually see the plot the results are just a contact you see something that looks very similar to the expected effect which is great this is the 1st a miracle or evidence we have that for at least one cybersecurity to address a context we see that social influence of this positive effect and we should take into account social influence in the design of the security and privacy. [00:26:05] But what happens when you plot the results for line approvals and then modification you see that area at the bottom of the graph the bottom half of that graphic here which I'm highlighting in red that's the area of the graph where social sunset the negative effect which means if you have a small set of friends who use like an approval or like notifications you're actually less likely to have the same childer self and if you had no friends to use like an approval that's happened so that's really weird what's going on there why the social influence of this negative effect brought to talking to some friends of mine and you know like the circle sciences and marketing it turns out that there might be an explanation for this market is call this just affiliation and this occurs when a new cool user group search using your probably right like for example when parents started using Facebook teenagers sort of flocked to the platform because they didn't want to post anything where their parents or now just think about the 1st person who comes to mind who would use 2 factor authentication the date. [00:27:07] That person with anything like this guy. There more broadly early adopters of many security tools are people who you know my sounds. Like they might be a little overzealous about security they might genuinely have concerns and act on them in ways that other that other people might not and as a result of that early adopters of security and by these tools can often be perceived as paranoid or not easy and in turn this can signify the use of those tools Well you know I have nothing to hide I am not weird paranoid person who's afraid of the government so I don't need to factor I think station the only people using it are these. [00:27:46] However there are a couple of is the good news The 1st is that all the the facts go up into the right which means that more exposure is good. You know as you get more and more friends who use a particular cybersecurity tool you're more likely to you that yourself however the effect or the effect becomes increasingly positive as if it ever for traditional cybersecurity tools like logging approvals minded Isham effect remains negative until a very long until a very high level of exposure. [00:28:17] And then the 2nd is a good news is that instead of a spiritual appears to affect its potential for social structure it's like trying to contact but never have this negative social effect and why is that well it's very different like I said before it's a more social cybersecurity tool among a group of people are going to fishing for it and that's the time of the designers of the users of just a context it seems to boil down to these 3 dimensions as ever billeted cooperation stewardship observability capture this idea that as when you specify your trust a contact when the enroll into that tool you have to specify 3 to 5 your friends who will serve your types of contacts right and in so doing they get a letter station that alerts some to the fact that. [00:28:56] Just a context a tool that they didn't know about it before the. Cooperation attaches idea that people are working together towards you know beneficial and for security and privacy so try to contact your friends are working together to provide you with some security and start to practice idea people working on behalf of others again your friends or your stewards interests. [00:29:19] So if you can design the security and privacy to all the consumer base and security guards with more observability cooperation and stewardship maybe you can emulate occur What's more in the viral curve of the smaller trust contest and like long approvals It's Ok so social influence affects security behavior and this affect both ways it's contingent upon the design of security to. [00:29:42] Can we use that to make cybersecurity systems that are worse social to encourage better security barriers Well it's really hard to make the same cybersecurity tools more cooperative spirit in the but I am certainly make them more observable and we wanted to experimentally test. That was the subject of my next collaboration can we encourage cybersecurity behaviors with social search potentially that was we run a randomized experiment with $50000.00 and the promise of the experiment is very simple because of this annual security awareness campaign where they give you a little advocation on top the union speed that alerts to the fact that there are extra security told you can be certain to increase the security of. [00:30:20] The vanilla notification that there really is this or our intervention or something like this right sir keep your hands safe you can use security settings to protect their account and make sure it can be recovered if you ever lose access with the cult. That's pretty standard stuff we came in and integrated 7 additional conditions with social proof right so let me go over all of these but you can. [00:30:44] So we went from highly specific the wrong number condition told exactly the number to a frenzy is that's just weird settings and also mention that you can you know you can also protect your county and so that you can protect your act and all the way to something bake the same conditions some of your friends do that's your security so it's like any real number. [00:31:05] That we had a conditions settle social and nonsocial control condition that I showed the 1st we randomly assigned $6.00 to $250.00 participants per condition and ran the experiment for about 3 days and we measured the click to read the 7 adoption the 5 month options to see do our social critic is actually different answer here's what the numbers look like across all conditions 93 percent of our suspense Loggins nonsense about 13 percent clicked on one of the announcements about 4 percent adopted one of the promoted children in 7 days and about 10 percent adopted one of the promoted tools with an Ok sort of the what of those numbers look like broken down across the nation Well broadly speaking you can see the value here in the gratitude stack bar chart so there are 2 things I wanted to I want you to take away from this the 1st thing is that every single social conditions I prefer to control conditions in soliciting more clicks on the announcement so people are more intrigued by the social. [00:32:05] Significance of. And the 2nd thing is that the top 2 performing conditions actually also lead to more adoptions the promoted security tools and so the assassins this year were about 36 percent more between the raw number and a control condition for improvement like the read my 10 percent improvement and options between the random and the Control Commission and that's a pretty remarkable result if you think about it we can all we did was change one sentence in a small notification on top of the Facebook news feed which is a very busy and. [00:32:33] I don't know if and if you're familiar with running a b. tests at scale but you almost never see at 36 percent improvement in click the right with just a little change to it so this is a very promising result but if we can make more substantive social changes in the design of this consumer facing security in privacy everybody may be able to see sort of more widespread adoption struggles Ok there are making some security systems more social commentary there's security there's I finally I just want to briefly summarize to convince you of this final point which is that there's this vast large on top opportunity to meet social Remember start of this top with this question of how can we design systems encourage better sides. [00:33:15] Well so this current work I hope suggests that there's a spray full of largely untapped opportunity to mix cybersecurity areas more social and instead they're encouraged better and specifically is there these 3 design dimensions observability cooperation stewardship What does it mean to mix I was here in pads the markets are. [00:33:34] Well we can go down to this question of how can we make it easier for people to observe and emulate good security behaviors. One way like to like and that's in the real world or in the real world if there is some sort of physical threat to our being that we can usually see that threat whether it's like. [00:33:51] You know a person or if it's a natural disaster or something oftentimes are to see if in fact the most a fair sponsor can see a red light for example viruses. And we can also observe and emulate other people's reactions to those threats. So you know if somebody comes in and is a threat I can see other people running away I can see other people trying to protect themselves I can see other people trying to fight the person and I think you can choose which of those reactions makes the best sense for me given my unique set of constraints and circumstances right I can't do any of that with cyber security and privacy and the press are abstract and invisible a chancy what anybody else is doing stuff course I often get blindsided by cyber security implies. [00:34:34] Operating taxes idea of how can we design an additive security system that makes it secure their privacy assignments that the men function to use a crude real world analog again if I'm walking down a dark alley way alone at night person with a friend feel safer with a friend but I think that's because I know an attacker would have to overcome the 2 of us if they wanted to attack us. [00:34:53] Security privacy that kind of works the other way around where if I have good security every cent and I'm interacting with them or security givers I was I was actively be pacified any information to share because the easiest factor for an attacker to get my information is through them and so Chris a perverse disincentive where people with poor security barriers do not actually gain anything from interacting with something stronger security of yours and people are trying to secure to be of yours actually. [00:35:20] Weakening their defenses by interacting with people with your security and finding stewardship have to cite how can we design systems that allow people to act on their concern for the security and privacy of their loved ones and our research over and over what pops up is that people. [00:35:34] Care you know people really want their friends and their families he has 2 factor authentication on their bank accounts but when you press them about why they don't enable it is the guy's kind of you know I don't you know it's not it's not something that I actively think about and so there's all this unharnessed energy in our social networks our concern for other people's well being digital wellbeing and none of our security and privacy tools today really capture that concern like there's no easy way for me to act and my concern for you know my brother is a security of privacy. [00:36:04] And so I'm going to change that so that even if I don't care about my own Can we tap into the energy that I have. That might might my concern for others Ok So broadly speaking that that's my talk and I think of examples of these systems if necessary. [00:36:20] But yet these are the 3 things that I want you to take right and. I'm trying to. Select thank you we are going to now address the q. and a. With both of you and. I'm going to start with a question in the United States with an economy that embraces capitalism and private ownership effectively and act a comprehensive federal law that governs data privacy citizens the March of Europe's g.d.p. our is the responsibility of data privacy protections Sali lie on the individual in the absence. [00:37:13] Ok I think that's more for me than glad for a shock to come in and thanks for that super interesting set of slides about how we actually get people to act more securely. So right now the u.s. is going through a process being led by California when it comes to privacy legislation so a lot of you might know that California passed the c.c.p. the California Consumer Privacy Act. [00:37:41] That went into effect in January of this year and California all by itself is the 5th largest economy in the world so a lot of businesses that operate United States have to operate in California and beyond that California has on its ballot initiative for this November another another round of strengthening for. [00:38:00] Any business that does seem substantial amount of activity in California so I think quite likely you know if the ballot initiative passes the way people think it will then businesses in the United States are already sometimes dealing with Europe so they've had to come into compliance of g.d.p. are and they're already doing business in California so increasingly they're having to do business in California and I think that's leading to a lot of new pressure on Congress to pass some kind of national legislation. [00:38:31] And even strange groups in the chamber of commerce for instance has come out in federal in favor of federal privacy legislation and they're doing that because they're worried California other states are going to go even more regulatory and Congress might do it so I think there's a pretty good chance or on a path over the next few years to have a federal comprehensive privacy law I'll stop there for the 2nd Ok thank you this question is directed to shall think how do you conceive of people taking on this one protections when many of them have significant costs thank you each to e.v.p. and. [00:39:16] That's a great question so. You know I think obviously we're not out of point where we can give up on usability I think improving the usability of cybersecurity empowers a system that. Is still very active research area there's a whole conference about it and so I think. One thread is we need to make these things better and more aligned with how people want to use these computing systems that said I don't think usability is the only thing right we also need to happen we need a sort of as a society we need to decide. [00:39:50] Some things require individual compromise for greater threat and I'm not saying that necessarily you know innate like requiring everybody is if the end is the way to go over here but certainly if enough people adopt that sort of excrement of security in privacy give us the economics of site reform would be very different because right now a big part of it is just sort of compromising people who are vulnerable because if we ask for is a bottle because they've got to get software or something nature and if enough people sort of close down those factors of attack it would make the economic suicide a kind so different because all this and script kiddies could come in and make a massive profit by distributing a bunch of mouth stretch and so I don't know exactly where where the where the exact line should be between sort of individual agency versus collective good but certainly there's an element there and certainly we can try as researchers signers to make this if you're in practice the privacy preserving the. [00:41:01] Tools as you squeeze possible as well to reduce the cost I wonder if I can pick up on that and Ashok this doesn't seem like it's just whether the individual adopts the tool or whether. It's also whether the the system defaults work that way for users so. For years you could have done in Christian for your e-mail that was really hard to do it because the other people receiving it didn't have the encryption in place. [00:41:30] Or after Snowden especially the mail and. And Facebook Messenger and a lot of the other services shifted to encryption that they put in as just part of the architecture and there were individuals to do the encryption that turned out we never would have gotten there but for encryption in transit it became very very common once the service is put in as the standard way to do things so yeah I was you know how much can security by design or security at the infrastructure level take care of these things instead of saying it's all on the user and the user feels overwhelmed Yeah I'll say Good point so I think it all goes down to the threat model right so what users can do and what companies can do sometimes overlap but often times are different for example you know and then encryption that's something about companies then and fall until it was relatively simple for them to do so like run till the user the hit the user experience wasn't as bad as it would have been and the past because the side sort of computational officials and things like that but ultimately you can have an extremely secure system but it needs to align with. [00:42:43] Stress and so there are situations where a company cannot make the decision for the user for example when should I be using signals to contact a journalist versus when should I be using Facebook to reach the widest audience possible or something like that and so there are always going to be these touch points where the end user needs to make a decision about security and privacy. [00:43:02] For example for my phone when should I updated because some people hate the update right but it comes with a really important security patches fall. So there are always going to be these touch points we can't completely automate out the user for many of these decisions and for those decisions and do we incentivize how do we make it easier how do we make it better I'm going to get on how do I get for the users to take this sort of secure private options. [00:43:37] All right thank you this is a question directed to Peter are there national moves toward the European model in the u.s. I live in California where there is an intentional a line some of the e.u.. Well I was talking about that a little bit in a lot of other states besides California had laws that were being considered this year Washington state almost passed a big new law in Nevada did pass a big new law. [00:44:11] You know I think when you when you ask. People voters in America do you want to have good privacy or not you get overwhelming support for it and that's why the polling on this initiative in California the last the last poll that came out showed 80 percent supporting privacy even though it's going to put a lot of burdens on industry and a lot of the companies are upset about it so the more that you you have sort of popular. [00:44:38] Use built into the laws the more it looks like the United States population looks like the European population and wants to have those protections and I do think that I California passing its laws 2 years ago it really changed the path because now lots of companies are getting used to having to build in these protections and now. [00:45:04] The the people who don't like regulation have to realize that many many many many of the companies have sort of gone over to the side of trying to figure out how to live with the new privacy laws so I think. There really does seem to be a moment where the rules can change here and there's another another reason it might happen I say this in my privacy class and I say even if you don't believe it you know politicians are people too and so the the idea there is and we've seen this sometimes in privacy votes if you let a member of Congress or a state legislature a senator they Gee what do I want for my family or what I want for my own privacy then even if business is saying that it's a pain or is going to be burdensome the politician who's a person with a family as wait a 2nd I want this to be protected and here's just one story that there's been reported but isn't widely known about when the 2nd President Bush came into office in the early 2000 and when that happened the hipper medical rules had just gone in to have been promulgated but they hadn't gone into effect yet. [00:46:12] And the hospitals the insurance companies did this big lobbying campaign saying hey we can't have HIPAA it's too burdensome it's a really bad idea and it went up to the president of states who went to George w. Bush and his answer was wait a 2nd in this applies to my daughters are you saying that when they go to the hospital their medical records shouldn't be protected and the advisor said yes well met Yes Mr President he too burdensome and he said no it's not too burdensome I want my family's records to be protected and so he acted as a person and said let's keep these HIPAA rules even though a lot of his advisors thought that they were too expensive so when you let politicians vote on their personal views on these issues a lot of times you end up more privacy protected than you might have guessed Ok The next question is directed to show that Google now has a password suggestion for when creating new accounts they appear to be linked he makes is a character passwords can I trust one company to both generate and store my passwords and have now my passwords that is that I do myself. [00:47:29] I personally I I mean so I assume you mean chrome or do you mean to do a count specifically so I mean chrome that and. Certainly there's an element of trust there because Google is producing chrome and so are they sharing the encrypted sort of file list of your password back with their servers I'm not sure we'd have to look into the code for that an innocent source you should be able to look into it and it's unlikely I personally don't use browser password suggestions I use a 3rd party I asked her to manager several years before true because it's open source so that way you don't have to trust that you know it's Google who has access to all of your passwords I'm not saying that Google is doing anything nefarious with passwords if you save on Chrome but I just like to distribute trust so that there is not overwhelming there like it will source just 11 point on that is. [00:48:29] So if Google gets a court order from a judge either in the United States or some other country we're Google's doing business and it says give me shots or Peter's password if Google has them and the judge has the right kind of authority there thank you will have to turn him over and one advantage to using a 3rd party service I use a different one for passwords is they'd need to go to the 3rd party service and to you know the bank or the or the website that you're going to. [00:49:00] Before that the the government can actually sort of see the stuff it cares about so there are some reasons to do. To move your passwords off of the biggest platforms because the biggest platforms are where the government goes 1st. I definitely didn't consider that perspective and that's definitely true subpoena powers are. [00:49:25] Typically it with these password manager services that allow you to sort of back up your passwords on the cloud that the password list should be encrypted with strong encryption and so your master password will be key there so even if you know the government subpoenas the effort was it should be encrypted as requests but they're not still relies on the security of your master password or it was have it's always important to have a strong master answered even if you have. [00:49:51] Ok thanks this next question I believe is probably directed towards you her question is having been actively involved in data privacy through inception and evolution of the Internet what is the most significant transfer you have noticed predictive and origin not expect data privacy matters Wow So so in 1998 when I wrote the 1st book what I predict or not predict Well well one thing as I predicted that there was going to be this big mess between Europe and the United States and now we're getting that it's sort of been delayed for more than 20 years but now with this Rams case this summer yesterday the the Data Protection Authority for part of Germany him out with guidance that was very strict saying they didn't see how do companies in Germany should be able to use American cloud providers. [00:50:50] That's an official part of the government making that statement and it's a plausible interpretation of the recent court decision in July. So we've we've sort of delayed the clash between Europe and the rest of the world for a couple of decades but now that clash is much more serious and it really could lead to this splintering of the Internet that people worry about. [00:51:14] I think that there were things we didn't see coming you know and in the 1990 s. none of the social networks were that big a deal so the kinds of social things Shock's talking about was non-modern mines also there wasn't much talk about the Internet of Things which is the idea of censor censors everywhere and then how do you manage that data people thought of data is like something you gave to a company and now data is what happens when the camera sees you walk down the street so there's definitely technological changes but but I do think that this basic idea of whether Basically you're allowed allowing companies to process data or basically they can't do it except on exceptional basis that's a huge battle and that was something I tried to highlight my slides and you know we need to have a lot of privacy protections but I also think we need to have a lot of data for research and for lots and lots of other purposes and how to balance that remains a really hard problem I don't know shove it if you have any any thoughts like things you've predicted or things you've seen that have surprised you you know even from middle school till today. [00:52:22] I don't know maybe it wasn't thinking so much about privacy but certainly I. I in undergrad. I. Beg believe are that the centralized computing was going to become bigger than it is and I still think that there's a chance that it's going to happen in the future decentralized computing seems the big cars a p 2 p. a fine wire. [00:52:49] That coin in a. Bitcoin is a centralized but it's a little bit more centralized in terms of its regular regulatory now. I thought maybe some help getting was going to be bigger because of the advantages to privacy and the lack of needing to trust any centralized authority to regulate the Internet. [00:53:07] We've actually been removing more and more toward centralization you know massive players like Google and Facebook and and certainly there's a synergy between like what a surveillance state might want and allowing these centralized companies to live because the Peano aren't things like. And so I probably underestimated the standard just sort of economic and sort of behavioral benefits of having these centralized powers control the Internet and I still think that there's a chance that the centralized decentralization will become more mainstream but I'm. [00:53:40] Not that optimistic that's going to happen anytime soon. This maybe to follow up on that on the on the banking side that corner payments. The bank regulators and law enforcement anonymous large payments they see that as a way to have money laundering at scale to have you know Russian influence on u.s. elections of scale unless you have know your customer and the bottleneck around banks then. [00:54:11] Then defending against those kinds of flows of money becomes impossible and so that the degree of intensity that the f.b.i. or bank regulators feel about knowing where money goes including So the people of the pay taxes which is what keeps government going the intensity of feeling about that is so are that I think thinking about computer architecture compared to we're just not going to let that happen. [00:54:37] Is I think something when it comes to the payment side of things that sort of leads me into this next question What is the path. Of intrusion of privacy that the u.s. and I say is going down compared with Chinese Russian and rainy and surveillance practices Why are Americans not made aware of foreign ad for serious attacks on their personal. [00:55:10] Yeah I've written a lot about this the and this is part of what happened after Snowden is that the United States in my view and I testified under oath on this as the most Stansell set of safeguards against abuse of surveillance of any country in the world. And. [00:55:30] You know the n.s.a. does a lot of things but it does it with a court oversight it does it with all sorts of auditing it does it with more transparency about its actions than any other country in the world. China I did a big study last year about that basically the Chinese government wants stuff it gets it there's no laws there's no safeguards stuff we've seen about the weaker as you know in the western part of China is it just total surveillance state and one of the reasons for the u.s. to continue to do surveillance is if the u.s. were to close it down the way some of the Europeans would want to do it China and Russia would continue to do the surveillance and continue to have that awareness that really is unilateral disarmament that really is not knowing what your adversaries are up to and it's a dangerous world. [00:56:22] And I think some of the people who get really upset about the u.s. government doing surveillance sometimes haven't realized how much. How many problems that cyber attack level and other levels come if you're not playing in that space and I just I don't I think it remains a dangerous world in the u.s. should act in a way that recognizes that it recognizes that if you stop things on the u.s. or European side it will continue from the other countries you just mentioned Ok next question for the paranoid who is at the head of the cyber security system how are they held accountable thanking here of how companies like Facebook are snapchat at present time or security options but they are the biggest violators. [00:57:19] Yeah I think this is probably a question about Peter and I can speak to. It's a difference of track model here so when you are a Facebook user. The security options that Facebook provide do is not to protect you against the right to secure the option to face the providers to protect your Facebook account against. [00:57:45] You know people who might want to compromise their accounts so that they can access your friends sell scams think about nature. So that's. The 1st question which is there is no counter ability chain there from what I was speaking to hear what your speaking about because that's a different threat there when you're concerned about Facebook violating your rights. [00:58:11] You're not going to get the solution from Facebook because there's a disincentive and however there are ways there are there are there are 2 different sort of lines of thinking I think there's the designed line of thinking and then there's the regulation with your conduct we talk about the regulation and I think it is a line of thinking there are definitely options that exist they're just not totally usable for example out of a student currently working on a tool that we call stagger share that allows you to share photos on Facebook using deep learning facilitated steganography so that Facebook can see the photo that you're sharing but somebody else if they have the extension installed will be able to decrypt it and so that's a way to use Facebook in order to with Facebook as your. [00:58:58] So but this is definitely an option for and I do strongly believe that if you can make design these sorts of systems such that it's so usable and it's so easy to distribute these systems among end users and it ends up a big part of that piece of the problem but design isn't is going to solve all of it projects if you want large sort of surveillance capitalist corporations like Facebook and Google to be held more accountable then you're also going to need a regulation piece and I'm sure Peter can speak to that part but of my fat. [00:59:30] Boy a lot of different issues are just you know one of the thinks Facebook's been hammered in the last couple of years for not being good enough detecting child porn for lunch you must child porn to go from one user to another and the steganography approach that Chava just described maybe I'm wrong but it sounds like a delivery tool for child porn so that Facebook would be able to tell you know that user a is sending its a user b. and I know that there's been. [00:59:57] You know calls for as Facebook has tried to put and ending christian in more places it's been criticized for having security because it doesn't protect the child porn as effectively though. What freedom for you to be able to communicate without Facebook seeing it is also a law enforcement risk or making it harder to detect child porn and if you are dropped down and put in the middle of deciding for Facebook what to do there admit that it's really tricky morally to figure out how to play between those things between enabling anybody to communicate secretly and knowing that that might be. [01:00:38] You know creating problems just the way that Facebook has gotten his criticism for not policing false statements and political advertisements you know or not policing sourcing a political advertisements from Russia or other places to when they're supposed to close things down and when they're not is a really really hard question. [01:00:58] Yeah go ahead yeah I think reasonable people can follow along different lines of that spectrum different points in the spectrum. This is this is after that's going to a problem right within Christian where law enforcement wants a backdoor in terms of them but if they have a backdoor into encryption then everybody has a backdoor into the encryption so it's in groups anymore. [01:01:24] It's tough to assess that but I I think that you could have client side protections without requiring a centralized. Source read like so for example stagger share whatever Facebook is using to detect child pornography could potentially invasion installing that sort of protection. At the sort of client level that if you're trying to install anything like that that and indeed it can be 5 been detected finding work all the time because the child cannot refers and probably going to just stagger share and try to come up with something that doesn't have those protections but at the very least it will increase the burden of them for being able to do that while affording privacy protections to people who otherwise want to be away from the watchful eye of Facebook and I would argue that that latter group is probably the kinetic and with larger than the former so there's always going to be this tension between art if you're building privacy protections you're potentially increasing the likelihood of serious necessary a centerpiece for being able to do bad things that way from the watchful eye of law enforcement and the government but there's a tradeoff there at like what do we care what do you hear more about the individual the individual privacy or the collective security and at different points our society will care about these things a different way so right after 911 of course we cared so much more about collective security than we did about individual privacy laws and you know since then it's been oscillating back and forth but I think there's always going to be this inherent tension there. [01:03:03] But I do think that there are ways we can approach this that sort of allow for individual privacy without as much of a half to collective security but there's always going to be that tradeoff. I agree I do think the term you used is something that I think people in the audience might not have heard or there's a recent book called surveillance capitalism it's the term the shove accused by a serous on Azuba off that I've been reading recently and I think that the book that opposes the moral questions in a very strong way. [01:03:37] The ability of the biggest platforms to really granularly follow your activities has gone way up in the last 15 years they spoke didn't exist 15 years ago in a commercial form and on the one hand there's. The well let me just put it this way if the government if the companies can see all that stuff then the government can see it at least with a court order and so how to think about the overall architecture superimportant. [01:04:06] But we're going to have these big fights about well yes but if there's a court order what should the government see and that's going to be a great big data point for Ok next question Peter how did you write the u.s. privacy and security laws in reference to effect of notice in comparison to the other countries. [01:04:31] Are we leaving or are we be high. So for free for many years after 998 the u.s. in Foresman for the things that were on the books was stricter and the e.u. laws in theory were stricter and I think with g.d.p. are in the big On instance the last couple years is getting stricter in practice in Europe also. [01:04:56] Another thing is the us has a history since the 4th Amendment 200 whatever years ago a situation of having pretty substantial protections against the government surveillance people can't break into your house and they can't listen on wiretaps unless they get the right kind of court order and stuff and in a lot of ways the u.s. protections against that government snooping are much more extensive than in most of the countries in Europe something I've written about a lot but on the private sector right now it's much closer to laissez faire in the private sector in the United States than it is in your Ok. [01:05:39] And I really think the I realize you know I tend to get try to short sound bites on these each of these could be you know 2 out. And was in partly did partly to touch on different questions and to have different people's use on these things. Yes I'm going to combine 2 questions are shot that have to do with awareness and education regarding online privacy or the general majority population wanting to know what word are we behind in general education on privacy what awareness campaigns such as watches for drunk driving smoking etc be beneficial in changing people's opinions on online privacy Yeah good question so I think that there are these 3 barriers to better consumer privacy and security behaviors one is awareness so in that sense I do think awareness campaigns up the. [01:06:47] Possibility of helping like like you said she felt campaigns from grabbing camp and the sorts of things that circulate more on everybody's minds privacy typically So there's definitely a possibility but there are 2 others there's motivation and there's knowledge so you can be aware of the threat and what you can do to protect yourself against a threat but it doesn't matter if you don't know how to use or or how it works you can be aware and know how it works and how to use the defenses but it doesn't matter if you're not motivated to use it and you know these have been documented in the literature. [01:07:23] It's awareness knowledge and sort of motivation behavioral gap I think the social stuff that I talked about his strongest at affecting motivation because I also think motivation is one of the hardest things we need to need to improve and this is partially because there's a really funny and good paper by Microsoft Research from about 10 years ago talks about if you. [01:07:47] If you monetize or you see it converted into a dollar. Giving people minimum wage the amount of time it would take them to read every privacy policy that came across and engage in all of the expert recommended security and privacy advice that they might come across it would actually cost them more to follow the advice than the expected loss like $1.00 major breach or something like that right and so currently the way our system than our processes are set up are such that there's a slick huge disconnect between how much people have to do and are told to do and like what's actually effective at preventing this is if you resist I think this motivation piece is really important I think awareness is also really important and so. [01:08:31] These campaigns they might help I don't know. Without affecting these other things as well as a question here as analytics graduate student Apple access. Government data presumably due to the digital Accountability and Transparency Act I'm concerned with the data findings I may discover from the us and that may come in conflict conflict with the government am I protected data can be harmless until it's understood I'm not sure I understand the context of the question enough to say any really answer I'm not I'm familiar with a lot of laws that I don't recognize the law or the person just referred to Ok you can skip it. [01:09:26] I didn't know the context either. The topic in general. Is there is a question I don't know I found maybe you can type and chat that is a question about the dangers of inferring. Things from data that is out of the question about the standard that you know if you're working with government data. [01:09:56] And somehow you infer something maybe you shouldn't have if you're protected or not. Yes sir I don't know if they were if it goes around for the 1st session and or ritual coming that we've talked about differential privacy. Where it might be partially an answer to that I do think that large corporate entities and governments if they're going to if they're going to share data. [01:10:24] They should do so in a way. That is differentially private That means if you can make inferences in the aggregate but you can't specifically the identify any individual data. But that said not all data sets that are out there that might contain information about you have been shared in the differentially private ways and that sense yes there probably inferences that can be made about you that we don't know about the act that you may not want people to be able to answer about you and there's no real great way to protect yourself against that is that they don't already out there is the unfortunate. [01:10:58] But she responded back mostly referring to government data with data dot gov. But well so if you if you download data from data doc. And the government thinks it's a good policy you know to do something and your analysis shows that it's a bad policy to do it I think the point of data dot gov is that people are supposed to be able to do their own analyses and come to their own conclusions from it so I I don't think. [01:11:32] You know I hope you have the free speech rights in the freedom from being treated badly by this government anough so that you do your good data analysis and you come up with the recommendations you come up with even if the current people in power don't like you. [01:11:46] I'm not a little shop issues the term threat model of various points I'm not sure what the risks are of the questions addressing. You know you might find out you might find come to a conclusion that's unpopular but that's true for any any study that you do. Next question. [01:12:15] We've only got a few minutes left by the way I think it's less than 5 now we've got 3 minutes left so. What role do you think governments have in protecting their citizens information relation to release of personal information from all over Iraq be researched. This is then a question around what's called logic Toodle studies for health care you know where there might be decades of data collected and then you know what happened 2 years ago gets revealed later on. [01:12:55] And I'm not sure I have a good answer to the level of risk from an i.r.b. these are supposed to be there to address ethical problems with the data in a supposed to be there hopefully to stop inappropriate release of the personal data. There may be particular places in particular I Arby's where that's protections not being done well I or business intuitional review board that reviews human subjects research. [01:13:28] In like any regulatory system maybe the regulators are doing a great job in a particular moment the hope is I or bees exist to stop the bad things from being released but I don't have enough context to say more than that and I think you I think one other note on that is. [01:13:49] Our knowledge about what can be done with data is increasing exponentially and so even 5 years ago by things that we can infer today it was just like not conceived us it means out there to save more and more of that for the forseeable future where data that was thought to be innocuous this is somehow really identifying or really can be used for him for things that you've never. [01:14:13] Been So it's a tribute challenge right because how do you this is one of the reasons why I think you need a combination of design and regulation because I think regulation at least for the forseeable future is always kind of going to be a little bit behind the curve. [01:14:25] And so. Ok Well I think we have come to the end of our own mind it much either if you have any last comments that you'd like to make I'll just say it's great to have not so many people just a painting today and perhaps watching in the future the number of different issues about governing data how did you think securely and privately there's just so many of these things and so I think for all of us is going to be a lifetime learning process to try to handle this and will be solved once and and be done with but they'll be plenty for the current middle schoolers when shocks in my stage and some time. [01:15:15] This is going to keep happening prior lifetimes but as impressive by the. Next we're going to. Have to push it in Ok we appreciate you sharing your expertise with us today and. Thanks for being here. Ok blanket thank you all thank you Spence Yes Thank you Professor Squire and Dr Dobson and thank you to everyone who attended the program for today I'd like to invite you to join us tomorrow at 1 pm Eastern Standard Time will have talked or yummies of cases object to a plot to coop an album Acree now from the library Freedom Project so you'll be a great day to to the symposium and thank you to all the professors I'm specially doctor coming from doctors it's just so presenting earlier and letting this kind of care and on these really interesting conversations about privacy and autonomy I have everyone has a great point because I.