Welcome everyone to another week of the cybersecurity lecture series here at the School of cybersecurity and privacy at Georgia Tech. Thank you for joining us this week. I'm very excited. We're going to have a different kind of talk today, one that I've been looking forward to for awhile. I'm very glad to introduce Blake brand and he is the Chief Strategy Officer at one trust him, before he was the Chief Strategy Officer, he was also the very first DTO at one trust. And even better than all of that incredible success. A Georgia Tech alum, right? He got his bachelor's degree here, his master's degree here, both a DC. So with that, I am very happy to hand it over to him. So thank you so much. You Yeah. Thanks for having me. I'm very excited about this. Very privileged to be this. And we did not have as awesome at facilities as you guys do. But I was going through there this but I did Masters in Science, Electrical and Computer Engineering, graduated back in 08, but really quickly latched on to kind of enterprise software. I did lots of autonomous vehicle communicate safety communication research when I was in grad school here. So that was that was how I ended up. I am and I tell people I'm like, Georgia Tech is what put me in Atlanta and I never left my life. But yeah. And with one trust, trust is right up the road here, bounded by could be borrowed a Georgia Tech grad as well. We've been on a pretty awesome journey. I'll say a little bit about the market that we're in, get through the content today. But just to set the stage with the core of the presentation and I'll explain to you like, what is this market and what has been happening? And then I want to talk about what is, what is sort of going to be really exciting for us as researchers, academia, you'll commercial businesses to really empower kinda what I would call the next generation, the next decade plus of, you know, doing innovation like and I'll give some context as we go through it. But that this stage and all this, what is it that you notice about? Like all these companies that we've got, largest taxi company the world doesn't have any cars. Largest media company doesn't own any content. Or just Retailers don't have inventory. Accommodation provider doesn't have hotels. What did they have then? They say largest. This typically means like capital, like value of company. What did they have if they don't have the typical tangible stuff that we think of that makes up the well, go ahead. Distance data, their data. And they are good at using the data, right? And that's why software and tech companies are, some of the value comes in the world. They know how to use data. And I think a lot of the innovation and cool stuff that's going to happen, like how are we going to get to Mars, how we're going to terraform plan slicing can be powered by our understanding and use of data and bringing these things together. So that's awesome and coolies even talking earlier this week with Medtronic medical device company. If you're familiar with Medtronic, well, they're not a medical device company anymore. They are a health technology company. And that just shows sort of, you know, the emphasis of a over a 100 year old industry of how everything is like if you are not a data software company, you are going to die. That's how the kind of world thinks in the industry thinks. So that's the stated this. Now, here's another interesting thing that's happening. I'm a people who have gotten to see these beautiful prompts on their phone, right? So how is it that? So data's all this powerful. The most valuable companies in the world are ones that use data. How is it that this prompt can cause books? This multiple billion dollars of market cap gone over a couple of pixels at past. What is happening? What's causing this? What's feeling it. So let's, let's break all this up. That is the essence behind one trust and our story, which is weird, the fastest growing software business software company in history. We went from 0 to a $150 million of revenue faster than any of lot of companies you probably heard of. Mongo bill Slack, Okta. No flay. That's what put us as one of the fastest growing companies in the Inc. 500 last year, ranked as the fastest growing company. Baraka. Though, how did we get here? Let's back it up. What's explain what's been happening, how did like how does that, how does that just happens though? Many of you have probably heard luminaries and even have started to realize right over the past couple years, like if I am using products and services and I'm not having to pay for it. Like how is this? This is for me, like what is this? Well, it's because you are the product. Your dad is the product and because you take that data and I can go use it for interesting things, that makes it valuable. Well, what we've seen over the past two years, three years, four years, and started lose track so much going by CFF. There. See this stuff, right? Why are you guys seeing all this? Is this not annoying as heck? That we're seeing pop-ups, accept, accept, accept, and all that comes because the value of using that data, and we're starting to say, well, wait a minute, I, on that data, you're making all this money off of it. And in some cases cycle, okay, cool. I want, I'll pay with my data. My data, I'm fine with that. What's the impact? Well, that's fine. I'm not saying it's one way is right or wrong. But at the end of the day there is this kind of ale to think about of use of that data and what you're doing with it. There's, That's cool. That's creepy. Your God, Don't ever do that, right? And there's these lines and it's all contextual. So there's this kind of slope, a kind of whoa, that's way too creepy. Can't do that for use of data. So some things are okay and find to do. And as technology becomes more and more invasive, more and more connected, more and more intelligent, what it has happened and what its cause this society. The second curve is sorted driving this notion of, well, how do we start to think about the relevance and importance of not invasiveness of technology, but the protection of the data and the use of the data. And that relevance of data protection, how that ties into policies that drive country's policies that drive company's ethics and values that drive companies. All of that is like at an all time high. Though, what has been happening on the regulatory front is something I want to walk you through, like what is, let's unpack all of this was in the past couple years on the regulatory front. Well, it all started and I'm going to say it all started. This goes back years. Privacy is not like something new because back centuries. But anybody have a clue who this man is? Of course, the prism program, right? He exposed that the famous whistleblower, what that effectively netted two, was the kind of invalidation of a program called Safe Harbor. Safe harbor was a mechanism just like you think about economic trade. If like these countries can buy and sell from each other. There was a, there's a program in place between two big economic powers in the world. The US and Europe has said, we want the free flow of data and enable commerce and trade. And the mechanism that was in place at a country level to protect that was safe harbor. Though the Snowden kinda incident came out be invalidated Safe Harbor because the court of the courts in Europe said, Whoa, we can't trust our data going over there. Because the US government just looking at all of it kind of a thing. I mean, that's the simple of it though. What did they do? Well, just like every policy and law make it, they just came out with something else. They said, great, let's come out with Privacy Shield. Privacy Shields better, but didn't safe harbor. That came out in 2016. The EU passed a kind of countrywide across all the EU regulation called GDPR. Then long and behold, not too surprising. They ended up just invalidating they far or excuse me, Privacy Shield as well. Though we, as I've even today, is no government to government agreement between free trade of data between Europe and America. There's other ways to do that. Most companies rely on natural causes, but it's been invalidated. And I give you all that little bit of background because that basically kicked off a wave Around the world of all kinds of regulations and data protection policies. If you think about it, is kinda interesting. It's like, just like, you know, Nash and Nate, nations would put up security and borders and things like this. This is the digital version of that. How do I start to think about my citizens data? How we need to govern it, how we need to control other companies from using and abusing that data. So you have Brazil PDP, ALL GP day. You've got poppy in South Africa. You've got people in China. Even in the US, as opposed to a federal law. You have lots of states better introducing new laws and regulations. And it's not just data protection, privacy regulations. There's also things that are, have kind of taken off from that same set of events that are more traditionally, let's say cybersecurity or their compliance type regulations. The White House coming out saying, Bill of software materials has to be in every single critical infrastructure. What what ships are in your camera systems, what you know, et cetera, et cetera. I won't go into all this, but it's it's regulations and laws around things like this and even things that are not necessarily like this is privacy, data protection, security, things like speak up, cultures. Enabling whistleblowing. There's a law that's actually going into force in the next 12 months in Europe that requires every company of over 250 people to have a whistleblowing hotline and a whistle-blowing kind of system in place that enables a speak up culture. People that say, we're doing something wrong here, they have to have an avenue to pick up. That that's what's driving all this transformation change. Now that what is kinda different than one thing I want to set the stage for is there is a difference. These are data protection regulations basis. But there's a difference between data security and data privacy. Data security. You think of, simply put, how do I prevent someone that shouldn't have access from having access to it and use things like encryption and access control and these, these things, data privacy is governing the use of the data. And I want to unpack that because it's, it's, it's a really fascinating when you dive into, is really fascinating. It's like policy and culture down to like, what do I cryptographically did? Though, the challenge with data privacy are the unique thing that's very different is dumb. Things are like bad to do and some things are good to do. And here's the like, Throw the wrench and the engine. If you think about let me give you a use case first. So some point. Facial recognition on your phone, unlock your phone. Thank goodness. I can't imagine him to sit there and type in and unlock my phone. Awesome. The exact same algorithm, the data, the science, the encryption, all of it behind it. Perfectly fine. This is safe to use hetero. I use that exact same tech, that exact same science. Do track someone going through a public space. And I start to correlate and say, Well, you came by this area two or three times and these timestamps and you were by this person here, here and here. So I'm an associate, us being friends and I know that this person is associated with this political party. So there's a good chance that connection there. This person goes to this, whatever, you start to build this traits and in a society says, Are you insane? Absolutely not. This is not okay. You cannot do this. That's where that's the difference. Privacy and security. My head pretty clear. Same thing I can do some figures again, here's the twist. Unfortunately, what the world thinks is okay and not okay to do different here than it is in base than it is in Europe, than it is enlightenment. And in fact, sometimes it's actually almost like what's illegal to do here is illegal not to do in a certain country. Another country, though, you have like context, is really, really key. But if you boil it down, there's two kind of fundamental things. I want you to walk away from all those laws and regulations and things that you saw happening. Here's the two takeaways I want you to think about. Number 1. Data is owned by the individual. Though the weight of all of these things is really just driving, trying to help fighting companies ever say, your personal data is yours, just like your property, like if you own a house or something like that, your property, your personal data is your property. And you have rights to that property. You have rights like accessing that information from someone that's using it, rights to request them to stop process. You have rights to request someone to delete it and forget it and never use it again. If you want to have fun doing this. There's a couple of examples. Go to Starbucks as Website, Customer bars, but go to Starbucks is website. Ask for a copy of your personal data record. Every shot a vanilla syrup, every whip cream, Frappuccino. What street corner you ordered it on, you'll get a copy of that. Bp and cool OT, same time. The second thing that I want you to walk away with is data. Here's the challenge. Is data has to be processed in the context of a specific purpose that has to be limited. And what that means is when I exchange and give me my data, there's a relationship, a contract there that says, I'm giving you this to use for this purpose and I need that to be bounded. And you can't freely just do whatever you want with it and you can't just change it and these types of things. So those are the two kind of what's new and what's, what's interesting. Now, this almost creates this like at odds in the world of like you've got researchers in data and scientists in sharp engineers wanted to do like super innovative new things and just like how do we move fast? How do we innovate? How do we, how do we do all this inside of organizations and companies? You've got the other side sort of saying, we got to, you can't do this in your policy protection, security and, you know, how do you sort of going to slow yourself down, slow innovation down. Can't do that though. How do we fix it? Well, where we all come in, how can we use technology and things like IVC enhancing tech and glue to actually enable the more use of data. But to do it in a way that is safe. And the way to think about the profession and the programs of sort of what has driven data protection and data policy, like how has it been solved, okay, historically, the way you solved, how can we use this data for this new way? You went, your legal team and your company. You said, Hey, we want to do this. Can you just add something to the contracts with the terms of service you guys get these Life vlog. I mean, people read all of them. There. You're a hero. Get a star. Right? That's how you solved it. Now we're off to the races. We can do whatever we want. Both that is evolving to do, okay? Because GDPR, because of these regulations, you can't do that. It's actually illegal now. Can't do that anymore. You have to be especially for certain categories of data and different things. But there's sort of like you have to have been interpreted texts. There's like the regulations say like it's gotta be human. Interpreter can be lawyer speak. It's gotta be like something the common person can understand. It's gotta be explicitly given out of the context of other things. There has, there's things like an employee can't legally give consent to an employer because there's an imbalance of power, right? So there's all this stuff that sort of says, you can't just do this. It's illegal. You use the data that's gone. But what that has done is that's now driven these companies to say, well, we can't just take all the data, you throw it all together, use it for whatever we want and have fun. We have to start to think about governing it. We've got to really start to think about the organization, the process, and things like that. And that's, that's kinda what gave life to one trust. Like we, we help build pools and software to help you operationally organize all those work streams and activities that you have to do. But where the market is evolving and this is very similar to security programs 40, 50 years ago, where the firewall at the edge of the network. But you did whatever you needed to do inside when a lot of governing out that everything that happened, but you're protected because you had the firewall on the outside. Now, fast-forward, you have everything from employee training. You have fishing simulation attacks. You have telemetry and sensors all over the place to detect anomaly and detect threats or a prevention or detection and the network. The equivalent to that and privacy is how do I, how do I evolve into that level of maturity and specification where I'm not asking the lawyers and people like, how are we using this data? What's the purpose of processing? I'm building technology into the fabric of the data and into the fabric of the systems to detect new use of data, to detect change of use of data and these types of things. So that's how this whole market is evolving. And there's a couple of really interesting areas of technology evolution and research that are kind of happening that I think are going to unlock and continue to drive where this market goes. But the four big trends that I would say that if because of all these regulations, what is it causing companies like feet on the ground to have to think about and do this for things one, Everybody's getting real pressure on how they are using and sharing data in the context of like, you can't just take data and really share it with people because that's a violation like privacy. And the solution to that, or are lots of interesting privacy enhancing texts. Things that are happening to say, how do we manage that? Data localization. Another kind of phenomenon, again, born out of some of those regulations to say our citizens data has to reside in our country. And there's some countries that are like you can copy it and then transfer it if you have the appropriate controls. There's some countries, for example, in China. Today in mainland China, you cannot actually let the data lead the country. Though. You have to get like really creative with things like how you do key management and encryption and you have to push that to the edge. I think this is actually going to drive people think of IoT when they think of edge computing. I think this is gonna drive more edge computing than IOT does. You've got all these countries saying our data can never leave our country where you have to have a copy of it in our country. Really interesting kind of challenge that the world's going through. Understanding the flow of your data. So you can put in govern policies around the data is another interesting one I'll go through. And then the last one is, if people give me data, they're giving that too because like they trust me or they don't. And when you think about that relationship, you have to really think about how do we build and maintain like trust of the people that are giving that data and allow them to manage and use that. So these areas, I want to, want to dive into that. So first off, privacy enhancing tech, lots of them, lots of technologies at different aspects of the tech stack that are being used here. Everything from on-device, you know, enclaves, secure computation, all environments. You're also seeing this in the Cloud as well, where people are looking for like trusted intermediaries and vaults where they can put data in between two parties. Let that person do the secure computation. And kind of give the results. Some of that ties into MPC's as well. And I'll go through a use case in a second. And then you've got things like federated learning that are happening either on devices or in cohorts. But one of the tangible use cases that is a real problem right now is browsers. So it will let me set the context though. When an advertiser, Procter and Gamble, wanting to say, hey, we've got this new toothpaste, We've got this cool new tutors. They want to make the world know that. So they go and they put ads on all the stuff. You guys see. Nike, whoever it is. What they have historically done is they said, Well, we do these ads, we did these campaigns. How do we figure out what that actually close transaction, like we did this stuff and that meant many people bought these Air Jordan shoes, etc. So how do I do that matching and conversion? This historically has relied on things like third party cookies and app identifiers to do that kind of mashing. Those in closing the loop there say, we spent these dollars on these ads and these work than those that every company in the world that advertises like they're using that today. Now, if you don't know, Google has announced that third-party cookies are going away out of Chrome. So they're going to blame. They're going to try again. They tried to do this two years ago. Our world, we huddled and push back and they delayed it. So 2023, they're going to try again. But what that, what that means and just to show better the flow. And this is where some of these pets come into play, is you have the day, you have an advertiser again, Procter and Gamble or night, you have an ad platform, amazon, Google, Map, Facebook, whatever. You have a publisher which is just a midi airy. And to do that matching, you're having to take personal data. Like here's the people that bought for my CRM system. Here's the IDs of people that I showed adds to or the person in Facebook, they click that link and I share that information together to do that match. What's the problem with that? And to share that information that like all of these companies have to give their data to Google and Facebook and Amazon to do that matching. And a, that's a violation of the n person's privacy because, because of those laws, you don't have consent to give that data to Facebook. And two, what else are they doing with that data once you give it to home is an interesting question. I'll say one way or the other. And this is where the Ito, agile cryptography tricks start to become even more relevant in this industry because you've got things like MPC's that have already know how to solve this. And the flow of that becomes how do we have this MPC in the middle? That is taken the two datasets together and allowing us to tell this match, this match without actually sharing that information. So this is just one example of how privacy in the real-world, privacy enhancing technology could come in. I'll set an ad tech, thousands of other use cases. So every company typically either has or is on a mission of building out more and more kind of data warehousing data, lake type architectures. And the problem is, you get all your data together. You've got like infinite Cloud Storage, infinite cloud computing. And the problem is now you create almost this other issue, which is everybody has access to everything and they can actually do anything they want with it. Though, how do we start to think about protecting that? Well, that's where things like dynamic masking, things like differential privacy of the data sets come together. Synthetic data generation, though generate equivalent data, don't use the actual data. Even isolation policies. For example, if you go into like a Walgreens or CVS, if you give your name and e-mail address to the pharmacy counter versus the front reach l are checking out. There are two different laws that govern the use of that data and they can't blend together. There's like isolation that has to happen. There's very specific use cases. So if you flip your head back to like your, your engineering world and you say, join in the database, has to have some control to block. And here's the problem. I mean, this is why it's not trivial. Context matters. It's not about Blake has a role that gives them access to this table is about Blake needs this data, purpose, a purpose, a blade needs those social security numbers. And that's cool, That's fine. There's nothing wrong with that. But Purpose V. I need to understand what purpose B means. The dynamically say that gets mass, that gets removed, or I have to do differential privacy. And this becomes even more complicated when you start joining multiple datasets together where you don't have problems in silos, but you have problems once you combine the datasets that you don't have like a privacy problem by itself. But when you combine this census data over here, this COVID data over here, and my CRM that over here. Now I've created the privacy problem, but all independent, they're not. So you need this almost real time mathematical way to say how do we ensure that we're not creating at that intersection of all those data points. A new privacy problem. That one is, that one's interesting though in some trivial. Another interesting type of technology that is, that is emerging in this is redacting. And the reason I call it a little bit different than just masking is redacting has to become much more contextual to the not just category or classification of data like this is a phone number. This is a credit card. Hey, I need to understand what were the two points. I said. I said, you you own your personal data and you have to understand the context of the data. That means in all of these datasets, I have to understand what's Blake's data, right? Versus what score is data. And if you think about something like an email that contains my email address, your email address, your phone number? My phone number. How do I intelligently decipher redacting my information from your data set to give you a copy of your data, or to omit your data from the learning algorithm that we're using. Because you didn't give consent to use that data. So you have to have an identity graph or people graph, if you will, of each individual's data, not just the metadata of the names, phone numbers, home addresses, these things. The redacting of the data in a very intelligent way is another interesting technology that's evolving. Those are a couple of privacy enhancing texts, switching gears, slightly responsible AI. Though, going back to a little bit like the facial recognition, this is another huge emerging area where companies, as well as regulations driving more scrutiny over what is and isn't acceptable use of AI and automated decision-making. And this is causing companies to really where they've been in this decentralized IT like let people build their microservices, do their own things like letter ruined run, right? That's having to not get real backend from a tech perspective that get governing arm wrapped around it so that you as an organization like I was talking to the Chief Data Officer Bank of America the other day. Like they have no idea how many AI, like where they're using AI across all the different products that they have. And I didn't have an inventory of that. And how do you have a language that determines, like the fairness of the algorithms, the bias in the decision making, you have to do all these things. So when I think about the the, the ML Ops operation, this does it quite fit as much on the light developing the model, building the model. But there's this element of like, how do I understand sort of bias in the algorithm? How do I understand the decision-making that is being done in production? And there's lots of evolution of kind of explainable AI, the tech behind how do you white-box, black-box, poking holes into it, see if you can drift the decision, these types of things. But there's also, there's lots of regulatory driver specifically that are starting to pop up. Well, a lot of things, but that are specifically targeted at e-mail decision-making. Finances had some of these for some time. But for example, even last year, the State of New York or the city of New York, without an explicit law around employment, that you cannot use automated decision-making in the software that governs your employee, like decision-making process. And that means these, these tools that do things like, well, I'm gonna read resumes. I'm going to look at conversations back and forth. So I'm going to build sentiment scores about they might be aggressive than mine at this. Like you can't use that as part of the decision-making process. So that kinda tag is driven for that. A massive one that's coming over Europe next year of the AI act, or AIA. And it's trying to put AI usage in these buckets of like absolutely prohibited real-time remote biometric identification public spaces. It will go to jail if you do that. High risk activities. To give people insights into, you can do this, but you gotta have really strong transparency and governance of the posset and limited uses. And some of the tech that's evolving in this is not too dissimilar to the security tag that governs things like and accuracy or vulnerabilities in ML libraries. But it's things more like looking at the, you have gender bias, you have diversity bias. Let me look at the training data set or they're sensitive categories of data in the training data set, is that creates like an illegal issue with it. You have to look at the drift of that algorithm in production and kind of see if you're drifting from that. And then there are parts of the responsible AI notion that are kinda nonscientific. It's contextually, you gotta kinda like assess it. But that's another interesting area of evolution is happening, right? The third, the third one, data localization. Eat my data in my country. Though this is creating, like I said, interesting tack around pushing, pushing data to the edge to either be stored and copied locally. And most of these regulations, for example, today and he, India and Russia, there are laws that say you must copy any PI. Do a server in the country before you are allowed to transfer it outside of that, when you are doing things like marketing DemandGen, campaigns and things, you have to have this flow of data that allows you to kinda copy it before you can export it. That is evolving. That's kinda copy and then it's okay. I think a lot of the world is actually going to evolve to more of what China just release with people. The IPL is the regulation in China, which is, you can actually copy it either. And you have to start to think about your software architecture for your applications where you need to not just copy the data in the country, but keep it in the country, encrypted in the country. And then build your application to allow your browser almost to this, hey, this record of data has to be pulled from this server here, decrypted from it, from directly from the client to the server, as opposed to at all going up to the common Cloud. That creates another really interesting challenge. And in the marketplace flow of data, understanding of the data. Biggest challenge here is that I can tell you for just talking every one of our 13 thousand customers, nobody has a clue what their data map looks like, like what they're followed. That is, the problem is it changes all the time. And they, they don't have the tools to really digest that. So there's this huge evolution up. If I have to govern all my data, I can't do that unless I know like what that I have. And when someone moves or spins up a new microservice or starts transferring, have to detect that to put it in place. So how do you just a pulse on what's going on? So there's a couple of different approaches to this on the technology side. One are companies that are evolving in looking at the data pipeline. Or if you are data fabric, you think of these like data Ops pipelines. They sort of digest all that information. They extrapolate the lineage flow. And that helps you stay on top of when someone starts taking data and shoving it into the data lake from some new system, taking the data and transforming it over here, using it in this new report or in the sewage application, it gives you a Paul sunlight, what has happened and how do I make sense of that? The other approach that I've seen is a little more from the network or the microservice level. But do you think about microservice architecture in depth teams where you, you sort of say everybody go divide out Bill what you want. We'll put the whole thing and the VPC and now we're off to the races. When when you when you take the backdrop of everything I just said, you have transfers, the data happening. Well, where the transfers happening if you've got teams to building microservices, you don't have that, that these, these texts that are evolving and some are open source projects, but that either tie in at the mesh layer of the VPC, that tie-in at the kernel level or the Kubernetes level of the cluster to try to track and look at the data and say you've got categories of this data moving this way or going from this micro-service to this microservice. The other one that is also a really interesting problem to solve is how do I understand the data from the code itself? Though, can I actually start to read the code the same way that you would do static code analysis for vulnerability management, things like that. But rather than trying to say, this is, this library is old, it needs to be patched. This to SQL injection. I have to understand these variables in the code like what are they actually storing and using? How would I classify the data? Are they being used? Here's an HTTP call, that variable going out. Is that a transfer of data to some third party system? Where's that third party system? Adh. Does that mean we transfer data from the US to the EU? How do you read the code and then derive? Here's the data you have. There's a couple of companies trying to tackle this, but it's definitely not trivial. It's a really interesting problem to think about of how you can, how you can kind of solve that. The other area that is evolving is really centered around kind of ongoing assurance of what you're doing. And the way to think about the many companies today. If you wanted to go start your own company, you're like, I want to go cell to cell. I've thought of a really cool idea. I want to go sell this medical thing to the healthcare industry. First thing you're going to get hit with is we need you to have what's called a HIPAA certification. Or if you're trying to sell to finance, it might be a little bit different. Price I'll generically security, they're going to say you need to have a SOC 2 Type 2 for your data center. You need to have ISO certification. You're all probably is your first time founders going to go with the rap? Is that what do I do, right? Though? What those are? Is there certain programs and standards of things that you need to be doing for people that sort of trust you to say a, you have an authentication systems in place for your employees. You have endpoint protection on your computers. You have, depending on your facility or setup, you have locks on the door, right? Whatever it is. The way that has always worked in the world is you kind of manually document that you like. You've documented everything you did, and then you have these auditors, man, and they sort of review your information. And they went look at it and they say, yep, you're good or no, you're bad and go change this little subjective, like they have standards and frameworks. I'm not profession, but it is like a human like thing. Like I'm looking at a screenshot of your AWS Config and I'm saying, Yeah, this looks good or does it? And that is evolved to what's kind of happening now is people are saying, we need you to not just do that once a year. I need you to actually do that a little more frequently and I don't need you to just have one standard that you certify to. I need you to have faith. Well, all I did that would be like me saying I need you to enroll in 20 classes. You're like, how do I scale this? When you start to think about automating, though, what has happened is you've got all these technologies now that are saying you don't have to go to get the screenshot from AWS. I'm going to log into AWS and I'm going to get the light config file itself. I'm going to connect your HR system or your endpoint management system. I'm going to verify you've got encryption turned on your laptops, and I'm going to produce all that evidence for use of that, then you can present it to the auditor and just streamline the process. But where all of this is going for kind of the, the technologies, there's massive investment in how do we almost take that policy of you should have these things and kind of put it in code. Within the code becomes the language that talks to the systems and machines to say, here's how you have to be configured. And you've got the, the, the tentacles into all those systems, the digest and pull out. This is how they are configured. It, it almost creates this notion of I can not get a certification that I'm doing what I'm doing once a year or once every six months, I can almost get a instantaneous like right now, let me get a cryptographic signature from AWS, from Workday, From this and this at your companies config that I can then validate and look at and say yep, as a this back in time, you are set up and compliant with what I can trust. That's how the, the, the industry shifting and there's a couple of players that are doing these. Some of these you'll hear called continuous compliance tools. Some are called Cloud Security Posture Management, broader angle, one trust actually we acquired a company that was a startup in the space called tugboat logic. But there's lots of other ones that are attacking it from kind of a little more of the privacy, a little more compliance, a little more security angle. That's kind of a fascinating concept of getting like that instant real-time assurance. And the last thing I want to talk about is really just how does this boil down into kind of the cell? What to the in-person or someone that, you know, ultimately he's going to give you data ultimately says like why do you trust this company to buy the goods and services from them to give you to give them your home shipping address. Why do you trust them? Well, there's a lot that goes into that and there's accompany there's a research group called Adelman that put out anyway familiar the Edelman Trust Barometer. That will people, okay, they kinda measure trust at a macro level. They do this research analysis and stuff. But the takeaway point is, we're at an all time low across the world, are all society trust is at a massive low across the board. Nobody trustee when more than they ever had. And out of all of it, businesses are actually more trustworthy than any other entity. People don't trust. Obviously the government, media, these things, but businesses are, are having more pressure than ever to be the ones that actually pioneer in trail braise into these, these next kind of areas of trust. And the economics behind why you should care about that, why you need to be trustworthy, are getting super, super clear. Companies. This is a study from Boston Consulting Group, but companies that have higher trust into z-scores, people you have more data to, they buy more goods and services from, they let them use them for more innovative things. And that impacts the bottom line. Like they're able to innovate, you're able to outrun because they are more trustworthy. Though, the way I distill that down, as, you know, put two decades ago, when UN shop for something or bought something, thought about what's the value I'm getting? And you weighed that on a ratio of price to kinda quantity. Getting good bang for my buck. Well, we all of a sudden got lots of places we get value. So people started say, well how do we differentiate, differentiate, experience? You pay $7 for that latte because of the experience of what you're getting. It's about the end. By how many times did you hear the world's say the word experience through the 20 tens and, and so forth. So it's all about the engagement, the experience, and the next decade. I think it's going to be differentiated on trust and case in point. We don't talk about the phone pixel size. We don't talk about that. Yeah. Not as much as we used to five years ago. We talk about privacy, That's iPhone. Anything else? Any questions? Right? So that's the level of light in your face. Then there's a, there's a huge notion of like one at an organization level 2, at a broader level of how do we have better, how you build trust? One of the ways that you can think about doing that is, well, people are going to be trusted if they feel like being transparent, that they feel in control, that they feel like they have a relationship with you as an entity. And this is, this actually a segment, the market one trusted them. But this is sort of the notion of like, how do you have a Trust Center or Trust Portal were just like you could go in and kinda have a healthcare portal where you learn all about your healthcare and you can have that relationship. How do you have a Trust Portal with your organization where you feel like the information that they have about you. They're transparent of how they're using it. You have choice and control to change their use of that information. You can give them more information. You can get copies of your data. You can request them to do or not do things. You can learn about how they protect it when something in the world happens. And this doesn't have to just be about data. This could be about what are you doing with carbon? You said that you're going to be net neutral, but where's the data behind it? How do you convey that? Because if you don't be transparent and you don't allow that level of transparency or your road trust. And trust is definitely something you lose and buckets and you gain and drops. The importance of this towards a brand is really, really key. Now one thing that is beyond just the Here's tack and here's things that has to happen is we need common language in the world of like, what does it mean for you to be using my data for x? Like that means something different to every single company, every single person. So what has to develop over the next decade, I think are things like where we all can go into a store and pick up a food package and then we kind of digest and we know what's in it. Like I know what's in the box, whether it's healthy or not. Who cares. But I know what's in it. We don't have that for like data and use of data, transparency of data. So this notion of like how do we have some standards and frameworks and kind of a nutrition label for data is really key. And what should be on that nutrition label? Deloitte just put out a research they, they did the end of last year, but basically it was around trust indices. What makes up trust? What are the factors in the levers that makeup trust? The text a little small here, but things like, of course, I'm not going to trust you if you don't have a good product and those things. But right below that are things like ethics, your culture, your purpose, your data integrity, and data protection thing I've been talking about full time. There's at the top of these indices of importance. Things like crime, fraud, cyber, those are significantly lower than, that becomes a really, really key thing for people to need to know a, to feel empowered about. And companies have, are like rapidly all embracing. Like how do we continue to be trustworthy as an organization? That is what's going on in privacy data protection. Huge amount going there and I think we've got time for questions and things like that, but thank you guys for listening. Now let's thank our speaker. Now some wake-up call to Rome on privacy problems. Thank you so much. We have time for questions. Anything from the audience? Yeah. We got one all the way in the back. Yeah. I just had a question about something you said at the end. He said that like in the 20 tens, a lot of the pricing because you were like you pay $7 for a latte at Starbucks because of like the experience, I guess what are those cover those costs, I guess effective now because we've seen a lot of companies that are like raising prices because they're talking about inflation, but then they're also like making record profits at the same time. How do we determine what costs are actually from security, from data privacy like stuff like that, and not just greed, inflation altogether a macroeconomic factors. God's fascinating question. So I think the context of the 20th and the experience, I think everyone realized like I have to have a good experience or people won't want to buy from me. Like I had to have a good environment. We saw that with even like employees and like the work environment and how that had to change that into your question. How do we think about that in terms like profitability? I think this is a very interesting notion of how all this comes into. How capitalism and trusts don't have to be at odds with each other. And I don't think it will be very uncommon for us to see a future pretty quickly. Where just like you have a 10 K and you're reporting your financial data, you have to report your carbon data, your ESG data, maybe your privacy data. Wall Street to be a public traded company or et cetera, et cetera. Yeah. Like several times a year we find out that corporations are mishandling all of our data or just willfully ignoring cybersecurity standards. And they're making huge sums of money. But then like the amount that they actually get punished or find with this very minimal, like they'll make a billion dollars of data misuse and then get fined $50 million. Yeah, but it's nothing like how do we actually like what, what's the actual cost associated with doing the right? Like doing the, doing something bad or like what's the incentive to do the right thing? Because it seems like nothing. So fair point. I don't disagree, but I would say one, there has there has to be and there's a demand that the enforcement activities that have happened to date are not enough. And that is why you have seen the growth that these regulations and they do take time to actually get out there be interpreted, enforced. But I think that's the spirit of what's fueling it exactly what you're saying is all this came into existence because they saw the abuse of what was happening and they wanted to try to correct it. I do think though that regulation is one way to punish companies for that misconduct. The other is the buying power and the expectations of the consumers. And I think you see that when you see devaluations of, I don't want to call companies, but companies where they're CEOs kinda had that go. You see that market cap reflect that and that's not because someone got a fine. That's because the customer base society, employees, they don't want to work for you because of that. People don't want to buy those goods and services because of that. Well, let's say everything's an exact science, but yes. Well, this is a university, so we think everything has an exact science. Joking. Any other questions for me? I'm like that was a great talk. I think expedience people can sort of, you know, we have some common sense of what a good experiences. Trust is lot more abstract and so much more personal. In my own household. I would never, never do things that my wife or daughter do all the time. And I've not been able to do Vince them not to do those things because I think when you move up to trust, personal, cultural, there's all kind of what hope do we have? Sort of a common sense of trust would work. Lots of companies and people and things like that. So you hit on a couple of topics. I said as well, I trust is something that is massively cultural. And if you look at any company that is a global multinational, they have to think about their culture code, their engagement with that market. Employees in that market, everything at a, at a cultural, at a geographical level because it is so unique and cultural. And then every culture of a company has even within the US and even Atlanta, you have two different sort of cultures that you have to align those Trust initiatives and outcomes too. So it is super subjective and specific. Like I was talking to the CISO at Dell Technologies recently and we were kind of talking about like security operations, privacy operations, how these things come together. There's a whether it happens or not, but I think we both agree. There's a path where security becomes more and more standardized and consistent around how like in the world, how everyone runs their security program does not as bespoke. I don't see that future for things like trust and privacy because of everything you just hit on. Awesome. Thank you. Actually, I have one online from an online viewers are asking about privacy tools that may be deployed. Any of them well-intentioned, but come with vulnerabilities themselves that may jeopardize data security. And how do you maybe vet some of these things when they're being deployed? Yeah. That's just like any software or tool or system or people or project. You, you have to think about the holistic picture of everything, their security, There's ethics, there's privacy, there's, you know, again, carbon implications. You're using this whatever, what's the carbon impact for doing that. And you know, the, the threat vector can be from anywhere. In a lot of cases, hopefully it's not intentional, but it can be from anywhere. So that answers the question. But you, there's no easy out that because you're in the ethics space or you're in the carbon space, that that means you are always a good actor. Like yeah, zero-trust, I guess 0 trust assumptions and then build from there. Absolutely. One more of the room. Though. In the EU barely and also in the US to an extent, as he said, we're seeing a movement towards idea of privacy blowing into the individual. And you're also seeing that to an extent elsewhere. But thinking particularly of China, I think you're seeing that as a secondary or the primary being the idea of data belonging to the government. Do you foresee a possibly a future where you end up with blocks where I'm countries view data is blowing to individuals, others view data's blowing to society. And if so, how might that help? Yeah, I think that little bit of a future of now some cases with that, but yeah, I do think there is this. I think the pen them all over swing where people put blocks around their borders and walls from the context of data, the use of ending the ownership and I think that will swing back the pinned on, we'll come back, but I think it always sort of Uber swings a little bit before rotates back to where the world really wants it. I personally feel that that is a reality. And then one more from online and then will set you free play. So this person asks, and this is a good one to kind of bring it back to Georgia Tech. So as a young technologist, how do you deal with considering all of these privacy implications when you are building new applications or hardware systems. Yeah. Yeah. Yeah, so this is why organizations like ourselves and I would say industry groups and communities exist to help us create framework and taxonomy to navigate those questions. Like just, just like anything that's maybe a little more IT and mature where there's, there's kind of nist framework that you run your program off of. From a security perspective. There's privacy and sort of trust frameworks that help you navigate those things and bring it down until I see you doing this or that and help you weigh those things. So the practice, the standards and policies in the technology, they all have to come together to really get to the goal of the in-state. Outstanding. And let's thank our speaker again.