But first, I wanted to welcome everyone. My name is Katherine Manti from the Georgia Tech Library. A few things before we get started. We do have the Q and A open today. If you have questions for Professor Wendy Wong, as we go, you're welcome to put questions in the Q and A. We'll have a little bit of time at the end. We've also enabled captioning, which you'll find along the bar, along the bottom of your zoom to be able to follow along with that. I'll pass it to Dr. Asha Johnson from the Georgia Tech Library to welcome us. Hello. Hello everyone, and welcom. I am Dr. Asha Johnson, Associate Dean for Academic Affairs and Outreach for the Georgia Tech Library. Thank you all for being with us this afternoon. I hope you are as thrilled as I am. So let's introduce Dr. Wang. Dr. Wendy Wong, a professor of political science, focusing her studies on global governance. She is the author of award winning books, Internal Affairs and The Authority Trap, with Sarah Stroup. As a scholar, she has penned dozens of peer reviewed articles and chapters and have appeared in outlets such as CBC, The Global and Mail, the Toronto Star, and The Conversation. Dr. Wang has also been awarded grants from Social Science and Humanities Research Council of Canada and Canadian Institute for Advanced Research, among many other granting agencies. Today, we are honored for Dr. Wang to share and discuss her book, The Data Human Rights in the Digital Age. Published by MT. Press in October. The data has been shortlisted for the 2024 Gilbert Prize and was named by the Journal of Democracy's favorite books of 2023. This work serves, of course, as a rallying call for extending human rights beyond our physical selves. Delving into critical needs to reboot rights in our data intense world in the exploration of pervasiveness and data collection and tracking. Professor Wang reminds us that we all indeed are stakeholders in this digital world, yet many are being left out of crucial conversations around technology, ethics, and policy. Let's all welcome Professor Wong. Thank you so much for that, Dr. Johnson. Let me share my screen. Thank you all for being here. Thanks to Katherine Manthy for the wonderful invitation. It's so nice to be speaking virtually at a library as you'll see later today. To my mind, libraries are core part of helping us adapt to our data intense reality and I'll come to that at the end of my talk. I'm also so honored actually to be giving this talk for the Love Data Week here at Georgia Tech. And of course we all know Georgia Tech is a leader in developing and thinking about tech issues. It's especially important to be thinking about the intersections of tech in society. When I got the invitation from Katherine, I was thinking about what this idea of love data week means. What is love data? I thought, yes, I think I do love data. But I think there's a caveat with the way I love data. That's really what we're going to be talking about today because I think that there are things about digital data, not data in general, but digital data that enable us to record things about people and their activities through online devices. That should give us all pause and think about how that capability actually really fundamentally changes our lives in terms of how we experience our lives. I also wanted to reflect a little bit on the theme of my kind of data. I think this is such an important idea to think about because as I hope will become clear today, the idea of my kind of data is not so simple once we think about it in the context of the interconnected societies in which we live. I really like the way Dr. Johnson set up the talk, and I will do my best to really help convince you and maybe give people a more concrete framework to think about what data stakeholdership means. Those are my provisos, But today I will be talking about a book that was published by MT Press. It is the same name as this talk and it's really a that, although I'm a political scientist, I do international relations, I really was writing a book for a much more general audience. In part because I think that data issues are so important outside of my own home field, but also because so much has been written by non political scientists. And in fact, a lot of my ideas draw from outside of the field that I'm most familiar with. I looked a lot at law. I looked at people working in tech fields and what they had to say. I looked at some humanities fields. I looked at the social sciences writ large. I thought about this really through the lens of human rights. Society and global governance. Because that's really the material I'm most familiar with and that I've been working on for two decades now. Going on two decades in a time when artificial intelligence, or AI, is a topic that gets headlines at least once a week. I think it's really important to highlight the role of data in society because it's important to remember that AI needs three things to be useful or to be successful. As a technology, AI needs compute computing resources. Ai needs algorithms to give the commands to use the compute resources. And it needs data. Current models of AI, especially the things that are out there right now, Like large language models that do make headlines, Those types of AI need lots and lots of data. That's really a general statement for the type of technologies that AI refers to. Even though this is a book about data, it's actually a book about why data are so central to AI and how that centrality is life changing. For us, the book came around, I have to say, right before the pandemic. Actually, the timing was not great. But I also got to see what it was like for all of us living our lives. Going mostly face to face to, all of a sudden being locked down and going onto these virtual platforms like Zoom. Today that we're on, becoming a lot more dependent on digital technologies in a very explicit way. It was really a good time to reflect on how our lives become data in the short time we have today together. I really want to just talk through some of the main takeaways from my book. And I'm really looking forward to our conversation after I like to start by talking about what Amazon.com Shark Tank, the reality show and basketball legend, Scheck Quel O' Neil have in common. The thing that they have in common is the ring doorbell. Now, Amazon owns ring company. The product was initially featured on Shark tank. The fortunes for the company really weren't going that well until, among other things, Shak decided to become rings Pitchman. This is an interesting intersection of these popular culture items. I'm sure many people today know what ring is, or other video doorbells like The Nest or the Arlo, or the Blink. These are things that have become very ubiquitous, very pervasive in our lives. About one in five American households have one of these products. During the pandemic, video doorbells and other remote devices also surged in terms of sales. Video doorbells are a product that allow the purchasers the ability to monitor and record the area outside of their doors or wherever they end up placing these doorbells, they pull in audio and visual recording. They allow the owners of these devices to speak to visitors and answer the door, even when they're not home, at least to know who's coming in to interact. The thing about ring is that they offer a bunch of different types of products. You can get the doorbell, but you can also get cameras and lights and hook them all together in a security system for your property. Now, why am I talking about ring? Why the human rights issue? Increasingly, everyday objects like door bells are actually raising important questions about how human life has changed and therefore how human rights have changed or need to change. To think about these shifts as our devices become more and more data intensive. And emerging technologies like AI have come to rely on massive pools of data about people order to work and function in the ways that we've come to expect. More and more of what I call mundane activities of our lives become data. I'll talk more about this mundaneness in a few minutes, but I want to put it in the sense that data are being taken from us, about us, often very unexpected and mundane, or everyday ways. When we think about how the ring works and the implications of the audio and visual detection and recording abilities, it really changes the terms of who might be surveying. I think in this day and age we are, we understand the purpose of some of the closed caption or closed circuit TV monitoring that happens in public spaces, often by government. We know that there's a lot of surveillance in semi public places like malls. Those are places where we might expect surveillance. But ring products are actually in placed on private homes which are facing streets that are very much public and often not surveyed. The surveillance is being done by homeowners for private purposes. Those of us who walk by these devices, our voices, our images, our gates are being caught in these snippets through audio and visual data. These data are available for analysis by those who have the data. Actually, ring has gotten into some controversy because it's signed agreements with thousands of law enforcement agencies to share ring camera data, often without the knowledge or consent of the doorbells owners. In turn, doorbell owners, if they are paying for a subscription, can download video and audio that their device is captured. There are a lot of folks who can actually access these data. Sometimes though, I think it's also worth thinking about how the capacities of ring doorbells can be turned against those who are doing the surveillance. There've been a number of high profile cases where hijackers actually or hackers were able to hijack the camera and the two way microphone on cameras or on doorbells to really harass the elderly, to stok children and other unsuspecting users because they were able to hook into the online connectedness of these devices. I also want to speak to the fact that we as the pedestrians or the drivers who are pretty unlikely to be aware of these devices as we're walking around in our neighborhoods or driving around. We're not really consenting to those kinds of activities. However harmless, however mundane, we're not consenting to them being captured. Doorbell owners don't always consent to how the footage being used on their devices is being shared. Because I think practically it is very difficult to do all that consenting and all that acknowledgment given how digital data are used. Just this is just to point out, the ring is one of many, countless, what we call smart objects. Recording and sharing data about us with various entities, they corporate or government. These entities have an interest in collecting data from people. They are data collectors. This raises several questions that I really want to focus on in the rest of my remarks. First, to what extent are human activities and human existence becoming data? I want us all to think about that and reflect on that. Second, critically, from the rights perspective, how are our rights as human beings being tested and perhaps violated in a social context in which data intensive devices are pervasive? They are ubiquitous. We are all meeting on an online platform today, and I'm sure in your purses, your bags on your desks, we've got multiple smart devices that are there these days. We hear a lot actually, about data about people or data being taken from people. And we're often confronted with news about how companies are using data in ways we may not like. We might find troubling whether that's tracking our behavior across the internet and having a surprised ad show up because a random conversation we might have had with a friend. Or being part of this massive generative AI system like chat PT or the various GPT's that are being created. Thinking about how we are part of the data, we are part of the data and being used as part of these AI systems for training. I think it's really easy to then think, well, we can't help but fall victim to these incidents or be part of this process because of how our lives are structured and organized. Many of us have smartphones. We depend on the Internet for day to day activities. We engage in digital technologies, whether we like it or not, whether that's taking public transit or patronizing stores that use loyalty programs, which are of course, an old way to track people's behaviors. Some of these stores even use facial recognition technology these days. Or whether we're having our purchase histories being analyzed and tracked through websites and apps. Or walking down streets with houses that have digital doorbells and feature digital assistants like Alexa, our lives have become data fied. That is, the behaviors that are revealing of our innermost thoughts and personal activities have become digital data. Another way to think about this is that a big part of our lives have become data that could mean that we feel like data subjects. This is the first takeaway I'm going to be talking about today. We might have a tendency to feel like we're subjected to this data making process. That we might feel that we have very little to say about whether the data collection process happens. I wrote this book because I think it's important to become data stakeholders, to recognize that we are actually all data stakeholders. To think about how human rights can get to that point of recognition of our stakeholdership. And so the first step is to recognize the importance of our role and the creation of data. All of us are data sources, right? And I think it's important to think about this role. It's not a passive role. We don't have to lose to powerful data collectors with names like Google, Amazon, Microsoft, Apple, and Meta. It's not necessarily the case that they get to dictate the terms of how data are made to be a stakeholder. It really means making our voices heard. And there's a challenge to that, because there are currently billions out there, billions of Us data sources, and there are very few companies, They're very wealthy and they have a lot of power in the AI space. But I think part of the issue is that we're not thinking about data as something intrinsically human. Instead, I think we're often told to think about data as a commodity, something to be bought and sold. Even something as valuable as oil. We're still being told that data are something that we can trade. Just this morning as I was prepping for this talk, I was reading on my Twitter feed that the University of Michigan is licensing data from speeches and student. Now there's no mention of whether people know they're in this dataset or if they've consented to having this material be shared. But there's certainly dollar values attached to these datasets that are on offer. This is I guess, consistent with the idea that we're the product if we're not paying. But this is very much a market driven idea of what data about people are. By contrast, we're sometimes also told that data are by products. They're not important. In fact, I've often heard people talking about data as dust or detritus or exhaust. Certainly not things that I would attach huge dollar values to. Now what if we didn't just think about data as part of the market, but what if we thought about it as part of who we are part of being human? Because as our lives become data fight, that's exactly what I think is happening. If we do think about data as being part of who we are, then we can also think about how data can be subject to the globally endorsed framework of human rights. Just to restate, stakeholdership means having skin in the game, but it also means having entitlement to speak up because we're all part of the game already. Rather than sitting back and being subject to data collectors policies on how they make, store, and analyze data about us, we need to start directing the conversation to one where we can talk about the future of data in terms of human rights. Now the second takeaway is that there are some challenges. This idea of using human rights to think about data, that's because data are sticky. This is a framework I introduce in the book because I think it's a really useful image, right? Data are sticky like gum on the bottom of your shoe. We've all had that unfortunate instance where we step on gum and don't realize and grind it in our shoe. And then it's really, really hard to take it off afterwards when we finally realize what we've done. Now I think in a similar way, it's actually quite easy to generate data about ourselves because of how we live our lives. But again, it's not so easy to be rid of them. I think critically different from gum, we often don't even know we've stepped in it or we've made those data. Okay, and that's because data are sticky. Now before I go into the four reasons why I think data are sticky, especially data about people, I do want to point out that one of the great things about digital data is that they are easily copied and transferred, right? They are easy to share and easy to move around. Those qualities make them very useful for us, but they also make them problematic, and they contribute to this idea of stickiness. The first reason why data are sticky is because the great majority of data from people that are being collected are mundane in nature. It means they're not remarkable. We often can think about them as not being even all that interesting outside of, maybe perhaps on our own preferences. So think about an app like Straba which records the length of your runs and the time and all that that maps it out. I mean, those are data that are important perhaps to us, but really are mundane. They're every day, right? These are things that you do on a daily basis or on a regular basis, and now you're recording them for yourself. There are all these different things that our devices are collecting, about our locations, what we do, our little behavioral quirks. But the mundaneness of data is also really important to point out some things you can't change. You can change your jog, but you can't change your face that easily. You can't change the way you walk that easily. You can't change your heart rate or your heartbeat patterns that easily. This is why they're mundane there every day and there that can't or don't have the capacity to easily change. The second reason why data are sticky is because they're linked. This is really just referring back to this idea that data are really easy to copy and transfer. They have value for analysis. I think we all know now that when data are created, they're not just, they're in a little spreadsheet or in a little database that stays closed. In most circumstances, they're being shared and bought and sold. You're going to think about positive things that might arise from that thinking in terms of medical data. Oftentimes medical professionals wish that data could be shared more so that they could learn more about certain diseases, work towards cures. But of course, there are negative externalities, negative effects, when we do share health data. Things that are maybe not directly related to the health of a population or individuals. But health data can be used to deny people insurance or raise insurance rates, discriminate them in other ways. The third reason why data are sticky is because they're effectively forever once created because we often don't know that they're being created. But also it's actually very hard to verify the deletion of data because they are so easily copied. I think it's safe to assume that data are effectively immortal as soon as they're created. They last for a long time. The last reason data are sticky is that they're co created. This is something I talk about quite a bit in the book because it does pose some issue for the claims around. I often hear people claiming this is my data or this is your data, or these data belong to you. But actually, given the fact that data can only exist if there is a data source that's all of us and a data collector that's someone, a company, a government that's interested in making those data. This is the co created process. Without data sources, you or data collectors, there would be no data right because data don't exist in nature. This is actually a problem if we think in terms of property rights, for example, about whose data they are. What's the determination of to whom those data belong? Is it because they're about you? You're the source, so they're your data, or is it actually the company's data? Because they're the ones who are actually collecting information about what you're doing and what you're thinking. That's one way to think about co creation. I think the other way to think about it really is one that forces us to step outside of our individual experience of data collection. I want to draw attention to the fact that data are often actually very collective in nature. Think about explicit examples, like when someone posts a group photo of you in it on social media. Or we can think about the more implicit background ways that data works. The reason why data are so important to data collectors is because they're able to generalize about people like you. They make new categories, they make new collectives from which to sell better products or deliver better services. These data are coming from individuals. We all experience data collection as an individual. The effects of those data are actually very much collective. They're not about just our experience, but other people's experiences that are like us. We probably won't know to what extent the data that are being taken about us are affecting someone else's experiences. When data about us are created, it's important to note they're not created in isolation and that they have both collective and individual implications. Now I'm going to turn to how human rights can help, because human rights is another framework where we have to think about individuals and collectives. Let me get some water before I start that, we often hear the term human rights. I think it's one of those big broad ideas that has gotten so much traction in the last 75 to 80 years. I want to point out, what are human rights? How do we think about them? The political scientist Jack Donnelly, who has written one of the most important textbooks out there about human rights, has argued that human rights are not what humans are. Right now, the point of human rights is to think about human beings and what they might be pump. In that sense, we're thinking about human potential. Human rights are safeguarding a potential about human life. Now, some of you might be familiar with the story of international human rights in the 20th century. Let me just quickly go through the highlights here. After World War Two, the UN Charter established human rights as one of the important pillars of how the international sphere was going to operate post war. There were a bunch of negotiations after the end of the war. These culminated in what's called the Universal Declaration of Human Rights, or UDHR for short. In 1948, through the years of negotiation, state delegates arrived at 30 articles they saw key to the establishment of international human rights. They agreed that those rights are universal, unconditional, and interdependent. Now, I want to say here some people think about ethics and human rights in the same breath. I think they're related, but they're different in an important way. Because human rights are entitlements anyone has against states are responsible for providing or protecting human rights. That's what makes human rights different from the broader idea of ethics or ethical behavior, which is about morality. Now, some examples of human rights are freedom of expression, freedom from torture. But perhaps the ones that we pay less attention to here in North America are the right to education, the right to choose a marriage partner, the right to a fair trial. Since 1948, there have been, at the UN level, the United Nations level nine main human rights treaties that have been signed and agreed to by states. If you've heard of the Genocide Convention, or the Convention against Torture, or the International Covenant on Civil and Political Rights, these are some of those. But of course, there are more legal documents and other guides at the UN, But also in regional organizations like the EU or the Organization of American States, the OAS. Okay. There's a lot of human rights out there. Right? After telling you all that, I'm going to say in our data five world, we want to think about how all these rights, what they're actually doing, what they're trying to tell us about human, human life. Excuse me. On this slide I have what's called the portico of human rights. After the negotiations over these different rights happened in the 1940s, even the framers of UDHR wondered, well, how do you fit all these rights together? Because you're talking about education and you're talking about choosing a marriage partner. And then all of a sudden you're also talking about privacy. What holds these things together other than we think they're important? The French diplomat and lawyer Rene Cassin, who later won the Nobel Prize, came up with this idea of portico for a Greek temple and thought about, you have all these rights that can be organized into different pillars of kinds of rights. As you can see on the slide. You have the roof that they're holding up, which is like the duties that we have to one another and why human rights are important for upholding those ideas of a common humanity. And you have the steps that lead up to the pillars, right? This is where, how we get to these ideas around certain rights. I just want to point us to the foundation here, right? The, the entire edifice, the entire buildings. Actually four values and they were called dignity, liberty, equality, and brotherhood at the time of the UDHR. Now I think these values are really where we need to focus our efforts in thinking about data fication and human rights. Because there's been arguments around how privacy changes, how freedom of expression changes as a result of AI or as a result of big data. But I really think it's important to think about why do we care about the freedom of expression, Why do we care about privacy? And so I think these values are actually what helps us understand that better, right? And you know, in so far as thinking about these very important values, I think that dignity is about the worth of a person. Dignity is how people feel as having worth and also how we treat each other as though we have worth. The idea of equality is really important. It speaks to a desire to be treated without discrimination, and that we're all being treated to the same baseline, not arbitrary baselines in accordance to individual characteristics. In the book, I talk not about liberty, but about autonomy because I think autonomy is a very important idea if liberty is about being able to act. Income if liberty means not having restraints. I think the idea of autonomy is about agency. It's about actually acting on a lack of restraint to actually be able to make choices and act on them freely. Autonomy, I think takes liberty at one more step. Finally, I updated brotherhood, or the 21st century to think about it in terms of community. Right? Community is really what brotherhood's about. It's about being part of a group. It's about membership in a common society. When we think about dignity and equality and autonomy and community worth, things haven't changed. I think a lot of our debates around social media or about what's going to happen with deep fakes. These are all about these four values. I think really what we need to think about is that when the framers of human rights in the 1940s, the UDHR framers, when they were writing that document, they were really thinking about physical abuses and physical restraints on human potential. We have to think about how the digital changes that. We have to think about how the digital makes it better or worse in terms of encouraging human flourishing. But we have to embrace the fact that human is physical and digital, they're not separate. I think that's really a key thing to acknowledge. Now I also want to spend some time thinking about big tech as governor. I can talk more about this in the key and I have a special interest in thinking about non state actors as governors and governing in the world. But let me just start off with what about tech means, right? I think we're usually talking about tech. That they're, they're big because they are rich and they're historically wealthy. That's what gives them power. I think that's correct. In a certain way. Money is a way to exercise power, but it's just one way. And I actually think that given what big tech is giving us, given the types of technologies they're developing and their role as data collectors, I think that's far more than economic power that contributes to their economic power. But it's not just economics, they're also governing in ways we're used to thinking about states or governments doing. If we think about governing from a political science perspective, we're talking about creating order and shared expectations. That's what governing is all about. That means we're setting rules and finding ways to enforce those rules. In our day and age, we do expect governments to govern. It's far less expected to see other types of entities doing that work. But I think more and more we cannot dismiss the fact that big tech is governing. They are creating and controlling the digital platforms through which we access services and create social and political connections. They determine how those platforms work. They determine what data they collect, when and how they use those data. These data are underpinning many AI functions for sure, but also non AI computing functions that have value in human societies in terms of creating order around shared expectations. I'll just give you an example from the book. I talk a lot about Meta's oversight board. Meta started this board when they were still called Facebook. Basically, they actually took it upon themselves to recognize the importance of the human right of freedom of expression and they actually refer to international law around it. And they made it a goal to use freedom of expression analysis to think about what types of content to either keep on their platforms, Facebook and Instagram, or to remove. This is one way where billions of people are affected by the decisions of one company. Because the reach of Meta's total set of products which are not just Instagram and Facebook, but also Messenger and Whatsapp varies 3-4 billion, depending on who's counting. I think that is incredible reach. There's no government on Earth that can credibly claim that it governs three to 4 billion people. When we think about the reach of some of these technologies and the self designated governing roles that they're playing, I think it says an awful lot about how we need to expand our way of thinking about big tech to go beyond the financial. I want to close with encouraging all of us to think about data literacy as a human right. Data literacy is a set of skills and concepts that I think will be key to creating data stakeholders. We're circling back to the first takeaway. I argue in the book that data literacy should actually be a part of how we think about the universal right to education. That's because human life will continue to be data centric as it is already. As such, we have to start thinking about how learning about data, making data, using data, and the implications of creating digital data have become core human experiences. If that's the case, then declaring a human right to data literacy creates a universal entitlement which allows us to write the imbalance currently between data sources and data collectors in terms of how data are used and perhaps exploited In our world today, literacy is about both giving individual skills in order to function in a data driven society, but also creating societies and communities that are collectively informed about data. And can at least understand the conceptual leap from things existing in nature to things exist as datasets, to things existing as datasets in a digital format. Thinking here, what does it mean to realize a right to data literacy? We're not saying everyone should be a data scientist here, I think we're saying people need to have basic ideas around the premises of data creation. Why our assumptions about the world matter when we make data. And how our choices of data sources matter for how and what we learn from the data. This involves curricular changes, which I don't want to talk about right now. This involves community organizations that have run pilots on thinking about how to teach people about data. But let me, this is about libraries. Okay. This is about reconceptualizing and I think correspondingly funding libraries to be stewards of data and data. Literacy libraries are foundational in giving people linguistic literacy skills. They're key parts of literacy campaigns for reading. There's no reason actually why we can't think of new ways to fund libraries to give us wide basis for bringing about data literacy. Because if we think about what libraries are for, they gather information and they organize that and tag it and file it such that when I, as a patron, come in, I can find what I need. Librarians are often very informed about their collections. Often, there are shortcuts to thinking about what's possible to learn in a given library. They are data creators, data collectors in a very direct sense. I want to end with some thoughts about treating people, data from people and what that means. And I think we need to really recognize that data from people are from people. The data being taken from living beings. They should be treated accordingly. And I think that's a really important conceptual shift. Data don't come from nowhere, they come from some one. They're made in a co creative process that we currently don't fully acknowledge. This co creative process is laden with political, social, cultural, and economic concern. We also have to think about the power that is in that process and why it's critical to understanding, in some ways, the current state of affairs with AI and AI governance. If we think about that, a lot of times people talk about privacy. I don't think privacy is enough because I think that implies that something exists and you're doing something to keep others from accessing that. Good, right? You're keeping someone from accessing data. But if you think about the argument I've talked about today, if you've already collected the data and then you're promising to keep someone's privacy, that's actually quite different from not collecting the data at all because of autonomy concerns or dignity concerns. Another thing I hear a lot of people talking about is consent. I have a lot to say about this. Let me just briefly run through this idea. As a human subjects researcher, someone who talks to people as part of my work, I have to seek consent all the time from participants, whether it's interview data or survey research data. And we're forced as researchers to think about the processes and uses of the data we are trying to collect about. People go through these very rigorous processes. We are reviewed by multiple bodies within the university and we have to think of ways that data subjects can appropriately consent and take away that consent. And also how we manage the data once we collect them and how we destroy them. This is not a process that I see replicated at all outside of the university context. I think this is a problem because nowadays it's not just social scientists and medical professionals who are dealing with human subjects. It's all these different companies that are creating AI products, that are creating data intensive technologies. There is something to be said about the limits of consent, but also the rigor behind the consent that we've gotten used to in the digital age. To conclude, I don't think human rights are actually a silver bullet. I think that the challenges of data fication on humanity are broader than that. But I think that if we think about the values of autonomy, community, dignity and equality, our ways of thinking about data, I think it's going to get us down a path of really valuing human potential in ways that are different currently from how we talk about data. Thank you for listening and I'll end my remarks there. Thank you so much, Professor Wang. That was absolutely informative and just amazing approach to data, data literacy and AI as we really change even how we perceive information in the 21st century.