Season 1 – Episode 16 – Privacy and Ethical Considerations for Extended Reality Settings
Discusses privacy and other ethical considerations for extended reality settings.
Click to expand/collapse
Daniel Smith: Welcome to On Tech Ethics with CITI Program. Our guest today is Mihaela Popescu, who is a professor of communication studies and the faculty director of the Extended Reality for Learning Lab at California State University, San Bernardino. Mihaela’s research focuses on privacy on sociotechnical platforms and in emerging mediated contexts such as virtual reality, algorithmic media, and human machine communication. Today, we are going to discuss privacy considerations for extended reality settings. Before we get started, I want to quickly note that this podcast is for educational purposes only. It’s not designed to provide legal advice or legal guidance. You should consult with your organization’s attorneys if you have questions or concerns about the relevant laws and regulations that may be discussed in this podcast. In addition, the views expressed in this podcast are solely those of our guest. And on that note, welcome to the podcast, Mihaela.
Mihaela Popescu: Thank you so much, Daniel. It is a pleasure and an honor to be here.
Daniel Smith: I’m excited to learn more about extended reality and how we should be thinking about privacy in this context. But first, can you tell us more about yourself and what you currently focus on at California State University, San Bernardino?
Mihaela Popescu: Well, you said it very nicely. So, I am a professor of media studies. I’m also the faculty director of our VR lab. And our focus right now is to bring together faculty, programmers, and students to try to reimagine how we can teach with these new technologies that are making so many waves in education such as virtual reality or any other emerging technologies. So, we are trying to make sense of them, and we are trying to find pedagogically sound ways to introduce them in the classroom so that students can relate better to content, to each other, and to the world around them.
Daniel Smith: So, we’re here today to talk about extended reality. And I know that term encompasses a few experiences or environments. So, can you start by providing a brief overview of what extended reality entails?
Mihaela Popescu: Sure. So, extended reality is an umbrella term that pertains to various ways of blending the physical world with digital objects to create some kind of cohesive experience for an intended user. And although the abbreviation XR, extended reality, merged around the turn of the 21st century, I have to emphasize that these are by no means new technologies. In fact, elements of these technologies have been present ever since the 19th century. But as we understand it today, extended reality pertains to technology such as mixed reality, which is a form of interacting with both digital and physical objects in real time. Augmented reality, which creates a sort of hybrid of physical objects and digital information, and creates a way of interacting with this digital objects. Virtual reality, which entails various forms of digital world creation. And sometimes I hear people including 360 degree video or spherical video under this umbrella. And that’s a technology of extending the field of vision by recording in every direction.
Daniel Smith: So, what are some ways in which people are using virtual augmented and mixed reality?
Mihaela Popescu: Well, increasingly, people are using these technologies in marketing. Gaming has been around for a while. And of course, VR games are really booming area in the industry. Social media. So, for example, VR Chat is an application that enables people to inhabit this virtual worlds by means of an avatar. Increasingly, museums are finding ways to redefine the idea of a museum tour as a virtual experience. And obviously in education. So, VR simulations are extremely powerful in teaching learners how to interact, how to talk, how to observe various things that they might not be able to observe. And so, VR simulations are using a variety of industries like aviation for example, or police training.
Daniel Smith: That’s really interesting. So, I know these extended reality settings can raise some unique ethical considerations related to things like user identity and privacy. So, can you also talk a bit about some of these issues that people should be aware of?
Mihaela Popescu: Oh, this is a really nice question that plays right into my area of research. Right now, I’m a privacy scholar. So, I’m fascinated by the many ways in which current technologies, platforms, applications manage to collect data from people. So, first of all, I think we should keep in mind that XR creates an entirely new ecology of data collection. So, if you think about how these experiences are reaching users, there are mediating devices that enable us to perceive these digital objects. For example, a headgear like Oculus 2. And this mediating devices have their own policy of data collection. So, there is a lot of data that is being collected by the device itself. And then of course the operating system of the device collects data. And then the various gatekeepers, for example, the app store from which you download something collects data. And of course the app or the game collects data as well.
So, I think it’s very important to think about which companies own all these gatekeepers and to be very careful about the privacy policies of these companies. But let’s talk for a little bit about the mounted headgear, which is something unique to VR. That raises privacy issues that are maybe not as urgent as in other settings. Because the headgear is actually able to track your eyes, so to actually record where you are looking, sometimes there are other biometrics that are being collected. For example, VR includes forms of spatial computing that has to figure out what is the position of your body in relation to the space around you. And in the process, actually collect data about how you move data that’s actually unique to you. And this biometric information is actually very helpful for companies to make all sorts of inferences about what you might like, about your mood, and so on.
So, my fear here is that we don’t really think about privacy in a complete way. So, right now, if we look at regulations such as GDPR, it’s all about what data are being collected and how the data are being managed. But it’s not just about the type of data, the nature of the data being collected, it’s also about what kind of inferences you can make about the person behind the data. So, for example, eye tracking enables us to make a lot of inferences about how we think or what we are interested in and so on. Not to mention that there is also data collected about the context in which I am right now. Which is also extremely important to make inferences about what kind of persuasive messages might work with me right now.
So, this idea of actually reflecting on what could be inferred about the user, not just what can be collected, but what could be inferred about the user by means of this data, and what kind of supplementary information can you add. For example, if you are Facebook, you not only have information now about where the user is looking at, but also information about the friends of the users. So, what kind of inferences can we make based on all this information linked by the company that owns the device? So, I think that that’s actually something very sobering to think about when it comes to these new technologies.
Daniel Smith: So, I want to talk a bit about what companies, and researchers, and educators, and also just individual users can do to mitigate some of those issues. But first I was just wondering, are there any other ethical considerations that we should be aware of when interacting in these extended reality settings?
Mihaela Popescu: Sure, there are a few. So, one, and an obvious one, I think that a lot of attention has been paid to this one is the issue of motion sickness, especially when we are using these technologies in education. If the resolution is not good enough, it could induce some kind of motion sickness in learners, in users. So, it’s important to figure out how to mitigate that. Also, there is the issue of accessibility. Let’s keep in mind that all the technologies that we are talking about right now are technologies of vision. So, what happens if the learner or the user is partially blind or completely blind? What happens if the user doesn’t have sufficient motor control to actually manipulate all the tracking equipment available? So, the issue of accessibility. I would also list the issue of impact.
So, one of the things that fascinate people about VR for example, is the fact that the experiences in VR are very powerful because they trick the brain into believing that the person is right there, it’s embodied in that environment. Well, if that’s the case, then I think it’s a fair question to ask whether these powerful experiences actually have effects on users and what kind of effects we are talking about. Then there is the issue of marketing in VR. So, if indeed it’s true that many companies are collecting information that enable them to find out what the users are thinking about, or what they care about, or their emotions, that means that information is actually very useful in crafting very persuasive messages for the users.
Now, I ask you to imagine receiving ads while you have the headgear on. You can’t look away. You are, in a sense, forced to actually consider the ads so that in a way you are captive audience. So, I think that is something to consider, whether it’s entirely ethical to subject users to that kind of captive audience situation. And last but not least, there are various worlds in VR that enable social experiences. And any social experience, it comes with ethical challenges. Bullying, for example, might be one or exposure to offensive content. So, as you can see, a whole host of ethical issues that we have to take into account when designing experiences for our students.
Daniel Smith: I want to quickly tell you about CITI Program’s podcast on campus with CITI Program, which explores current issues affecting higher education institutions. You can subscribe to on-campus with CITI Program, wherever you listen to podcasts. And now back to the conversation with Mihaela.
So, now I want to talk a bit about what we can do to mitigate some of those issues. But I’m going to break it down into first, what can institutions, and researchers, and educators who are utilizing these technologies in their work, what can they do to mitigate some of these issues for users?
Mihaela Popescu: I know that a common refrain is better education. But I’m going to put that last because I think that it is quite unreasonable to ask users, students, for example, to educate themselves in this area. So, let me go elsewhere. So, first of all, as an institution, I think that it’s very important to look at the procurement process. So, first of all, to curate the kind of technologies that we are adopting in the classroom via instruments such as the higher education community vendor assessment toolkit. It’s a questionnaire that is being asked of the vendor of the industry partner. And that actually covers issues such as how do you collect data, how do you store it, with whom are you sharing the data, and so on. So, that is one thing that could be done. Another thing is actually a reevaluation of the kind of contracts we establish with the providers of these technologies.
So, many times institutions are more worried about how their data assets are being stored or are being considered rather than the data from students. So, let me give you an example so that it’s clearer. So, for example, in a contract, you could have a clause that says you are not allowed to share university data with a third party. But it’s very unclear what university data represents because universities collect a whole lot of data from students. So, what is it that you are not sharing? So, it’s not very clear what constitutes university data and what constitutes student data that the university might not collect, but the vendor collects and therefore shares. So, there is this distinction to keep in mind.
Also, I think that universities have to have more meaningful data governance policies. So, that, to me, again goes back to the idea of what constitutes data that universities collecting because it must collect, such as a students’ address. And what is data that university is collecting because it feels like at some point in the future, maybe it’ll be useful data to find out things about how students learn, for example. And do students have the option of not providing that kind of information?
And last but not least, lobbying the state for better privacy laws that actually apply to education. For example, California Consumer Privacy Act, which is a very comprehensive privacy act. But it has one big problem, it doesn’t actually apply to universities. So, how about we started meeting that universities actually are consumers of big data, that they are collecting a lot of data, that the technologies used in the classroom are indeed collecting a lot of data, and that we need more meaningful regulation about that. Okay, so after I said all of that, there is also the issue of educating students. And that’s, again, I mean it seems common sense that universities and K-12 schools should actually have some type of media literacy education for their learners. But if you start thinking where exactly in the curriculum might a course like that enter, it’s not very obvious. And also if you think about who should take that course and at what point in their journey, again, there is a lot of curriculum politics around that. So, I think that one thing that we could be doing better is educating our students about these dangers.
Daniel Smith: So, then also from the other perspectives, as an individual user, is there anything that I should be doing to help address some of these issues?
Mihaela Popescu: Well, I think that the first question I would ask you is to reflect on whether the benefits that you are deriving from the experience of using these technologies are actually worth your privacy risks. And it could be that the answer is yes. So, the problem with this question though is that sometimes people don’t have a very good sense of what privacy risks entail. So, I guess one thing that you could do as a user is to inform yourself about really, what kind of data are regularly collected. So, I would encourage any user to Google up, how do I find out what Google knows about me? And have a look at the kind of data that a browser collects and a search engine collects. And then if they still want to use devices that collect even more data, then that’s fine.
And information may be collected over time and combined with information collected on different websites. All right, so now I’m starting to get a sense that it’s not just the data I’m providing right now, but also the history of the data collecting through other websites. So, again, I might decide that I don’t particularly care about that. But I think it’s always nice to decide knowing exactly what you are subjecting yourself to. Okay, other things that could be done. And here I would admit that these are things that I do and I encourage my students to do is to lobby the FTC, the Federal Trade Commission, for better consumer laws and antitrust enforcement.
I know that here we are getting into the weeds a little bit. But I think it’s very important that we don’t have companies that are able to collect data in so many ways through so many platforms, thus creating such powerful images of the users. So, I think that a little bit more competition is good. I think that a little bit more privacy would be very welcome. So, we are not powerless, but sometimes we are overwhelmed by the amount of information that we have to digest in order to protect ourselves.
Daniel Smith: Absolutely. So, on that note, are there any resources that are currently available that you would suggest to your students or others on navigating these issues, essentially helping them digest all of that information a bit better?
Mihaela Popescu: Sure. I would suggest two, so one would be EduCourse in the education area. EduCourse is a nonprofit organization. It brings together educators, industry practitioners. So, it actually has quite a lot of articles about the ethics of using VR, privacy in VR, as well as various things that educators as well as students should take into account when using these technologies. But also privacy advocates such as the Electronic Frontier Foundation, can be excellent sources of information about emerging technologies.
Daniel Smith: Wonderful. And I’ll certainly include links to both of those resources in our show notes so that our listeners can learn more. And I guess my final question for you today, Mihaela, is do you have any final thoughts you would like to share that we have not already touched on?
Mihaela Popescu: Even knowing all these risks, I think that XR offers unique pedagogical affordances. And sometimes it’s interesting to think what can I do with VR or XR that I cannot do with anything else? And I guess this is the trajectory that we adopted at CSUSB, the idea of thinking about, especially during the pandemic, what are the things that would be wonderful to have that we cannot have because we can’t interact in the physical space with other human beings? And lots of those things were able to be created via emerging technologies. So, I wouldn’t throw the baby out with the bathwater and say, “Okay, we shouldn’t be using any of these technologies because they are bad for our privacy.” I think it’s important to actually consider when are these technologies essential and when other ways of teaching are actually more appropriate. So, I guess I’ll leave it at that.
Daniel Smith: And I think that is a wonderful place to leave our conversation for today. So, thank you again, Mihaela.
Mihaela Popescu: Thank you, Daniel. It’s been a great pleasure talking to you.
Daniel Smith: And I also invite everyone to visit citiprogram.org to learn more about our courses and webinars on research ethics and compliance. You may be interested in our Technology, Ethics, and Regulations course, which covers various emerging technologies such as wearables and biometrics, and their associated ethical issues and governance approaches. And with that, I look forward to bringing you all more conversations on all things tech ethics.
How to Listen and Subscribe to the Podcast
You can find On Tech Ethics with CITI Program available from several of the most popular podcast services. Subscribe on your favorite platform to receive updates when episodes are newly released. You can also subscribe to this podcast, by pasting “https://feeds.buzzsprout.com/2120643.rss” into your your podcast apps.
- Season 1 – Episode 15: Considerations for Using AI in IRB Operations
- Season 1 – Episode 14: Recent Developments in AI Regulation
- Season 1 – Episode 13: Impact of Generative AI on Research Integrity
- Season 1 – Episode 12: Ethical and Policy Issues for Xenotransplantation Clinical Trials
Meet the Guest
Mihaela Popescu, PhD – California State University, San Bernardino
Mihaela Popescu is a Professor of Digital Media in the Department of Communication Studies at California State University, San Bernardino (CSUSB) and the Faculty Director of CSUSB’s Extended Reality for Learning Lab (xREAL). She holds a PhD in Communication from the University of Pennsylvania.
Meet the Host
Daniel Smith, Associate Director of Content and Education and Host of On Tech Ethics Podcast – CITI Program
As Associate Director of Content and Education at CITI Program, Daniel focuses on developing educational content in areas such as the responsible use of technologies, humane care and use of animals, and environmental health and safety. He received a BA in journalism and technical communication from Colorado State University.