State of Cybercrime

Gemma Galdon-Clavell: The Legal, Social, and Ethical Impact of Data and Data Technologies (Part One)

Episode Summary

One theme we're always discussing on the Inside Out Security podcast is the tension between law, privacy and security. When we create new technologies, we want security and privacy, economic prosperity and sustainability, accountability but insist on confidentiality. However, we also recognize the urgency businesses have in securing a first place finish. The reality is that it is difficult to embed all of these values in one pass. As technologies get built, it also elucidates some values we hold to a higher regard than others.

Episode Notes

I wanted to better understand how to manage our moral and business dilemmas, so I enlisted data & ethics expert Dr. Gemma Galdon-Clavell to speak about her leadership in this space. As founding partner of Eticas Research & Consulting, she traverses in this world every day, working with innovators, businesses, and governments who are are considering the ethical and societal ramifications of implementing new technology in our world.

In the first part of our interview, Gemma explains why we get ethics fatigue. Unfortunately, those who want to improve our world are consistently told that they're not doing enough. She also gives us great tips on creating products that have desirability, social acceptability, ethics, and good data management practices.

On Ethics Fatigue

Gemma Galdon-Clavell: My name is Gemma Galdon Clavell and I work on the legal, social, and ethical impact of data and data technologies. I started a company six years ago now that works precisely on this. And so, we're one of the very few companies that have been helping the public and the private sector in the last six years developing better technologies, but also understanding better how technology have impacted societies. So, taking the point of view of the consumer and the citizen into design of the technology, and avoiding bad data practices as we see all the time, unfortunately.

Cindy Ng: Welcome, Gemma. What caught my eye was a quote you said that if we keep talking about our moral obligations and ethical concerns in technology not offering solutions, people are gonna zone out. We want security and privacy. We want economic prosperity and sustainability. We want safety, but not willing to sacrifice some freedoms. Can you talk a little bit about ethics fatigue and some might also call it moral overload?

Gemma Galdon-Clavell: Working in this field, we've been saying what's wrong for a long, long time. But you don't see that many voices out there that are offering solutions. And it seems like any effort you make is never good enough. And that is really frustrating. That can be really frustrating for someone who has really good intentions and their willingness to improve their practices. So, I think that when you're maybe doing academia or just commenting on things, it's easy to take that position. I think it's really good to have people that say what is going wrong, but it's also important that we have ways of defining what it means to do it well, and spaces, organizations, individual that help you do it better. And hopefully over time, as a society, we will agree on what kind of compromises we want to make or whether we wanna make those compromises.

But I think that there has to be some ability to improve and not just always be subject to criticism. When you speak out and you're willing to recognize that you have vulnerabilities, if everyone comes down on you, then you will not be motivated to improve your practices. And I think that's what ethics fatigue will be. So when people are like, "Listen, you had my ear for some time. I was willing to do things better, but if all you have is just more criticism and say that if there's no way to improve this, then people are just gonna shut off." And I think that's the worst outcome we could hope for. So, I'm hoping that through presenting actual practices in ways of doing things better and in doing things well, we can avoid that scenario.

Recovering from Privacy Mishaps

Cindy Ng: I like what you created with Eticas because you've gone beyond the philosophy of ethics and created a framework and methodology that the public and private sector can really get behind. And you recognized that your expertise is sought out after a privacy disaster happens. And by the time they seek out help, they'll probably have collected all the data, analyzed the data. So, the opportunities are already diminished by the time there's an intervention. And you've mentioned in a previous talk that the media already knows the creepy thing you've done and clients want assisting in voicing what they're already doing well, not what they've failed tremendously on. Can you speak some of your experiences when businesses often seek out your counsel after a privacy disaster?

Gemma Galdon-Clavell: Sure. I think we're very lucky that at the very beginning of my work, I was asked by actual people with real problems to see how my knowledge and the things that I have experienced on could help them improve understandings and practices. And so, that forced me to become very practical from day one. And initially when I started working with those actors, I thought, you know, "I'm sure there's gonna be some methodology out there in some book, or people that know a lot more than me that have this before." And what I found was that there was no methodology that was adequate to assess data and privacy risk. There was not enough there that was structured that would fit these things. There was a lot on the impact of technology, on environmental impact assessment, as well as things that were loosely related, and things that I could learn from, but not something that I could readily implement in my work.

So, I basically have to design my own methodology, listening to these private and public actors, but also reading a lot of the literature and talking to a lot of people. So, I think that in the end, what we have is a robust way of addressing social, ethical and privacy risks when developing any data process or data policy as well. And that's been the outcome of a lot of work by a lot of people and a lot of getting the best of great minds together. And I think that's what makes our work different from a lot of what you find out there. We actually get done with the actual problems. But there are shortcomings in what we do. I often say that ideally, you would wanna think about the desirability of what you're doing before you actually start conceiving a new project. And in the real world when people come to you, someone out there has already decided that that new product or new policy is desirable. And so, you cannot really impact in that part of the process, but then we can look at their data management practices, and their ethics, and their legal compliance, and the desirability issues, and we can incorporate the stakeholders.

So, even if we are contacted later on or after a privacy disaster, there's still a lot that we can do. But of course, in the future, what we would like is for no agency or actor to get into developing anything without having some safeguards in place. If you're developing new chemical materials, you would never think of selling something before it runs through with the appropriate safeguards. If you're developing a new drug, you would not dream of putting it in the market and selling it before you've proved that you've gone through the precautionary principle, and you've gone through the development agencies to be validated to your initiative. How come in engineering, that is not the case? How come all these things are making it to the market without no way for society or the regulator to see what the social or legal impact of those new devices or initiatives for data processes would be? That, we're trying to solve.

So, even if we're contacted later on, there's still a lot that we can do. But also, we hope that over time, we'll convince our clients that they need to come to us right at the beginning of developing their new ideas.

The Eticas Methodology

Cindy Ng: Right. And based on some of the analysis you've done, you recommend that businesses don't operate during a disaster, that they approach you before they consider anything. And you found four silver bullets to prevent some of these privacy disasters. And you've mentioned them a little bit and I'd like to go into it a little bit more. You said number one, desirability, why should we even be creating this new service and product. Second, social acceptability, third, ethics, and fourth, data management. Can you go into a little bit of each and talk about why they are significant, and why you would consider them to be four major touch points?

Gemma Galdon-Clavell: Sure. I mean these are the four steps that we think are necessary for you to, not only to assess the risks of your product, but also to accompany the process of developing it. So, this is from the very conception of the project or initiative until the very end in the implementation. And the first step is to consider the desirability, like does society need this? We think that in technology, there's a lot of people that come up with new technologies and then look for problems in their new technology. And that should not be the case. You should develop technology because you have a problem that you wanna solve. That is the productive, innovative way of doing thing.

There's something you wanna solve like the quality of life of older people, or the quality of life of people with disabilities, or address discrimination in a sector or industry. So, you have a problem and then you look for a solution that maybe technology calls. That's the train of thought you wanna follow, but in technology, that's not always followed. So, we wanna make sure that that is the case.

But we also wanna make sure that you go beyond having a great idea. I always say that having a great idea is just the beginning. It's only the starting point. You need to plan all the way through implementation. We see so many technologies that are great ideas, but once they're out there, no one plans for, for instance, training the people that are gonna be using that technology, if you're offering that technology to third-parties, for instance. So, if you don't take that into account, the people that are mediating your relationship with the client have no idea what they're doing. And so, of course, the way that the client is gonna see that technology is not gonna be as good. I think the person in the middle knew about what they were talking about and the possibilities of that technology. So, are you planning for implementation and not just having the great idea.

So, ideally, in desirability, you would go through all these, making sure you have a thorough plan that actually solves the problem. So, that's the first pillar.

The second pillar is social acceptability. I think we've all learned with design thinking and service design that clients and stakeholders and people are just really important. You don't wanna develop technologies that then no one really use. I mean, think Google Glass. You this great idea in the lab, but then actually no one thinks that that's useful. You wanna avoid spending all that money in a pilot that in the end, no one's gonna use. And you do that by talking to people. And there's methodologies and ways of making sure that you understand your potential client and you're building trust in your system. You may even go as far as having mechanisms for them to intervene even after the technology has been in the market. So, you see what their feedback is. So, it's some techniques from marketing, but adapted to these understanding of the social impact of technology.

Then we also wanna look at legal compliance, of course. You would need to comply with the law, and here, there are so many industries that have suffered for not doing this. Think about the drone industry, for instance. A few years ago, everyone thought that drones are gonna be this next thing that you were gonna have your home deliveries come to you by drones and they have to invest a lot of money in this new technology because everyone was gonna be using drones. That didn't happen because we didn't have a legal framework. If you have come to us five years ago, we could have told you the most likely thing is that drones are used in some rural areas and for crisis management. But they're not gonna be an everyday thing in the next 10 years. And then you would have saved a lot of money and you would have invested maybe in another technology that was more promising. So, fulfilling the law is really, really important.

But there's also things that are not part of the law that inform our laws, like social cohesion or trust. So, we also look at how those things are impacted by your technology. What is the impact of your technology or the data process on social cohesion and trust between actors. Because these are not things...there's no laws about trust, but it's such an important part of how our society are. And it's important to have a specific emphasis put on that. So, that is the legal pillar of our assessment.

And then finally, data management. I always say that data management is the source of all problems, but also the source of all solutions. So, what we do here is we map the data life cycle, any piece of data that get into your system, we look at their vulnerabilities and their moments of vulnerabilities. So, there is five moments of vulnerability in every data that gets into your system: the moment of collection, the moment of storage, the moment of sharing, the moment of analysis, and the moment of deletion. In those five moments, things can go wrong. And so, what we do is...what we've learned in the other pillars, we build specific mitigation measures in your system to make sure that you have encryption mechanisms in place, that you anonymize if that is needed, that you minimize the data that you use in your system, that you have relevant contract with the processors to protect your liability in case there's a data breach. So, in the end, we'll make your system more robust through following these four main pillars of emphasis.