"Technology Can Go Both Ways"
As part of the program Global Governance Futures, 25 young executives from various fields and five countries come together. Over the course of a year, they work together in three groups on scenarios that the world of tomorrow will face. We spoke with the two GGF fellows Cathleen Berger and Reirui Ri in New Delhi about the topic of their working group: data governance. Their group is looking ten years into the future and working on scenarios for dealing with the increasing instances of digitalization.
Cathleen Berger is a consultant and political expert who focuses on Internet freedoms, human rights, and foreign policy in the digital age. She has already worked for the Federal Foreign Office and the Stiftung Wissenschaft und Politik (German Institute for International and Security Affairs), among other organizations. She currently works for Mozilla.
Reirui Ri has already worked for Google Asia-Pacific, Google Japan, and the Boston Consulting Group. She is currently a fellow of the Stanford Program in International Legal Studies at Stanford Law School.
Read the interview
Big Data has been described as the "new oil" - once refined properly, it has immense value and greatly empowers the holder. What dangers arise when governments, companies and criminal networks all use the same information source?
Cathleen Berger: I am not worried about Big Data, depending on how you address it. Big Data anonymizes and kind of gives you a broader societal, environmental or political aspect. It can be used for a lot of good things: for example, I see a lot of opportunities and technological development for crisis management, climate change, smart cities and their fight against pollution as well as health analysis. What I am worried about are safeguards, the centralization of data and overall implications for the individual. I really hope for databases to remain separate and that personal information is treated as privately and confidentially as possible.
As one of the many potential fields of application for Big Data, "predictive policing" has fueled the imagination of many, bearing resemblance to the sci-fi thriller The Minority Report. What exactly does predictive policing entail and what are its benefits and drawbacks?
Berger: What is happening mainly in US law enforcement is that algorithms have been developed to determine whether a certain area might be particularly affected by crime. In simple terms, it is about preventing crime before it happens, with the help of an algorithm. I do think this can be very tricky because many discriminatory aspects can be fed into an algorithm - which, after all, lives from the data that it is fed. For example, when it comes to policing, if the data is already discriminatory, then this is replicated by the technology. I find this very concerning from a human rights perspective. However, I would strictly separate that from Big Data as such, because it is the interpretation of the data that is flawed here, not the data itself. It has not actually been applied anywhere in Europe, but some law enforcement agencies are pushing for it despite prevailing human rights concerns.
The recent US elections have shown how vulnerable societies have become to cyberattacks. Many governments are trying to close these security holes now. Are there any risks connected to this?
Berger: The biggest problem is that many governments are looking inwards right now. From a data governance perspective, that is a big threat for the global public resource called Internet. As soon as governments are starting to take national measures and are putting up legal frameworks or closing their borders thus trying to control the infrastructure, that's a massive threat for the internet as we know it.
"Technology is a neutral force"
What are some of the other aspects of data governance you are discussing that are currently not dominating the headlines?
Berger: I am working on the problem of access in terms of infrastructure, devices and the competences to use them. There are many political declarations to get the next billion people online, but it is often private actors taking action. That can under certain instances lead to infringements on freedom of expression, because there is a market interest to limit access to certain services. For example, many people in the Global South often have the issue that content is simply not available in their languages. Trying to actually be inclusive and not just talk about it is something that I feel is currently missing from the discourse.
Reirui Ri: What is also missing is an in-depth debate on consumer choice and data ownership, and on what tools and mechanism we can employ to achieve it. Another topic that comes to my mind are self-regulation standards of corporations. The media often reports on whether Internet giants delete certain things, but by deletion, do you mean in local domains or global domains? Why do corporations easily delete imagery that infringes upon copyright but not personal information? These corporate enforcement standards are fundamental to consumers and Internet rule making but they are not covered in the headlines.
What are some of the factors that your Working Group is taking into account when planning scenarios for the next decade of the digital age?
Ri: One of the factors our group is looking at is how governments are continuously losing regulatory and enforcement power, making Internet regulations de facto ineffective. Ultimately, they are becoming less influential, both as actors and within the Internet governance debate. Another trend would be the rise of the new citizens, the next generation of people coming online. They will be actively voicing their needs, both political and economic, and in our working group, we see internet services are changing to accommodate them.
Berger: For us, what is very important in scenario planning is that technology as a neutral force at the center can always go both ways. Either it can empower people to regain control of their data or it can be used against them, rendering them powerless and controlled by bigger forces than themselves. In brief, the ethics of technology usage is a very important factor. Another thing we are looking at is the data economy. With the lack of government enforcement, we have trouble controlling investments and market growth as companies can act faster than the regulatory frameworks. How to handle that and how to go forward is a big challenge that influences our scenario a lot.
Regional Harmonization and Cultural Differences
While data protection legislation is highly fragmented, attempts at harmonization such as the EU’s General Data Protection Regulation (GDPR) are already underway. Does such regional harmonization make sense or do we need a new global treaty protecting the privacy of groups against the potential misuses of Big Data?
Berger: I would love to see privacy respected everywhere, since it actually is part of international human rights law. However, I do not think it is possible to enforce it on an international level. Elements like the GDPR are fantastic developments and I have been using it a lot as a best practice example to advocate for regulatory change in other regions. The African Union, for instance, is also looking into similar efforts and I am happy that this serves as an inspiration and that something is already on the table.
During your discussions and scenario planning, have you noticed cultural differences in regional approaches to data governance?
Ri: I think Asian nations are aware that we are the developing nations in terms of internet markets and internet technology, and so I think we have a very different sense of privacy and a very different sense of what internet regulations should look like. In some countries, there is definitely more room to allow for innovation and more room for data utilization compared to the EU.
Change in perspectives
Both of you are bringing a wealth of expertise to the table. What new impulses have you been able to take away from the GGF sessions?
Ri: What I really gained is the perspective of individuals and civil society, which Cathleen actually represents. Another huge perspective gain is the look into how the internet operates in China and India, and how people are using technology to grow when technical capacities and infrastructure are limited.
Berger: For me the corporate view is actually very helpful, just to understand these dynamics better. Because of my experience in the Foreign Service, I have always had a strong focus on foreign policy. Understanding the domestic drivers that many of the fellows have brought to the table, particularly from the Asian region, was extremely interesting and I have learnt a lot about how different civil society actors work, what drives them and what their motivations are. People share their personal stories and then show you how they see their world. That has been a major take-away for me.