Wendy Sherman on the United States’ priorities as it takes the helm of the Freedom Online Coalition
US Deputy Secretary of State Wendy Sherman outlined the priorities for the world’s democratic tech alliance, from protecting fundamental freedoms online to building resilience against digital authoritarianism.
Wendy Sherman on the United States’ priorities as it takes the helm of the Freedom Online Coalition
Share this story
Event transcript
Uncorrected transcript: Check against delivery
Introduction
Rose Jackson
Director, Democracy & Tech Initiative, Digital Forensic Research Lab
Opening Remarks
Wendy Sherman
Deputy Secretary of State, US Department of State
Panelists
Boye Adegoke
Senior Manager, Grants and Program Strategy, Paradigm Initiative
Juan Carlos Lara
Executive Director, Derechos Digitales
Alissa Starzak
Vice President, Global Head of Public Policy, Cloudflare
Moderator
Khushbu Shah
Nonresident Fellow, Digital Forensic Research Lab
ROSE JACKSON: Hello. My name is Rose Jackson, and I’m the director of the Democracy + Tech Initiative here at the Atlantic Council in Washington, DC.
I’m honored to welcome you here today for this special event, streaming to you in the middle of the Freedom Online Coalition, or the FOC’s first strategy and coordination meeting of the year.
For those of you watching at home or many screens elsewhere, I’m joined here in this room by representatives from thirty-one countries and civil-society and industry leaders who make up the FOC’s advisory network. They’ve just wrapped up the first half of their meeting and wanted to bring some of the conversation from behind closed doors to the community working everywhere to ensure the digital world is a rights-respecting one.
It’s a particularly important moment for us to be having this conversation. As we get ready for the second Summit for Democracy later this week, the world’s reliance and focus on the internet has grown, while agreement [on] how to further build and manage it frays.
I think at this point it’s a bit of a throwaway line that the digital tools mediate every aspect of our lives. But the fact that most of the world has no choice but to do business, engage with their governments, or stay connected with friends and family through the internet makes the rules and norms around how that internet functions a matter of great importance. And even more because the internet is systemic and interconnected, whether it is built and imbued with the universal human rights we expect offline will determine whether our societies can rely on those rights anywhere.
Antidemocratic laws have a tendency of getting copied. Troubling norms are established in silence. And a splintering of approach makes it easier for authoritarians to justify their sovereign policies used to shutter dissent, criminalize speech, and surveil everyone. These are the core democratic questions of our time, and ensuring that the digital ecosystem is a rights-respecting one requires democracies [to row] in the same direction in their foreign policy and domestic actions.
The now twelve-year-old FOC, as the world’s only democratic tech alliance, presents an important space for democratic governments to leverage their shared power to this end, in collaboration with civil society and industry around the world.
We were encouraged last year when Secretary of State Antony Blinken announced at our open summit conference in Brussels that the US would take over as chair of the FOC in 2023 as part of its commitment to reinvest in the coalition and its success. Just over an hour ago, the US announced a new executive order limiting its own use of commercial spyware on the basis of risks to US national security and threats to human rights everywhere really brings home the stakes and potential of this work.
So today we’re honored to have Deputy Secretary of State Wendy Sherman here to share more about the US government’s commitment to these issues and its plans for the coming year as chair.
We’ll then turn to a panel of civil-society and industry leaders from around the world to hear more about how they view the role and importance of the FOC in taking action on everything from internet shutdowns to surveillance tech and generative AI. That session will be led by our nonresident fellow and the former managing editor of Rest of World Khushbu Shah.
Now, before I turn to the deputy secretary, I want to thank the FOC support unit, the US State Department, and our Democracy and Tech team here for making this event possible. And I encourage you in Zoomland to comment on and engage liberally with the content of today’s event on your favorite social media platforms, following at @DFRLab, and using the hashtags #SummitforDemocracy, #S4D too, or #PartnersforDemocracy.
For those tuning in remotely in need of closed captioning, please view today’s program on our YouTube channel through the link provided in the chat.
It is now my distinct honor to pass the podium to Deputy Secretary of State Wendy Sherman, who needs no introduction as one of our nation’s most experienced and talented diplomats.
Deputy Secretary, thank you so much for joining us.
WENDY SHERMAN: Good afternoon. It’s terrific to be with you, and thank you, Rose, for your introduction and for all of the terrific work that the Freedom Online Coalition is doing.
It is fitting to be here at the Atlantic Council for this event because your mission sums up our purpose perfectly: shaping the global future together. That is our fundamental charge in the field of technology and democracy: how we use modern innovations to forge a better future.
That’s what the DFRLab strives to achieve, through your research and advocacy, and that’s what the Freedom Online Coalition, its members, observers, and advisory network seek to accomplish through our work. Thank you for your partnership.
More than five decades ago—seems like a long time ago, but really very short—the internet found its origins in the form of the first online message ever sent, all of two letters in length, delivered from a professor at UCLA to colleagues at Stanford. It was part of a project conceived in university labs and facilitated by government. It was an effort meant to test the outer limits of rapidly evolving technologies and tap into the transformative power of swiftly growing computer networks.
What these pioneers intended at the time was actually to devise a system that could allow people to communicate in the event of a nuclear attack or another catastrophic event. Yet what they created changed everything—how we live and work, how we participate in our economy and in our politics, how we organize movements, how we consume media, read books, order groceries, pay bills, run businesses, conduct research, learn, write, and do nearly everything we can think of.
Change didn’t happen overnight, of course, and that change came with both promise and peril. This was a remarkable feat of scientific discovery, and it upended life as we know it for better, and sometimes, worse.
Over the years, as we went from search engines to social media, we started to face complicated questions as leaders, as parents and grandparents, as members of the global community—questions about how the internet can best be used, how it should be governed, who might misuse it, how it impacts our children’s mental and emotional health, who could access it, and how we can ensure that access is equitable—benefitting people in big cities, rural areas, and everywhere in between. Big-picture questions arose about these tectonic shifts. What would they mean for our values and our systems of governance? Whether it’s the internet as we understand it today or artificial intelligence revolutionizing our world tomorrow, will digital tools create more democracy or less? Will they be deployed to maximize human rights or limit them? Will they be used to enlarge the circle of freedom or constraint and contract it?
For the United States, the Freedom Online Coalition, and like-minded partners, the answer should point in a clear direction. At a basic level, the internet should be open and secure for everyone. It should be a force for free enterprise and free expression. It should be a vast forum that increases connectivity, that expands people’s ability to exercise their rights, that facilitates unfettered access to knowledge and unprecedented opportunities for billions.
Meeting that standard, however, is not simple. Change that happens this fast in society and reaches this far into our lives rarely yields a straightforward response, especially when there are those who seek to manipulate technology for nefarious ends. The fact is where all of us may strive to ensure technology delivers for our citizens, autocratic regimes are finding another means of expression. Where democracies seek to tap into the power of the internet to lift individuals up to their highest potential, authoritarian governments seek to deploy these technologies to divide and disenfranchise, to censor and suppress, to limit freedoms, [to] foment fear and [to] violate human dignity. They view the internet not as a network of empowerment but as an avenue of control. From Cuba and Venezuela to Iran, Russia, the PRC, and beyond, they see new ways to crush dissent through internet shutdowns, virtual blackouts, restricted networks, blocked websites, and more.
Here in the United States, alongside many of you, we have acted to sustain connections to internet-based services and the free flow of information across the globe, so no one is cut off from each other, the outside world, or cut off from the truth. Yet even with these steps, none of us are perfect. Every day, almost everywhere we look, democracies grapple with how to harness data for positive ends, while preserving privacy; how to bring out the best in modern innovations without amplifying their worst possibilities; how to protect the most vulnerable online while defending the liberties we hold dear. It isn’t an easy task, and in many respects, as I’ve said, it’s only getting harder. The growth of surveillance capabilities is forcing us to constantly reevaluate how to strike the balance between using technologies for public safety and preserving personal liberties.
The advent of AI is arriving with a level of speed and sophistication we haven’t witnessed before. It will not be five decades before we know the impact of AI. That impact is happening now. Who creates it, who controls it, [and] who manipulates it will help define the next phase of the intersection between technology and democracy. By the time we realize AI’s massive reach and potential, the internet’s influence might really pale in comparison. The digital sphere is an evolving and is evolving at a pace we can’t fully fathom and in ways at least I can’t completely imagine. Frankly, we have to accept the fact that the FOC’s absolutely vital work can feel like a continuous game of catchup. We have to acknowledge that the guidelines we adopt today might seem outdated as soon as tomorrow.
Now let me be perfectly clear: I am not saying we should throw up our hands and give up. To the contrary, I’m suggesting that this is a massive challenge we have to confront and a generational change we have to embrace. We have to set standards that meet this moment and that lay the foundation for whatever comes next. We have to address what we see in front of us and equip ourselves with the building blocks to tackle what we cannot predict.
To put a spin on a famous phrase, with the great power of these digital tools comes great responsibility to use that power for good. That duty falls on all our shoulders and the stakes could not be higher for internet freedom, for our common prosperity, for global progress, because expanded connectivity, getting the two billion unconnected people online can drive economic growth, raise standards of living, create jobs, and fuel innovative solutions for everything from combating climate change to reducing food insecurity, to improving public health, to promoting good governance and sustainable development.
So we need to double down on what we stand for: an affirmative, cohesive, values-driven, rights-respecting vision for democracy in a digital era. We need to reinforce rules of the road for cyberspace that mirror and match the ideals of the rules-based international order. We need to be ready to adapt our legal and policy approaches for emerging technologies. We need the FOC—alongside partners in civil society, industry, and elsewhere—to remain an essential vehicle for keeping the digital sphere open, secure, interoperable, and reliable.
The United States believes in this cause as a central plank of our democracy and of our diplomacy. That’s why Secretary Blinken established our department’s Bureau of Cyberspace and Digital Policy, and made digital freedom one of its core priorities. That’s why the Biden-Harris administration spearheaded and signed into and onto the principles in the Declaration for the Future of the Internet alongside sixty-one countries ready to advance a positive vision for digital technologies. That’s why we released core principles for tech-platform accountability last fall and why the president called on Congress to take bipartisan action in January.
That’s why we are committed to using our turn as FOC chair as a platform to advance a series of key goals.
First, we will deepen efforts to protect fundamental freedoms, including human rights defenders online and offline, many of whom speak out at grave risk to their own lives and to their families’ safety. We will do so by countering disruptions to internet access, combating internet shutdowns, and ensuring everyone’s ability to keep using technology to advance the reach of freedom.
Second, we will focus on building resilience against the rise of digital authoritarianism, the proliferation of commercial spyware, and the misuse of technology, which we know has disproportionate and chilling impacts on journalists, activists, women, and LGBTQI+ individuals. To that end, just a few hours ago President Biden issued an executive order that for the first time will prohibit our government’s use of commercial spyware that poses a risk to our national security or that’s been misused by foreign actors to enable human rights abuses overseas.
On top of that step, as part of this week’s Summit for Democracy, the members of the FOC and other partners will lay out a set of guiding principles on government use of surveillance technologies. These principles describe responsible practices for the use of surveillance tech. They reflect democratic values and the rule of law, adherence to international obligations, strive to address the disparate effect on certain communities, and minimize the data collected.
Our third objective as FOC chair focuses on artificial intelligence and the way emerging technologies respect human rights. As some try to apply AI to help automate censorship of content and suppression of free expression, FOC members must build a consensus around policies to limit these abuses.
Finally, we will strengthen our efforts on digital inclusion—on closing the gender gap online; on expanding digital literacy and skill-building; on promoting access to safe online spaces and robust civic participation for all, particularly women and girls, LGBTQI+ persons, those with disabilities, and more.
Here’s the bottom line: The FOC’s work is essential and its impact will boil down to what we do as a coalition to advance a simple but powerful idea, preserving and promoting the value of openness. The internet, the Web, the online universe is at its best when it is open for creativity and collaboration, open for innovation and ideas, open for communication and community, debate, discourse, disagreement, and diplomacy.
The same is true for democracy—a system of governance, a social contract, and a societal structure is strongest when defined by open spaces to vote, deliberate, gather, demonstrate, organize, and advocate. This openness could not be more important, because when the digital world is transparent, when democracy is done right, that’s when everyone has a stake in our collective success. That’s what makes everyone strive for a society that is free and fair in our politics and in cyberspace. That’s what we will give—that’s what we’ll give everyone reason to keep tapping into the positive potential of technology to forge a future of endless possibility and boundless prosperity for all.
So good luck with all your remaining work; lots ahead. And thank you so much for everything that you all do. Thank you.
KHUSHBU SHAH: Hello, everybody. Thank you so much for joining us. I’m Khushbu Shah, a journalist and a nonresident fellow at the Atlantic Council’s DFRLab.
We’re grateful to have these three experts here with us today to discuss rights in the digital world and the Freedom Online Coalition’s role in those rights. I’ll introduce you to these three experts.
This is Adeboye Adegoke, who is the senior manager of grants and program strategy at Paradigm Initiative. We have Alissa Starzak, the vice president and global head of public policy at Cloudflare, and Juan Carlos, known as J.C., Lara, who’s the executive director of Derechos Digitales. And so I will mention that both J.C. and Adeboye are also on the FOC’s Advisory Network, which was created as a strong mechanism for ongoing multi-stakeholder engagement.
And so I’ll start with the thirty-thousand-foot view. So we’ve heard—we’ve just heard about the FOC and its continued mission with the United States at the helm as chair this year in an increasingly interconnected and online world. More than five billion people are online around the world. That’s the majority of people [on] this planet. We spend nearly half of our time that we’re awake online, around more than 40 percent.
We as a global group of internet users have evolved in our use of the internet, as you’ve heard, since the creation of the FOC in 2011.
So Adeboye, why do you think now suddenly so many people are suddenly focused on technology as a key democratic issue? And speaking, you know, from your own personal experience in Nigeria, should we be?
ADEBOYE ADEGOKE: Yeah. I mean, I think the reasons are very clear, not just [looking out] to any region of the world, but, you know, generally speaking, I mean, the Cambridge Analytica, you know, issue comes to mind.
But also just speaking, you know, very specifically to my experience, on my reality as a Nigerian and as an African, I mean, we just concluded our general elections, and technology was made to play a huge role in ensuring transparency, you know, the integrity of the elections, which unfortunately didn’t achieve that objective.
But besides that, there are also a lot of concerns around how technology could be manipulated or has been manipulated in order to literally alter potential outcomes of elections. We’re seeing issues of microtargeting; you know, misinformation campaigns around [the] election period to demarcate, you know, certain candidates.
But what’s even most concerning for me is how technology has been sometimes manipulated to totally alter the outcome of the election. And I’ll give you a very clear example in terms of the just-concluded general elections in Nigeria. So technology was supposed to play a big role. Results were supposed to be transmitted to a central server right from the point of voting. But unfortunately, those results were not transmitted.
In fact, as a matter of fact, three or four days after the election, 50 percent of the results were not uploaded. As of the time that the election results were announced, those results were—less than 50 percent of the results had been transmitted, which then begin to, you know, lead to questioning of the integrity of those outcomes. These are supposed to be—elections are supposed to be transmitted, like, on the spot. So, you know, it becomes concerning.
The electoral panel [gave] an excuse that there was a technical glitch around, you know, their server and all of that. But then the question is, was there actually a technical glitch, or was there a compromise or a manipulation by certain, you know, bad actors to be able to alter the outcome of the election? [This] used to be the order of the day in many supposedly, you know, democratic countries, especially from the part of the world that I come from, where people really doubt whether what they see as the outcomes of their election is the actual outcome or somebody just writing something that they want.
So technology has become a big issue in elections. On one side, technology has the potential to improve on [the] integrity of elections. But on the other side, bad actors also have the tendency to manipulate technology to make sure that the opinions or the wishes of the people do not matter at the end of the day. So that’s very important here.
KHUSHBU SHAH: And you just touched on my next question for Alissa and J.C. So, as you mentioned, digital authoritarians have used tech to abuse human rights, limit internet freedoms. We’re seeing this in Russia and Myanmar, Sudan, and Libya. Those are some examples. [The] deputy secretary mentioned a few others. For example, in early 2022, at the start of its invasion of Ukraine, Russia suppressed domestic dissent by closing or forcing into exile the handful of remaining independent media outlets. In at least fifty-three countries, users have faced legal repercussions for expressing themselves online, often leading to prison terms, according to a report from Freedom House. It’s a trend that leaves people on the frontlines defenseless, you know, of course, including journalists and activists alike.
And so, J.C., what have you seen globally? What are the key issues we must keep an eye on? And what—and what are some practical steps to mitigate some of these issues?
JUAN CARLOS LARA: Yeah. I think it’s difficult to think about the practical steps without first addressing what those issues are. And I think Boye was pointing out basically what has been a problem as perceived in many in the body politic, or many even activists throughout the world. But I think it’s important to also note that these broader issues about the threats to democracy, about the threats to human rights, [they] manifest sometimes differently. And that includes how they are seen in my region, in Latin America, where, for instance, the way in which you see censorship might differ from country to country.
While some have been able to pass laws, authoritarian laws that restrict speech and that restrict how expression is represented online and how it’s penalized, some other countries have resorted to the use of existing censorship tools. Like, for instance, some governments [are] using [Digital Millennium Copyright Act] notice and technical mechanisms to delete or to remove some content from the online sphere. So that also becomes a problematic issue.
So when we speak about, like, how do we go into, like, the practical ways to address this, we really need to identify… some low-level practices [that] connect with the higher-level standards that we aspire to for democracies; and how bigger commitments to the rule of law and to fair elections and to addressing and facing human rights threats goes to the lower level of what are actually doing in governments, what people are actually doing when they are presented with the possibility of exercising some power that can affect the human rights of the population in general. So to summarize a bit of that point, we still see a lot of censorship, surveillance, internet blockings, and also, increasingly, the use of emerging technologies as things that might be threatening to human rights.
And while some of those are not necessarily exclusive to the online sphere, they are certainly been evolving—they have been evolving [for] several years. So we really need to address how those are represented today.
KHUSHBU SHAH: Thank you. Alissa, as our industry expert I want to ask you the same question. And especially I want you to maybe touch upon what J.C. was saying about low-level practices that might be practical.
ALISSA STARZAK: You know, I think I actually want to step back and think about all of this, because I think—I think one of the challenges that we’ve seen, and we certainly heard this in Deputy Secretary Sherman’s remarks—is that technology brings opportunities and risks. And some of the challenges, I think, that we’ve touched on are part of the benefit that we saw initially. So the drawbacks that come from having broad access is that you can cut it off.
And I think that as we go forward, thinking about the Freedom Online Coalition and sort of how this all fits together, the idea is to have conversations about what it looks like long term, what are the drawbacks that come from those low-level areas, making sure that there is an opportunity for activists to bring up the things that are coming up, for industry, sort of folks in my world, to do the same. And making sure that there’s an opportunity for governments to hear it in something that actually looks collaborative.
And so I think that’s our big challenge. We have to find a way to make sure [that] those conversations are robust, that there is dialogue between all of us, and [that] we can both identify the risks that come from low-level practices like that and then also figure out how to mitigate them.
KHUSHBU SHAH: Thank you. And so, back to you—both of you. I’d like to hear from you both about, as part of civil society—we can start with you, Adegoke—what role as an organization, such as the Freedom Online Coalition, what kind of role can it play in all of these issues that we’re talking about as it expands and it grows in its own network?
ADEBOYE ADEGOKE: Yeah. So I think the work of the Freedom Online Coalition is very critical in such a time as this. So when you look at most international or global [platforms] where conversations around technology, its impact, are being discussed, human rights is rarely at the center of the issues. And I think that is where the advocacy comes in terms of highlighting and spotlighting, you know, the relevance of human rights of this issue. And as a matter of fact, not just relevance but the importance of human rights to this issue.
I think the work of the FOC is relevant even more to the Global South than probably it is to the Global North because in the Global South you—our engagement with technology, and I mean at the government level, is only from the—it’s likely from the perspective of… economics and… security. [Human rights] is, sadly, in an early part of the conversation. So, you know, with a platform like the FOC, it’s an opportunity to mainstream human rights into the technology, you know, conversation generally, and it’s a great thing that some of us from that part of the world are able to engage at this level and also bring those lessons back to our work, you know, domestically in terms of how we engage the policy process in our countries.
And that’s why it’s very important for the work of FOC to be expanded to—you know, to have real impact in terms of how it is deliberate—in terms of how it is—it is deliberate in influencing not just regional processes, but also national processes, because the end goal—and I think the beauty of all the beautiful work that is being done by the coalition—is to see how that reflects on what governments, in terms of how governments are engaging technology, in terms of how governments are consciously taking into cognizance the human rights implication of, you know, new emerging technologies and even existing technologies. So I think the FOC is very, very important stakeholder in technology conversation globally.
KHUSHBU SHAH: J.C., I want to ask you the same question, especially as Chile recently joined the FOC in recent years. And love to hear what you think.
JUAN CARLOS LARA: Yeah. I think it’s important to also note what Boye was saying in the larger context of when this has happened for the FOC. Since its creation, we have seen what has happened in terms of shutdowns, in terms of war, in terms of surveillance revelations. So it’s important to also connect what the likemindedness of certain governments and the high-level principles have to do with the practice of those same governments, as well as their policy positions both in foreign policy forums and internally, as the deputy secretary was mentioning.
I think it’s—that vital role that Boye was highlighting, it’s a key role but it’s a work in progress constantly. In which way? Throughout the process of the FOC meeting and producing documents and statements, that’s when the advisory network that Boye and myself are members of was created. Throughout that work, we’ve been able to see what happens inside the coalition and what—the discussions they’re having to some degree, because I understand that some of them might be behind closed doors, and what those—how the process of those statements comes to be.
So we have seen that very important role [in] how it’s produced and how it’s presented by the governments and their dignitaries. However, I still think that it’s a work in progress because we still need to be able to connect that with the practice of governments, including those that are members of the coalition, including my own government that recently joined, and how that is presented in internal policy. And at the same time, I think that key role still has a big room—a big role to play in terms of creating those principles; in terms of developing them into increasingly detailed points of action for the countries that are members of; but also then trying to influence other countries, those that are not members of the coalition, in order to create, like, better standards for human rights for all of internet users.
KHUSHBU SHAH: Any thoughts, Alissa?
ALISSA STARZAK: Yeah. You know, I think J.C. touched on something that is—that is probably relevant for everyone who’s ever worked in government which is the reality that governments are complicated and there isn’t one voice, often, and there frequently what you see is that the people who are focused on one issue may not have the same position as people who are working on it from a different angle. And I think the interesting thing for me about the FOC is not that you have to change that as a fundamental reality, but that it’s an opportunity for people to talk about a particular issue with a focus on human rights and take that position back. So everybody sitting in this room who has an understanding of what human rights online might look like, to be able to say, hey, this is relevant to my government in these ways if you’re a government actor, or for civil society to be able to present a position, that is really meaningful because it means that there’s a voice into each of your governments. It doesn’t mean that you’re going to come out with a definitive position that’s always going to work for everyone or that it’s going to solve all the problems, but it’s a forum. And it’s a forum that’s focused on human rights, and it’s focused on the intersection of those two, which really matters.
So, from an FOC perspective, I think it’s an opportunity. It’s not going to ever be the be all and end all. I think we all probably recognize that. But you need—I think we need a forum like this that really does focus on human rights.
KHUSHBU SHAH: An excellent point and brings me to my next question for you three. Let’s talk specifics, speaking of human rights: internet shutdowns. So we’ve mentioned Russia. Iran comes to mind as well during recent months, during protests, and recently, very recently, the Indian government cut tens of millions of people off in the state of Punjab as they search for a Sikh separatist.
So what else can this look like, J.C.? Those are some really sort of very basic, very obvious examples of internet shutdowns. And how can the FOC and its network of partners support keeping people online?
JUAN CARLOS LARA: Yes, thank you for that question because specifically for Latin America, the way in which shutdowns may present themselves is not necessarily a huge cutting off of the internet for many people. It sometimes presents in other ways, like, for instance, we have seen the case of one country in South America in which their telecommunication networks has been basically abandoned, and therefore, all of the possibilities of using the internet are lost not because the government has decided to cut the cable, but rather because it’s let it rot, or because it presents in the form of partially and locally focused cutting off services for certain platforms.
I think the idea of internet shutdowns has provided awareness about the problems that come with losing access to the internet, but that also can be taken by governments to be able to say they have not shut access to the internet; it’s just that there’s either too much demand in a certain area or that a certain service has failed to continue working, or that it’s simply failures by telecommunication companies, or that a certain platform has not complied with its legal or judicial obligations and therefore it needs to be taken off the internet. So it’s important that when we speak about shutdowns we consider the broader picture and not just the idea of cutting off all of the internet.
KHUSHBU SHAH: Adeboye, I’d like to hear what your thoughts are on this in the context of Nigeria.
ADEBOYE ADEGOKE: Yeah. It’s really very interesting. And to the point, you know, he was making about, you know, in terms of when we talk about shutdown, I think the work around [understanding shutdowns] has been great and it’s really helped the world to understand what is happening globally. But just as he said, I think there are also some other forms of exclusion that [happen] because of government actions and inactions that probably wouldn’t fall on that thematic topic of shutdown, but it, in a way, is some sort of exclusionary, you know, policy.
So an example is in some remote areas in Nigeria, for example, for most of the technology companies who are laying cables, providing internet services, it doesn’t make a lot of business sense for them to be, you know, present in those locations. And to make the matter worse for them, the authorities, the local governments, those are imposing huge taxes on those companies to be able to lay their fiber cables into those communities, which means that for the businesses, for the companies it doesn’t make any economic sense to invest in such locations. And so, by extension, those [kinds] of people are shut down from the internet; they are not able to assess communication network and all of that.
But I also think it’s very important to highlight the fact that—I mean, I come from the continent where internet is shut down for the silliest reason that you can imagine. I mean, there have been [shutdowns] because [the] government was trying to prevent students cheating in exams, you know? Shutdowns are common during elections, you know? [Shutdowns] happen because [the] government was trying to prevent gossip. So it’s the silliest of reasons why there have been internet [shutdowns] in the area, you know, in the part of the world that I am from.
But what I think—in the context of the work that the FOC does, I think something that comes to mind is how we are working to prevent future [shutdowns]. I spoke about the election that just ended in Nigeria. One of the things that we did was to, shortly before the election, organize, like, a stakeholder meeting of government representative, of fact checkers, of, you know, the platforms, the digital companies, civil society [organizations], and electoral [observers]… to say that, OK, election is—if you are from Africa, any time election is coming you are expecting a shutdown. So it’s to have a conversation and say: Election is coming. There is going to be a lot of misinformation. There’s going to be heightened risk online. But what do we need to do to ensure that we don’t have to shut down the internet?
So, for Nigeria, we were able to have that conversation a few weeks before the election, and luckily the [internet was] not shut down. So I mean, I would describe that as a win. But just to emphasize that it is helpful when you engage in a platform like the FOC to understand the dimensions that [shutdowns] take across the world. It kind of helps you to prepare for—especially if you were in the kind of tradition that we were to prepare for potential shutdown. And also I think it’s also good to spotlight the work that Access Now has done with respect to spotlighting the issue of shutdown because it helps to get their perspective.
So, for example, I’m from Nigeria. We have never really experienced widespread shutdown in Nigeria, but because we are seeing it happen in our sister—in our neighboring countries—we are kind of conscious of that and were able to engage ahead of elections to see, oh, during election in Uganda, [the] internet was shut down. In Ethiopia, [the] internet was shut down. So it’s likely [the] internet will be shut down in Nigeria. And then to say to the authority: No, you know what? We don’t have to shut down the internet. This is what we can do. This is the mechanism on [the] ground to identify risk online and address those risks. And also, holding technology platform accountable to make sure that they put mechanism in place, to make sure they communicate those mechanisms clearly during elections.
So it’s interesting how much work needs to go into that, but I think it’s… important work. And I think for the FOC, it’s also—it’s also very important to continue to communicate the work that the FOC is doing in that regard so that more and more people become aware of it, and sort of more people are prepared, you know, to mitigate it, especially where you feel is the highest risk of shutdown.
KHUSHBU SHAH: Thank you. I’m going to jump across to the other side of that spectrum, to surveillance tech, to the—to the almost literally—the opposite, and I wanted to start with the news that Deputy Secretary Sherman mentioned, with the news that the Biden administration announced just this afternoon, a new executive order that would broadly ban US federal agencies from using commercially developed spyware that poses threats to human rights and national security.
The deputy secretary also mentioned, Alissa, some guiding principles that they were going to announce later this week with the FOC. What are some—what are some things—what are some principles or what are some ambitions that you would hope to see later this week?
ALISSA STARZAK: So I think there’s a lot coming is my guess. Certainly the surveillance tech piece is an important component, but I think there are lots of broad guidelines.
I actually want to go back to shutdowns for a second, if you don’t mind…. Because I think it’s a really interesting example of how the FOC can work well together and how you take all of the different pieces—even at this table—of what—how you sort of help work on an internet problem or challenge, right? So you have a world where you have activists on the ground who see particular challenges who would then work with their local government. You have industry partners like Cloudflare who can actually show what’s happening. So are there—is there a shutdown? Is there a network disruption? So you can take the industry component of it, and that provides some information for governments, and then governments can work together to sort of pressure other governments to say these aren’t acceptable. These are—these norms—you can’t—no, you can’t shut down because you are worried about gossip, and cheating, and an exam, right? There’s a set of broad international norms that become relevant in that space, and I think you take that as your example. So you have the players—you have the government to government, you have the civil society to government, you have the industry which provides information to government and civil society. And those are the pieces that can get you to a slightly better place.
And so when I look at the norms coming out later this week, what I’m going to be looking for are that same kind of triangulation of using all of the players in the space to come to a better—to come to a better outcome. So whether that’s surveillance tech, sort of understanding from civil society how it has been used, how you can understand it from other tech companies, how you can sort of mitigate against those abuses, working with governments to sort of address their own use of it to make sure that that doesn’t become a forum—all of those pieces are what you want from that model. And I think—so that’s what I’m looking for in the principles that come out. If they have that triangulation, I’m going to be—I’m going to be very happy.
KHUSHBU SHAH: What would you both be looking for, as well? J.C., I’ll start with you.
JUAN CARLOS LARA: Yeah, as part of the [FOC advisory network], of course, there might be some idea of what’s coming in when we speak about principles for governments for the use of surveillance capabilities.
However, there are two things that I think are very important to consider for this type of issue: first of all is that which principles and which rules are adopted by the states. I mean, it’s a very good—it’s very good news that we have this executive order as a first step towards thinking how states refrain from using surveillance technology disproportionately or indiscriminately. That’s a good sign in general. That’s a very good first step. But secondly, within this same idea, we would expect other countries to follow suit and hopefully to expand the idea of bans on spyware or bans on surveillance technology that by itself may pose grave risks to human rights, and not just in the case of this, or that, or the fact that it’s commercial spyware, which is a very important threat including for countries in Latin America who are regular customers for certain spyware producers and vendors.
But separately from that, I think it’s very important to also understand how this ties into the purposes of the Freedom Online Coalition and its principles, and how to have further principles that hopefully pick up on the learnings that we have had for several years of discussion on the deployment of surveillance technologies, especially by academia and civil society. If those are picked up by the governments themselves as principle, we expect that to exist in practice.
One of the key parts of the discussion on commercial spyware is that I can easily think of a couple of Latin American countries that are regular customers. And one of them is an FOC member. That’s very problematic, when we speak about whether they are abiding by these principles and by human rights obligations or not, and therefore whether these principles will generate any kinds of restraint in the use and the procurement of such surveillance tools.
KHUSHBU SHAH: So I want to follow up on that. Do you think that there—what are the dangers and gaps of having this conversation without proposing privacy legislation? I want to ask both of our—
JUAN CARLOS LARA: Oh, very briefly. Of course, enforcement and the fact that rules may not have the institutional framework to operate I think is a key challenge. That is also tied to capacities, like having people with enough knowledge and have enough, of course, exchange of information between governments. And resources. I think it’s very important that governments are also able to enact the laws that they put in the books, that they are able to enforce them, but also to train every operator, every official that might be in contact with any of these issues. So that kind of principle may not just be adopted as a common practice, but also in the enforcement of the law, so get into the books. Among other things, I think capacities and resources are, like—and collaboration—are key for those things.
KHUSHBU SHAH: Alissa, as our industry expert, I’d like to ask you that same question.
ALISSA STARZAK: You know, I think one of the interesting things about the commercial spyware example is that there is a—there is a government aspect on sort of restricting other people from doing certain things, and then there is one that is a restriction on themselves. And so I think that’s what the executive order is trying to tackle. And I think that the restricting others piece, and sort of building agreement between governments that this is the appropriate thing to do, is—it’s clearly with the objective here, right?
So, no, it’s not that every government does this. I think that there’s a reality of surveillance foreign or domestic, depending on what it looks like. But thinking about building rulesets of when it’s not OK, because I think there is—there can be agreement if we work together on what that ruleset looks like. So we—again, this is the—we have to sort of strive for a better set of rules across the board on when we use certain technologies. And I think—clearly, I think what we’ve heard, the executive order, it’s the first step in that process. Let’s build something bigger than ourselves. Let’s build something that we can work across governments for. And I think that’s a really important first step.
ADEBOYE ADEGOKE: OK. Yeah, so—yeah, so, I think, yeah, the executive order, it’s a good thing. Because I was, you know, thinking to myself, you know, looking back to many years ago when in my—in our work when we started to engage our government regarding the issue of surveillance and, you know, human rights implications and all of that, I recall very vividly a minister at the time—a government minister at the time saying that even the US government is doing it. Why are you telling us not to do it? So I think it’s very important.
Leadership is very key. The founding members of the FOC, if you look FOC, the principles and all of that, those tests are beautiful. Those tests are great. But then there has to be a demonstration of—you know, of application of those tests even by the governments leading, you know, the FOC so that it makes the work of people like us easier, to say these are the best examples around and you don’t get the kind of feedback you get many years ago; like, oh, even the US government is doing it. So I think the executive order is a very good place to start from, to say, OK, so this is what the US government is doing right now and this is how it wants to define their engagement with spyware.
But, of course, like, you know, he said, it has to be, you know, expanded beyond just, you know, concerns around spyware. It has to be expanded to different ways in which advanced technology [is] applied in government. I come from a country that has had to deal with the issues of, you know, terrorism very significantly in the past ten years, thereabout, and so every justification you need for surveillance tech is just on the table. So whenever you want to have the human rights conversation, somebody’s telling you that, you want terrorists to kill all of us? You know? So it’s very important to have some sort of guiding principle.
Yeah, we understand [the] importance of surveillance to security challenges. We understand how it can be deployed for good uses. But we also understand that there are risks to human-rights defenders, to journalists, you know, to people who hold [governments] accountable. And those have to be factored into how these technologies are deployed.
And in terms of, you know, peculiar issues that we have to face, basically you are dealing with issues around oversight. You are dealing with issues around transparency. You are dealing with issues around [a] lack of privacy frameworks, et cetera. So you see African governments, you know, acquiring similar technologies, trying, you know, in the—I don’t want to say in the guise, because there are actually real problems where those technologies might be justified. But then, because of the lack of these principles, these issues around transparency, oversight, legal oversight, human-rights considerations, it then becomes problematic, because this too then become—it’s true that it is used against human-rights defenders. It’s true that it is used against opposition political parties. It’s true that it is used against activists and dissidents in the society.
So it’s very important to say that we look at the principle that has been developed by the FOC, but we want to see FOC government demonstrate leadership in terms of how they apply those principles to the reality. It makes our work easier if that happens, to use that as an example, you know, to engage our government in terms of how this is—how it is done. And I think these examples help a lot. It makes the work very easy—I mean, much easier; not very easy.
KHUSHBU SHAH: Well, you mentioned a good example; so the US. So you reminded me of the biometric data that countries share in Central and North America as they monitor refugees, asylum seekers, migrants. Even the US partakes. And so, you know, what can democracies do to address the issue when they’re sometimes the ones leveraging these same tools? Obviously, it’s not the same as commercial spyware, but—so what are the boundaries of surveillance and appropriate behavior of governments?
J.C., can I throw that question to you?
JUAN CARLOS LARA: Happy to. And we saw a statement by several civil-society organizations on the use of biometric data with [regard] to migrants. And I think it’s very important that we address that as a problem.
I really appreciated that Boye mentioned, like, countries leading by example, because that’s something that we are often expecting from countries that commit themselves to high-level principles and that sign on to human-rights instruments, that sign declarations by the Human Rights Council and the General Assembly of the [United Nations] or some regional forums, including to the point of signing on to FOC principles.
I think that it’s very problematic that things like biometric data are being used—are being collected from people that are in situations of vulnerability, as is the case of very—many migrants and many people that are fleeing from situations of extreme poverty and violence. And I think it’s very problematic also that also leads to [the] exchange of information between governments without proper legal safeguards that prevent that data from falling into the hands of the wrong people, or even that prevent that data from being collected from people that are not consenting to it or without legal authorization.
I think it’s very problematic that countries are allowing themselves to do that under the idea that this is an emergency situation without proper care for the human rights of the people who are suffering from that emergency and that situations of migrations are being treated like something that must be stopped or contained or controlled in some way, rather than addressing the underlying issues or rather than also trying to promote forms of addressing the problems that come with it without violating human rights or without infringing upon their own commitments to human dignity and to human privacy and to the freedom of movement of people.
I think it’s—that it’s part of observing legal frameworks and refraining from collecting data that they are not allowed to, but also to obeying their own human-rights commitments. And that often leads to refraining from taking certain action. And in that regard, I think the discussions that there might be on any kind of emergency still needs to take a few steps back and see what countries are supposed to do and what obligations they are supposed to abide [by] because of their previous commitments.
KHUSHBU SHAH: So thinking about what you’ve just said—and I’m going to take a step back. Alissa, I’m going to ask you kind of a difficult question. We’ve been talking about specific examples of human rights and what it means to have online rights in the digital world. So what does it mean in 2023? As we’re talking about all of this, all these issues around the world, what does it mean to have freedom online and rights in the digital world?
ALISSA STARZAK: Oh, easy question. It’s really easy. Don’t worry; we’ve got that. Freedom Online’s got it; you’ve just got to come to their meetings.
No, I think—I think it’s a really hard question, right? I think that we have—you know, we’ve built something that is big. We’ve built something where we have sort of expectations about access to information, about the free flow of information across borders. And I think that, you know, what we’re looking at now is finding ways to maintain it in a world where we see the problems that sometimes come with it.
So when I look at the—at the what does it mean to have rights online, we want to—we want to have that thing that we aspire to, I think that Deputy Secretary Sherman mentioned, the sort of idea that the internet builds prosperity, that the access to the free flow of information is a good thing that’s good for the economy and good for the people. But then we have to figure out how we build the set of controls that go along with it that are—that protect people, and I think that’s where the rule of law does come into play.
So thinking about how we build standards that are respect—that respect human rights in the—when we’re collecting all of the information of what’s happening online, right, like, maybe we shouldn’t be collecting all of that information. Maybe we should be thinking of other ways of addressing the concerns. Maybe we should be building [a] framework that countries can use that are not us, right, or that people at least don’t point to the things that a country does and say, well, if they can do this, I can do this, right, using it for very different purposes.
And I think—I think that’s the kind of thing that we’re moving—we want to move towards, but that doesn’t really answer the underlying question is the problem, right? So what are the rights online? We want as many rights as possible online while protecting security and safety, which is, you know, also—they’re also individual rights. And it’s always a balance.
KHUSHBU SHAH: It seems like what you’re touching on—J.C., would you like to—
JUAN CARLOS LARA: No. Believe me.
KHUSHBU SHAH: Well, it seems like what you’re talking about—and we’re touching—we’ve, like, talked around this—is, like, there’s a—there’s a sense of impunity, right, when you’re on—like in the virtual world, and that has led to what we’ve talked about for the last forty minutes, right, misinformation/disinformation. And if you think about what we’ve all been talking about for the last few weeks, which is AI—and I know there have been some moments of levity. I was thinking about—I was telling Alissa about how there was an image of the pope wearing a white puffer jacket that’s been being shown around the internets, and I think someone pointed out that it was fake, that it was AI-generated. And so that’s one example. Maybe it’s kind of a fun example, but it’s also a little bit alarming.
And I think about the conversation we’re having, and what I really want to ask all of you is, so, how might these tools—like the AI, the issue of AI—further help or hurt [human rights] activists and democracies as we’re going into uncharted territories, as we’re seeing sort of the impact of it in real time as this conversation around it evolves and how it’s utilized by journalists, by activists, by politicians, by academics? And what should the FOC do—I know I’m asking you again—what can the FOC do? What should we aim for to set the online world on the right path for this uncharted territory? I don’t know who wants to start and attempt.
ADEBOYE ADEGOKE: OK, I’ll start. Yeah.
So I think it’s great that, you know, the FOC has, you know, different task [forces] working on different thematic issues, and I know there is a task force on the issue of artificial intelligence and human rights. So I think for me that’s a starting point, you know, providing core leadership on how emerging technology generally impacts… human rights. I think that’s the starting point in terms of what we need to do because, like the deputy secretary said, you know, technology’s moving at such a pace that we can barely catch up on it. So we cannot—we cannot afford to wait one minute, one second before we start to work on this issue and begin to, you know, investigate the human rights implications of all of those issues. So it’s great that the FOC’s doing that work.
I would just say that it’s very important for—and I think this [speaks] generally to the capacities of the FOC. I think the FOC needs to be further capacitated so that this work can be made to bear in real-life issues, in regional, in national engagement so that some of the hard work that has been put into those processes can really reflect in real, you know, national and regional processes.
ALISSA STARZAK: Yeah. So I definitely agree with that.
I think—I think on all of these issues I think we have a reality of trying to figure out what governments do and then what private companies do, or what sort of happens in industry, and sometimes those are in two different lanes. But in some ways figuring out what governments are allowed to do, so thinking about the sort of negative potential uses of AI may be a good start for thinking about what shouldn’t happen generally. Because if you can set a set of norms, if you can start with a set of norms about what acceptable behavior looks like and where you’re trying to go to, you’re at least moving in the direction of the world that you think you want together, right?
So understanding that you shouldn’t be generating it for the purpose of misinformation or, you know, that—for a variety of other things, at least gets you started. It’s a long—it’s going to be a long road, a long, complicated road. But I think there’s some things that can be done there in the FOC context.
JUAN CARLOS LARA: Yes. And I have to agree with both of you. Specifically, because the idea that we have a Freedom Online Coalition to set standards, or to set principles, and a taskforce that can devote some resources, some time, and discussion to that, can also identify where this is actually the part of the promise and which is the part of the peril. And how governments are going to react in a way that promotes prosperity, that promotes interactivity, and promotes commerce—exercise of human rights, the rights of individuals and groups—and which sides of it become problematic from the side of the use of AI tools, for instance, for detecting certain speech for censorship or for identifying people in the public sphere, because they’re working out on the streets, or to collect and process people without consent.
I think because that type of expertise and that type of high political debate can be held at the FOC, that can promote the type of norms that we need in order to understand, like, what’s the role of governments in order to steer this somewhere. Or whether they should refrain from doing certain actions that might—with the good intention of preventing the spread of AI-generated misinformation or disinformation—that may end up stopping these important tools to be used creatively or to be used in constructive ways, or in ways that can allow more people to be active participants of the digital economy.
KHUSHBU SHAH: Thank you. Well, I want to thank all three of you for this robust conversation around the FOC and the work that it’s engaging in. I want to thank Deputy Secretary Sherman and our host here at the Atlantic Council for this excellent conversation. And so if you’re interested in learning more about the FOC, there’s a great primer on it on the DFRLab website. I recommend you check it out. I read it. It’s excellent. It’s at the bottom of the DFRLab’s registration page for this event.