360/OS 2019: Ethical Norms and Methods in Open-Source Research

Part of a series of posts highlighting key themes at the DFRLab’s 360/OS 2019 summit

360/OS 2019: Ethical Norms and Methods in Open-Source Research

Share this story
THE FOCUS

Part of a series of posts highlighting key themes at the DFRLab’s 360/OS 2019 summit

(Source: @sarahphotovideo)

In our inaugural post for 360/OS 2019, we welcomed you to our community of #DigitalSherlocks. In this post, we highlight two panels from the first day of discussion that underscored a common theme: one of the key challenges facing #DigitalSherlocks studying disinformation and online influence today is developing a set of methodological best practices and ethical norms to guide our research. These standards would ensure greater transparency, replicability, and accountability on our part; they would also help us guard against targeted efforts to undermine our work.

Powered by Proof

On the morning of the first day, June 20, the DFRLab’s senior fellows, Andy Carvin and Ben Nimmo, explained that, because #DigitalSherlocks come from such a variety of backgrounds, we all have different definitions of what constitutes “open-source research.” Nevertheless, Carvin and Nimmo suggested that most definitions of the term would include three primary criteria:

  1. Transparency;
  2. Replicability;
  3. Accountability.

These criteria serve as a basis for a “scientific method” for open-source research. “I think one of the most important aspects of the work we do is that we don’t stand up and say, ‘trust us,’” Carvin noted. “We are willing to say, ‘here is the evidence, here is our proof, here is our methodology — walk yourself through it.’”

DFRLab Senior Fellows Andy Carvin and Ben Nimmo discuss the parallels between Sherlock Holmes and the #DigitalSherlocks, as well as the criteria that indicate good open-source research. (Source: @sarahphotovideo/SarahHalls.net)

We share a certain affinity for Sherlock here at the DFRLab — in many ways, Carvin and Nimmo observed, he was the original open-source researcher. But it is equally important, they argued, to note where we diverge from him. Sherlock was… well, a misanthrope. He preferred to work in isolation; at the DFRLab, in contrast, collaboration is at the heart of our model.

Within the open-source environment in which we operate, our work is continuously subject to open peer-review by a network of digital forensic analysts. In announcing our partnership with Facebook on election integrity, for example, the DFRLab Director and Managing Editor Graham Brookie wrote in 2018: “[u]sing open source means that we do not ask our readers to take our credibility for granted: we present our findings in a way that anyone with access to the internet can verify them.” Collaboration also allows us to engage in a dialogue with our peers and the public, rather than impose our perspective on the issues at hand.

Can We? Should We?

After Carvin and Nimmo established working definitions, Alexa Koenig and Kate Starbird picked up with a discussion of the ethical dilemmas facing open-source researchers.

Dr. Koenig, who directs the Human Rights Center at UC Berkley School of Law, stressed the importance of privileging psychosocial security in the research community. “When we talk about psychosocial security,” she noted, “a big piece of what we’re doing is looking at very large quantities of sometimes very graphic content, and that gets to the best of us — and it’s not a question of strength.”

In “Can We, Should We?” Alexa Koenig and Kate Starbird homed in on the ethical challenges and and personal security risks open-source researchers face. (Source: @sarahphotovideo/SarahHalls.net)

At the DFRLab, we continuously grapple with how best to balance the duty to inform with duty of care for both our research community as well as the broader public. When do the risks associated with republishing graphic content as evidence outweigh the benefits for transparency and replicability? How do we educate others about the risks associated with prolonged exposure to graphic content?

In March, a live-streamed video of a horrific terrorist attack against two mosques in Christchurch, New Zealand went viral on social media. In response, the DFRLab’s Andy Carvin urged readers not to view the violent video, drawing from his own experience with developing vicarious trauma from repeated exposure to graphic content online in the course of his reporting. Unnecessary exposure to graphic imagery “numbs our sensitivity, damages our psyche, and distorts our worldview in ways that helps no one except the people who want us to fall prey to online violence and violent propaganda,” Andy argued.

Dr. Starbird, who directs the Emergent Capacities of Mass Participation Lab (emCOMP Lab) at the University of Washington, discussed the importance of developing ethical guidelines for our work. She explained that the nature of our research renders us particularly vulnerable to targeted attacks from bad actors. We should thus be particularly diligent to establish and build trust between researchers and their audience, so that our research can withstand bad faith efforts to undermine our credibility.

Kate Starbird explains how open-source researchers can defend themselves from efforts to undermine their work. (Source: @sarahphotovideo/SarahHalls.net)

We have these ethical concerns in mind in our effort to build a community of #DigitalSherlocks. In addition to holding us accountable with regard to our sources and methods, as Carvin and Nimmo underscored, establishing this network also allows us to defend one another in the face of strategic and coordinated efforts to delegitimize our work and silence our voices.


Follow along for more in-depth analysis from our #DigitalSherlocks.