BKC affiliate Ryan Merkley on the importance of reliable access to knowledge when making important choices:
“...getting it wrong will have potentially catastrophic effects for our families...for your health and the way we live.”
Dozens of public and private Facebook groups totaling hundreds of thousands of members have become a haven for conspiracy theories, medical equipment promotion and unproven cures.
“Manipulators will continue to use keyword squatting in private groups to seed health misinformation and scams,” BKC's Joan Donovan said. “The platform is too big for proper moderation, especially when most people posting about coronavirus are seeking information and asking about potential cures.”
Sasha Costanza-Chock discusses their new book, Design Justice.
“Technological innovation and design processes are quite messy, and...people are often marginalized from the stories we hear about the creation of new tools. Social movements are often hotbeds of innovation, but their contributions aren't always recognized.”
Google's decision to drop labels like "man" and "woman" from their image tagging AI pressures other companies to do the same. But Sasha Constanza-Chock argues that it isn't enough: "All classification tags on humans should be opt-in, consensual, and revokable."
Julie Owono explains the concern over the law:
“In some countries, like Ethiopia, insufficient and inadequate content moderation is becoming a danger for freedom of expression”
Bruce Schneier explains how threats to health surveillance systems pose an even larger risk to the public than misinformation, and why governments should prepare.
"With coronavirus on the verge of becoming a pandemic, the United States is at risk of not having trustworthy data, which in turn could cripple our country’s ability to respond."
BKC faculty associate Zeynep Tufekci on why being prepared for the global spread of coronavirus is “one of the most pro-social, altruistic things you can do in response to potential disruptions of this kind.”
“We should prepare, not because we may feel personally at risk, but so that we can help lessen the risk for everyone. We should prepare not because we are facing a doomsday scenario out of our control, but because we can alter every aspect of this risk we face as a society.”
BKC fellow Christo Wilson explains that while the convenience of skipping the checkout line comes at the expense of privacy, so does shopping through Amazon at all.
BKC affiliate Joan Donovan spoke at the Knight Media Forum on the growing challenge of hateful content online.
Clinical instructor Kendra Albert talks about their time as a student in the Cyberlaw Clinic, their current work, and what they’re the proudest of in their time at the Clinic:
“I think the thing that I’m most proud of is the experience we can provide to students and watching them get excited about doing real legal work that helps solve people’s problems, rather than more abstract or academic work," says Albert. “And I think that’s a really important part of what the Clinic brings to the technology law space at HLS and also to Berkman Klein.”
In a wide-ranging conversation with Nick Gillespie, Taylor Lorenz talks about how TikTok, the ultra-short video platform out of China, enhances self-expression, why government regulation of online speech is always ultimately doomed to fail, and how the future depends on all of us developing media literacy in a hurry.
The 2019–2020 academic year marks the twentieth anniversary of the Cyberlaw Clinic, which is based at the Berkman Klein Center. To commemorate the occasion, we spoke with Kendra Albert (Harvard Law J.D. ‘16), clinical instructor in the Cyberlaw Clinic and former student in the Clinic about their takeaways from that experience, their current work, and what they’re the proudest of in their time at the Clinic.
BKC faculty associate Francine Berman on the question that keeps her up at night: Is the Internet of Things a future utopia, or is it a future dystopia?
“How do we promote ethical behavior in autonomous systems? How do we promote privacy and protections? Because if we are not doing that, then the technology is not serving us. And at the end of the day, we want the technology to serve us; we don’t want to serve the technology.”
Desmond Patton joined the MIT CMSW podcast to discuss the promise and challenge of eliciting context in social media posts with natural language processing.
BKC faculty associate Danielle Citron joins former Florida gubernatorial candidate Andrew Gillum, director of the ACLU’s voting-rights initiative Dale Ho, and election law professor Rick Hasen to discuss can-do fixes for what threats to the integrity of U.S. elections.
Desmond Patton discusses the importance of meaningful interaction with the people behind data with the authors of Data Feminism.
“It became really clear to me that we needed to create a new approach to social media data that could really grasp culture, context and nuance. For the primary reason of not misinterpreting what’s being said.”
BKC fellow Beatriz Botero Arcila on why getting compensated for your personal data is not a good idea.
"This is why 'data as labor' or property would reinforce current patterns of inequality, perhaps aggravate them, and legitimize some of the uses of personal data that we today find concerning, as companies would be able to say 'but I paid for it.'"
As some countries restrict and replace content, are we headed toward a world of multiple internets? David Weinberger argues that we are already there.
“So on one hand, yes, the dream [of an open internet] is dead. On the other hand, that statement causes us, I think, to overlook the quite often very positive transformation that the internet has brought about: in our sense of what it means to speak in public, what it means to connect in public.”
Pew Research Center and Elon University’s Imagining the Internet Center canvassed technology experts in the summer of 2019 to gain their insights about the potential future effects of people’s use of technology on democracy.
One of the most extensive and thoughtful answers to the canvassing question came from Judith Donath. She chose not to select any of the three possible choices offered in this canvassing, instead sharing two possible scenarios for 2030 and beyond. In one scenario, she said, “democracy is in tatters.” Disasters created or abetted by technology spark the “ancient response” – the public’s fear-driven turn toward authoritarianism.
In the second scenario, “Post-capitalist democracy prevails. Fairness and equal opportunity are recognized to benefit all. The wealth from automation is shared among the whole population. Investments in education foster critical thinking and artistic, scientific and technological creativity. … New voting methods increasingly feature direct democracy – AI translates voter preferences into policy.”
Zeynep Tufekci on why China’s use of surveillance and censorship makes it harder for Xi Jinping to know what’s going on in his own country.
“It’s not clear why Xi let things spin so far out of control,” writes Tufekci. “It might be that he brushed aside concerns from his aides until it was too late, but a stronger possibility is that he did not know the crucial details.”