Investigating AI and Surveillance Technology in Your Community [Sponsored]

  • Friday, Oct 09 – 12:15 PM – 1 PM ET (16:15 – 17 UTC)
  • Not available virtually
  • #ONA20
To access this session, register here for ONA20 Everywhere: ONA20! Already registered? Log into your account.

The use of algorithmic decision-making tools is on the rise across our institutions, from criminal justice and education to public benefits and health care. Take facial recognition as one example: Police can use a facial recognition app, built on a database of more than three billion images scrubbed from social media, to try and identify protesters in a crowd or a person accused of a crime. And while some cities have banned the use of facial recognition, others are installing multi-million-dollar systems aiming to facilitate real-time tracking of individuals, similar to the mass surveillance systems in China.

Meanwhile, communities most impacted by these technologies have little power over the algorithms that judge them. This panel will discuss how to investigate and report on the use of AI and surveillance technologies in your own community for greater transparency and accountability.

This session is designed for:

  • Journalists looking for ways to cover the increasing prevalence of artificial intelligence systems in governance and policing, and its impacts on communities
  • Newsroom leaders looking to develop new story ideas and angles in their criminal justice and government coverage.
  • Anyone who is interested in data journalism, systemic bias, and future trends

This event is supported by the John D. and Catherine T. MacArthur Foundation

Speakers

Hannah Sassaman
Policy Director, Movement Alliance Project
hannahsassamanVisit Website

Rashida Richardson
Visiting Scholar, Rutgers Law School
Visit Website

Inioluwa Raji
Fellow, Mozilla Foundation
rajiinio

Moderator

Kashmir Hill
Reporter, The New York Times
kashhillVisit Website