Michael Appleton/ Mayoral Photography Office

Mayor de Blasio at a 2016 crime statistics briefing. The Brennan Center recently won partial disclosure of materials related to the NYPD's predictive policing pilot—a technology that data scientists at the Human Rights Data Analysis Group found discriminated against poor communities of color in Oakland.

The use of hidden algorithms in everything from online shopping, to social media, to election redistricting already impacts our daily lives. Some financial experts have argued that the reckless use of algorithms crashed the economy, and tech journalists have investigated whether algorithms used by social media giants like Twitter and Facebook resulted in Donald Trump’s election to the presidency. Former Wall Street quantitative analyst and Harvard Ph.D. data scientist Cathy O’Neil has called these algorithms “Weapons of Math Destruction.” In her book of the same title, she warns that large decision-making datasets and secret algorithms have the combined potential to harm society in ways society, lawmakers and courts can’t detect, let alone redress.

Government agencies are also using hidden algorithms to make decisions that will shape the lives of its constituents. A recent New York Times Magazine piece questioned the use of an algorithm by a child-welfare agency in Pennsylvania to screen child abuse calls for more intensive investigation. New York City uses a “deferred acceptance algorithm” to screen children for school choices. New Jersey just overhauled its entire criminal justice system and is now relying on an algorithm to help determine who should be incarcerated before they’ve even been convicted of a crime.

Civil rights advocates are concerned with both the fairness and the lack of transparency of these algorithms. In a widely read piece about the use of a sentencing algorithm in Florida, data scientists found that the algorithm was biased against Black people. The Legal Aid Society has litigated the fairness of algorithms used in DNA testing. The Brennan Center recently won partial disclosure of materials related to the NYPD’s predictive policing pilot—a technology that data scientists at the Human Rights Data Analysis Group found discriminated against poor communities of color in Oakland.

In December, the New York City Council passed a bill that requires Mayor de Blasio to create a task force to examine the use of algorithms by New York City agencies. The task force will draft a report that will identify the use of algorithms in various government agencies and identify the risk that these algorithms may be harming our communities. The task force will then issue recommendations on oversight, fairness and transparency.

This task force is an opportunity for the mayor to lead the country. By appointing an independent commission, the mayor will be taking the first important step in dramatically changing the way algorithms impact our lives. The makeup of the task force must be diverse. Appointing a commission influenced by corporate interests will fail to give algorithms the critical review they deserve. Likewise, a commission of only data scientists will also not sufficiently voice the concerns of impacted communities.

Ideally, the task force will prioritize voices from impacted communities, particularly communities of color who may suffer outsized harm by the use of algorithms. People like Khalil Cumberbatch of the Legal Action Center and Monique George of Picture the Homeless should be appointed, along with organizers and data scientists from Data for Black Lives. The task force must also include representation from non-profits like the Legal Aid Society and the New York Civil Liberties Union who serve communities impacted by these tools and have previously voiced criticism for the use of algorithms.

The mayor must take care to avoid appointing agencies and private corporations that have a clear conflict of interest in the task force’s recommendations. Task-force members should be required to publicly disclose professional affiliations with organizations promoting or selling algorithms in advance. Those who have any stake in the task force’s recommendations should be excluded. Government agencies like the New York City Police Department, the Mayor’s Office of Criminal Justice and the Office of the Chief Medical Examiner who are currently using, or seeking to use, a variety of algorithm-based technologies are not independent and would improperly defer to agencies about the algorithms they employ, hoping to avoid scrutiny of their own models.

An earlier iteration of the bill sought to directly open up algorithms to greater fairness and accountability—that must be the goal of this task force. It must put the priorities of the public first, and it must include voices from the communities that will be impacted the most. This bill presents an opportunity to shape one of the most important conversations of the digital age—the mayor must ensure that it is the priorities of the people that prevail.

Joshua Norkin is the Project Coordinator for the Decarceration Project of the Legal Aid Society of New York City.