This post was originally published on this site is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to

New York City may soon gain a task force dedicated to monitoring the fairness of algorithms used by municipal agencies. Formed from experts in automated systems and representatives of groups affected by those systems, it would be responsible for closely examining algorithms in use by the city and making recommendations on how to improve accountability and avoid bias.

The bill, which doesn’t have a fancy name, has been approved by the city council and is on the Mayor’s desk for signing. The New York division of the ACLU has argued in favor of it.

Say, for instance, an “automated decision system” (as the law calls them) determines to a certain extent who’s eligible for bail. It may be that biases inherent to the training data that produced this system tend to result in one group being unjustly favored for bail hearings over another.

The task force will be required to author a report that lays out procedures for dealing with situations like the above. Specifically, the report will make recommendations regarding the following:

  • How can people know whether or not they or their circumstances are being assessed algorithmically, and how should they be informed as to that process?
  • Does a given system disproportionately impact certain groups, such as the elderly, immigrants, the disabled, minorities, etc?
  • If so, what should be done on behalf of an affected group?
  • How does a given system function, both in terms of its technical details and in how the city applies it?
  • How should these systems and their training data be documented and archived?

The task force would need to be formed within three months of the bill’s signing, and importantly it must include “persons with expertise in the areas of fairness, accountability and transparency relating to automated decision systems and persons affiliated with charitable corporations that represent persons in the city affected by agency automated decision systems.”

So this wouldn’t just be a bunch of machine learning experts and a couple lawyers. You need social workers and human rights advocates as well, something I’ve certainly argued for in the past.

The report itself (which would be public) wouldn’t be due for 18 months, but this isn’t the kind of thing you want to rush. Assessing these systems is a data-intensive task and creating parallel municipal systems to make sure people don’t fall through the cracks is civically very important.

Featured Image: Aniwhite/Shutterstock

At L Technology Group, we know technology alone will not protect us from the risks associated with in cyberspace. Hackers, Nation States like Russia and China along with “Bob” in HR opening that email, are all real threats to your organization. Defending against these threats requires a new strategy that incorporates not only technology, but also intelligent personnel who, eats and breaths cybersecurity. Together with proven processes and techniques combines for an advanced next-generation security solution. Since 2008 L Technology Group has develop people, processes and technology to combat the ever changing threat landscape that businesses face day to day.

Call Toll Free (855) 999-6425 for a FREE Consultation from L Technology Group,