Home
ViSmedia
News

Facing Facts at the Airport

Presentation
Photo:
ViSmedia

Main content

Facial recognition is already being used to identify suspects, with a system that can match a face from a photo to millions of photos from driver’s licence and other archives. And by 2021, they will use face recognition to identify 100% of international passengers at US airports. One problem here is false matches, explains associate professor Nicholas Diakopoulos, in this presentation for the ViSMedia Conference. For example, facial recognition systems have been shown to perform less accurately for people of colour. 

This is not just a question of how algorithms work. Algorithms are woven together with designers, operators, owners, and maintainers in complex sociotechnical systems. Algorithmic accountability is about understanding how those people exercise power within and through the system. The designers and owners are ultimately responsible for the system’s decisions.

A study done by the ACLU in 2018 audited the accuracy of Amazon’s Face Recognition system, which is being sold to various authorities across the USA. For some shock value they ran the software on the portraits of members of congress and found that the false matches were disproportionally of people of colour.

Transparency means you should have information about the procedures and the outcome of a system.  Transparency is about information, related both to outcomes and procedures used by an actor, and it is relational, involving the exchange of information between actors. Accountability in turn is about the relevant entity answering for and taking responsibility for a lack of apt behaviour, such as a violation of some ethical expectation or societal standard. In order to tackle the accountability of algorithmic systems there must be some way to know if there has been a lapse in behaviour. 

Nicholas Diakopoulos - Transparency and the Ethics of AI