The rapid emergence of biometric solutions has led to the improved security and usability of countless digital devices. A quick glance at a smartphone with facial recognition can unlock it and using voice recognition can make accessing telephone banking simpler than ever before.
Despite these advancements, not all people have benefited equally from biometric innovations. For example, people with disabilities face challenges when interacting with many biometric solutions. Bank customers who have a stammer or voice tremor may find it stressful and difficult to use voice recognition services. Somebody who was born with visible facial differences, scarring or had injuries could not be able to use face recognition or fingerprint recognition at all.
Recent studies have also illustrated how accessibility issues with biometric solutions are playing a role in hindering people with disabilities in accessing digital services. A report from non-profit MITRE identified that biometric services that require the electronic device to be held in a certain angle to the face can cause problems for people with limited vision or those people who are blind.
Inclusive product development
To overcome accessibility issues in biometric solutions, consideration needs to be given to these factors at every stage of the development process. In practice, the creators of these services should consult with organisations representing people with disabilities, minority groups and other vulnerable groups.
Without hearing first-hand from the people who are most impacted by the failures to make biometric solutions accessible, it is difficult to ensure these products will actually meet the needs of these groups.
Leaving the consideration of accessibility factors as an afterthought in the development process makes it much more complex to incorporate features that support marginalised groups. An effective way to immediately improve the experience for all people who use biometric solutions is to offer a choice of biometric services to users, instead of only allowing one to be used.
Users are then able to select the option they are most comfortable using, rather than facing challenges being forced to use a single service.
Addressing biometric bias
Inclusivity issues can also arise when it comes to the biases that are embedded into biometric solutions. Evidence of racial bias in facial recognition has been widely reported in recent years, with a study of 189 algorithms finding that these technologies are the least accurate on women of colour.
Both conscious and unconscious biases have entered the biometric development process, resulting in minority groups facing potential discrimination. The first step to addressing these inequities is to undertake a thorough evaluation of current biometric solutions to understand where discriminatory processes exist and remove them.
Consulting ethical guidelines around AI usage is a good place to start when embarking on a new biometric solution, so that potential pitfalls can be avoided. As biometric solutions only grow more popular and become increasingly essential to access vital services and products, companies need to take renewed action on ensuring that accessibility issues are genuinely tackled.
Written by Finbarr Toesland, Editorial Contributor, VC Innovations