Blog

Author

  • Laura Camplisson

    Laura Camplisson has led on the Future Identity portfolio of content and events since the brand’s launch in 2021. Laura regularly contributes to articles, reports and blogs exploring the latest initiatives, technologies and concepts in the identity space. She is particularly excited about the potential for digital identity to enable greater inclusion and make everyday life more seamless.

    View all posts

Can I see some ID? Advances in age verification online

age verification online

Laura Camplisson, Future Identity Portfolio lead, explores the growing demand for age verification technology, to protect consumers and businesses in an online world.
__________________________________________________________________

As an adult, it is likely that you remember having lied about your age at least once as a teenager. 

Maybe you were the last of your friends to turn 18, so you nabbed an older siblings ID to get into a nightclub. Maybe you snuck into an R rated movie, sheepishly convinced a shopkeeper you were old enough to buy a top shelf magazine, or even managed to get hold of a fake ID to buy alcohol. 

A few times you may have gotten away with it. But often the person checking your identification was able to use their judgement to determine whether you were the age you claimed.  

Crucially, the margin for error in a face-to-face interaction was limited – at age 17 you may have passed as 18, but it’s very unlikely that at age 10 you could have. 

In a digital world, things aren’t quite as simple. Without the right verification technology in place, online users can be any age they claim to be. So how can online businesses ensure age-restricted content and products aren’t available to vulnerable, underaged persons?  

Purchasing age restricted goods online

UK law is clear that the sale of tobacco, e-cigarettes, and alcohol to anyone under the age of 18 is illegal. Merchants selling these goods online need reliable age verification process in place to ensure legal compliance and avoid penalties.  

During the pandemic and subsequent lockdowns, many non-essential retailers were forced to shift operations online. This put pressure onto small scale businesses such as independent craft breweries and start-up vaping e-stores to implement remote identity proofing technology.  

There is no shortage of age verification solutions on the market. Credit references, mobile provider data, even face verification combined with an identity document scan, can all offer businesses greater confidence in a customer’s age.  

But the key is to implement technology which provides the right level of assurance, without requesting any more data than necessary. Businesses want to avoid being responsible for safely storing their customers personal information and being liable if anything goes wrong.   

That’s not forgetting the pressure for efficiency. Avoiding overly cumbersome identity verification processes is crucial to minimising the number of customers abandoning account opening or failing to complete a transaction.

Data privacy concerns 

But no matter how easy the verification process, not every customer will feel comfortable disclosing their identity to access age restricted services online. 

Individuals from religious or cultural backgrounds prohibiting the consumption of alcohol or tobacco, for example, may want to avoid associating themselves to purchases of these goods. Those looking to access online pornography, might find technology such as face biometric verification, feels far too intrusive.   

Iain Corby, Executive Director of the Age Verification Providers Association notes, “I don’t really have a problem with sharing my data online and in fact, I kind of gave up a long time ago trying to keep anything secret. But there are people transacting online who, whether because they’re in repressive states or because of religious or cultural sensitivities, are nervous about sharing their identity.”  

In a lot of cases these concerns are misplaced, and solutions are designed with privacy and minimal data collection in mind. But consumers can still be discouraged from completing an age check if they have even the slightest doubt that their online behaviour is being tracked, and critically being linked to their offline identity. 

Ian Corby explains that unless there is legal just cause, users should be able to retain some aspect of anonymity online, “I always say that age verification is the ability to prove your age online without disclosing your identity,” he notes

And solutions are emerging which provide ‘Age Tokens’ to individuals, allowing them to prove they are 18+ with a cryptographically secured token without giving away any other information, such as date of birth, credit card details, or contact information. 

This would not only put consumers at ease, but also make business sense. For businesses the only attribute that really needs to be verified is that an individual is legally able to purchase their products. From a data management compliance perspective, the less information they are required to manage the better. 

Protecting minors from adult content 

Making it easy for customers of a legal age to easily engage with services, is only one piece of the puzzle. Businesses also want to ensure minors are restricted and shielded from harm – if not as a moral responsibility, at the very least to protect brand reputation. 

At present children in the UK can access adult content online with relative ease, either by searching for pornographic websites or stumbling across them by accident. The government has come under increasing pressure to pass legislation in this area. And according to a June 2021 poll commissioned by CARE, 81% of UK adults agree the age verification should be implemented for online pornography. 

Last month the All-Party Parliamentary Group on Commercial Sexual Exploitation (APPGCSE) launched an inquiry in response to growing concern over the extreme nature of commercial pornography in the UK. APPGCSE Chair, Diana Johnson, MP, believes compulsory age verification should be made part of the government’s 2021 Online Safety Bill, to prevent young children from seeking out soft adult content, and instead stumbling upon extreme or violent content, without restriction.

But the issue remains that previous attempts at mandating age verification for adult content, have been met with stringent privacy pushback. The 2017 Digital Economy Act proposed age checks on adult websites, but after numerous delays and criticism from privacy campaigners, the plans were officially dropped in 2019.  

The Open Rights Group were one organisation very vocal in their opposition to the proposal, arguing that the privacy protections afforded by the Act, “would have prompted the creation of vulnerable records of the public’s porn preferences”, and “could lead to people being outed, blackmailed or having their careers destroyed.”  

Clearly, strictly enforced privacy policies, based around minimal data collection are critical if age verification is to become a requirement for accessing adult content online.

Social media and data driven harms 

Social media platforms have also come under increasing pressure to protect their younger users. From content promoting eating disorders and suicide, to paedophile activity, it’s shocking the harms children can be exposed to via these channels.  

The 2021 Online Safety Bill created a new duty of care on websites and search engines which allow user-generated content to be shared. This included a requirement to assess the risk of children being exposed to content which is illegal, or harmful – in other words risks having an adverse physical or psychological impact. 

Without being able to identify the perpetrators, and the vulnerable victims of dangerous or violent content it becomes challenging for social channels to effectively moderate it. As Dr Rachel O’Connell, comments, “If platforms know the ages of their users, they can create safer spaces for them to participate in.” 

YouTube is one example of a content sharing platform rolling out age verification technology. From the end of 2020, all new YouTube users have been asked to provide photographic identification to prove they are over the age of 18. 

But as Dr O’Connell highlights data driven harms are also a crucial issue to address if we are to fully protect children online, “From age 13 in the UK,” Dr O’Connell explains, “children are deemed to be proficient at reading privacy policies, understanding cookies, and the whole nature of the data ecosystem surveilling them.” 

Online media sites can then collect data points about these young users, find patterns in their viewing habits and push specific content to what has been identified as being a captive audience. This has the potential to be detrimental for young vulnerable individuals. For example predictive analysis could be used to surface content promoting body dysmorphia to young people who have engaged with dieting tips. 

Dr O’Connell suggests that education has an important role to play here in preparing children for safely interacting online, “How do we empower and enable young people with the tools, knowledge and skills so they can actively participate in the digital playground designed for them?”, she asks.

Standards and progress 

In short, the protection of vulnerable identities online is a complex dynamic to navigate. Businesses want to preserve their commercial interests, ensuring they remain compliant, without deterring customers from using their services with overly invasive identity checks.  

Privacy advocates want individuals to retain the right to online anonymity, while those advocating for the rights of children and young people, feel strongly that much more stringent age verification should be enforced.  

All these competing interests exist within the wider identity ecosystem and an evolving landscape of assurance levels, regulation, liability and redress measures should anything go wrong. Industry standardisation is likely to have a key role to play in driving progress. 

And the age verification sector is already well advanced in creating recognised standards. Ian Corby recognises the importance of industry assessment bodies such as ACCS (Age Check Certification Scheme), in certifying how accurate solutions are. “Standardisation has allowed for a clear delineation between good quality age verification providers and those who ask users to tick a box or type their date of birth,” he comments. 

Standardisation could also be key in gaining the public’s trust in age verification checks. Mark Cooley, Head of Sales at ACCS highlights the importance of their GDPR certification mark in letting users know their data will be protected. “The public are unsure what happens to their data,” Cooley notes, “and they’re not sure what would happen if they ever asked for their digital ID to be deleted.” Restoring this confidence could go a long way towards supporting adoption.  

Perhaps Nick Mothershaw, Chief Identity Strategist of the Open Identity Exchange summarises best the challenge ahead; “We need to implement a set of rules, procedures and codes that sit across the ecosystem.” Of course, this is no easy task, but a focus on user trust, data protection and standardisation could be the key to driving progress in how we verify age online and in turn making it safer for young and vulnerable persons to interact in today’s online world.   

__________________________________________________________________

The comments shared in this post were made in the panel, ‘Protecting Vulnerable Identities Online’, which took place at the Future Identity and Fintech Talents festival 2021.

Author

  • Laura Camplisson

    Laura Camplisson has led on the Future Identity portfolio of content and events since the brand’s launch in 2021. Laura regularly contributes to articles, reports and blogs exploring the latest initiatives, technologies and concepts in the identity space. She is particularly excited about the potential for digital identity to enable greater inclusion and make everyday life more seamless.

    View all posts