Yesterday, the ICO published its draft Age Appropriate Design Code setting out guidance for online services likely to be accessed and used by children. The draft Code is open for consultation until the end of May, signalling a clear intention to get the Code up and running as soon as possible.
Whilst it is understandable that there is a desire to get measures designed to create safer experiences for children and young people implemented without delay, this extremely tight window to seek views on complex issues is disappointing, particularly given the low rate of initial consultation responses (you can read our own initial response here). techUK hopes that this short formal consultation window will be supplemented by extensive consultation and work with those in scope to ensure the Code is proportionate, workable and effective.
It is important to not lose sight of the bigger picture during this consultation. The internet and the services built on it can bring numerous benefits to users of all ages whether to educate, to connect or to play. While the ICO’s draft code gives the broad assumption that data collection and processing is inherently harmful it disregards the positive behaviour it can help drive, whether this is helping a child develop a hobby by suggesting more interesting content, to connecting users with like-minded communities and developing their critical thinking.
As with the Online Harms White Paper, launched last week, the Code will apply to a far wider number of services and companies than simply those social media companies in the spotlight at the moment. The draft Code explicitly states that it applies to anyone providing online products or services that process personal data and are likely to be accessed by children - even where the service provided is not aimed or exclusively for young people.
The code will apply regardless of size, or how likely young people would be in accessing the service, running the risk of entrenching companies who have the resources and capability to meet the Code’s demanding requirements for the five different age groups set out.
While many companies are implementing or working towards some of the 16 standards the ICO sets out – from increasing transparency to robust enforcement of community standards – we should be conscious of the different capabilities of companies. A lack of proportionality will mean that a start-up near Silicon Roundabout will have the same level of responsibility as a multinational company. Similarly, an online news site, which young people may access, would be required to implement the same standards as a service explicitly designed for, marketed to and used by children.
Other standards, however, fail to address the concerns raised in the initial call for evidence and are in fact at odds with each other. Take for example standards on age-appropriate application. The ICO recommends that in order to ease the burden of essentially creating different services for different users, a service should "provide a child-appropriate service to all users by default with the option of age-verification mechanism to allow adults to opt out".
This could possibly be workable, although it is arguable whether it is desirable, especially when considering the same principle for data minimisation. In order to tailor design or implement robust age verification as the code demands, many websites and services will need to collect additional information on their users that previously was never needed to be collected or stored, contradicting the very principle of data minimisation.
This approach presents many data protection risks, particularly for children who may now have their personal data linked to their verified identity. It is worth noting here that age-verification requirements for pornographic websites to be regulated by the BBFC have been delayed twice already for over a year amid concerns of data protection – it is difficult to see the same concerns for children’s data to be addressed in the next few months.
An overemphasis on the design of online services at the expense of looking at the whole picture runs the risk of hindering children’s online development. Moving away from the specifics in the Code - which techUK will respond to in full, more attention should be paid to improving the enormous role for education and parental oversight in teaching children to understand how online services work, and how they can manage their online lives safely and securely. Industry already funds a range of initiatives to empower parents and children on this front, and more focus should be given to these initiatives.
With only a short six week consultation period it is critical we get these decisions right to create a design code that is both effective at protecting children, but also limits the unintended consequences. The current proposals create many challenges, including potentially increasing the risk of children’s personal data getting into the wrong hands, while also creating significant barriers to entry for many start-ups to challenge existing players. We look forward to engaging with the ICO over the coming weeks to get these decisions right.