My blog

Safer Generational Childhoods Online or Not...

20 Oct 2020

The Information Commissioners Office (ICO) is set to introduce and age-appropriate code derivative from the Data Protection Act 2018. This Code of Practice is to set out 15 standards of age appropriate design for online services which process personal data and are likely to be accessed by children. The code is supposedly leverage to be used as practical guidance on data protection safeguards so that online services are appropriate for use by children. Firstly  looking at the definition of appropriate meaning suitable or proper in the circumstances. This alone groups all children into one pool of users (children). How do we know is such a service is appropriate if the designers have not consulted with, assessed maturity and understanding of the user.  I will argue that the web based services need to encourage the collection of further evidenced based research  for a child-centered approach for data protection and emerging technologies that contribute for safeguarding children. Both in terms of managing their own risks and how policy and practice must shift away from being an adult-as-gatekeeper paradigm, to one that is driven by children’s own data capture and communication. The child need not be an afterthought but an upfront partner and voice within a complexity of the world wide web. The child’s voice and the child’s user journeys need to be viewed as a social technical generational safe place to explore, learn, grow and reach out in if and when they feel connected to do so. Children now interact in a cyborg-childhood space (Mya-Chahal et al., 2014) and central to this I argue that successful socio-technical processes must be designed with and for those use use them.  Whilst it is accepted that, theoretically, childhood is deemed a socially construct (James and Prout, 1997) and is changing over time. What is most significant is the role that technology and the online world plays within the childhood space.

The United Nations Convention on the Rights of the Child (UNCRC) (UNICEF, 1989)  Article 12, describes a state’s duty to take into account that children’s views matter in decision-making about their lives. Evidence suggests that children resent tokenism where consultation in the development of services takes place (Woolfson et al., 2010). However, as child welfare policies have evolved, the rhetoric regarding greater child participation has not been matched by their presence in the debates regarding the processes of governance and safeguarding. The process appears to remain largely adult-focused, and a view persists that children are talked about rather than being talked to and fully participating in the process (Gilligan and Manby, 2008; Munro, 2011, Carlick 2018). Children resent tokenism in consultation for the development of services and the way in which this is conducted is not yet sufficiently advanced to centralise the child (Woolfson et al., 2010). Given the dearth of literature that focuses on children and their voice being heard for consultation purposes, the research is virtually non-existent for understanding their views on their use of technology for child protection. A child’s identity offline and their journey through social networks are connected, but questions remain about differences in communication between the on/offline worlds and, importantly for safeguarding, how distinctions are made in what information children share in public and private domains (Carlick, 2018).

As part of my PhD research project I undertook various workshops based upon the theoretical framework of human computer interaction with children and young people from the aged 9- 16 years olds. The thematic analysis from what they shared concluded that the policy direction and such developments as the age verification code is contradictory to the design of technology solutions. Their agency and enactment showed that there must be diversity with regards to age groups within this space and technology design (Carlick, 2018). Design must be child-focused, meaning their power and agency is central. Rather than excluding children and  forgoing a model that the online world is a generational technical space. How do we reposition data protection and safety within a child’s world? There needs to be a resconstruction of childhood with the digital world.

  • Very few providers consider design and content as co-operative inquiry with children as equal partners.
  • There are very few direct provisions for children in existing regulatory frameworks and data protection directives. Where they exist, such as the age verification code they tend to rely exclusively on parental consent and/or give little distinction made between older adolescents and younger children.
  • Risks are frequently defined by adults and often reflect pre-established norms and adult concerns regarding the nature of these risks.
  • Little has been done to unpack the younger generation’s concepts of privacy, expectations of use and notions of public information. This situation is compounded by the limited understanding of the array of risks and opportunities of big data.
  • Education on the rights of children and adolescents, their right to express themselves and to be heard and importantly their right to privacy and confidentiality – including from their parents as well as from more traditional players in the data cycle.
  • The true essence of the UNCRC needs to be at the forefront of emerging tech.
  • The 13 years old threshold seems to be incompatible with the core of the UNCRC i.e. assumes age 13 every child is capable and then doesn’t take account of children developing at difference stages / pace.
  • Parental consent does not necessarily imply that a child will receive more guidance regarding data processing online.

My research suggests that the ICO and safety tech sector needs to rethink its understanding and use of technology, to become more child-centred in its application and to look for new ways of incorporating technology into safeguarding practice, data protection and best practice. This means providing information in a format that is understood by children and is truly child centred and ensuring that children are equal within these processes (Carlick, 2018).

Arthur : Dr Sarah Carlick

www.drsarahcarlick.co.uk

Further Information:

https://ico.org.uk/for-organisations/guide-to-data-protection/key-data-protection-themes/age-appropriate-design-a-code-of-practice-for-online-services/

 

No comments found


Leave a Reply

Your email address will not be published. Required fields are marked *

Please note: all comments are subject to moderation before appearing on the site, which may take a short while.

Let's connect

To book Sarah for an upcoming event, discuss a media enquiry or for any general questions, please complete the form below.

Send Message