The Information Commissioners Office (ICO) is set to introduce and age-appropriate code derivative from the Data Protection Act 2018. This Code of Practice is to set out 15 standards of age appropriate design for online services which process personal data and are likely to be accessed by children. The code is supposedly leverage to be used as practical guidance on data protection safeguards so that online services are appropriate for use by children. Firstly looking at the definition of appropriate meaning suitable or proper in the circumstances. This alone groups all children into one pool of users (children). How do we know is such a service is appropriate if the designers have not consulted with, assessed maturity and understanding of the user. I will argue that the web based services need to encourage the collection of further evidenced based research for a child-centered approach for data protection and emerging technologies that contribute for safeguarding children. Both in terms of managing their own risks and how policy and practice must shift away from being an adult-as-gatekeeper paradigm, to one that is driven by children’s own data capture and communication. The child need not be an afterthought but an upfront partner and voice within a complexity of the world wide web. The child’s voice and the child’s user journeys need to be viewed as a social technical generational safe place to explore, learn, grow and reach out in if and when they feel connected to do so. Children now interact in a cyborg-childhood space (Mya-Chahal et al., 2014) and central to this I argue that successful socio-technical processes must be designed with and for those use use them. Whilst it is accepted that, theoretically, childhood is deemed a socially construct (James and Prout, 1997) and is changing over time. What is most significant is the role that technology and the online world plays within the childhood space.
The United Nations Convention on the Rights of the Child (UNCRC) (UNICEF, 1989) Article 12, describes a state’s duty to take into account that children’s views matter in decision-making about their lives. Evidence suggests that children resent tokenism where consultation in the development of services takes place (Woolfson et al., 2010). However, as child welfare policies have evolved, the rhetoric regarding greater child participation has not been matched by their presence in the debates regarding the processes of governance and safeguarding. The process appears to remain largely adult-focused, and a view persists that children are talked about rather than being talked to and fully participating in the process (Gilligan and Manby, 2008; Munro, 2011, Carlick 2018). Children resent tokenism in consultation for the development of services and the way in which this is conducted is not yet sufficiently advanced to centralise the child (Woolfson et al., 2010). Given the dearth of literature that focuses on children and their voice being heard for consultation purposes, the research is virtually non-existent for understanding their views on their use of technology for child protection. A child’s identity offline and their journey through social networks are connected, but questions remain about differences in communication between the on/offline worlds and, importantly for safeguarding, how distinctions are made in what information children share in public and private domains (Carlick, 2018).
As part of my PhD research project I undertook various workshops based upon the theoretical framework of human computer interaction with children and young people from the aged 9- 16 years olds. The thematic analysis from what they shared concluded that the policy direction and such developments as the age verification code is contradictory to the design of technology solutions. Their agency and enactment showed that there must be diversity with regards to age groups within this space and technology design (Carlick, 2018). Design must be child-focused, meaning their power and agency is central. Rather than excluding children and forgoing a model that the online world is a generational technical space. How do we reposition data protection and safety within a child’s world? There needs to be a resconstruction of childhood with the digital world.
My research suggests that the ICO and safety tech sector needs to rethink its understanding and use of technology, to become more child-centred in its application and to look for new ways of incorporating technology into safeguarding practice, data protection and best practice. This means providing information in a format that is understood by children and is truly child centred and ensuring that children are equal within these processes (Carlick, 2018).
Arthur : Dr Sarah Carlick
No comments found
Please note: all comments are subject to moderation before appearing on the site, which may take a short while.
To book Sarah for an upcoming event, discuss a media enquiry or for any general questions, please complete the form below.