Education: Legal Issues

The safety concerns and standards shared in other sections provide an initial foundation for legal protections. However, calls for stricter consumer protection laws must accompany the proliferation of neurotech devices. Special privacy laws must be promulgated to ensure “cognitive privacy” (Nita Farahany, 2012, 2023) [25]  and educational autonomy. Raw brain data is uniquely sensitive, and an individual’s brain pattern may be more unique than our fingerprint. This uniqueness has the potential to be identifiable, creating legal, security, and ethical challenges.

Current regulations in Europe govern certain aspects of any neurotechnology that mitigate harms related to privacy and surveillance.

For example, the Children’s Code UK formerly known as the [26]  IEEE P2089™ –‘Standard for Age Appropriate Digital Services Framework – Based on the 5Rights Principles for Children’ –establishes a set of processes by which organizations seek to make their services age appropriate.  This standard, which was written into law as part of the 2018 Data Protection Act, which also implemented General Data Protection Regulation (GDPR) in the UK, mandates websites and apps published after September 2021 to take the ‘best interests’ of their child users into account, or face fines of up to 4% of annual global turnover.

European Union data privacy law defines biometric data as “special categories of personal data” and prohibits its “processing.” The combination of brain Big Data and AI will impact educational neurotechnology. On the EU ratification on artificial intelligence in education, culture, and the audiovisual sector consult  [27] section 30 calls for a future-proof legal framework for AI so as to provide legally binding ethical measures and standards. The data produced and used by AI applications must be especially sensitive for minors. Children constitute a vulnerable group who deserve particular attention and protection from data mining and software manipulations. For example, AI systems should not be making decisions about learning modalities and educational opportunities without full human supervision.

Devices approved for clinical use should not automatically translate over to consumer usage without separate testing on that population. Only if a technique has been experimentally proven to upgrade learning experiences in typical populations can it be regarded as an opportunity to design a learning aid device.

One scandal about neurotech in the classroom has already erupted in China (2019), where schools using EEG monitoring headbands were collecting and storing data to “score” attention levels during classes [4], [28].

The Ryder Review (2022) [29], an independent legal review of the governance of biometric data in England and Wales, developed recommendations directly relating to the IEEE Neuroethics Framework, including recommending

  • new, technologically neutral, statutory framework.
  • regulation and oversight of biometrics should be consolidated, clarified and properly resourced
  • further research on private – public partnerships/organizations gathering and processing biometric data, developing tools, accessing datasets.

 

Around the world, several key areas for legal issues are easily anticipated:
Data ownership. It may be expected that a legal struggle will emerge between users and manufacturers over the “head space” [the capacity and potential of data generation of an individual user through neurotechnological devices] of cognitive performance as a commodity.

Responsibility and accountability. Manufacturers will naturally bear the burden of refuting claims of mental and learning harms, although users must show that their implementation and usage was appropriate.  In July 2022, the UK agreed to a US plan for sharing police-held biometric data with US border officials. While this agreement specifically covers police data, businesses and organizations can share their data with the police. Consequently, biometric data collected during an academic study might also cross international borders and be subject to foreign legal protocols.

Security. Education neurotechnology has the potential to uniquely violate neuro-privacy because of interaction with minors during early formative ages. Devices that not only register but preserve and store data from device usage deserve the closest scrutiny. Existing privacy laws in almost all countries today do not clearly apply to neurotechnology data collection, so more jurisprudence must emerge.

 Data Storage. The problem of ownership and storage of biometric data is paired with an issue concerning data location. Even “cloud” data is still on servers located in particular countries with different jurisdictions. The storage duration and data availability to third parties (commercial, governmental) will cause legal and regulatory issues. Vast amounts of individual data are recorded within educational settings; Livingstone refers to this as ‘datafication’ (Livingstone, 2018) [30] and calls for specific legislation/policy/code of practice in the protection of young people and their data.

Device Transfer. Devices shared between users, especially those that must be calibrated for particular users, will continue to be problematic. Devices could be designed to work for only one particular user unless fully reset.