Are your child’s data safe in the digital classroom?

It always takes a while with new trends for the side effects to show. It was not for a few years that parental controls were brought in on the internet. The first content filters for children in mobile phones only appeared in 2007, with Apple then building in safety mechanisms in 2009. Addressing the risks to children often comes as an after-thought. This is also the case for the use of technology in education, despite the fact that its use exposes children to the use of their personal data, invasion of privacy, abuse, identity theft, offensive messages and images, cyberbullying, scams, and fake news and misinformation. It is time for a re-think.

First the facts

Like it or not, we have already stepped over a cliff edge on this front. Digital technology providers, including those producing education technology products, collect and store data on their users, including information that is sensitive. One study found that 89% of 163 education technology products recommended during the pandemic could surveil children – and that 39 of 42 governments providing online education during the pandemic fostered uses that risked or infringed on children’s rights.

While devices provided by schools can have safeguards installed, students’ own laptops or devices often lack controls or filters that might regulate data use and so leave them exposed. Despite this, one in five countries still provides subsidies of some form encouraging students to purchase their own devices.

And we should not forget that the introduction of artificial intelligence in education opens the door to additional potential. Without going into the full list, simple facial recognition systems already in use have been shown to be biased against specific races for instance. In Texas, at least eight school districts use facial recognition that is also used for law enforcement and can reproduce or deepen inequality.

It is not just companies collecting data that we should be concerned about, but also schools, some of which are sensitive, including student biometric and health data, and dietary requirements that can be used to make assumptions about religion. Yet, regulations on the use of data remains rare. In the European Union, public schools are covered by the General Data Protection Regulation and must appoint data protection officers. The regulation determines how and when data can be lawfully processed; for example, there is a lawful basis for schools to process data and this task is in the public interest, but this data should not be recycled for another task.

The issue with this sort of approach is that it is based on risks rather than rights. It does not provide the same assurances as human rights, or child rights, due diligence processes. Supervision and oversight must ensure that education technology companies adhere to standards and do not extend their power without limits.

In addition, complaint mechanisms and administrative or judicial remedies tend not to be tailored for children. Australia, Brazil, France, Ireland, Singapore, South Africa and the United Kingdom have entrusted a regulatory authority with the power to bring administrative actions against parties who have committed a breach of data laws. But the extent to which they can investigate, impose civil liability and issue fines varies by country. Article 69 of the Chinese Personal Information Protection Law puts the burden of proof on the handlers of personal information, making them liable to the extent that they cannot prove they are not at fault. However, the mechanism is complex and it may still be difficult to make such actors accountable

How can we put better safeguards in place?

For a start, we need to ramp up our calls for this to become a priority. This is quite new territory for legal experts, and we need to emphasize the need for stronger capabilities to protect children’s privacy online – particularly in education arenas. The complication lies in the fact that the harm from this type of privacy violation is hard to define. It extends into the future. Its negative consequences are spread across many people, even though they may be minimal for a single individual. They may be just inconvenient for an individual but bring large benefits to companies. All of this challenges courts’ traditional understandings of harm. The vagueness behind some of these definitions may be the reason why, at present, analysis of our PEER country profiles on technology in education found that only 16% of countries explicitly guarantee data privacy in education by law. It definitely goes some way to suggest why further analysis of 10 countries found that, despite existing legislation, children’s rights were still not protected.

And while we wait for legislative protection, education systems need to strengthen preventative measures. For starters, they need to enable members of the education community to understand the implications of their online presence and digital footprint. Brazil’s National Common Curricular Base for Basic Education recognizes schools should develop understanding and the use of digital ICT in a critical, meaningful, reflective and safe way as one of the essential skills. More than 50% of schools included elements of safe, responsible and critical internet use in the content of several subjects.

Rectifying the situation also requires greater oversight of the education technology industry, something that we hope the #TechOnOurTerms campaign launched alongside the 2023 GEM Report on technology in education will help advance. Giant technology firms use their dominant position to enter education and further strengthen their near monopoly on the market. Google Workspace for Education and Google Classroom, for instance, still widely used by schools the world over, are being used to extract student personal data for advertising purposes. Amazon Web Services is increasingly influencing education through cloud computing, data storage and platform technology services, taking advantage of the increasing use of data in system management. Empowering countries to stand up to such influential players is important. Some are already banning the use of some such applications because of privacy concern, as some states in Germany, which have banned Microsoft products that do not comply with the General Data Protection Regulation. It should be a case of no entry unless proved safe, rather than allowing players in and then trying to omit them when violations are discovered.

Regulating the private industry does not remove the responsibility from the companies themselves to apply sound privacy and data protection to their products, service and systems. For instance, this could include something as simple as setting the privacy by default in applications and devices, and not requiring manual input from the user. This would then see users – or children in this case – instead having to opt in to being tracked by third-party applications, as is the case with Apple’s operating system iOS 14.5. Alternatively, companies can ensure privacy by design as is now a legal requirement in the GDPR.

This all refers back to the fourth recommendation in the 2023 GEM Report: that governments ask whether the uses of technology they are choosing are sustainable – for our budgets, for our planet and our wellbeing.

Share:

Leave a Reply