What organisations must do to protect children’s data privacy

We live in a world of “datafied” children – whether we’re posting their photos on our social media pages, storing baby monitoring data in the Cloud or letting our kids use online services.

Indeed, the likes of Facebook, Snapchat and TikTok have been scrutinised for the way they use children’s data – often resulting in legal action.

But these tech firms aren’t the only concern we should have. Children use technology for multiple reasons, including gaming, studying and social interaction.

The data that’s gathered using these tools builds up a large set of personal data relating to an individual, and results in a profile of that child.

The lasting effects of this has been evident through employers and universities rescinding job and course applications after scrolling through social networking platforms.

Compliance vs opportunity

The UK’s Children’s Commissioner announced in 2018 that it was driving a strategy to help address the privacy risks associated with children’s data – the same year that the GDPR (General Data Protection Regulation) took effect, which has specific requirements regarding the use of such information.

Although those rules go a long way toward protecting children, organisations still have the right to attract customers no matter their age, as long as they meet their compliance requirements.

In some cases, this will mean towing the line as closely as possible, in a way that may be technically legal but may not feel entirely satisfactory to parents and guardians.

TikTok provides one notable example of this. Earlier this year, the former children’s commissioner for England, Anne Longford, sued the video-sharing app on behalf of 3.5 million children.

She alleged that TikTok violated the GDPR by collecting excessive data and failing to explain what it’s used for.

The lawsuit notes that TikTok collects vast amounts of data from users, which isn’t necessarily used for the service itself but by advertisers.

However, TikTok has continually defended the way it processes children’s personal data. In response to the lawsuit, the organisation said:

Privacy and safety are top priorities for TikTok and we have robust policies, processes and technologies in place to help protect all users, and our teenage users in particular.

We believe the claims lack merit and intend to vigorously defend the action.

The result of the lawsuit is pending, but whether TikTok is found guilty or not, data privacy advocates are looking beyond punitive action and towards stronger controls that prevent this sort of thing occurring in the first place.

The best chance of that happening is if organisations approach the rules not as something to skirt around but as a way to gain a competitive advantage.

Using data privacy to gain a competitive advantage

For an example of an organisation that’s using improved data privacy controls for children to gain a competitive advantage, let’s look at Google’s recent overhaul of its data privacy controls.

The organisation confirmed earlier this month that it is blocking ad-targeting based on the age, gender or interests of people under 18.

It added that it would also turn off its “location history” feature, expand the types of age-sensitive ad categories that are blocked for users up to 18 and turn on safe-searching filters for children.

Similarly, Google is introducing a new policy for under-18s and their parents or guardians that enables them to request the removal of the child’s images from Google Image search results.

But Google isn’t the only organisation taking action. TikTok, which as we mentioned earlier is currently being sued for the way it processes children’s personal data, recently announced that it was overhauling its privacy controls for children.

The organisation said a pop-up will appear requesting that creators under the age of 16 choose who can watch their videos.

In a blog post, the company’s head of child safety public policy, Alexandra Evans, and its global head of privacy, Aruna Sharma, said:

“The process of making a TikTok is fun and creative – choosing music, picking effects, and getting the transitions right – but it is just as important to choose who that video will be shared with.”

They added that users aged 16 to 17 can turn on a feature that lets than choose who can download public videos, and confirmed that downloads are permanently disabled on accounts from creators under the age of 16.

TikTok said the changes will be rolled out globally over the coming months.

Implementing privacy controls

The examples we’ve outlined in this blog demonstrate that organisations are starting to understand that there are benefits of prioritising children’s privacy. A few simple controls can help build trust and give the organisation a competitive advantage.

If you’re considering how you too can reap these benefits, GRCI Law can help. With our Privacy as a Service solution, you can receive the tools and advice you need to create robust data privacy measures.

Our team of experienced lawyers, barristers and information and cyber security experts will work with you to review your organisational set-up and make recommended changes.

This includes help with compliance monitoring, breach notification processes and data privacy management.