Deterrent rather than punishment: What does Instagram’s $403m fine mean for children’s data privacy

Tacita article image

On 2nd September, Instagram and its parent company Meta were fined $403m for inadequate handling of children’s data under the EU GDPR. The fine was the culmination of a long running investigation by the Irish Data Protection Commission (DPC) into the social media company. It is also the largest fine that Meta has been issued to date and a second largest ever issued by a European data protection authority, following the €746m fine issued by the Luxembourg data authority against Amazon last year. The hope of privacy activists is that this fine can help signal a shift to a more prevalent view of fines as a deterrent rather than as punishment. 

$403m worth of problems

On 2nd September, Instagram and its parent company Meta were fined $403m for inadequate handling of children’s data under the EU GDPR. The fine was the culmination of a long running investigation by the Irish Data Protection Commission (DPC) into the social media company and is the largest fine that Meta has been issued to date. This fine is the second largest ever issued by a European data protection authority, following the €746m fine issued by the Luxembourg data authority against Amazon last year.

The complaint centred upon Instagram’s failure to provide adequate protections for children’s data processed on their application. Until recently, Instagram had allowed users aged between 13 and 17 to operate business accounts on the platform. These accounts showed the users’ phone numbers and email addresses. It was also found the platform had operated a user registration system in which these accounts were set to “public” by default, allowing anyone to access them.

Whilst the exact reasoning for the fine is yet to be released, it is likely that Instagram and Meta will have been found to have to have violated the integrity and confidentiality principle. The integrity and confidentiality principle is one of the 7 fundamental principles of the GDPR and necessitates businesses to ensure that adequate protections are put in place for the data they process. By allowing children (which the GDPR classifies as anyone under the age of 18) to create these business accounts, Instagram would have violated this principle and put vulnerable data subjects (as the GDPR terms under-18s) at risk.

Unsurprisingly Meta is not happy.

Following the release of the fine, a spokesperson for the social media giant stated that “While we’ve engaged fully with the DPC throughout their inquiry, we disagree with how this fine was calculated and intend to appeal it. We’re continuing to carefully review the rest of the decision.” This will likely extend the process even further.

Instagram, TikTok, and illegal data processing

This fine is the latest in a series of data protection complaints centring upon the processing of children’s data.

In September 2021, Francis Haugen (a Meta whistleblower) leaked a wealth of internal documents, many of which centred upon Meta’s negligent approach to the protection of their younger users. In one key document, internal investigations into the effect of social media (in particular Instagram) upon the mental health of young people were shown to have been suppressed and ignored. The backlash this generated forced Meta to halt their work on their planned new product: ’Instagram for Kids’.

Also last year, it was revealed that the Irish DPC had opened two investigations into the video-sharing giant TikTok. The first of these investigations is focusing on whether TikTok is handling children’s data in a compliant fashion. The second is assessing whether TikTok’s transfers of personal data to their parent company in China constitute an illegal data transfer. These investigations were undertaken under the DPC’s own volition (meaning no external complaint is required to be issued) following pressure from consumer groups and other European data protection authorities. Similar to the Instagram investigation, this is likely to be a long running situation.

In January last year also, the Italian DPC forced TikTok to check the age of every user in the country after the data protection watchdog instigated an emergency procedure, using GDPR powers, following child safety concerns. TikTok complied and more than half a million accounts were removed after ages could not be verified.

In short Tiktok, Instagram, and other social media companies who process Children’s data are finding themselves increasingly in the crosshairs of European DPCs.

"A valuable but untapped audience"

The current generation of young people are digital natives; Individuals who have grown up surrounded by technology and who’s personal lives are integrated across both the actual and virtual domains.

Consequently, the digital footprint (the information trail left by us as we surf the web) left by younger users is likely to be significantly larger and more detailed than that of both their parents generation and even slightly older generations (such as millennials). Children have often spent their lives on the internet, have operated social media profiles from an early age, and are adept at using (and signing up for) services across the internet.

The size of their audience and the detail of their digital footprint is therefore an enticing prospect to social media companies. In one leaked internal memo from 2020, Facebook questions: “Why do we care about tweens?”.

The report answers that question by stating that “They are a valuable but untapped audience.”

Lawmakers agree and have implemented various legal protections to attempt to protect children and their data. Because of the impressionable nature of children, marketing towards children is tightly regulated under the GDPR and other national authorities, such as the ASA in the UK. Equally additional individual rights are afforded to children, including the ability to delete all information given prior to their turning 18.

The GDPR has also classified children to be ‘vulnerable data subjects’. This designation identifies that the personal data of children is deemed to be of a higher risk than the personal data of others. Therefore further protective measures (such as the increased security that Instagram lacked) must be implemented before this data can be processed safely and compliantly. The United Kingdom (which has the UK GDPR) is also planning to implement further children’s data-focused legislation in the Online Safety Bill.

But the speed of legislation and the speed of technology are not in concert.

Often legislation is too slow in its production and online trends too fast in their emergence. This results in legislation which cannot initially meet the requirements for protection, and businesses being able to exploit these loopholes at the risk of children’s data. By the time legislation has caught up and enforced, the necessary protections are missing from the technology used. Fines (and the protections that they encourage) are therefore rendered retroactive; Punishment rather than deterrent.

Deterrent is better than punishment

The route to better protection of children’s data must start with a paradigm shift: from viewing fines as a deterrent rather than a punishment.

The size of the fine issued to Meta is a positive step towards this. $403m is a significant amount and one which other social media companies (especially TikTok) will note. Whilst it will likely be too late for TikTok, considering the investigation is 1 year deep already, other social media companies should view the fines issued against Meta as a stark warning to adopt a privacy by design and default approach. This term is used in the privacy sphere to denote the ‘baking in’ of protective measures and securities, encouraging new applications and services to be constructed with privacy at their heart.

In their response to the fine, a spokesperson for Meta clarified that the issues have been now been rectified: “This inquiry focused on old settings that we updated over a year ago, and we’ve since released many new features to help keep teens safe and their information privateAnyone under 18 automatically has their account set to private when they join Instagram, so only people they know can see what they post, and adults can’t message teens who don’t follow them.

Meta had changed their settings following the announcement of the investigation. However this was a case of too little too late and the nature of these changes reflects the reactive rather than proactive approach to children’s data protection that should be stamped out of the industry.

As the bite of data protection authorities begins to match the bark of data protection legislation, we must hope that the threat of non-compliance acts as the necessary deterrent to illegal data processing activities rather than purely punishment.

About Us: Tacita are GDPR compliance experts. Tacita help clients achieve and maintain GDPR compliance. Get in touch to explore our range of GDPR services including the Tacita GDPR Audit, GDPR Consultant Service and the GDPR Toolkit.

Share this article:

Facebook
Twitter
LinkedIn
WhatsApp