Connect with us

Uncategorized

TikTok has until Friday to respond to Italy’s order to block users it can’t age-verify after girl’s death

Published

on

TikTok has until Friday to respond to an order by Italy’s data protection agency to block users whose age it cannot verify, TechCrunch has learned.

The GPDP made an ‘immediate’ order Friday in response to the death of a 10-year-old girl from Palermo who died of asphyxiation after participating in a ‘blackout challenge’ on the social network, according to reports in local media.

The agency said the ban would remain place until February 15 — suggesting it would make another assessment about any additional action at that point.

At the time of writing it does not appear that TikTok has taken action to comply with the GPDP’s order.

A spokeswoman told us it is reviewing the notification. “We have received and are currently reviewing the notification from Garante,” she said. “Privacy and safety are top priorities for TikTok and we are constantly strengthening our policies, processes and technologies to protect all users, and our younger users in particular.”

The GPDP had already raised concerns about children’s privacy on TikTok, warning in December that its age verification checks are easily circumvented and raising objections over default settings that make users’ content public. On December 22 it also announced it had opened a formal procedure — giving TikTok 30 days to respond.

The order to block users whose age it cannot verify is in addition to that action. If TikTok does not comply with the GPDP’s administrative order it could face enforcement from the Italian agency, drawing on penalty powers set out in the GDPR.

TikTok’s spokeswoman declined to answer additional questions about the order — which prohibits it from further processing user data “for whom there is no absolute certainty of age”, per GPDP’s press release Friday.

The company also did not respond when we asked if it had submitted a response to the agency’s formal procedure.

In a statement last week following the girl’s death the company said: “Our deepest sympathies are with the girl’s family and friends. At TikTok, the safety of our community — in particular our younger users — is our priority, and we do not allow content that encourages, promotes, or glorifies dangerous behaviour that might lead to injury. We offer robust safety controls and resources for teens and families on our platform, and we regularly evolve our policies and protections in our ongoing commitment to our community.”

TikTok has said it has found no evidence of any challenge involving asphyxiation on its platform.

Although, in recent years, there have been a number of previous reports of underage users hanging themselves (or attempting to) after trying to copy things they saw on the platform.

Users frequently create and respond to content challenges, as part of TikTok’s viral appeal — such as (recently) a trend for singing sea shanties.

At the time of writing, a search on the platform for ‘#blackoutchallenge’ returns no user content but displays a warning that the phrase “may be associated with behavior or content that violates our guidelines”.

Screengrab of the warning users see if they search for ‘blackout challenge’ (Image credit: TechCrunch)

There have been TikTok challenges related to ‘hanging’ (as in people hanging by parts of their body other than their neck from/off objects) — and a search for #hangingchallenge does still return results (including some users discussing the death of the 10-year-old girl).

Last year a number of users also participated in an event on the platform in which they posted images of black squares — using the hashtag #BlackOutTuesday — which related to Black Lives Matters protests.

So the term ‘blackout’ has similarly been used on TikTok in relation to encouraging others to post content. Though not in that case in relation to asphyxiation.

Ireland’s Data Protection Commission, which has been lined up as TikTok’s lead data supervisor in Europe — following the company’s announcement last year that its Irish entity would take over legal responsibility for processing European users’ data — does not have an open inquiry into the platform “at present”, per a spokesman.

But TikTok is already facing a number of other investigations and legal challenges in Europe, including an investigation into how the app handles users data by France’s watchdog CNIL — announced last summer.

In recent years, France’s CNIL has been responsible for handing out some of the largest penalties for tech giants for infringing EU data protection laws (including fines for Google and Amazon).

In December, it also emerged that a 12-year-old girl in the UK is bringing a legal challenge against TikTok — claiming it uses children’s data unlawfully. A court ruled she can remain anonymous if the case goes ahead.

Last month Ireland’s data protection regulator put out draft guidelines on what it couched as “the Fundamentals for a Child-Oriented Approach to Data Processing” — with the stated aim of driving improvements in standards of data processing related to minors.

While the GDPR typically requires data protection complaints to be funnelled through a lead agency, under the one-stop-shop mechanism, Italy’s GPDP’s order to TikTok to cease processing is possible under powers set out in the regulation (Article 66) that allow for ‘urgency procedures’ to be undertaken by national watchdogs in instances of imperative risk.

Although any such provisional measures can only last for three months — and only apply to the country where the DPA has jurisdiction (Italy in this case). Ireland’s DPC would be the EU agency responsible for leading any resulting investigation.

Lyron Foster is a Hawaii based African American Musician, Author, Actor, Blogger, Filmmaker, Philanthropist and Multinational Serial Tech Entrepreneur.

Continue Reading
Comments

Uncategorized

Facebook will pay $650 million to settle class action suit centered on Illinois privacy law

Published

on

Facebook was ordered to pay $650 million Friday for running afoul of an Illinois law designed to protect the state’s residents from invasive privacy practices.

That law, the Biometric Information Privacy Act (BIPA), is a powerful state measure that’s tripped up tech companies in recent years. The suit against Facebook was first filed in 2015, alleging that Facebook’s practice of tagging people in photos using facial recognition without their consent violated state law.

1.6 million Illinois residents will receive at least $345 under the final settlement ruling in California federal court. The final number is $100 higher than the $550 million Facebook proposed in 2020, which a judge deemed inadequate. Facebook disabled the automatic facial recognition tagging features in 2019, making it opt-in instead and addressing some of the privacy criticisms echoed by the Illinois class action suit.

A cluster of lawsuits accused Microsoft, Google and Amazon of breaking the same law last year after Illinois residents’ faces were used to train their facial recognition systems without explicit consent.

The Illinois privacy law has tangled up some of tech’s giants, but BIPA has even more potential to impact smaller companies with questionable privacy practices. The controversial facial recognition software company Clearview AI now faces its own BIPA-based class action lawsuit in the state after the company failed to dodge the suit by pushing it out of state courts.

A $650 million settlement would be enough to crush any normal company, though Facebook can brush it off much like it did with the FTC’s record-setting $5 billion penalty in 2019. But the Illinois law isn’t without teeth. For Clearview, it was enough to make the company pull out of business in the state altogether.

The law can’t punish a behemoth like Facebook in the same way, but it is one piece in a regulatory puzzle that poses an increasing threat to the way tech’s data brokers have done business for years. With regulators at the federal, state and legislative level proposing aggressive measures to rein in tech, the landmark Illinois law provides a compelling framework that other states could copy and paste. And if big tech thinks navigating federal oversight will be a nightmare, a patchwork of aggressive state laws governing how tech companies do business on a state-by-state basis is an alternate regulatory future that could prove even less palatable.

 

Continue Reading

Uncategorized

AWS reorganizes DeepRacer League to encourage more newbies

Published

on

AWS launched the DeepRacer League in 2018 as a fun way to teach developers machine learning, and it’s been building on the idea ever since. Today, it announced the latest league season with two divisions: Open and Pro.

As Marcia Villalba wrote in a blog post announcing the new league, “AWS DeepRacer is an autonomous 1/18th scale race car designed to test [reinforcement learning] models by racing virtually in the AWS DeepRacer console or physically on a track at AWS and customer events. AWS DeepRacer is for developers of all skill levels, even if you don’t have any ML experience. When learning RL using AWS DeepRacer, you can take part in the AWS DeepRacer League where you get experience with machine learning in a fun and competitive environment.”

While the company started these as in-person races with physical cars, the pandemic has forced them to make it a virtual event over the last year, but the new format seemed to be blocking out newcomers. Since the goal is to teach people about machine learning, getting new people involved is crucial to the company.

That’s why it created the Open League, which as the name suggests is open to anyone. You can test your skills and if you’re good enough, finishing in the top 10%, you can compete in the Pro division. Everyone competes for prizes as well such as vehicle customizations.

The top 16 in the Pro League each month race for a chance to go to the finals at AWS re:Invent in 2021, an event that may or may not be virtual, depending on where we are in the pandemic recovery.

Continue Reading

Uncategorized

Free 30-day trial of Extra Crunch included with TC Sessions: Justice tickets

Published

on

TC Sessions: Justice is coming up on Wednesday, and we’ve decided to sweeten the deal for what’s included with your event pass. Buy your ticket now and you’ll get a free month of access to Extra Crunch, our membership program focused on founders and startup teams with exclusive articles published daily.

Extra Crunch unlocks access to our weekly investor surveys, private market analysis and in-depth interviews with experts on fundraising, growth, monetization and other core startup topics. Get feedback on your pitch deck through Extra Crunch Live, and stay informed with our members-only Extra Crunch newsletter. Other benefits include an improved TechCrunch.com experience and savings on software services from AWS, Crunchbase and more.

Learn more about Extra Crunch benefits here, and buy your TC Sessions: Justice tickets here.  

What is TC Sessions: Justice? 

TC Sessions: Justice is a single-day virtual event that explores diversity, equity and inclusion in tech, the gig worker experience, the justice system and more. We’ll host a series of interviews with key figures in the tech community. 

The event will take place March 3, and we’d love to have you join. 

View the event agenda here, and purchase tickets here

Once you buy your TC Sessions: Justice pass, you will be emailed a link and unique code you can use to claim the free month of Extra Crunch.

Already bought your TC Sessions: Justice ticket?

Existing pass holders will be emailed with information on how to claim the free month of Extra Crunch membership. All new ticket purchases will receive information over email immediately after the purchase is complete.

Already an Extra Crunch member?

We’re happy to extend a free month of access to existing users. Please contact extracrunch@techcrunch.com and mention that you are existing Extra Crunch members who bought a ticket to TC Sessions: Justice. 

Continue Reading

Trending