The popular and increasingly controversial social media app TikTok must pay a fine of 12.7 million pounds (equivalent to around $16 million) in the UK for disregarding data protection for children.
The British data protection authority Information Commissioner’s Office (ICO) announced this week that TikTok had allowed up to 1.4 million children under the age of 13 in the country to open accounts in 2020, despite the app’s own rules prohibiting it.
Children’s personal data had also been used without parental consent, it said, despite UK law requiring it.
“TikTok also failed to implement adequate controls to identify and remove underage children from its platform,” an ICO statement added.
While some senior TikTok executives had raised concerns internally, the company had not responded appropriately, the ICO said.
Meanwhile, a new report from Pixalate analyzing the privacy policy of every US-registered child-directed app in the Apple App Store found that the majority (54%) of those apps appear to violate the Children’s Online Privacy Protection Act (COPPA).
COPPA is a US federal law enacted in 1998 to protect the privacy of children under the age of 13 on the Internet. COPPA applies to websites and online services that collect personal information from children under the age of 13, such as their name, address, email address, phone number, and other identifiable information.
They must also post a clear and comprehensive privacy policy on their website or online service and provide parents with the option to review and delete their children’s personal information.
Privacy for Kids
Due to skyrocketing online activity by children and teens, protecting their privacy and security online has become one of the most discussed topics in 2023.
The Biden administration has called on Congress to strengthen privacy protections, ban targeted advertising to children, and demand technology companies stop collecting personal data on children.
In children’s privacy enforcement actions against publishers and advertising platforms, the FTC has imposed hefty fines, customer refunds, and lengthy compliance and auditing obligations.
App Violations Widespread
According to the Pixalate report, 21% of apps in the Apple Store don’t even have a privacy policy, despite Apple’s claim that all apps in the store are required to have a privacy policy.
Among the US-registered, child-directed apps that do have a privacy policy, 34% are missing a Children’s Privacy Disclosure, and 13% are missing contact information.
Jalal Nasir, CEO of Pixalate, explains there are multiple risks for app developers who ignore the regulations and names three that stand out.
“The first involves FTC fines, settlements, oversight and other similar remedies, and the second concerns losing the ability to monetize, for example undergoing business practices scrutiny, or being dropped by ad partners as they look to shed risk,” he says. “The third involves losing your customers’ trust.”
Krishna Vishnubhotla, vice president of product strategy at Zimperium, says the penalties for violating the COPPA Rule can be substantial.
“The FTC has the authority to bring enforcement actions against violators, and it may seek civil penalties of up to $43,280 per violation,” he explains.
In addition to monetary penalties, violators may also be required to take corrective action, such as deleting the personal information of children that was collected in violation of COPPA.
“The larger issue is that it can also harm a company’s reputation and lead to a loss of trust among customers and partners,” Vishnubhotla says.
Holes in Apple Oversight
Nasir says for smaller developers, lack of awareness or lack of resources are likely two of the biggest factors when it comes to failing COPPA compliance. But the biggest factor may be Apple’s failure to provide reasonable oversight.
Apple, as the gatekeeper of its app ecosystem, needs to take a much stronger stance in protecting children’s privacy, he says.
“Apple not only needs to do a much better job of giving its developers the resources necessary to be compliant with the latest privacy laws, but it also seems to need to step up monitoring and enforcement of its own app store policies,” Nasir says.
He adds that if app developers work with any third parties — including any advertising partners — they must ensure that those partners also have compliant privacy policies and practices.
“It creates a complicated web,” Nasir says.
The real problem concerns the vast number of apps and frequent releases, he says, which makes it difficult for the Federal Trade Commission or any organization to check for noncompliance.
“Stores must provide these regulatory organizations with a dashboard or notifications in order to make it viable,” he notes. “Public stores may not allow that. It’s almost impossible to accomplish this in a proactive, feasible, and structured manner.”
“Technical Debt” for App Developers
Melissa Bischoping, director of endpoint security research at Tanium, says development of applications must balance available resources, regulatory requirements, and competing business priorities in their planning and execution.
“While almost no one would disagree that privacy and protection of children’s data is always a priority, the technical debt and workload can make engineering the compliance a long and expensive process,” she says.
She adds that complying with these, and other regulations and security best practices requires not only the talent to implement the solutions but also adequate staff to test and review that the designed solution meets the desired outcome.
“This is, effectively, engineering a safer airplane while it’s in flight,” she explains. “It takes the work of multiple teams to engineer and assure. This effort, and others like it that are centered around protecting vulnerable populations, cannot be ignored.”
Source: www.darkreading.com