The Problem of TikTok

Since the parallel rise over the past couple years of TikTok and concerns about the service’s connections to China — or, more specifically, its intelligence and military arms — I have been mulling over this piece. I have dropped bits and pieces, but I feel like now is a good time to bring all those thoughts together given a letter sent by FCC commissioner Brendan Carr jointly to Tim Cook and Sundar Pichai. Unfortunately, it has solely been posted to Twitter as a series of images without descriptive text because I guess Carr hates people who use screen readers:

I am writing the two of you because Apple and Google hold themselves out as operating app stores that are safe and trusted places to discover and download apps. Nonetheless, Apple and Google have reviewed and approved the TikTok app for inclusion in your respective app stores. Indeed, statistics show that TikTok has been downloaded in the U.S. from the Apple App Store and the Google Play Store nearly 19 million times in the first quarter of this year alone. It is clear that TikTok poses an unacceptable national security risk due to its extensive data harvesting being combined with Beijing’s apparently unchecked access to that sensitive data But it is also clear that TikTok’s pattern of conduct and misrepresentations regarding the unfettered access that persons in Beijing have to sensitive U.S. user data — just some of which is detailed below — puts it out of compliance with the policies that both of your companies require every app to adhere to as a condition of remaining available on your app stores. Therefore, I am requesting that you apply the plain text of your app store policies to TikTok and remove it from your app stores for failure to abide by those terms.

As a reminder, Carr works for the FCC, not the FTC. Nor does Carr work for the Department of Commerce, which was most recently tasked with eradicating TikTok from the United States. While frequent readers will know how much I appreciate a regulator doing their job and making tough demands, I feel Carr’s fury is misplaced and, perhaps, a little disingenuous.

Carr’s letter follows Emily Baker-White’s reporting earlier this month for Buzzfeed News about the virtually nonexistent wall between U.S. user data collected by TikTok and employees at ByteDance, its parent company in China. The concerns, Baker-White says, are claims of persistent backdoors connected to Chinese military or intelligence which allow access to users’ “nonpublic data”. The ostensible severing of ties between ByteDance and TikTok’s U.S. users is referred to as “Project Texas” internally:

TikTok’s goal for Project Texas is that any data stored on the Oracle server will be secure and not accessible from China or elsewhere globally. However, according to seven recordings between September 2021 and January 2022, the lawyer leading TikTok’s negotiations with CFIUS and others clarify that this only includes data that is not publicly available on the app, like content that is in draft form, set to private, or information like users’ phone numbers and birthdays that is collected but not visible on their profiles. A Booz Allen Hamilton consultant told colleagues in September 2021 that what exactly will count as “protected data” that will be stored in the Oracle server was “still being ironed out from a legal perspective.”

In a recorded January 2022 meeting, the company’s head of product and user operations announced with a laugh that unique IDs (UIDs) will not be considered protected information under the CFIUS agreement: “The conversation continues to evolve,” they said. “We recently found out that UIDs are things we can have access to, which changes the game a bit.”

What the product and user operations head meant by “UID” in this circumstance is not clear — it could refer to an identifier for a specific TikTok account, or for a device. Device UIDs are typically used by ad tech companies like Google and Facebook to link your behavior across apps, making them nearly as important an identifier as your name.

It has become a cliché by now to point out that TikTok’s data collection practices are no more invasive or expansive than those of American social media giants. It is also a shallow comparison. The concerns raised by Carr and others are explicitly related to the company’s Chinese parentage, not simply the pure privacy violations of collecting all that information.

But, you know, maybe they should be worried about that simpler situation. I think Baker-White buried the lede in that big, long Buzzfeed story:

Project Texas’s narrow focus on the security of a specific slice of US user data, much of which the Chinese government could simply buy from data brokers if it so chose, does not address fears that China, through ByteDance, could use TikTok to influence Americans’ commercial, cultural, or political behavior.

This piece is almost entirely about users’ private data being accessible by staff apparently operating as agents of a foreign government; almost none of it is about its algorithm influencing behaviour.1 So it is wild to read, in the first half of this sentence, that a great deal of the piece’s concerns about TikTok collecting user data can be effectively undone if its management clicks the “Add to Cart” button on a data broker’s website. Those are the privacy concerns churning away in the back room. What is happening in the front office?

From Carr’s letter:

In March 2020, researchers discovered that TikTok, through its app in the Apple App Store, was accessing users’ most sensitive data, including passwords, cryptocurrency wallet addresses, and personal messages.

This seems to be a reference to Talal Haj Bakry and Tommy Mysk’s research showing that TikTok and a large number of other popular apps were automatically reading the iOS clipboard — or “pasteboard” in iOS parlance. It sounds bad, but is not clear that TikTok actually received any of this pasted data.

There are also many non-insidious reasons why this could have been the case. In a statement to Ars Technica, TikTok said it was related to an anti-spam feature. I am not sure that is believable, but I also do not have a good reason to think it was secretly monitoring users’ copied data when there are other innocent explanations. Google’s Firebase platform, for example, automatically retrieves the pasteboard by default when using its Dynamic Links feature. TikTok probably does not use Firebase, but all I am saying is that it is not, in of itself, a reason to think the worst. At any rate, suspicious pasteboard access is one reason pasting stuff in iOS has become increasingly irritating and why there is a whole new paste control in iOS 16.

Also Carr:

In August 2020, TikTok circumvented a privacy safeguard in Google’s Android operating system to obtain data that allowed it to track users online.

This appears to reference TikTok’s collection of MAC addresses — a behaviour which, while against Google’s policies, is not exclusive to TikTok. It may be more concerning when TikTok does it, but it could obtain the same information from data brokers (PDF).

That is really what this is all about. The problem of TikTok is really the problem of worldwide privacy failures. There are apps on your phone collecting historically unprecedented amounts of information about your online and offline behaviour. There are companies that buy all of this — often nominally de-identified — and many of them offer “enrichment” services that mix information from different sources to create more comprehensive records. Those businesses, in turn, provide those more complex profiles to other businesses — or journalists — which now have the ability to individually identify you. It is often counterproductive to do so, and they often promise they would never do such a thing, but it is completely possible and entirely legal — in the U.S., at least. It should be noted that, while the world is grappling with privacy problems, some of those failures are specific to the United States.

One of the concerns Carr enumerated in his letter is TikTok’s collection of “biometric identifiers, including faceprints […] and voiceprints”. When this language was added last year, it seemed to give the company legal cover for features like automatic captions and face filters, both of which involve things that are considered biometric identifiers. But the change was only made to the U.S.-specific privacy policy; TikTok has three. When I looked at them, the U.S. one seemed more permissive than those for Europe or the rest of the world, but I am not a lawyer, and things may have changed since.2

One of the frustrating characteristics about Carr’s letter is that he is, in many ways, completely right — and I just wish he had raised these concerns about literally everything else applicable. From the perspective of a non-American, his concerns about intrusive surveillance reflect those I have about my data being stored under the control of American companies operating under American laws. Sure, Canada is both an ally and a participant in the Five Eyes group. But it is hard to be reassured by that when the U.S. has lost its moral high ground by wiretapping allies and entire countries.

It is worse that an authoritarian surveillance state may be doing the snooping. But the moral and ethical problems are basically the same, so it is hard to read even the most extreme interpretation of Carr’s letter without some amount of hypocrisy. The U.S. decided not to pass adequate privacy legislation at any point in the past fifteen years of accelerating intrusive practices — by tech companies, internet service providers, ad networks, and the data brokers tying them all together — and has its own insidious and overreaching intelligence practices. Even if Carr has a point, TikTok is not the problem; it is one entity taking advantage of a wildly problematic system.


  1. On the question of influence, there is room for nuance. A response to that is a whole different article. ↥︎

  2. TikTok is far from the only company to have region-specific privacy policies. I would love to see a legal analysis and comparison. ↥︎