TIKTOK v. GARLAND
Supreme Court Cases
24-656 (2025)
Opinions
Majority Participants
Concurring Participants
NOTICE: This opinion is subject to formal revision before publication in the United States Reports. Readers are requested to notify the Reporter of Decisions, Supreme Court of the United States, Washington, D. C. 20543, pio@supremecourt.gov, of any typographical or other formal errors.
SUPREME COURT OF THE UNITED STATES
_________________
Nos. 24â656 and 24â657
_________________
TIKTOK INC., et al., PETITIONERS (24â656)
v.
MERRICK B. GARLAND, ATTORNEY GENERAL
BRIAN Âé¶čŽ«ĂœIOSBAUGH, et al., PETITIONERS (24â657)
v.
MERRICK B. GARLAND, ATTORNEY GENERAL
ON APPLICATIONS FOR INJUNCTION PENDING REVIEW TO THE UNITED STATES COURT OF APPEALS FOR THE DISTRICT OF COLUMBIA CIRCUIT.
[January 17, 2025]
PER CURIAM.
As of January 19, the Protecting Americans from Foreign Adversary Controlled Applications Act will make it unlawful for companies in the United States to provide services to distribute, maintain, or update the social media platform TikTok, unless U. S. operation of the platform is severed from Chinese control. Petitioners are two TikTok operating entities and a group of U. S. TikTok users. We consider whether the Act, as applied to petitioners, violates the First Amendment.
In doing so, we are conscious that the cases before us involve new technologies with transformative capabilities. This challenging new context counsels caution on our part. As Justice Frankfurter advised 80 years ago in considering the application of established legal rules to the âtotally new problemsâ raised by the airplane and radio, we should take care not to âembarrass the future.â Northwest Airlines, Inc. v. Minnesota, 322 U.S. 292, 300 (1944). That caution is heightened in these cases, given the expedited time allowed for our consideration.[1] Our analysis must be understood to be narrowly focused in light of these circumstances.
I
A
TikTok is a social media platform that allows users to create, publish, view, share, and interact with short videos overlaid with audio and text. Since its launch in 2017, the platform has accumulated over 170 million users in the United States and more than one billion worldwide. Those users are prolific content creators and viewers. In 2023, U. S. TikTok users uploaded more than 5.5 billion videos, which were in turn viewed more than 13 trillion times around the world.
Opening the TikTok application brings a user to the âFor Youâ pageâa personalized content feed tailored to the userâs interests. TikTok generates the feed using a proprietary algorithm that recommends videos to a user based on the userâs interactions with the platform. Each interaction a user has on TikTokâwatching a video, following an account, leaving a commentâenables the recommendation system to further tailor a personalized content feed.
A TikTok userâs content feed is also shaped by content moderation and filtering decisions. TikTok uses automated and human processes to remove content that violates the platformâs community guidelines. See 1 App. 493â497. TikTok also promotes or demotes certain content to advance its business objectives and other goals. See id., at 499â501.
TikTok is operated in the United States by TikTok Inc., an American company incorporated and headquartered in California. TikTok Inc.âs ultimate parent company is ByteDance Ltd., a privately held company that has operations in China. ByteDance Ltd. owns TikTokâs proprietary algorithm, which is developed and maintained in China. The company is also responsible for developing portions of the source code that runs the TikTok platform. ByteDance Ltd. is subject to Chinese laws that require it to âassist or cooperateâ with the Chinese Governmentâs âintelligence workâ and to ensure that the Chinese Government has âthe power to access and control private dataâ the company holds. H. R. Rep. No. 118â417, p. 4 (2024) (H. R. Rep.); see 2 App. 673â676.
B
1
In recent years, U. S. government officials have taken repeated actions to address national security concerns regarding the relationship between China and TikTok.
In August 2020, President Trump issued an Executive Order finding that âthe spread in the United States of mobile applications developed and owned by companies in [China] continues to threaten the national security, foreign policy, and economy of the United States.â Exec. Order No. 13942, 3 CFR 412 (2021). President Trump determined that TikTok raised particular concerns, noting that the platform âautomatically captures vast swaths of information from its usersâ and is susceptible to being used to further the interests of the Chinese Government. Ibid. The President invoked his authority under the International Emergency Economic Powers Act (IEEPA), 50 U. S. C. §1701 et seq., and the National Emergencies Act, 50 U. S. C. §1601 et seq., to prohibit certain âtransactionsâ involving ByteDance Ltd. or its subsidiaries, as identified by the Secretary of Commerce. 3 CFR 413. The Secretary published a list of prohibited transactions in September 2020. See 85 Fed. Reg. 60061 (2020). But federal courts enjoined the prohibitions before they took effect, finding that they exceeded the Executive Branchâs authority under IEEPA. See generally TikTok Inc. v. Trump, 507 F. Supp. 3d 92 (DC 2020); Marland v. Trump, 498 F. Supp. 3d 624 (ED Pa. 2020).
Just days after issuing his initial Executive Order, President Trump ordered ByteDance Ltd. to divest all interests and rights in any property âused to enable or support ByteDanceâs operation of the TikTok application in the United States,â along with âany data obtained or derived fromâ U. S. TikTok users. 85 Fed. Reg. 51297. ByteDance Ltd. and TikTok Inc. filed suit in the D. C. Circuit, challenging the constitutionality of the order. In February 2021, the D. C. Circuit placed the case in abeyance to permit the Biden administration to review the matter and to enable the parties to negotiate a non-divestiture remedy that would address the Governmentâs national security concerns. See Order in TikTok Inc. v. Committee on Foreign Investment, No. 20â1444 (CADC, Feb. 19, 2021).
Throughout 2021 and 2022, ByteDance Ltd. negotiated with Executive Branch officials to develop a national security agreement that would resolve those concerns. Executive Branch officials ultimately determined, however, that ByteDance Ltd.âs proposed agreement did not adequately âmitigate the risks posed to U. S. national security interests.â 2 App. 686. Negotiations stalled, and the parties never finalized an agreement.
2
Against this backdrop, Congress enacted the Protecting Americans from Foreign Adversary Controlled Applications Act. Pub. L. 118â50, div. H, 138 Stat. 955. The Act makes it unlawful for any entity to provide certain services to âdistribute, maintain, or updateâ a âforeign adversary controlled applicationâ in the United States. §2(a)(1). Entities that violate this prohibition are subject to civil enforcement actions and hefty monetary penalties. See §§2(d)(1)(A), (d)(2)(B).
The Act provides two means by which an application may be designated a âforeign adversary controlled application.â First, the Act expressly designates any application that is âoperated, directly or indirectly,â by âByteDance Ltd.â or âTikTok,â or any subsidiary or successor thereof. §2(g)(3)(A). Second, the Act establishes a general designation framework for any application that is both (1) operated by a âcovered companyâ that is âcontrolled by a foreign adversary,â and (2) âdetermined by the President to present a significant threat to the national security of the United States,â following a public notice and reporting process. §2(g)(3)(B). In broad terms, the Act defines âcovered companyâ to include a company that operates an application that enables users to generate, share, and view content and has more than 1,000,000 monthly active users. §2(g)(2)(A). The Act excludes from that definition a company that operates an application âwhose primary purpose is to allow users to post product reviews, business reviews, or travel information and reviews.â §2(g)(2)(B).
The Actâs prohibitions take effect 270 days after an application is designated a foreign adversary controlled application. §2(a)(2). Because the Act itself designates applications operated by âByteDance, Ltd.â and âTikTok,â prohibitions as to those applications take effect 270 days after the Actâs enactmentâJanuary 19, 2025.
The Act exempts a foreign adversary controlled application from the prohibitions if the application undergoes a âqualified divestiture.â §2(c)(1). A âqualified divestitureâ is one that the President determines will result in the application âno longer being controlled by a foreign adversary.â §2(g)(6)(A). The President must further determine that the divestiture âprecludes the establishment or maintenance of any operational relationship between the United States operations of the [application] and any formerly affiliated entities that are controlled by a foreign adversary, including any cooperation with respect to the operation of a content recommendation algorithm or an agreement with respect to data sharing.â §2(g)(6)(B). The Act permits the President to grant a one-time extension of no more than 90 days with respect to the prohibitionsâ 270-day effective date if the President makes certain certifications to Congress regarding progress toward a qualified divestiture. §2(a)(3).
C
ByteDance Ltd. and TikTok Inc.âalong with two sets of TikTok users and creators (creator petitioners)âfiled petitions for review in the D. C. Circuit, challenging the constitutionality of the Act. As relevant here, the petitioners argued that the Actâs prohibitions, TikTok-specific foreign adversary controlled application designation, and divestiture requirement violate the First Amendment.
The D. C. Circuit consolidated and denied the petitions, holding that the Act does not violate petitionersâ First Amendment rights. 122 F. 4th 930, 940, 948â965 (CADC 2024). After first concluding that the Act was subject to heightened scrutiny under the First Amendment, the court assumed without deciding that strict, rather than intermediate, scrutiny applied. Id., at 948â952. The court held that the Act satisfied that standard, finding that the Governmentâs national security justificationsâcountering Chinaâs data collection and covert content manipulation effortsâwere compelling, and that the Act was narrowly tailored to further those interests. Id., at 952â965.
Chief Judge Srinivasan concurred in part and in the judgment. Id., at 970. In his view, the Act was subject to intermediate scrutiny, id., at 974â979, and was constitutional under that standard, id., at 979â983.
We granted certiorari to decide whether the Act, as applied to petitioners, violates the First Amendment. 604 U. S. ___ (2024).
II
A
At the threshold, we consider whether the challenged provisions are subject to First Amendment scrutiny. Laws that directly regulate expressive conduct can, but do not necessarily, trigger such review. See R. A. V. v. St. Paul, 505 U.S. 377, 382â386 (1992). We have also applied First Amendment scrutiny in âcases involving governmental regulation of conduct that has an expressive element,â and to âsome statutes which, although directed at activity with no expressive component, impose a disproportionate burden upon those engaged in protected First Amendment activities.â Arcara v. Cloud Books, Inc., 478 U.S. 697, 703â704 (1986).
It is not clear that the Act itself directly regulates protected expressive activity, or conduct with an expressive component. Indeed, the Act does not regulate the creator petitioners at all. And it directly regulates ByteDance Ltd. and TikTok Inc. only through the divestiture requirement. See §2(c)(1). Petitioners, for their part, have not identified any case in which this Court has treated a regulation of corporate control as a direct regulation of expressive activity or semi-expressive conduct. See Tr. of Oral Arg. 37â40. We hesitate to break that new ground in this unique case.
In any event, petitionersâ arguments more closely approximate a claim that the Actâs prohibitions, TikTok-specific designation, and divestiture requirement âimpose a disproportionate burden uponâ their First Amendment activities. Arcara, 478 U. S., at 704. Petitioners assertâand the Government does not contestâthat, because it is commercially infeasible for TikTok to be divested within the Actâs 270-day timeframe, the Act effectively bans TikTok in the United States. Petitioners argue that such a ban will burden various First Amendment activities, including content moderation, content generation, access to a distinct medium for expression, association with another speaker or preferred editor, and receipt of information and ideas.
We have recognized a number of these asserted First Amendment interests. See Moody v. NetChoice, LLC, 603 U.S. 707, 731 (2024) (âAn entity âexercising editorial discretion in the selection and presentationâ of content is âengaged in speech activity.â â (quoting Arkansas Ed. Television Commân v. Forbes, 523 U.S. 666, 674 (1998); alteration omitted)); City of Ladue v. Gilleo, 512 U.S. 43, 54â58 (1994) (âOur prior decisions have voiced particular concern with laws that foreclose an entire medium of expression.â); Rumsfeld v. Forum for Academic and Institutional Rights, Inc., 547 U.S. 47, 68 (2006) (âWe have recognized a First Amendment right to associate for the purpose of speaking, which we have termed a âright of expressive association.â â); Martin v. City of Struthers, 319 U.S. 141, 143 (1943) (âThe right of freedom of speech and press . . . embraces the right to distribute literature and necessarily protects the right to receive it.â (citation omitted)).[2] And an effective ban on a social media platform with 170 million U. S. users certainly burdens those usersâ expressive activity in a non-trivial way.
At the same time, a law targeting a foreign adversaryâs control over a communications platform is in many ways different in kind from the regulations of non-expressive activity that we have subjected to First Amendment scrutiny. Those differences â the Actâs focus on a foreign government, the congressionally determined adversary relationship between that foreign government and the United States, and the causal steps between the regulations and the alleged burden on protected speech â may impact whether First Amendment scrutiny applies.
This Court has not articulated a clear framework for determining whether a regulation of non-expressive activity that disproportionately burdens those engaged in expressive activity triggers heightened review. We need not do so here. We assume without deciding that the challenged provisions fall within this category and are subject to First Amendment scrutiny.
B
1
âAt the heart of the First Amendment lies the principle that each person should decide for himself or herself the ideas and beliefs deserving of expression, consideration, and adherence.â Turner Broadcasting System, Inc. v. FCC, 512 U.S. 622, 641 (1994) (Turner I ). Government action that suppresses speech because of its message âcontravenes this essential right.â Ibid. âContent-based lawsâthose that target speech based on its communicative contentâare presumptively unconstitutional and may be justified only if the government proves that they are narrowly tailored to serve compelling state interests.â Reed v. Town of Gilbert, 576 U.S. 155, 163 (2015). Content-neutral laws, in contrast, âare subject to an intermediate level of scrutiny because in most cases they pose a less substantial risk of excising certain ideas or viewpoints from the public dialogue.â Turner I, 512 U. S., at 642 (citation omitted). Under that standard, we will sustain a content-neutral law âif it advances important governmental interests unrelated to the suppression of free speech and does not burden substantially more speech than necessary to further those interests.â Turner Broadcasting System, Inc. v. FCC, 520 U.S. 180, 189 (1997) (Turner II ).
We have identified two forms of content-based speech regulation. First, a law is content based on its face if it âapplies to particular speech because of the topic discussed or the idea or message expressed.â Reed, 576 U. S., at 163; see id., at 163â164 (explaining that some facial distinctions define regulated speech by subject matter, others by the speechâs function or purpose). Second, a facially content-neutral law is nonetheless treated as a content-based regulation of speech if it âcannot be âjustified without reference to the content of the regulated speechâ â or was âadopted by the government âbecause of disagreement with the message the speech conveys.â â Id., at 164 (quoting Ward v. Rock Against Racism, 491 U.S. 781, 791 (1989)).
As applied to petitioners, the challenged provisions are facially content neutral and are justified by a content- neutral rationale.
a
The challenged provisions are facially content neutral. They impose TikTok-specific prohibitions due to a foreign adversaryâs control over the platform and make divestiture a prerequisite for the platformâs continued operation in the United States. They do not target particular speech based upon its content, contrast, e.g., Carey v. Brown, 447 U.S. 455, 465 (1980) (statute prohibiting all residential picketing except âpeaceful labor picketingâ), or regulate speech based on its function or purpose, contrast, e.g., Holder v. Humanitarian Law Project, 561 U.S. 1, 7, 27 (2010) (law prohibiting providing material support to terrorists). Nor do they impose a ârestriction, penalty, or burdenâ by reason of content on TikTokâa conclusion confirmed by the fact that petitioners âcannot avoid or mitigateâ the effects of the Act by altering their speech. Turner I, 512 U. S., at 644. As to petitioners, the Act thus does not facially regulate âparticular speech because of the topic discussed or the idea or message expressed.â Reed, 576 U. S., at 163.
Petitioners argue that the Act is content based on its face because it excludes from the definition of âcovered companyâ any company that operates an application âwhose primary purpose is to allow users to post product reviews, business reviews, or travel information and reviews.â §2(g)(2)(B); see Brief for Petitioners in No. 24â656, pp. 26â27 (Brief for TikTok); Brief for Petitioners in No. 24â657, p. 26 (Brief for Creator Petitioners). We need not decide whether that exclusion is content based. The question before the Court is whether the Act violates the First Amendment as applied to petitioners. To answer that question, we look to the provisions of the Act that give rise to the effective TikTok ban that petitioners argue burdens their First Amendment rights. The exclusion for certain review platforms, however, applies only to the general framework for designating applications controlled by âcovered compan[ies],â not to the TikTok-specific designation. §§2(g)(3)(A)â(B). As such, the exclusion is not within the scope of petitionersâ as-applied challenge.
b
The Government also supports the challenged provisions with a content-neutral justification: preventing China from collecting vast amounts of sensitive data from 170 million U. S. TikTok users. 2 App. 628. That rationale is decidedly content agnostic. It neither references the content of speech on TikTok nor reflects disagreement with the message such speech conveys. Cf. Ward, 491 U. S., at 792â793 (holding noise control and sound quality justifications behind city sound amplification guideline were content neutral).
Because the data collection justification reflects a âpurpos[e] unrelated to the content of expression,â it is content neutral. Id., at 791.
2
The Actâs TikTok-specific distinctions, moreover, do not trigger strict scrutiny. See Brief for TikTok 26â27; Brief for Creator Petitioners 24â26. It is true that â[s]peech restrictions based on the identity of the speaker are all too often simply a means to control content.â Citizens United v. Federal Election Commân, 558 U.S. 310, 340 (2010). For that reason, â[r]egulations that discriminate among media, or among different speakers within a single medium, often present serious First Amendment concerns.â Turner I, 512 U. S., at 659. But while âlaws favoring some speakers over others demand strict scrutiny when the legislatureâs speaker preference reflects a content preference,â id., at 658, such scrutiny âis unwarranted when the differential treatment is âjustified by some special characteristic of â the particular [speaker] being regulated,â id., at 660â661 (quoting Minneapolis Star & Tribune Co. v. Minnesota Commâr of Revenue, 460 U.S. 575, 585 (1983)).
For the reasons we have explained, requiring divestiture for the purpose of preventing a foreign adversary from accessing the sensitive data of 170 million U. S. TikTok users is not âa subtle means of exercising a content preference.â Turner I, 512 U. S., at 645. The prohibitions, TikTok-specific designation, and divestiture requirement regulate TikTok based on a content-neutral data collection interest. And TikTok has special characteristicsâa foreign adversaryâs ability to leverage its control over the platform to collect vast amounts of personal data from 170 million U. S. usersâthat justify this differential treatment. â[S]peaker distinctions of this nature are not presumed invalid under the First Amendment.â Ibid.
While we find that differential treatment was justified here, however, we emphasize the inherent narrowness of our holding. Data collection and analysis is a common practice in this digital age. But TikTokâs scale and susceptibility to foreign adversary control, together with the vast swaths of sensitive data the platform collects, justify differential treatment to address the Governmentâs national security concerns. A law targeting any other speaker would by necessity entail a distinct inquiry and separate considerations.
On this understanding, we cannot accept petitionersâ call for strict scrutiny. No more than intermediate scrutiny is in order.
C
As applied to petitioners, the Act satisfies intermediate scrutiny. The challenged provisions further an important Government interest unrelated to the suppression of free expression and do not burden substantially more speech than necessary to further that interest.[3]
1
The Actâs prohibitions and divestiture requirement are designed to prevent Chinaâa designated foreign adversaryâfrom leveraging its control over ByteDance Ltd. to capture the personal data of U. S. TikTok users. This objective qualifies as an important Government interest under intermediate scrutiny.
Petitioners do not dispute that the Government has an important and well-grounded interest in preventing China from collecting the personal data of tens of millions of U. S. TikTok users. Nor could they. The platform collects extensive personal information from and about its users. See H. R. Rep., at 3 (Public reporting has suggested that TikTokâs âdata collection practices extend to age, phone number, precise location, internet address, device used, phone contacts, social network connections, the content of private messages sent through the application, and videos watched.â); 1 App. 241 (Draft National Security Agreement noting that TikTok collects user data, user content, behavioral data (including âkeystroke patterns and rhythmsâ), and device and network data (including device contacts and calendars)). If, for example, a user allows TikTok access to the userâs phone contact list to connect with others on the platform, TikTok can access âany data stored in the userâs contact list,â including names, contact information, contact photos, job titles, and notes. 2 id., at 659. Access to such detailed information about U. S. users, the Government worries, may enable âChina to track the locations of Federal employees and contractors, build dossiers of personal information for blackmail, and conduct corporate espionage.â 3 CFR 412. And Chinese law enables China to require companies to surrender data to the government, âmaking companies headquartered there an espionage toolâ of China. H. R. Rep., at 4.
Rather than meaningfully dispute the scope of the data TikTok collects or the ends to which it may be used, petitioners contest probability, asserting that it is âunlikelyâ that China would âcompel TikTok to turn over user data for intelligence-gathering purposes, since China has more effective and efficient means of obtaining relevant information.â Brief for TikTok 50 (internal quotation marks omitted). In reviewing the constitutionality of the Act, however, we âmust accord substantial deference to the predictive judgments of Congress.â Turner I, 512 U. S., at 665 (opinion of Kennedy, J.). âSound policymaking often requires legislators to forecast future events and to anticipate the likely impact of these events based on deductions and inferences for which complete empirical support may be unavailable.â Ibid. Here, the Governmentâs TikTok-related data collection concerns do not exist in isolation. The record reflects that China âhas engaged in extensive and years-long efforts to accumulate structured datasets, in particular on U. S. persons, to support its intelligence and counterintelligence operations.â 2 App. 634.
Even if China has not yet leveraged its relationship with ByteDance Ltd. to access U. S. TikTok usersâ data, petitioners offer no basis for concluding that the Governmentâs determination that China might do so is not at least a âreasonable inferenc[e] based on substantial evidence.â Turner II, 520 U. S., at 195. We are mindful that this law arises in a context in which ânational security and foreign policy concerns arise in connection with efforts to confront evolving threats in an area where information can be difficult to obtain and the impact of certain conduct difficult to assess.â Humanitarian Law Project, 561 U. S., at 34. We thus afford the Governmentâs âinformed judgmentâ substantial respect here. Ibid.
Petitioners further argue that the Act is under inclusive as to the Governmentâs data protection concern, raising doubts as to whether the Government is actually pursuing that interest. In particular, petitioners argue that the Actâs focus on applications with user-generated and user-shared content, along with its exclusion for certain review platforms, exempts from regulation applications that are âas capable as TikTok of collecting Americansâ data.â Brief for TikTok 43; see Brief for Creator Petitioners 48â49. But âthe First Amendment imposes no freestanding underinclusiveness limitation,â and the Government âneed not address all aspects of a problem in one fell swoop.â Williams-Yulee v. Florida Bar, 575 U.S. 433, 449 (2015) (internal quotation marks omitted). Furthermore, as we have already concluded, the Government had good reason to single out TikTok for special treatment. Contrast Brown v. Entertainment Merchants Assn., 564 U.S. 786, 802 (2011) (singling out purveyors of video games for disfavored treatment without a persuasive reason âraise[d] serious doubts about whether the government [wa]s in fact pursuing the interest it invoke[d], rather than disfavoring a particular speaker or viewpointâ). On this record, Congress was justified in specifically addressing its TikTok-related national security concerns.
2
As applied to petitioners, the Act is sufficiently tailored to address the Governmentâs interest in preventing a foreign adversary from collecting vast swaths of sensitive data about the 170 million U. S. persons who use TikTok. To survive intermediate scrutiny, âa regulation need not be the least speech-restrictive means of advancing the Governmentâs interests.â Turner I, 512 U. S., at 662. Rather, the standard âis satisfied âso long as the regulation promotes a substantial government interest that would be achieved less effectively absent the regulationâ â and does not âburden substantially more speech than is necessaryâ to further that interest. Ward, 491 U. S., at 799 (quoting United States v. Albertini, 472 U.S. 675, 689 (1985); alteration omitted).
The challenged provisions meet this standard. The provisions clearly serve the Governmentâs data collection interest âin a direct and effective way.â Ward, 491 U. S., at 800. The prohibitions account for the fact that, absent a qualified divestiture, TikTokâs very operation in the United States implicates the Governmentâs data collection concerns, while the requirements that make a divestiture âqualifiedâ ensure that those concerns are addressed before TikTok resumes U. S. operations. Neither the prohibitions nor the divestiture requirement, moreover, is âsubstantially broader than necessary to achieveâ this national security objective. Ibid. Rather than ban TikTok outright, the Act imposes a conditional ban. The prohibitions prevent China from gathering data from U. S. TikTok users unless and until a qualified divestiture severs Chinaâs control.
Petitioners parade a series of alternativesâdisclosure requirements, data sharing restrictions, the proposed national security agreement, the general designation provisionâthat they assert would address the Governmentâs data collection interest in equal measure to a conditional TikTok ban. Those alternatives do not alter our tailoring analysis.
Petitionersâ proposed alternatives ignore the âlatitudeâ we afford the Government to design regulatory solutions to address content-neutral interests. Turner II, 520 U. S., at 213. âSo long as the means chosen are not substantially broader than necessary to achieve the governmentâs interest, . . . the regulation will not be invalid simply because a court concludes that the governmentâs interest could be adequately served by some less-speech-restrictive alternative.â Ward, 491 U. S., at 800; see ibid. (regulation valid despite availability of less restrictive âalternative regulatory methodsâ); Albertini, 472 U. S., at 689; Clark v. Community for Creative Non-Violence, 468 U.S. 288, 299 (1984); Members of City Council of Los Angeles v. Taxpayers for Vincent, 466 U.S. 789, 815â816 (1984). For the reasons we have explained, the challenged provisions are ânot substantially broader than necessaryâ to address the Governmentâs data collection concerns. Ward, 491 U. S., at 800. Nor did the Government ignore less restrictive approaches already proven effective. Contrast McCullen v. Coakley, 573 U.S. 464, 490â494 (2014) (state law burdened substantially more speech than necessary where State had not considered less restrictive measures successfully adopted by other jurisdictions). The validity of the challenged provisions does not turn on whether we agree with the Governmentâs conclusion that its chosen regulatory path is best or âmost appropriate.â Albertini, 472 U. S., at 689. âWe cannot displace [the Governmentâs] judgment respecting content- neutral regulations with our own, so long as its policy is grounded on reasonable factual findings supported by evidence that is substantial for a legislative determination.â Turner II, 520 U. S., at 224. Those requirements are met here.
D
In addition to the data collection concerns addressed above, the Government asserts an interest in preventing a foreign adversary from having control over the recommendation algorithm that runs a widely used U. S. communications platform, and from being able to wield that control to alter the content on the platform in an undetectable manner. See 2 App. 628. In petitionersâ view, that rationale is a content-based justification that âtaint[s]â the Governmentâs data collection interest and triggers strict scrutiny. Brief for TikTok 41.
Petitioners have not pointed to any case in which this Court has assessed the appropriate level of First Amendment scrutiny for an Act of Congress justified on both content-neutral and content-based grounds. They assert, however, that the challenged provisions are subject toâand failâstrict scrutiny because Congress would not have passed the provisions absent the foreign adversary control rationale. See Brief for TikTok 41â42; Brief for Creator Petitioners 47â50. We need not determine the proper standard for mixed-justification cases or decide whether the Governmentâs foreign adversary control justification is content neutral. Even assuming that rationale turns on content, petitionersâ argument fails under the counterfactual analysis they propose: The record before us adequately supports the conclusion that Congress would have passed the challenged provisions based on the data collection justification alone.
To start, the House Report focuses overwhelmingly on the Governmentâs data collection concerns, noting the âbreadthâ of TikTokâs data collection, âthe difficulty in assessing precisely which categories of dataâ the platform collects, the âtight interlinkagesâ between TikTok and the Chinese Government, and the Chinese Governmentâs ability to âcoerc[e]â companies in China to âprovid[e] data.â H. R. Rep., at 3; see id., at 5â12 (recounting a five-year record of Government actions raising and attempting to address those very concerns). Indeed, it does not appear that any legislator disputed the national security risks associated with TikTokâs data collection practices, and nothing in the legislative record suggests that data collection was anything but an overriding congressional concern. We are especially wary of parsing Congressâs motives on this record with regard to an Act passed with striking bipartisan support. See 170 Cong. Rec. H1170 (Mar. 13, 2024) (352â65); 170 Cong. Rec. S2992 (Apr. 23, 2024) (79â18).
Petitioners assert that the text of the Act itself undermines this conclusion. In particular, they argue that the Governmentâs data collection rationale cannot justify the requirement that a qualified divestiture preclude âany operational relationshipâ that allows for âcooperation with respect to the operation of a content recommendation algorithm or an agreement with respect to data sharing.â §2(g)(6)(B); see Brief for Creator Petitioners 48â49. We disagree. The Government has explained that ByteDance Ltd. uses the data it collects to train the TikTok recommendation algorithm, which is developed and maintained in China. According to the Government, ByteDance Ltd. has previously declined to agree to stop collecting U. S. user data or sending that data to China to train the algorithm. See 2 App. 705â706. The Government has further noted the difficulties associated with monitoring data sharing between ByteDance Ltd. and TikTok Inc. See id., at 692â697. Under these circumstances, we find the Governmentâs data collection justification sufficient to sustain the challenged provisions.
*ââ¶Ä*ââ¶Ä*
There is no doubt that, for more than 170 million Americans, TikTok offers a distinctive and expansive outlet for expression, means of engagement, and source of community. But Congress has determined that divestiture is necessary to address its well-supported national security concerns regarding TikTokâs data collection practices and relationship with a foreign adversary. For the foregoing reasons, we conclude that the challenged provisions do not violate petitionersâ First Amendment rights.
The judgment of the United States Court of Appeals for the District of Columbia Circuit is affirmed.
It is so ordered.
Notes
[1] Applications for an injunction pending review were filed on December 16, 2024; we construed the applications as petitions for a writ of certiorari and granted them on December 18, 2024; and oral argument was held on January 10, 2025.
[2] To the extent that ByteDance Ltd.âs asserted expressive activity occurs abroad, that activity is not protected by the First Amendment. See Agency for Intâl Development v. Alliance for Open Society Intâl Inc., 591 U.S. 430, 436 (2020) (â[F]oreign organizations operating abroad have no First Amendment rights.â).
[3] Our holding and analysis are based on the public record, without reference to the classified evidence the Government filed below.
SUPREME COURT OF THE UNITED STATES
_________________
Nos. 24â656 and 24â657
_________________
TIKTOK INC., et al., PETITIONERS (24â656)
v.
MERRICK B. GARLAND, ATTORNEY GENERAL
BRIAN Âé¶čŽ«ĂœIOSBAUGH, et al., PETITIONERS (24â657)
v.
MERRICK B. GARLAND, ATTORNEY GENERAL
ON APPLICATIONS FOR INJUNCTION PENDING REVIEW TO THE UNITED STATES COURT OF APPEALS FOR THE DISTRICT OF COLUMBIA CIRCUIT.
[January 17, 2025]
Justice Sotomayor, concurring in part and concurring in the judgment.
I join all but Part II.A of the Courtâs per curiam opinion. I see no reason to assume without deciding that the Act implicates the First Amendment because our precedent leaves no doubt that it does.
TikTok engages in expressive activity by âcompiling and curatingâ material on its platform. Moody v. NetChoice, LLC, 603 U.S. 707, 731 (2024). Laws that âimpose a disproportionate burdenâ upon those engaged in expressive activity are subject to heightened scrutiny under the First Amendment. Arcara v. Cloud Books, Inc., 478 U.S. 697, 704 (1986); see Minneapolis Star & Tribune Co. v. Minnesota Commâr of Revenue, 460 U.S. 575, 581â585 (1983). The challenged Act plainly imposes such a burden: It bars any entity from distributing TikTokâs speech in the United States, unless TikTok undergoes a qualified divestiture. The Act, moreover, effectively prohibits TikTok from collaborating with certain entities regarding its âcontent recommendation algorithmâ even following a qualified divestiture. §2(g)(6)(B), 138 Stat. 959. And the Act implicates content creatorsâ âright to associateâ with their preferred publisher âfor the purpose of speaking.â Rumsfeld v. Forum for Academic and Institutional Rights, Inc., 547 U.S. 47, 68 (2006). That, too, calls for First Amendment scrutiny.
As to the remainder of the per curiam opinion, I agree that the Act survives petitionersâ First Amendment challenge.
SUPREME COURT OF THE UNITED STATES
_________________
Nos. 24â656 and 24â657
_________________
TIKTOK INC., et al., PETITIONERS
24â656v.
MERRICK B. GARLAND, ATTORNEY GENERAL
BRIAN Âé¶čŽ«ĂœIOSBAUGH, et al., PETITIONERS
24â657v.
MERRICK B. GARLAND, ATTORNEY GENERAL
ON APPLICATIONS FOR INJUNCTION PENDING REVIEW TO THE UNITED STATES COURT OF APPEALS FOR THE DISTRICT OF COLUMBIA CIRCUIT.
[January 17, 2025]
Justice Gorsuch, concurring in judgment.
We have had a fortnight to resolve, finally and on the merits, a major First Amendment dispute affecting more than 170 million Americans. Briefing finished on January 3, argument took place on January 10, and our opinions issue on January 17, 2025. Given those conditions, I can sketch out only a few, and admittedly tentative, observations.
First, the Court rightly refrains from endorsing the governmentâs asserted interest in preventing âthe covert manipulation of contentâ as a justification for the law before us. Brief for Respondent 37. One manâs âcovert content manipulationâ is anotherâs âeditorial discretion.â Journalists, publishers, and speakers of all kinds routinely make less-than-transparent judgments about what stories to tell and how to tell them. Without question, the First Amendment has much to say about the right to make those choices. It makes no difference that Americans (like TikTok Inc. and many of its users) may wish to make decisions about what they say in concert with a foreign adversary. âThose who won our independenceâ knew the vital importance of the âfreedom to think as you will and to speak as you think,â as well as the dangers that come with repressing the free flow of ideas. Whitney v. California, 274 U.S. 357, 375 (1927) (Brandeis, J., concurring). They knew, too, that except in the most extreme situations, âthe fitting remedy for evil counsels is good ones.â Ibid. Too often in recent years, the government has sought to censor disfavored speech online, as if the internet were somehow exempt from the full sweep of the First Amendment. See, e.g., Murthy v. Missouri, 603 U.S. 43, 76â78 (2024) (Alito, J., dissenting). But even as times and technologies change, âthe principle of the right to free speech is always the same.â Abrams v. United States, 250 U.S. 616, 628 (1919) (Holmes, J., dissenting).
Second, I am pleased that the Court declines to consider the classified evidence the government has submitted to us but shielded from petitioners and their counsel. Ante, at 13, n. 3. Efforts to inject secret evidence into judicial proceedings present obvious constitutional concerns. Usually, âthe evidence used to prove the Governmentâs case must be disclosed to the individual so that he has an opportunity to show that it is untrue.â Greene v. McElroy, 360 U.S. 474, 496 (1959). Maybe there is a way to handle classified evidence that would afford a similar opportunity in cases like these. Maybe, too, Congress or even the Standing Committee on Rules of Practice and Procedure would profit from considering the question. Cf. United States v. Zubaydah, 595 U.S. 195, 245 (2022) (Gorsuch, J., dissenting). But as the Court recognizes, we have no business considering the governmentâs secret evidence here.
Third, I harbor serious reservations about whether the law before us is âcontent neutralâ and thus escapes âstrict scrutiny.â See ante, at 9â12; Brief for Petitioners in No. 24â656, pp. 25â31; Brief for Petitioners in No. 24â657, pp. 24â26; Reply Brief in No. 24â656, pp. 10â12; Reply Brief in No. 24â657, pp. 8â11. More than that, while I do not doubt that the various âtiers of scrutinyâ discussed in our case lawâârational basis, strict scrutiny, something(s) in betweenââcan help focus our analysis, I worry that litigation over them can sometimes take on a life of its own and do more to obscure than to clarify the ultimate constitutional questions. Riddle v. Hickenlooper, 742 F.3d 922, 932 (CA10 2014) (Gorsuch, J., concurring).
Fourth, whatever the appropriate tier of scrutiny, I am persuaded that the law before us seeks to serve a compelling interest: preventing a foreign country, designated by Congress and the President as an adversary of our Nation, from harvesting vast troves of personal information about tens of millions of Americans. The record before us establishes that TikTok mines data both from TikTok users and about millions of others who do not consent to share their information. 2 App. 659. According to the Federal Bureau of Investigation, TikTok can access âany dataâ stored in a consenting userâs âcontact listââincluding names, photos, and other personal information about unconsenting third parties. Ibid. (emphasis added). And because the record shows that the Peopleâs Republic of China (PRC) can require TikTokâs parent company âto cooperate with [its] efforts to obtain personal data,â there is little to stop all that information from ending up in the hands of a designated foreign adversary. Id., at 696; see id., at 673â676; ante, at 3. The PRC may then use that information to âbuild dossiers . . . for blackmail,â âconduct corporate espionage,â or advance intelligence operations. 1 App. 215; see 2 App. 659. To be sure, assessing exactly what a foreign adversary may do in the future implicates âdelicateâ and âcomplexâ judgments about foreign affairs and requires âlarge elements of prophecy.â Chicago & Southern Air Lines, Inc. v. Waterman S. S. Corp., 333 U.S. 103, 111 (1948) (Jackson, J., for the Court). But the record the government has amassed in these cases after years of study supplies compelling reason for concern.
Finally, the law before us also appears appropriately tailored to the problem it seeks to address. Without doubt, the remedy Congress and the President chose here is dramatic. The law may require TikTokâs parent company to divest or (effectively) shutter its U. S. operations. But before seeking to impose that remedy, the coordinate branches spent years in negotiations with TikTok exploring alternatives and ultimately found them wanting. Ante, at 4. And from what I can glean from the record, that judgment was well founded.
Consider some of the alternatives. Start with our usual and preferred remedy under the First Amendment: more speech. Supra, at 2. However helpful that might be, the record shows that warning users of the risks associated with giving their data to a foreign-adversary-controlled application would do nothing to protect nonusersâ data. 2 App. 659â660; supra, at 3. Forbidding TikTokâs domestic operations from sending sensitive data abroad might seem another option. But even if Congress were to impose serious criminal penalties on domestic TikTok employees who violate a data-sharing ban, the record suggests that would do little to deter the PRC from exploiting TikTok to steal Americansâ data. See 1 App. 214 (noting threats from âmalicious code, backdoor vulnerabilities, surreptitious surveillance, and other problematic activities tied to source code developmentâ in the PRC); 2 App. 702 (â[A]gents of the PRC would not fear monetary or criminal penalties in the United Statesâ). The record also indicates that the âsizeâ and âcomplexityâ of TikTokâs âunderlying softwareâ may make it impossible for law enforcement to detect violations. Id., at 688â689; see also id., at 662. Even setting all these challenges aside, any new compliance regime could raise separate constitutional concernsâfor instance, by requiring the government to surveil Americansâ data to ensure that it isnât illicitly flowing overseas. Id., at 687 (suggesting that effective enforcement of a data-export ban might involve âdirect U. S. government monitoringâ of the âflow of U. S. user dataâ).
Whether this law will succeed in achieving its ends, I do not know. A determined foreign adversary may just seek to replace one lost surveillance application with another. As time passes and threats evolve, less dramatic and more effective solutions may emerge. Even what might happen next to TikTok remains unclear. See Tr. of Oral Arg. 146â147. But the question we face today is not the lawâs wisdom, only its constitutionality. Given just a handful of days after oral argument to issue an opinion, I cannot profess the kind of certainty I would like to have about the arguments and record before us. All I can say is that, at this time and under these constraints, the problem appears real and the response to it not unconstitutional. As persuaded as I am of the wisdom of Justice Brandeis in Whitney and Justice Holmes in Abrams, their cases are not ours. See supra, at 2. Speaking with and in favor of a foreign adversary is one thing. Allowing a foreign adversary to spy on Americans is another.