Connect with us

Uncategorized

Inside the information war on Black voters

Published

on

In August, about 12,000 cell phones with Detroit area code 313 got recorded messages from “Tamika Taylor.” She claimed to be a member of a civil rights organization called Project 1599, and said she was calling to warn that applying to vote by mail could lead people’s personal information to be entered into a public government database. Her tone and language led listeners to believe she was a Black woman, and she told the largely Black listeners that the database would be used to reconcile outstanding police warrants, collect credit card debt, and track people ahead of the release of a mandatory vaccine. 

But Tamika Taylor was not real, and the warnings were fake too. They were part of a campaign to suppress the Black vote in swing counties across five Rust Belt states. Earlier this month, the Michigan attorney general pressed charges against the founders of Project 1599, the notorious right-wing political operatives Jack Burkman and Jacob Wohl, for voter suppression in violation of the 1965 Voting Rights Act. They pled not guilty. 

Black voters in the Midwest, particularly in Rust Belt states such as Wisconsin, Minnesota, and Michigan, are crucial to the outcome of the 2020 election. And those communities are being deliberately targeted with messages of confusion, intimidation, and disinformation, all meant to dissuade them from voting. 

Black Americans are typically very politically engaged and form a voting block that leans left, but turnout in this population lagged in 2016, putting previous Democratic strongholds like Minnesota up for contention. Today as in 2016, while one side is trying to mobilize the Black vote, the other side is trying to suppress it via information warfare. Such activity can be illegal under the Voting Rights Act of 1965, and some experts have called for new legislation against voter suppression in the age of the internet.

“It’s been tough”

The robocall campaign in Detroit preyed on existing suspicions felt by Black voters in Michigan and throughout the country. “They were just saying things that have been said on the streets, condensed in a phone call,” says Rai Lanier, an organizer at Michigan Liberation, an activist group and superPAC focused on criminal justice reform in the greater Detroit area.

For the better part of a year, Lanier has been reaching out to the same predominantly Black neighborhoods that the callers targeted in Wayne and Oakland Counties, attempting to convince people to turn out and vote. She says many voters had seen effective disinformation messages or received robocalls aimed directly at the fears of Black Americans—especially during a time when the government’s concern for their well-being feels in doubt. Black Americans are over 2.5 times more likely than whites to contract covid-19, and twice as likely to die from it. 

“It’s saying all of these insecurities that are floating around in folks’ heads, like ‘Can I trust my government to make sure that they’re not going to force a vaccine on me when we have in the Black collective memory the Tuskegee Institute experiment?’” says Lanier. “Here’s this robocall that’s saying all of this stuff out loud.”

Michigan is a political battleground in most years, but Lanier says 2020 has been particularly testing. “With the amount of misinformation that’s out there, and really just people feeling like they don’t know who to turn to, there’s not a precedent that we, as a society, are holding on to. It’s been tough,” she says.

“Deterrence”

Across Lake Michigan, Wisconsin is also a hot spot. In 2016, Black turnout dipped almost 20% from the previous election, one part of a swing that helped make Donald Trump the first Republican to win the state since Ronald Reagan. At the end of September, a report by Britain’s Channel 4 News revealed that a data set obtained from Cambridge Analytica had marked 3.5 million Black Americans in battleground states as ripe for “deterrence.” The report says Americans were sorted into eight groups that indicated how the Trump campaign should target them via social media. Over 54% of those marked with “deterrence” were people the database determined belonged to minority groups. Of Wisconsinites in the data set, 17% of those marked with “deterrence” were labeled as Black, though Black people make up only 5.4% of the state’s population. 

It’s unclear whether the campaign was at all successful in deterring Black voters in Wisconsin or elsewhere in 2016, or whether this type of individual propagandizing is happening in 2020. But what is clear is the intention of the Republican campaign last time around to strategically suppress the Black vote.

Stricter voter ID laws and increased requirements for early voting were passed by the Republican-controlled state government before 2016, in what they claimed was an attempt to reduce voter fraud, but in practice lowers Black turnout. Milwaukee, Wisconsin, is one of the most important cities in the 2020 race and almost 40% Black. In the primary this spring, many Milwaukee residents say they didn’t receive their requested mail-in ballots, and a shortage of poll workers led to the closing of all but five of 180 polling locations in the city. And a challenge to a Republican-sponsored effort to purge voting rolls, an error-ridden process meant to update a voter registration database, is still before the state supreme court.

The manipulation of trust

Lanier says her most effective tool for countering this type of manipulation is trust. Michigan Liberation and TakeAction Minnesota, both part of the Win Black network, rely heavily on strategies like in-person visits, one-on-one video calls, and partnerships with local churches and community leaders. The campaigns have seen success. Kenza Hadj-Moussa, director of public affairs at TakeAction Minnesota, says that repeated door-knocking in minority communities was key to recruiting first-time voters, the winning strategy behind the election of Congresswoman Ilhan Omar. 

But online, fake versions of trust-building tactics have been used to infiltrate Black digital spaces. In 2016, the Russian Internet Research Agency (IRA) used trolls misrepresenting themselves as Black, liberal Twitter users to engage with mainstream “Black Twitter” ahead of the election. A recent study from the University of North Carolina found that tweets from Black-presenting IRA accounts got far more engagement than non-Black-presenting trolls. Last week Twitter suspended a set of accounts claiming to be Black Trump supporters for violating its rules on spam and platform manipulation.

Black women, specifically, are often disproportionately targeted by disinformation. At a congressional hearing last week, Nina Jankowicz, a disinformation expert at the Wilson Center, shared an analysis her team performed during the vice presidential debate on October 7. Tracking the number of messages on the platforms Parler and 4Chan relating to sexualized disinformation or violence against Senator Kamala Harris, the Democratic candidate, they saw increases of 631% and 1,078% respectively. This kind of online abuse, Jankowicz testified, is “meant to affect American women’s and minorities’ participation in the democratic process.”

Legal suppression and campaign strategy

Both the Biden and Trump campaigns have implemented specific communication strategies for Black voters. Notably, the Biden campaign more often talks to Black communities, and the Trump campaign tends to talk to white voters about Black communities. Perhaps President Trump’s biggest advertising focus has been on “law and order”; through September 1, more than 60% of his ads were about crime. Though Biden has also spent advertising money to condemn looting and rioting, most of his ads through the same time period centered on the coronavirus crisis. 

“Law and order” is a direct reference to the protests that rippled through the US after the murder of George Floyd by a police officer in Minnesota on May 25, the police shooting of Jacob Blake in Kenosha, Wisconsin, in August, and the subsequent fatal shooting of protesters by a white 17-year-old, Kyle Rittenhouse. Lanier says that racialized political messaging is reverberating most strongly with white people. There have been lots of calls to white people about how they “need to be ready to do what it takes to defend their families and defend their communities,” she says, “which seems frightening and a little more intense than saying, ‘Hey, can you go vote on November 3?’”

In Minnesota, the summer of protests has similarly been used in messaging predominantly to white people. Hadj-Moussa says that for Black voters, “the issues with policing and the murder of George Floyd are very independent of what’s going on in the electoral world.” But she acknowledges that the two are interrelated, and she says the events of the summer have contributed to an “edginess” that she feels on the ground.

“Making sure we can get ahead of this”

Though the Voting Rights Act protects against intimidation tactics such as armed citizens at polling places, it doesn’t specifically protect against voter deception, one of the most common tools of suppression online. 

In 2006, as a US senator, Barack Obama introduced the Deceptive Practices and Voter Intimidation Prevention Act, which failed to make it into legislation. Voter deception online increased dramatically in the run-up to the 2016 election and has continued, in no small part because of President Trump’s actions. In March 2019, the House passed the “For the People Act,” which includes many of Obama’s original protections against deception, but it is yet to be passed by the Senate. 

Last June, Senators Amy Klobuchar and Ben Cardin filed a standalone bill, also yet to be passed, called the Deceptive Practices and Voter Intimidation Prevention Act of 2019. “Updating federal law makes [existing] bans clearer, potentially broader, and would provide federal prosecutors the mandate to criminally enforce voter deception,” says Ian Vandewalker, the senior counsel for the Brennan Center for Justice at New York University. Currently, 15 states have laws that specifically protect against voter deception. 

Lanier and Hadj-Moussa both say encouraging early, mail-in-voting is key to avoiding a likely onslaught of disinformation, intimidation, and suppression aimed at Black voters from now until the election. Activists remain vigilant: just before Election Day four years ago, citizens from Somali immigrant communities in the Minnesota suburbs received text messages containing disinformation about how to cast their vote. But with about 3.3 million ballots already cast in all three states, both Lanier and Hadj-Moussa are confident in Black turnout numbers. 

Lanier says her confidence comes partly from the events of this summer and the lessons they have reinforced about being Black in America. 

“I think we’re definitely going to turn out. We talk a lot about why Black folks are so engaged, and I think it’s because ultimately we have to be,” she says. Black political participation happens “not because it’s a cute thing to do, but literally because our survival has always depended on the rule of law.”

Continue Reading
Comments

Uncategorized

GDPR enforcement must level up to catch big tech, report warns

Published

on

A new report by European consumer protection umbrella group Beuc, reflecting on the barriers to effective cross-border enforcement of the EU’s flagship data protection framework, makes awkward reading for the regional lawmakers and regulators as they seek to shape the next decades of digital oversight across the bloc.

Beuc’s members filed a series of complaints against Google’s use of location data in November 2018 — but some two years on from raising privacy concerns there’s been no resolution of the complaints.

The tech giant continues to make billions in ad revenue, including by processing and monetize Internet users’ location data. Its lead data protection supervisor, under GDPR’s one-stop-shop mechanism for dealing with cross-border complaints, Ireland’s Data Protection Commission (DPC), did finally open an investigation in February this year.

But it could still be years before Google faces any regulatory action in Europe related to its location tracking.

This is because Ireland’s DPC has yet to issue any cross-border GDPR decisions, some 2.5 years after the regulation started being applied. (Although, as we reported recently, a case related to a Twitter data breach is inching towards a result in the coming days.)

By contrast, France’s data watchdog, the CNIL, was able to complete a GDPR investigation into the transparency of Google’s data processing in much quicker order last year.

This summer French courts also confirmed the $57M fine it issued, slapping down Google’s appeal.

But the case predated Google coming under the jurisdiction of the DPC. And Ireland’s data regulator has to deal with a disproportionate number of multinational tech companies, given how many have established their EU base in the country.

The DPC has a major backlog of cross-border cases, with more than 20 GDPR probes involving a number of tech companies including Apple, Facebook/WhatsApp and LinkedIn. (Google has also been under investigation in Ireland over its adtech since 2019.)

This week the EU’s internet market commissioner, Thierry Breton, said regional lawmakers are well aware of enforcement “bottlenecks” in the General Data Protection Regulation (GDPR).

He suggested the Commission has learned lessons from this friction — claiming it will ensure similar concerns don’t affect the future working of a regulatory proposal related to data reuse that he was out speaking in public to introduce.

The Commission wants to create standard conditions for rights-respecting reuse of industrial data across the EU, via a new Data Governance Act (DGA), which proposes similar oversight mechanisms as are involved in the EU’s oversight of personal data — including national agencies monitoring compliance and a centralized EU steering body (which they’re planning to call the European Data Innovation Board as a mirror entity to the European Data Protection Board).

The Commission’s ambitious agenda for updating and expanding the EU’s digital rules framework, means criticism of GDPR risks taking the shine off the DGA before the ink has dried on the proposal document — putting pressure on lawmakers to find creative ways to unblock GDPR’s enforcement “bottleneck”. (Creative because national agencies are responsibility for day to day oversight, and Member States are responsible for resourcing DPAs.) 

In an initial GDPR review this summer, the Commission praised the regulation as a “modern and horizontal piece of legislation” and a “global reference point” — claiming it’s served as a point of inspiration for California’s CCPA and other emerging digital privacy frameworks around the world.

But they also conceded GDPR enforcement is lacking.

The best answer to this concern “will be a decision from the Irish data protection authority about important cases”, the EU’s justice commissioner, Didier Reynders, said in June.

Five months later European citizens are still waiting.

Beuc’s report — which it’s called The long and winding road: Two years of the GDPR: A cross-border data protection case from a consumer perspective — details the procedural obstacles its member organizations have faced in seeking to obtain a decision related to the original complaints, which were filed with a variety of DPAs around the EU.

This includes concerns of the Irish DPC making unnecessary “information and admissibility checks”; as well as rejecting complaints brought by an interested organization on the grounds they lack a mandate under Irish law, because it does not allow for third party redress (yet the Dutch consumer organization had filed the complaint under Dutch law which does…).

The report also queries why the DPC chose to open an own volition enquiry into Google’s location data activities (rather than a complaint-led enquiry) — which Beuc says risks a further delay to reaching a decision on the complaints themselves.

It further points out that the DPC’s probe of Google only looks at activity since February 2020 not November 2018 when the complaints were made — meaning there’s a missing chunk of Google’s location data processing that’s not even being investigated yet.

It notes that three of its member organizations involved in the Google complaints had considered applying for a judicial review of the DPC’s decision (NB: others have resorted to that route) — but they decided not to proceed in part because of the significant legal costs it would have entailed.

The report also points out the inherent imbalance of GDPR’s one-stop-shop mechanism shifting the administration of complaints to the location of companies under investigation — arguing they therefore benefit from “easier access to justice” (vs the ordinary consumer faced with undertaking legal proceedings in a different country and (likely) language).

“If the lead authority is in a country with tradition in ‘common law’, like Ireland, things can become even more complex and costly,” Beuc’s report further notes.

Another issue it raises is the overarching one of rights complaints having to fight what it dubs ‘a moving target’ — given well-resourced tech companies can leverage regulatory delays to (superficially) tweak practices, greasing continued abuse with misleading PR campaigns. (Something Beuc accuses Google of doing.)

DPAs must “adapt their enforcement approach to intervene more rapidly and directly”, it concludes.

“Over two years have passed since the GDPR became applicable, we have now reached a turning point. The GDPR must finally show its strength and become a catalyst for urgently needed changes in business practices,” Beuc goes on in a summary of its recommendations. “Our members experience and that of other civil society organisations, reveals a series of obstacles that significantly hamper the effective application of the GDPR and the correct functioning of its enforcement system.

BEUC recommends to the relevant EU and national authorities to make a comprehensive and joint effort to ensure the swift enforcement of the rules and improve the position of data subjects and their representing organisations, particularly in the framework of cross-border enforcement cases.”

We reached out to the Commission and the Irish DPC with questions about the report. But at the time of writing neither had responded. We’ve also asked Google for comment.

Beuc earlier sent a list of eight recommendations for “efficient” GDPR enforcement to the Commission in May.

Continue Reading

Uncategorized

Equity Dive: Edtech’s 2020 wakeup call

Published

on

Hello and welcome back to Equity, TechCrunch’s venture capital-focused podcast (now on Twitter!), where we unpack the numbers behind the headlines.

This week, we’re doing a first-ever for the show and taking a deep dive into one specific sector: Edtech.

Natasha Mascarenhas has covered education technology since Stanford first closed down classes in the wake of the coronavirus pandemic. In the wake of the historic shuttering of much of the United States’ traditional institutions of education, the sector has formed new unicorns, attracted record-breaking venture capital totals, and most of all, enjoyed time in a long-overdue spotlight.

For this Equity Dive, we zero into one part of that conversation: Edtech’s impact on higher education. We brought together Udacity co-founder and Kitty Hawk CEO Sebastian Thrun, Eschaton founder and college drop-out Ian Dilick, and Cowboy Ventures investor Jomayra Herrera to answer our biggest questions.

Here’s what we got into:

  • How the state of remote school is leading to gap years among students
  • A framework for how to think of higher education’s main three products (including which is most defensible over time)
  • What learnings we can take from this COVID-19 experiment on remote schooling to apply to the future
  • Why ed-tech is flocking to the notion of life-long learning
  • And the reality of who self-paced learning serves — and who it leaves out

And much, much more. If you celebrate, thank you for spending part of your Thanksgiving with the Equity crew. We’re so thankful to have this platform and audience, and it means a ton that y’all tune in each week.

Finally, if you liked this format and want to see more, feel free to tweet us your thoughts or leave us a review on Apple Podcasts. Talk soon!

Equity drops every Monday at 7:00 a.m. PDT and Thursday afternoon as fast as we can get it out, so subscribe to us on Apple PodcastsOvercastSpotify and all the casts.

Continue Reading

Uncategorized

TikTok’s epic rise and stumble

Published

on

TikTok’s rise in the West is unprecedented for any Chinese tech company, and so is the amount of attention it has attracted from politicians worldwide. Below is a timeline of how TikTok grew from what some considered another “copycat” short video app to global dominance and eventually became a target of the U.S. government.

2012-2017: The emergence of TikTok

These years were a period of fast growth for ByteDance, the Beijing-based parent company behind TikTok. Originally launched in China as Douyin, the video-sharing app quickly was wildly successful in its domestic market before setting its sights on the rest of the world. 

2012 

Zhang Yiming, a 29-year-old serial engineer, establishes ByteDance in Beijing.

2014

Chinese product designer Alex Zhu launches Musical.ly.

2016

ByteDance launches Douyin, which is regarded by many as a Musical.ly clone. It launches Douyin’s overseas version TikTok later that year.

2017-2019: TikTok takes off in the United States

TikTok merges with Musical.ly and and launches in the U.S., where it quickly becomes popular, the first social media app from a Chinese tech company to achieve that level of success there. But at the same time, its ownership leads to questions about national security and censorship, against the backdrop of the U.S.-China tariff wars and increased scrutiny of Chinese tech companies (including Huawei and ZTE) under the Trump administration.

2017

November

ByteDance buys Musical.ly for $800 million to $1 billion. (link)

2018

August

TikTok merges with Musical.ly and becomes available in the U.S. (link)

October

TikTok surpassed Facebook, Instagram, Snapchat and YouTube in downloads. (link)

November

Facebook launches TikTok rival Lasso. (link)

2019

February

TikTok reaches one billion installs on the App Store and Google Play. (link)

The U.S. Federal Trade Commission fines TikTok $5.7 million over violation of children privacy law. (link)

May

TikTok tops the App Store for the fifth quarter in a row. (link)

September

TikTok is found censoring topics considered sensitive by the Beijing government. (link)

October

TikTok bans political ads (link) but does not appear to take action on hashtags related to American politics. (link)

TikTok taps corporate law firm K&L Gates for advice on content moderation in the U.S. (link)

U.S. lawmakers ask intelligence chief Joseph Maguire to investigate if TikTok poses a threat to national security. (link)

TikTok says it has never been asked by the Chinese government to remove any content and would not do so if asked. (link)

November

The Committee on Foreign Investment in the United States reportedly opens a national security probe into TikTok. (link)

Instagram launches TikTok rival Reels. (link)

TikTok apologizes for removing a viral video about abuses against Uighurs. (link)

December

The U.S. Navy reportedly bans TikTok. (link)

The first half of 2020: Growth amid government scrutiny

The app is now a mainstay of online culture in America, especially among Generation Z, and its user base has grown even wider as people seek diversions during the COVID-19 pandemic. But TikTok faces an escalating series of government actions, creating confusion about its future in America. 

A man wearing a shirt promoting TikTok is seen at an Apple store in Beijing

A man wearing a shirt promoting TikTok is seen at an Apple store in Beijing on Friday, July 17, 2020. (AP Photo/Ng Han Guan)

2020

January

Revived Dubsmash grows into TikTok’s imminent rival. (link)

March

TikTok lets outside experts examine its moderation practices at its “transparency center.” (link)

Senators introduce a bill to restrict the use of TikTok on government devices. (link)

TikTok brings in outside experts to craft content policies. (link)

April

TikTok introduces parental controls. (link)

TikTok tops two billion downloads. (link)

June

TikTok discloses how its content recommendation system works. (link)

YouTube launches TikTok rival. (link)

July

Facebook shuts down TikTok rival Lasso. (link)

Secretary of State Mike Pompeo says the U.S. is looking to ban TikTok. (link)

TikTok announced a $200 million fund for U.S. creators. (link)

Trump told reporters he will use executive power to ban TikTok. (link)

The second half of 2020: TikTok versus the U.S. government

After weeks of speculation, Trump signs an executive order in August against ByteDance. ByteDance begins seeking American buyers for TikTok, but the company also fights the executive order in court. A group of TikTok creators also file a lawsuit challenging the order. The last few months of 2020 become a relentless, and often confusing, flurry of events and new developments for TikTok observers, with no end in sight. 

August

Reports say ByteDance agrees to divest TikTok’s U.S. operations and Microsoft will take over. (link)

Trump signals opposition to the ByteDance-Microsoft deal. (link)

Microsoft announces discussions about the TikTok purchase will complete no later than September 15. (link)

Trump shifts tone and says he expects a cut from the TikTok sale. (link)

TikTok broadens fact-checking partnerships ahead of the U.S. election. (link)

August 7: In the most significant escalation of tensions between the U.S. government and TikTok, Trump signs an executive order banning “transactions” with ByteDance in 45 days, or on September 20. (link). TikTok says the order was “issued without any due process” and would risk “undermining global businesses’ trust in the United States’ commitment to the rule of law.” (link)

August 9: TikTok reportedly plans to challenge the Trump administration ban. (link)

Oracle is also reportedly bidding for the TikTok sale. (link)

August 24: TikTok and ByteDance file their first lawsuit in federal court against the executive order, naming President Trump, Secretary of State Wilbur Ross and the U.S. Department of Commerce as defendants. The suit seeks to prevent the government from banning TikTok. Filed in U.S. District Court Central District of California (case number 2:20-cv-7672), it claims Trump’s executive order is unconstitutional.  (link)

TikTok reaches 100 million users in the U.S. (link)

August 27: TikTok CEO Kevin Mayer resigns after 100 days. (link)

Kevin Mayer (Photo by Jesse Grant/Getty Images for Disney)

Walmart says it has expressed interest in teaming up with Microsoft to bid for TikTok. (link)

August 28: China’s revised export laws could block TikTok’s divestment. (link)

September

China says it would rather see TikTok shuttered than sold to an American firm. (link)

September 13: Oracle confirms it is part of a proposal submitted by ByteDance to the Treasury Department in which Oracle will serve as the “trusted technology provider.” (link)

September 18: The Commerce Department publishes regulations against TikTok that will take effect in two phases. The app will no longer be distributed in U.S. app stores as of September 20, but it gets an extension on how it operates until November 12. After that, however, it will no longer be able to use internet hosting services in the U.S., rendering it inaccessible.  (link)

On the same day as the Commerce Department’s announcement, two separate lawsuits are filed against Trump’s executive order against TikTok. One is filed by ByteDance, while the other is by three TikTok creators.

The one filed by TikTok and ByteDance is in U.S. District Court for the District of Columbia (case number 20-cv-02658), naming President Trump, Secretary of Commerce Wilbur Ross and the Commerce Department as defendants. It is very similar to the suit ByteDance previously filed in California. TikTok and ByteDance’s lawyers argue that Trump’s executive order violates the Administrative Procedure Act, the right to free speech, and due process and takings clauses.

The other lawsuit, filed by TikTok creators Douglas Marland, Cosette Rinab and Alec Chambers, also names the president, Ross and the Department of Commerce as defendants. The suit, filed in the U.S. District Court for the Eastern District of Pennsylvania (case number 2:20-cv-04597), argues that Trump’s executive order “violates the first and fifth amendments of the U.S. Constitution and exceeds the President’s statutory authority.”

September 19: One day before the September 20 deadline that would have forced Google and Apple to remove TikTok from their app stores, the Commerce Department extends it by a week to September 27. This is reportedly to give ByteDance, Oracle and Walmart time to finalize their deal.

On the same day, Marland, Rinab and Chambers, the three TikTok creators, file their first motion for a preliminary injunction against Trump’s executive order. They argue that the executive order violates freedom of speech and deprives them of “protected liberty and property interests without due process,” because if a ban goes into effect, it would prevent them from making income from TikTok-related activities, like promotional and branding work.

September 20: After filing the D.C. District Court lawsuit against Trump’s executive order, TikTok and ByteDance formally withdraw their similar pending suit in the U.S. District Court of Central District of California.

September 21: ByteDance and Oracle confirm the deal but send conflicting statements over TikTok’s new ownership. TikTok is valued at an estimated $60 billion. (link)

September 22: China’s state newspaper says China won’t approve the TikTok sale, labeling it “extortion.” (link)

September 23: TikTok and ByteDance ask the U.S. District Court for the District of Columbia to grant a preliminary injunction against the executive order, arguing that the September 27 ban removing TikTok from app stores will “inflict direct, immediate, and irreparable harm on Plaintiffs during the pendency of this case.” (link)

September 26: U.S. District Court Judge Wendy Beetlestone denies Marland, Rinab and Chambers’ motion for a preliminary injunction against the executive order, writing that the three did not demonstrate “they will suffer immediate, irreparable harm if users and prospective users cannot download or update” TikTok after September 27, since they will still be able to use the app.

September 27: Just hours before the TikTok ban was set to go into effect, U.S. District Court Judge Carl J. Nichols grants ByteDance’s request for a preliminary injunction while the court considers whether the app poses a risk to national security. (link)

September 29: TikTok launches a U.S. election guide in the app. (link)

October

comedian Sarah Cooper's page is displayed on the TikTok app

WASHINGTON, DC – AUGUST 07: In this photo illustration, comedian Sarah Cooper’s page is displayed on the TikTok app. (Photo Illustration by Drew Angerer/Getty Images)

Snapchat launches a TikTok rival. (link)

TikTok says it’s enforcing actions against hate speech. (link)

TikTok partners with Shopify on social commerce (link)

October 13: After failing to win their first request for a preliminary injunction, TikTok creators Marland, Rinab and Chambers file a second one. This time, their request focuses on the Commerce Department’s November 12 deadline, which they say will make it impossible for users to access or post content on TikTok if it goes into effect.

October 30: U.S. District Judge Wendy Beetlestone grants TikTok creators Marland, Chambers and Rinab’s second request for a preliminary injunction against the TikTok ban. (link)

November

November 7: After five days of waiting for vote counts, Joe Biden is declared the president-elect by CNN, followed by the AP, NBC, CBS, ABC and Fox News. With Biden set to be sworn in as president on January 20, the future of Trump’s executive order against TikTok becomes even more uncertain.

November 10: ByteDance asks the federal appeal court to vacate the U.S. government’s divestiture order that would force it to sell the app’s American operations by November 12. Filed as part of the lawsuit in D.C. District Court, ByteDance said it asked the Committee on Foreign Investments in the United States for an extension, but hadn’t been granted one yet. (link)

November 12: This is the day that the Commerce Department’s ban on transactions with ByteDance, including providing internet hosting services to TikTok (which would stop the app from being able to operate in the U.S.), was set to go into effect. But instead the case becomes more convoluted as the U.S. government sends mixed messages about TikTok’s future.

The Commerce Department says it will abide by the preliminary injunction granted on October 30 by Judge Beetlestone, pending further legal developments. But, around the same time, the Justice Department files an appeal against Beetlestone’s ruling. Then Judge Nichols sets new deadlines (December 14 and 28) in the D.C. District Court lawsuit (the one filed by ByteDance against the Trump administration) for both sides to file motions and other new documents in the case. (link)

November 25: The Trump administration grants ByteDance a seven-day extension of the divestiture order. The deadline for ByteDance to finalize a sale of TikTok is now December 4.

This timeline will be updated as developments occur.

Continue Reading

Trending