[ad_1]
Supporters of the Texas law and a similar law in Florida say the law prevents tech companies from engaging in censorship by removing posts that reflect political views they disagree with. But the wording of the Texas law would also pave the way for companies to block or block any content that isn’t illegal, experts say, including terrorist recruiting, white supremacy organizing, egging on people with eating disorders, sharing vaccine information and other harmful content that many websites currently ban.
Although the laws in both states were the product of conservative legislatures, the Fifth Circuit’s ruling on the Texas law contradicts some longstanding Supreme Court opinions that uphold First Amendment protections for corporations — opinions that conservatives once praised. In May, the U.S. Court of Appeals for the 11th Circuit reversed a similar ruling in Florida that struck down the law. The conflict means the law will likely be heard by the U.S. Supreme Court, where conservative justices have repeatedly upheld corporations’ First Amendment rights in cases like Citizens United, the court’s 2010 decision that lifted longstanding limits on corporate campaign contributions. Restricted rights to participate in political speech.
Despite the hope that the Supreme Court will eventually reject the law, Silicon Valley companies are beginning to prepare for the worst, playing out responses in planning exercises called “sandboxing”, said Karl Szabo, vice president and general counsel for NetChoice. Tech company lobby groups that opposed the Texas law. Members of the group include Meta, TikTok, Google, Nextdoor and dozens of other services.
The strategy falls into four broad areas, the most radical of which would involve the companies shutting down their services entirely in Texas and possibly other states where the bills have been introduced.
Tech companies can build “pop-up screens” that greet users, let them know if what they’re about to see is too disturbing, and give them the option to opt-in to a neutral environment, Daphne Keller said. Director of the Platform Regulation Program at Stanford University’s Cyber Policy Center.
Companies have explored the perilous idea of ending all moderation — basically the law to a T — and waiting for a public outcry or people to flee their products. And some have floated the idea of ”lobotomizing” the content on their services, making it smoother, saying there’s no reason to remove anything, said the president of the Computer and Communications Industry Association (CCIA), another technology industry group. Fighting the law.
“What all these options have in common is complete confusion,” Schruers said.
Szabo said tech companies “have tried to sit down and try to figure out how to actually implement the Texas law,” but at the moment most of the options seem impractical, legally dubious or cost tens of millions. of customers.
“Some of the greatest technical minds on the planet have come together, but they can’t do it because what Texas and Florida are doing is asking them to make the platforms round,” he said.
Experts liken the law to forcing Barnes & Noble bookstores to carry Adolf Hitler’s Mein Kampf manifesto or forcing newspapers like the Washington Post to publish op-eds by neo-Nazi candidates.
Tech companies have reluctantly built their ability to remove, download, and edit content on their services, first in the U.S. in the U.S. to comply with laws barring services from hosting copyrighted material or child pornography, and in Europe with laws banning professional content. – Nazi speech. In its early years, Facebook tried to distinguish itself from rival MySpace by setting a higher standard of propriety, banning nudity and violent speech, and hiring fewer moderators to enforce its rules. .
But the company soon ran into content moderation complications when it mistakenly took a famous Vietnam War photo of a naked girl running from napalm bombs dropped by South Vietnamese planes. After protests The company reinstated the photo and added an exception to its newsworthiness policy against nudity.
In the year In 2017, social media companies in Silicon Valley were hauled before Congress after revealing that Russian operatives had planted widespread misinformation on their services during the previous year’s presidential election. In response, companies like Facebook and Google-owned YouTube hired tens of thousands of moderators, essentially spawning a content moderation industry overnight. With each new law, the tech companies have built more moderators and software to filter out problematic content.
The epidemic has brought more laws and more downloads by humans and algorithms as companies ban vaccine misinformation, such as posts of contrarian masks or fake cures.
In the year Jan 6, 2021 The growth of content moderation has reached a peak after tech companies banned former President Donald Trump’s social media accounts after the riots at the US Capitol. Trump’s ban has sparked a conservative backlash, leading to upholding laws in Florida and Texas.
Concerns that social media are too slow to move against misinformation and calls for violence have also led to liberal legal responses. A California law passed last month requires platforms to file complaints twice a year with the state attorney general about their content-editing policies regarding hate speech, misinformation and extremism.
There are no similar federal laws.
Experts say the Texas law applies to any technology service with more than 50 million users, including companies like Pinterest, Etsy and Yale that have nothing to do with political speech. Those companies are in a tougher position than larger platforms because they don’t have the financial resources to deal with all the challenges they might face at law, said Alex Furst, the social media platform’s head of legal affairs and former CEO. Counsel to technology companies on content moderation issues.
In theory, the law said, it could prevent a company like EC from removing pro-Nazi statements posted as part of a custom crib offering. It also allows anyone to sue for discrimination, leaving mid-sized companies vulnerable to a wave of lawsuits.
“It’s a nail-biter for the smaller companies, because the bigger companies don’t have the resources to do it, but they can still be sued by anybody,” First said.
Keller said it’s an alternative minefield that some tech companies are weighing — technically, legally and in terms of the impact on the company’s business.
A strategy to stop service in just one state is technically possible. Texas is the nation’s second most populous state (Florida is third), so it’s challenging and expensive. It can also be challenging for companies to find out if a Texas resident is importing from another state.
The pop-up option may not be legally enforceable because authorities in Texas could argue that users aren’t actually consenting to the integration, Szabo said.
Removing all political content from social media services may not work because anything can be construed as a political viewpoint, Schruers said.
Experts said that the assumption that the court will violate the law is also dangerous. Dobbs A judgment that overturned a significant abortion decision Roe v. Wade. The Supreme Court’s decision, which overturned some aspects of the law but allowed other parts to go into effect, will cause shockwaves in Silicon Valley.
Keller said the consequences of leaving some parts of the law intact will change how technology and media change Companies will do business, perhaps rewriting the algorithms that serve up all their content, firing thousands of moderators, and ramping up their practices of policing speech.
“There is a very messy legal landscape ahead,” she said. “That’s right. Dobbs Because everyone feels that the law is transgressive, judges act on their political beliefs and are willing to ignore precedent.
[ad_2]
Source link