Tag

Section 230

All articles tagged with #section 230

technology5 months ago

Steve Wozniak Warns Against Internet Scams

Steve Wozniak discusses the impact of the internet on society, highlighting how it has enabled scams like Bitcoin fraud using his image on YouTube. He has sued YouTube for not removing fraudulent videos, criticizing Section 230 for shielding platforms from liability. Wozniak emphasizes the need for better online scam prevention and reflects on the internet's original goal of democratizing information, which he believes has been exploited for profit.

politics1 year ago

Trump's FCC Nominee Targets Social Media Legal Protections

Brendan Carr, Donald Trump's pick for FCC chair, aims to eliminate Section 230 of the Communications Decency Act, which protects social media companies from legal consequences for user-generated content. Carr argues that Big Tech has too much power without accountability and suggests the FCC should reinterpret Section 230 to limit these protections. Both Trump and President Biden support repealing Section 230, albeit for different reasons, with bipartisan agreement on the need for reform. However, progress has been slow, despite a new House bill proposing to phase out Section 230.

politics1 year ago

Trump's FCC Nominee Targets Big Tech and Media in Censorship Crackdown

Incoming FCC Chairman Brendan Carr, appointed by President-elect Trump, is challenging Big Tech firms over their alleged involvement in a 'censorship cartel' with NewsGuard, a fact-checking firm accused of bias against conservative outlets. Carr's inquiry could impact Section 230, which protects tech companies from liability for third-party content. He demands responses from tech leaders like Sundar Pichai and Mark Zuckerberg by December 10, questioning their partnerships with NewsGuard and other media monitors.

politics1 year ago

Trump's FCC Pick Brendan Carr Signals Shift in Tech Regulation

President-elect Trump has nominated Brendan Carr as the chairman of the Federal Communications Commission. Carr, a senior Republican at the FCC since 2012, is known for his stance against Big Tech's content-moderation immunity and has authored the FCC chapter of Project 2025. He aims to reform Section 230 to limit Big Tech's immunity and advocates for increased transparency and accountability in tech operations. Carr also supports a ban on TikTok and further decoupling from Chinese tech influence, citing national security concerns. His nomination requires Senate approval.

technology1 year ago

Google's AI Search Faces Backlash Over Bizarre Answers

Google's new AI Overview feature generates written answers to user searches, raising questions about legal responsibility if the AI provides incorrect or harmful information. The legal protections under Section 230 of the Communications Decency Act, which shield companies from liability for third-party content, may not clearly apply to AI-generated content. The reliability of AI Overview's answers varies, and the feature's impact on the creation and recognition of reliable information is also a concern.

legal-technology1 year ago

"Social Media Giants to Face Lawsuit Over Buffalo Shooter Radicalization"

A New York state judge has ruled that Reddit and YouTube must face a lawsuit alleging that their algorithms played a role in radicalizing the shooter responsible for a racist shooting in Buffalo. The lawsuit, filed by survivors of the shooting, challenges the limits of Section 230, a law that shields internet platforms from lawsuits over user-posted content. The decision allows the claims against the tech companies to proceed to the discovery stage, while Reddit and YouTube have expressed disagreement with the ruling and plan to appeal.

technology2 years ago

Judge Rules Social Media Giants Must Face Child Safety Lawsuits

A federal court has ruled that social media giants Meta (formerly Facebook), ByteDance (owner of TikTok), Alphabet (parent company of Google and YouTube), and Snap must face a lawsuit alleging that their platforms have adverse effects on the mental health of children. The court rejected the companies' motion to dismiss the lawsuits, which accuse them of running addictive platforms that cause physical and emotional harm to children. The ruling states that the First Amendment and Section 230 do not shield the companies from liability, as many of the claims relate to alleged defects on the platforms, such as insufficient parental controls and age verification systems. While some claims were thrown out, the ruling could pave the way for more safety claims against social media platforms.

law2 years ago

Supreme Court Refuses to Hold Social Media Liable for Child Exploitation

The US Supreme Court has declined to hear a case brought by a victim of sex trafficking who sought to hold Reddit responsible for hosting images of child pornography on the website. The case targeted Section 230 of the Communications Decency Act, which offers broad immunity to online platforms. The court's decision suggests that it is willing to leave any changes to Section 230 to Congress. Reddit argued that it works hard to locate and prevent the sharing of child pornography on its website and that it should not be treated as the creator of unlawful content.

technology2 years ago

SCOTUS Upholds Internet Shield for Big Tech, Leaves Section 230 Untouched

The Supreme Court declined to take up two high-profile cases on the tech industry's liability protections, leaving Congress to decide whether or how to revamp Section 230. While there is broad bipartisan support for chipping away or revoking Section 230 protections, there is limited agreement on how to do it. The court's decision puts pressure on Congress to settle any lingering concerns about Section 230, which protects digital services from lawsuits over user content. However, partisan disagreements over whether platforms take down too much or too little misleading content have hindered progress.

technology2 years ago

Supreme Court Upholds Protections for Internet Companies and Speech Moderation Online.

The US Supreme Court has upheld legal protections for internet and social media companies, refusing to weaken Section 230 of the Communications Decency Act that safeguards internet companies from lawsuits for content posted by users. The court also shielded Twitter from litigation seeking to apply the Anti-Terrorism Act that enables Americans to recover damages related to "an act of international terrorism." Families of people killed by Islamist gunmen overseas had sued to try to hold internet companies liable because of the presence of militant groups on their platforms or for recommending their content.

lawtech2 years ago

Supreme Court's Decisions on Online Speech and Liability

The US Supreme Court has ruled in favour of Google, Twitter, and Facebook in lawsuits seeking to hold them liable for terrorist attacks. However, the court avoided the big issue of whether the federal law that shields social media companies from being sued over content posted by others is too broad. The court unanimously rejected a lawsuit alleging that the companies allowed their platforms to be used to aid and abet an attack at a Turkish nightclub that killed 39 people in 2017. The court also returned the case of an American college student who was killed in an Islamic State terrorist attack in Paris in 2015 to a lower court.

lawtech2 years ago

Supreme Court Upholds Protections for Google, Twitter, and Facebook from User-Generated Content Liability

The US Supreme Court ruled in favor of Google and Twitter in two liability cases brought by families of terrorism victims, stating that the companies did not aid and abet terrorist attacks. The court's decision avoided limiting Section 230, a law that protects social media platforms from lawsuits over user-generated content. The law has been a topic of debate in the polarized discussion over online speech. The court's decision was a victory for tech companies, who have been lobbying to defend Section 230. The court's decision puts the onus back on Congress to take action on the issue.

politics2 years ago

Supreme Court protects social media companies from liability in terror-related content cases.

The US Supreme Court has ruled in favour of social media companies Twitter and Google in two separate cases brought by families of victims of terrorist attacks. The cases marked the first time the court considered the scope of Section 230 of the Communications Decency Act, which protects internet companies from liability over content posted by third parties. The court unanimously ruled that the families failed to state a claim under the Anti-Terrorism Act, which allows US nationals injured in international acts of terrorism to sue for damages.