Xorte logo

News Markets Groups

USA | Europe | Asia | World| Stocks | Commodities



Add a new RSS channel

 
 


Keywords

2025-06-09 21:30:00| Fast Company

YouTube is reportedly giving creators more leeway about what they say in videos, easing up on some of the rules it has set in the past. The user generated video platform owned by Alphabet has adjusted its exception rule, which will allow videos that might have been removed nine months ago for promoting misinformation to remain on the platform. The New York Times reports that if a video is considered to be in the public interest or has EDSA (educational, documentary, scientific, artistic) context, up to 50% of it can be in violation of YouTubes guidelines for misinformation or showing violence, versus 25% before the policy change. That change, which was reportedly made about a month after Donald Trump was elected, but was not publicly announced, followed pandemic-era rules that saw a video of Florida Governor Ron DeSantis that shared some Covid misinformation removed from YouTube. The new rule change could benefit creators whose videos blend news and opinion. YouTube’s spokesperson Nicole Bell, in a statement, told Fast Company, “These exceptions apply to a small fraction of the videos on YouTube, but are vital for ensuring important content remains available. This practice allows us to prevent, for example, an hours-long news podcast from being removed for showing one short clip of violence. We regularly update our guidance for these exceptions to reflect the new types of discussion and content (for example emergence of long, podcast content) that we see on the platform, and the feedback of our global creator community. Our goal remains the same: to protect free expression on YouTube.” Free expression is the reason other social media companies have given in relaxing or eliminating their content moderation programs in recent months. X long ago handed over the responsibility of flagging inaccurate content to its users. Meta eliminated its fact-checking program in January, shortly after Trump took office. Trump and other conservatives have long accused social media sites of “censoring” conservative content, saying content moderation, as practiced by social media companies, was a violation of their First Amendment rights to free speech. YouTube said it regularly updates its Community Guidelines to adapt to content on the site. Earlier this year, it sunsetted all remaining COVID-19 policies and added new ones surrounding gambling content. Changes, it said, are reflected in its Community Guidelines Transparency Report.  The new rules largely revolve around content that is considered in the public interest. This is defined as videos where the creators discuss a number of issues, including elections, movements, race, gender, sexuality, abortion, immigration, and censorship. The Times reported it had reviewed training material that gave examples of videos that might have been flagged and taken offline in the past that are now allowed. Included among those was one that incorrectly claimed COVID vaccines alter people’s genes, but mentioned several political figures, increasing its “newsworthiness.” (That video has since been removed for unclear reasons.) Another video from South Korea involved a commentator saying they imagined former president Yoon Suk Yeol turned upside down in a guillotine so that the politician can see the knife is going down. The training material said the risk for harm was low because the wish for execution by guillotine is not feasible. The policy change is, in some ways, a big shift for YouTube, which less than two years ago announced a crackdown on health information. That same year, though, it also said it would stop removing misinformation about past elections, saying the policy could have the unintended effect of curtailing political speech.” YouTube has been criticized in the past for not doing enough to curb the spread of misinformation, ranging from everything from 9/11 “truthers” to false flag conspiracy theories tied to mass shootings. Some reports have even suggested its algorithm can lead some users down a rabbit hole of extremist political content. YouTube says it still actively monitors posts. In the first quarter, removals were up 22% compared to the year prior, with 192,856 videos removed for violating its hateful and abusive policies. The number of videos removed for misinformation was down 61% in the first quarter, however, in part because of the removal of COVID-19 policies. 


Category: E-Commerce

 

Latest from this category

26.07Private equity has ruined so many more businesses than you realize
26.07Beef costs, burrito blues, and boarding groups: The business stories everyones talking about this week
26.0712 signs youre micromanaging without realizing it
26.07LinkedIns Aneesh Raman says the career ladder is disappearing in the AI era
26.07Canva skills are what people hiring graphic designers increasingly want to see
26.075 job search habits young job seekers should ditch immediately
26.07How to go from quiet to commanding
25.07The weirdest Venmo request yet: The U.S. government
E-Commerce »

All news

27.07Domestic vs Global: International mutual funds take lead with up to 6% weekly returns
27.07More air fryers to be handed out to pensioners
27.07Hybrid Funds: A smart counterweight in volatile markets?
27.07NSDL IPO: Issue opens July 30, heres what you need to know about GMP, issue details
27.07IPO calendar: 14 IPOs eye over Rs 7,000 crore as Street gears up for record primary market action
27.07Today's Headlines
27.0711 penny stocks shine in 3 months; 3 turn multibaggers. Do you own any?
27.07Why I'm not paying into a pension
More »
Privacy policy . Copyright . Contact form .