Video

Discover trending videos on Bing. Gravity Tim McGraw “Gravity,” which appears in the National Geographic Documentary Films’ feature film Free Solo. Download free stock video footage with over 28,000 video clips in 4k and HD. We also offer a wide selection of music and sound effect files with over 100,000 clips available. Click here to download royalty-free licensing videos, motion graphics, music and sound effects from Videvo today. Whether it’s reporting on conflicts abroad and political divisions at home, or covering the latest style trends and scientific developments, Times Video journalists provide a revealing.

Video Downloader

The video of the Christchurch mosque killings portrays the murder of innocent people from the perspective of their killer, who also used it to disseminate his racist motivations and genocidal worldview. The recording was made with that intention — to spread. This, Facebook said, was among the reasons the company couldn’t quickly eliminate the footage from its platform, which the killer chose as his medium for his broadcast.

“In the first 24 hours we removed 1.5 million videos of the attack globally, of which over 1.2 million were blocked at upload,” Facebook said publicly on March 16. On March 20, the company elaborated on its efforts, explaining that existing “content matching” systems and artificial intelligence hadn’t been able to stop the video’s spread because the content itself had morphed so many times. (The company also acknowledged criticism that it should have done a better job.)

Facebook can explain why such a video isn’t welcome on its platform, and how they removed it. It can gesture blame, as it did, at “coordination by bad actors” who seek to re-share the video with as many people as possible. But its other explanations suggest the companywas also thwarted by a much larger and less organized group: the Facebook users behind the rest of that 1.5 million — the people who, as the company said, might have been “filming the broadcasts on TV, capturing videos from websites, filming computer screens with their phones, or just re-sharing a clip they received.”

People wanted to see this. People wanted to share this.

Elsewhere online, other platforms were also scrambling. Reddit banned a community called WatchPeopleDie, which had been active for the last seven years and attracted more than 400 thousand subscribers, after some of its volunteer moderators, already under increased scrutiny, refused to take down copies of the Christchurch attack. Liveleak, a YouTube-style video site, compared the shooting video to the “glossy promo videos for ISIS” and said that it wouldn’t “indulge” the shooter by hosting his recording.

Liveleak, however, was far more frank about its users’ desires. “We’ve received no small number of complaints regarding the fact we will not carry the video,” the website said in a post. “We fully understand some people will be very unhappy with this decision.”

Liveleak isn’t the sort of site where you just happen upon something horrific; horrific videos are what its users, dedicated or casual, come there to see. Since 2006, under the tagline “Redefining the Media,” the site has functioned as a smaller, grislier YouTube, with a stated emphasis on news-adjacent footage that depicts war, crime or terrorism. Now nearly 15 years old, it still gets between 16 and 20 million unique visitors a month, with surges accompanying a rare viral sensation (a video of a plane landing sideways in heavy wind) or, more often, a spasm of well-documented violence.

It positions itself as an unfiltered companion to the mainstream media, but Liveleak has its roots in the culture of “shock” or “gore” sites. Its launch corresponded with the shutdown of its predecessor, Ogrish, on Halloween of 2006, at site that had, along with sites like Rotten.com and Stile Project, openly reveled in the prurience of what it was sharing. Some shock sites presented photos and videos of death and violence in the manner of (and sometimes alongside) pornography. Rotten.com accompanied shocking imagery with morbid news, essays and a reference “library” of dark esoterica. In its early days, Ogrish hosted footage of the aftermath of the 9/11 attacks.

A ‘Desire’ to Transmit Trauma

What were viewers getting out of videos of death? Of murder and massacre? Of car crashes, medical mishaps and workplace accidents? In a 2008 study of the Ogrish forums, Sue Tait, a professor formerly at the University of Canterbury, in Christchurch, New Zealand, identified four different “spectatorial positions” for viewers. There were those for whom the suffering on display was a source of stimulation, for whom shock and horror amounted to a form of pleasure. There were viewers who expressed vulnerability, sadness or empathy. There were viewers who said they were watching to prepare themselves for something — a deployment, a difficult job — and thought they could usefully desensitize themselves. And finally, there were viewers who seemed to see what they were doing as necessary, as a courageous or somehow countercultural act — against the media, against censorship — or in the service of witnessing some sort of truth.

In an interview, Dr. Tait recalled how some users would judge videos in shallow aesthetic terms, and describe how they would get more of a “rush” from certain types of videos over others. “We knew people could desensitize themselves over time,” she said. “But here we saw people doing that intentionally.”

“Some people would talk about the way they enjoyed them, and how their enjoyment reduced over time,” she said. “But the things they were saying they enjoyed were symptoms of post-traumatic stress.” They were describing anxiety. They were reexperiencing their time on the site, as one might after a trauma, but describing this with a sense of accomplishment. Further, Dr. Tait said, “I noticed a desire to transmit that trauma to other people, so that you could have other people to talk about it with.”

This called to mind recent conversations she’d had with fellow Christchurch residents, one of whom had told her in a brief encounter at the supermarket that he’d watched the shooter’s video twice. He spoke abstractly about how it hadn’t affected him as much as he had expected. “It reminded me of people on Ogrish,” Dr. Tait said. “It felt to me that this guy who was watching it was a bit disappointed.”

Experts almost universally advise against casting the consumption of violent footage as a fringe phenomenon. Jennifer Malkowski, an assistant professor of film and media studies at Smith College, who uses they/them pronouns and is the author of “Dying in Full Detail: Mortality and Digital Documentary,” pointed out that Liveleak, which is just one of many sources for such footage, is ranked by the web tracking firm Alexa as the 695th biggest site in the world, right alongside The Onion, Jezebel, and Forever21. Mainstream internet platforms have thrown vast amounts of money and labor (much of it invisible) at removing nightmarish content, hiring thousands of content moderators to identify and remove often traumatic and illegal content. But “they’re circulated by many many people,” they said. “I think when you see those numbers from Facebook, you’re confronted with that reality.”

“You realize that these videos aren’t circulated by a few maladjusted individuals,” they added.

“To be focusing on the tech platforms is kind of like an imported crisis,” said Barbie Zelizer, a professor at the University of Pennsylvania and author of “About To Die: How News Images Move the Public.” She said: “You cant extricate one part of the media environment from the rest.” Conversations and norms around representing death, violence and terrorism in media span generations and mediums. (She notes in her book that Google experienced an extended surge of search for footage of a 2004 beheading of an American in Iraq.) Norms about what should be shown on television and in newspapers — which Dr. Zelizer says have become more conservative — have given way to debates about tech platforms. “There’s no question that images have impact,” she said. “But we don’t know what that impact is, not in a way that could propel some sort of reasoned set of responses.”

The big tech platforms, in other words, are inheriting, with much else, a problem that was once understood as the media’s. But services like Facebook are far larger than any individual newspaper. Big social media platforms have inherited much of the rest of the web and it’s users — including the ones who might have spent time on a site like Ogrish.

Violence and More, Made for Sharing

There are still plenty of videos of viscerally awful things on Liveleak. There are also a lot of videos about immigration, about how the media is attacking Donald Trump, about “political correctness” and about Islam. It’s one of the few platforms that still hosts videos from Infowars, which was banned from YouTube and Facebook last year, although they do not appear to garner many views.

The remaining founder, Hayden Hewitt, who made the decision not to host the Christchurch video, acknowledged that Liveleak’s audience leans right, and that a racist contingent has found a home on the site. “Yes, people who want to see these things will be drawn to it,” he said. “It’s as plain as the nose on your face.”

He described racism as “the epitome of stupidity,” but recited familiar complaints about the extremes of “both sides.” (“If you criticize the Israeli government, some people will claim you’re anti-Semitic because of that. And if you criticize radical Islam, you’re often accused of being Islamophobic,” he said.) He hosts an internet show called “Trigger Warning,” on which he laments the rise of political correctness. He told me he believes that conservative speech is being suppressed on larger internet platforms, and that as long as that’s true, its users will trend even further to the right.

On shock sites, extreme violence often pairs with extreme politics. Whether witnessing the worst breeds particular views or spaces catering to one taboo invite others is less obvious. But Dr. Tait remembers racism as central to the community on Ogrish, as well. “One of the most popular threads was a thread about white supremacy,” she remembered. “It was a lengthy debate that was really trying to give a scientific basis for racism.”

Youtube

“A lot of people talked about the reason for watching beheading videos, for example, as really wanting to see what the enemy was capable of,” she said. “And it was very associated with the right, and with the hatred of Arabs.”

In a 2017 article for Participations, a media studies journal, the writer Mike Alvarez wrote of BestGore, another large shock site: “It is apparent from the data that BestGore users do not view all human lives as equal.” Comments, often presented as jokes, “betray a view of humanity in which the life of the ‘other’ is deemed less valuable, if not valueless,” he wrote. On many shock sites, videos portraying police interactions, including shootings that elsewhere have inspired activist movements and debates about racism in law enforcement, are viewed as a case of the victims getting what they deserve.

After it was branded an alternative to mainstream news, Liveleak became popular with American soldiers deployed in Afghanistan and Iraq, and accumulated a library of footage of violence unfolding in Middle East, often posted with minimal context. There is footage from drones; of the aftermath of firefights; of mangled bodies. There is a lot of footage of cartel violence, though, as common on such sites, attribution is sparse and often unreliable, and the dead are treated as interchangable sources of content. (The most popular post of all time on the site, according to Mr. Hewitt, was the cellphone video of the execution of Saddam Hussein.)

Video

In 2008, when the anti-immigrant Dutch politician Geert Wilders wanted to distribute his short film, Fitna, which intercut gory footage of the aftermath of terrorist attacks with an anti-Islam polemic, he chose Liveleak. (“I put my life in danger for a man I disagree with on every level,” Mr. Hewitt said of his decision to host Mr. Wilders film.)

But if sites like Liveleak ever had a case to make for showing audiences what the rest of the media won’t, their argument has, in the last ten years, become more complicated. In 2019, videos of human suffering are recorded, like many videos, with a clear intention to share — and with a specific plan to do so, outside of any sort of professional or shared ethical context. Shock imagery, and violent imagery, isn’t just sought out, it’s hurled at people as a tool of harassment or intimidation, in service of ideology or of misinformation. Today, Liveleak is less a wire service than a macabre aggregator. Rotten.com is offline, perhaps forever. On anonymous message boards such as 4chan — or 8chan, where the Christchurch shooter shared his manifesto — images of human death are deployed frequently, and easily, for shock value.

And while millions of people still decide to go out of their ways to shock themselves, to “bear witness” or revel in the pain of others, it’s also true that recorded death and murder has new ways to find them. Contrary to suggestions that the Christchurch shooter enacted a sophisticated media plan based on some keen sense of how the internet works, what he really did was horribly much simpler: he opened the most popular app in the world, pressed a button and shared.

(CNN)A group representing Muslims in France is suing Facebook and YouTube over the broadcast of a video showing the mass shootings at two mosques in New Zealand on March 15.

The French Council of the Muslim Faith (CFCM) has filed legal papers against the two tech companies over their response to a video of the terror attack in the city of Christchurch, in which 50 people were killed.
CFCM President Ahmet Ogras told CNN that the organization is taking legal action against Facebook for not removing the video fast enough.
'This not admissible, Facebook must take their part of responsibility in this and must do everything to anticipate these livestreams, as much as [they do with] hate messages and Islamophobia on their networks,' Ogras told CNN.
Abdallah Zekri, president of the Observation Center Against Islamophobia, which is part of the CFCM, confirmed the legal action targets the French offices of both Facebook and YouTube.
'We can't have these videos online just like movies of shootings ... YouTube and Facebook must take measures to avoid this in the future,' Zekri told CNN.
The council has filed a complaint with prosecutors in Paris and said it is suing Facebook and YouTube for 'broadcasting a message with violent content abetting terrorism, or of a nature likely to seriously violate human dignity and liable to be seen by a minor,' according to the AFP news agency, which received a copy of the complaint.
Under French law this is punishable by up to three years' jail time and a €75,000 ($85,000) fine.
A group of students sings in front of a floral tribute to victims of the mosque shootings in Christchurch.
Facebook didn't immediately respond to CNN's request for comment Monday. In a statement in the wake of the attack, Mia Garlick, Facebook's director of policy for Australia and New Zealand, said: 'New Zealand Police alerted us to a video on Facebook shortly after the livestream commenced and we quickly removed both the shooter's Facebook and Instagram accounts and the video.'
A spokesman for YouTube declined to comment on the complaint and referred CNN to its previous statements. Following the attack, a Google spokesperson told CNN that YouTube removes 'shocking, violent and graphic content' as soon as it is made aware of it. YouTube declined to comment at the time on how long it took to first remove the video.
A spokesman for the CFCM told CNN that if the tech companies do pay a fine as a result of the complaint, they would like families of the victims of the Christchurch attack to share the money.
The footage was widely shared on social media and tech companies came in for criticism over their handling of the video.
In a statement on its website, Facebook said it had removed 1.5 million videos of the attack in the first 24 hours after the shooting. It blocked 1.2 million of them at upload, meaning they would not have been seen by users. Facebook didn't say how many people had watched the remaining 300,000 videos.

Video Of New Zealand Shooting

On March 18, New Zealand's Prime Minister, Jacinda Ardern, said tech companies have 'a lot of work' to do to curb the proliferation of content that incites hate and violence.