Jorg Greuel/Getty Images
A title card is the first thing you see in the video for “They Ain’t 100,” a song by the British rapper Fredo — which reads: Disclaimer: The content in this video is an expression of art and should not be taken literally. K-Trap’s “David Blaine” opens with a similar prologue: All characters in this visual are entirely fictional. The events that occur are purley [sic] symbolic and should not be taken literal. The video follows up with an audio repetition of the warning.
The warnings are these rappers’ insurance against claims by law enforcement in Britain that their genre of rap, called drill — the popularization of which is largely attributed to the rapper Chief Keef and which was seized on by British youth, driving an explosion in that country — has been a contributing factor in a metastasizing, upward trend of violence there.
“Sadly, I have had to look at quite a lot of drill music,” Metropolitan Police Commissioner Cressida Dick, London’s highest-ranking police official, said in an interview on May 18. “You can access it, sadly, very easily … Very quickly you will see that these are associated with lyrics that are about glamorizing violence — serious violence, murder, stabbings, they describe the stabbings in great detail, with great joy … But most particularly, it is often — we have gangs who make drill videos and in those videos they taunt each other and say what they’re going to do with each other.” A “gangs database” established by the Metropolitan Police, which lists among other things the music taste of those in it, was called “deeply flawed” by Amnesty International last month.
Scotland Yard’s diagnosis of the genre and the information in that database have now resulted in the removal of several of them from the world’s most-used video-streaming platform, YouTube. A YouTube spokesperson confirmed in an email to NPR that videos were removed — over 30, according to the BBC — and clarified in a statement that the removals were done so in collaboration with law enforcement.
“We have a dedicated process for the police to flag videos directly to our teams because we often need specialist context from law enforcement to identify real-life threats. Along with others in the U.K., we share the deep concern about this issue and do not want our platform used to incite violence,” a statement from YouTube reads, in part. It also pointed to a policy established in the U.K. 10 years ago, addressing the depiction of weapons in videos uploaded to the site “amid growing concern about knife and gun crime among young people,” as the BBC wrote at the time. Two years prior to that — just a year after YouTube launched, and one month prior to its purchase by Google — concern over the depiction of violence in music videos on the platform was being reported.
Opinion on whether the videos themselves cause violence, or are a reflection of the lives being lived under pernicious socioeconomic conditions, appears split. In an op-ed in January, London Mayor Sadiq Khan pointed to budget cuts for public services as driving the problem. Racial disparities around those services are pronounced, especially in the capital. “Ethnicity can be a marker of these circumstances, but it is not a cause,” The Guardian‘s editorial board wrote, starkly, last November.