A few weeks ago, Vice reported a Beverly Hills police officer playing “Santaria” on his phone while being filmed, presumably so the song would be flagged by an automated copyright filter and taken down, along with the Instagram page:
In a video posted on [LA activist Sennett Devermont’s] Instagram account, we see a mostly cordial conversation between Devermont and BHPD Sgt. Billy Fair turn a corner when Fair becomes upset that Devermont is live-streaming the interaction, including showing work contact information for another officer. Fair asks how many people are watching, to which Devermont replies, “Enough.”
Fair then stops answering questions, pulls out his phone, and starts silently swiping around—and that’s when the ska music starts playing…
Assuming that Fair wasn’t just trying to share his love of ’90s stoner music with the citizens of Beverly Hills, this seems to be an intentional (if misguided) tactic to use social media companies’ copyright protection policies to prevent himself from being filmed.
A few days later, Vice reported on another BHPD cop pulling the same stunt. Apparently this is not an uncommon practice among this specific department.
To be clear, the main focus of this story should be on the cops who used a known feature of social media platforms to avoid accountability for their actions. Copyright filters just happen to be that feature.
But at the same time, this is a teachable moment for those who aren’t as worried about automated copyright filters as they should be. It’s extremely tempting to believe that YouTube, Instagram, or whichever large social media platform can simply upload a protected work to their database, filter out all the pirated versions of that song, and call it a day.
To the extent that filters like Instagram’s can identify pirated content, they are highly prone to false positives. Algorithms are far from perfect when it comes to missing pirated content, but still pretty good at flagging videos. They are, however, atrocious when it comes to identifying fair use.
There are also instances like those with the Beverly Hills cops where music is played in the background of a politically relevant event, such as the protests in the wake of George Floyd’s murder last year, and those videos are taken down due to the music. As I wrote last summer:
[O]verly aggressive enforcement of their copyrights would do serious harm to the proliferation of information that’s vital for the functioning of a democracy.
Indeed, copyright filters and DMCA takedown notices have removed content essential to understanding the protests and the issue of police misconduct more broadly. Live streams of protests have been removed due to music playing in the background, and uploads of police manuals are hit with takedown notices.
Some examples are best understood as bugs in the way allegedly infringing content is policed online–even public domain content can be flagged.
Given the high rate of false positives associated with automated filters–which would be functionally required under the notice-and-staydown regime proposed by the Digital Copyright Act–it’s essential that not only fair use but also background music used to sabotage politically relevant content be weighed against lost royalties.
In any event, we cannot pretend that flagging is a minor occurrence and doesn’t come with serious consequences for those who are clearly not the nefarious thieves or pirates plaguing traditional copyright-intensive industries.