Writer and artist James Bridle has a long, but rather amazing and disturbing, piece arguing that “Something is wrong on the internet.” Specifically he’s talking about how kids’ videos on YouTube have turned super-strange and -dark, thanks to the weird profitability of kids’ videos, their low production standards, efforts to hit the right SEO and keyword notes, etc.. The result, he says, is that

Automated reward systems like YouTube algorithms necessitate exploitation in the same way that capitalism necessitates exploitation, and if you’re someone who bristles at the second half of that equation then maybe this should be what convinces you of its truth. Exploitation is encoded into the systems we are building, making it harder to see, harder to think and explain, harder to counter and defend against. Not in a future of AI overlords and robots in the factories, but right here, now, on your screen, in your living room and in your pocket.

I’ve been thinking a lot recently about the future of automation and work, and whether it’s possible to avoid the kinds of race-to-the-bottom, exterminate-the-worker imperatives that seem to be implicit in so many automation projects today, so this is a bracing argument.

It goes on, after walking through a number of examples of videos that are literally nightmarish:

To expose children to this content is abuse. We’re not talking about the debatable but undoubtedly real effects of film or videogame violence on teenagers, or the effects of pornography or extreme images on young minds, which were alluded to in my opening description of my own teenage internet use. Those are important debates, but they’re not what is being discussed here. What we’re talking about is very young children, effectively from birth, being deliberately targeted with content which will traumatise and disturb them, via networks which are extremely vulnerable to exactly this form of abuse. It’s not about trolls, but about a kind of violence inherent in the combination of digital systems and capitalist incentives. It’s down to that level of the metal.

This, I think, is my point: The system is complicit in the abuse.

And right now, right here, YouTube and Google are complicit in that system. The architecture they have built to extract the maximum revenue from online video is being hacked by persons unknown to abuse children, perhaps not even deliberately, but at a massive scale. I believe they have an absolute responsibility to deal with this, just as they have a responsibility to deal with the radicalisation of (mostly) young (mostly) men via extremist videos — of any political persuasion. They have so far showed absolutely no inclination to do this, which is in itself despicable. However, a huge part of my troubled response to this issue is that I have no idea how they can respond without shutting down the service itself, and most systems which resemble it. We have built a world which operates at scale, where human oversight is simply impossible, and no manner of inhuman oversight will counter most of the examples I’ve used in this essay.

I spent a little time looking at some of these videos, and they are beyond weird. They combine Second Life-level clunky animation; the kinds of repetition that adults find irritating and toddlers love; that distinctive kids’ music; and extremely strange cuts and changes of scene. About four minutes into one of the videos, the scene shifted from a totally anodyne house to a graveyard in which familiar toys sing a song about how sugar is bad, only they have flayed zombie heads; it was exactly the kind of thing that your mind would cook up as a nightmare.