Be your own algorithm
2025-07-13
A few days ago, I watched this video:
At the time when I finished it, I had a lot of possible thoughts and responses bubbling through my mind, I even recorded a video on my phone just rambling about my opinions on this video (which I did not upload anywhere). Despite what the title may suggest, this isn't just a generic "stop using twitter" video, it's a detailed critique of the book "Filterworld: How Algorithms Flattened Culture" by Kyle Chayka. I'd never heard of this book before and I don't intend to read it, so I can't comment on the accuracy of Pagemelt's reading. I'll just be giving my thoughts on the video on it's own. Now I should say, this video is over an hour long, and I watched it almost a week ago. Because of that, This post may end up focusing on a few fairly minor points in a bit of a scattered way. Of course, as you would expect if you've kept up with this blog at all, I am generally in agreement with Pagemelt's views. For example, the following quote, "We have a culture of tech today where these details have been deliberately obfuscated, designed to make outsiders like you think you're too stupid to build your own internet, but you're not". It should come as no surprise that I'm extremely happy to see a video getting popular which encourages people to move away from megacorp owned "platforms" and build their own personal websites. That being said, I do take issue with a few points in the video.
Pagemelt frames algorithmic feeds, in particular short form video platforms, with a sort of "both sides" device that goes something like this. "Yes, these programs are tracking you and selling your data, yes they are designed to psychologically manipulate you so that you stay on the site for as long as possible, yes they disempower their users at every opportunity, yes they funnel huge amounts of wealth into the pockets of capitalist oligarchs. However, I've also found some great art on there so you know, who can say if they're bad" Obviously I'm being hyperbolic in my phrasing here, they do seem to come down pretty heavily on the "negatives outweigh the positives" side of things by the end of the video. However, this particular point, which does come up a few times, that despite all of their flaws, these platforms do produce genuinely good art, well it rubbed me the wrong way a little. I can't help but see this as a kind of artwashing. There are really two propositions here. "Algorithmic feeds enable the production of unique new art", and, "unique new art is always good". Humans have been producing art for as long as we've been humans, and in fact earlier than that if by human you only include homo-sapiens. Shouts out Pseudodon shell DUB1006-fL. The fact that we manage to make art no matter what situation we're in (what medium we have to work with) is not special. One would expect any new medium to produce unique art, this doesn't say anything about the medium itself, it only points to the fact that it's very hard to stop us from producing art. The two sides of this scale, "medium causes active harm" on the one side and "medium enables the production of new art" on the other side, this is not actually balanced, because a medium which does not cause that harm would also enable the production of new art. You actually can't have a medium which doesn't do that. Because of this, I find the framing flawed. I'm going to use a much more severe example just to drive my point home. As much as the Nazi regime produced dogshit like "triumph of the will", they also were able to produce very effective art-as-propaganda in the form of uniforms and graphic design. It would be absurd to say "well the Nazis were bad, but we also got some good art out of it". This kind of both sides-ing makes two mistakes. It presents both sides of the "but" as if they have equal weight, and it ignores the fact that it is specifically the same structures which produced the bad outcomes which also enabled the production of art. Without the genocidal Nazi regime, we would not have access to a lot of art, and that would be a good thing. The surveillance, exploitation, and Skinner box mechanics of a platform like Tik Tok cannot simply be forgiven because they enabled the production of the culturally sacred object "art".
In fact, I might even suggest that the artists using these platforms to are to varying degrees, responsible in part for their perpetuation. They occupy a unique class position, on the one hand as gig workers, contractor employees of the company, but on the other hand as mini entrepreneurs, promoting their brand and profiting off of their ownership of their means of production as members of the petite bourgeoisie. It is for this reason that users of social media sites have been called digital sharecroppers, a term coined by Nicholas Carr. As he put it way back in 2006, "One of the fundamental economic characteristics of Web 2.0 is the distribution of production into the hands of the many and the concentration of the economic rewards into the hands of the few. It’s a sharecropping system, but the sharecroppers are generally happy because their interest lies in self-expression or socializing, not in making money, and, besides, the economic value of each of their individual contributions is trivial. It’s only by aggregating those contributions on a massive scale – on a web scale – that the business becomes lucrative. To put it a different way, the sharecroppers operate happily in an attention economy while their overseers operate happily in a cash economy. In this view, the attention economy does not operate separately from the cash economy; it’s simply a means of creating cheap inputs for the cash economy." To what extent are the artists, artisans, entrapeneurs, the content creators not only exploited by this system but also reproducing it via their participation and benefiting from the arrangement? This seems like a topic too detailed to cover here, so I'd like to dedicate a full blog post to that in the future. I'll just state my position for now as "a little bit sus".
As I understand it, the key point in Filterworld is the idea that algorithms have worsened art and culture, and this is a point that Pagemelt takes issue with. They are of the opinion that it's not that art is generally worse, just that there is more being produced and shared, and since some portion, maybe even most art is always going to be bad, we see more bad art just as a factor of total quantity, not a general decrease in quality. While I for the most part agree with this assessment, I once again think they're letting megacorps get away with a bit too much here. I'm going to make a claim which I have no direct evidence for, but I hope that by showing you my train of inferences you'll be able to accept it as plausible. "Social" media algorithmic feeds are generated using extremely large and detailed datasets, in combination with highly advanced behavioural modelling software. All of this in service of figuring out how to keep you in the platform for as long as possible. Note that this is different from showing you things you like. We know that these megacorps hire behavioural psychologists and experts in similar fields. It's reasonable to assume that they are aiming to design their platforms so as to produce a specific behaviour. If I were to use a little less charitable language, I might even say they are trying to manipulate their users into performing certain behaviours. So if these platforms have the expertise and resources to create algorithmic feeds which show us what we want to see, why don't they do it? Why is it the case that our feeds are full of not only posts we have no interest in, but also posts which are actively the opposite of what we want to see? My claim is that this is not an accident, but by design, to produce a sort of "slot machine" style effect, where each scroll we re-roll in the hopes of winning an actually good post. If our feeds were just what we wanted to see, what would happen when that ran out? We'd scroll further, see that we'd past the end of what we wanted to see and that the algorithm was left serving us less relevant content, and then we would leave the platform to go do something else. It makes more sense that these feeds are designed to drip-feed desirable content in between chunks of noise, to keep us on the platforms for longer. In this case, if you start seeing posts which aren't relevant to you, you have been taught to just keep scrolling further, rather than go do something else. If my theory is correct, this would mean that these algorithms aren't as innocent as Pagemelt assumes when it comes to the low quality of online art and culture. It may be the case that "social" media algorithms are intentionally serving up low quality or irrelevant art as a part of their design, not merely as a consequence of higher quantity.
As the video points out, there is a community of people who create independent websites on places like neocities (hey, that's me!). However, the video makes a point that this alternative, personal independent websites, will never be viable as a true competitor to social media, due to the technical barrier of entry. I found this a bit strange for 2 reasons. First, there already is mass adoption of personal websites and blogs. There are more good, active blogs than I could ever hope to find. It seems like near every time I go looking, I find 3 or 4 more amazing websites to follow via RSS which I'd never heard of before. Just yesterday I found this cool online art magazine for example. This is the major benefit personal websites have over "alternative platforms" like bluesky or fediverse or whatever the new thing will be next week: "the web" already has mass adoption. There's no need to convince your friends to jump ship, make an account, no need to deal with network effects or any of the sorts of problems these platforms typically have. Everyone already has a web browser. It's the same reason I say email is underrated as a communications method alternative to platforms like discord or slack. Everyone already has email. That being said, having access to the web is different from creating your own website, which does indeed have a technical barrier to entry. This leads me to my second point, we are not a social media company. We don't need to consistently report growth to our investors to justify burning runway. We don't need to grow at all. As I have said before, the widespread adoption of personal computers and the internet was not some sort of democratising progressive force, it was the cynical business strategy of corporations trying to maximise their customer-base. If the web is to continue existing in any form other than corporate walled gardens, it must degrow and rewild. We win the war by outmanoeuvring capitalists on the axis of our advantage, specifically not being capitalists, not being bound by the laws of growth, profit, and competition.
I am now done with my minor critique of what I want to remind you is an excellent youtube video which you should all watch. Pagemelt is clearly not only well read but also a good reader,and also a good writer. They are able to read deeper into the text and draw nuanced conclusions, relating them to other works (in this instance especially the work of Marshal McLuhan) in novel ways, and also present this information in a way that hits the perfect balance of being informative and entertaining. It's a video of one person talking into a camera for an hour and 20 minutes, and I wasn't bored for even a second. I was very glad that youtube's algorithm showed me their channel and I definitely look forward to watching whatever they make next. I even went back and watched their previous video about a gay romance novel, which would not typically have been a video I would have sought out myself, and I also thought that was great. I don't want to come off as if I'm only critical of this video, I thought it was very good and I highly recommend you watch it. There was only one short section which strongly disagreed with and that was this.
That stupid David Graeber quote.
“The ultimate, hidden truth of the world is that it is something that we make, and could just as easily make differently.”
I hate this quote. I hate it so much. I actually like some of Graeber's work. I thought Debt: The First 5,000 years was amazing until the end when he starts talking about the modern economy, Bullshit Jobs was pretty good, and The Dawn of Everything had some genuinely good parts but was held back by precisely the same issue with Graeber's ideology which is present in that stupid quote. Like oh, so I guess enslaved people just weren't trying hard enough to make their world differently then. No, obviously we can't just arbitrarily make the world however we want it, because we do not have power. We are bound by historical and material circumstances outside of our control. I'd like to counter with a better quote from Marx in The 18th Brumaire, "Men make their own history, but they do not make it as they please; they do not make it under self-selected circumstances, but under circumstances existing already, given and transmitted from the past." Graeber has this non-materialist view on history, that anything could just as easily be any other way if we had the right ideas. I'm sorry, I wish that were true, it would certainly make things easier, but it is not the case. We are where we are because of real, material historical processes, and we can't pretend they don't exist, or we end up with a bizarre victim blaming magical thinking reminiscent of The Secret, if you don't get what you want it's because you weren't manifesting hard enough. I understand that Pagemelt was trying to tie up the video with an optimistic message by including this quote, but it really just came off at-odds with the rest of their commentary, it has no place in a systemic critique of the capitalist internet. The web has been in large part enclosed and colonised because of capitalist property relations ultimately enforced through state violence, not personal consumer choice.