When did the LULZ become a menace to society?

Perhaps a stupid question to try to pinpoint it but I found this article really interesting about how memes and satire transitioned to become all the worst of things about the modern world.

3 Likes

When facebook/youtube/twitter started monetising them tbh and slowly forming a culture of political discourse based on likes, engagement and escalating division rather than deep consideration and cooperation

5 Likes

I blame Family Guy

2 Likes

It started in 4chan, and then was multiplied by the monetisation of those platforms, and the incompetence of the algorithm

3 Likes

It’s literally just there to drive engagement so at doing it’s job the algorithm rules, it just needs to be outlawed in the interest of (inter)national secruity

1 Like

I feel like it’s probably a bit like how there were punk musicians who grew up listening to punk band and focused on the speed and intensity, whereas the original punk bands had been using like the Beatles or whatever as a springboard.

Like ironic humour was once defined by some kind of a point of view, or a response to the difficult absurdity of the world. But then there’s more and more layers of irony and you just had ppl being bigoted and acting as if that was the whole point of the joke (remembering the “make me a sandwich” and rape jokes that were everywhere when I was a student) and it all just swirled together in some weird meta-bigotry thing.

3 Likes

and then we’re ruled over by tech libertarians who don’t care to properly understand the concepts that they’re shitting and pissing on for profit

and also political libertarians who profit from not caring to properly understand the tech that is pissing and shitting over the above cited for profit

think a lot of people don’t understand why it’s harmful to make problematic jokes and assume people only object to them because they’re ‘offended’

3 Likes

I’m thinking specifically of the original YouTube algorithm which boosted right wing engagement and all that because it just wanted to max out view time. Not Good

Kind of feel that saying ‘in the 2010s, Hitler memes and “ironic” racism filled the internet’ is a willful forgetting of how those kind of things were more prevalent in print and broadcast media during the 00s.

2 Likes

Thing is, 4chan was largely an isolated corner of the internet; a fucking horrific one, but most people wouldn’t ever come across it. See also Stormfront (existed since the 90s), the darker corners of Reddit and so on.

Facebook, Twitter, Youtube and all those never intended to monetise controversy - it’s not a goal, but rather a side effect of the conflict of one ideal that the American tech industry holds dear and a goal;

  1. We are a platform or aggregator, not a publisher. We believe in free speech and do not curate the content that our users choose to publish on us.
  2. Our main goal is to drive engagement. This means we will surface the content that will most likely drive you to stay on our site and take an action; preferably a stronger one such as a reply rather than a like.

There’s actually nothing inherently wrong with the second goal here (addiction worries aside) - on the face of it it’s a good and pure goal; show people things that they find engaging.

The problem is the utter abdication of responsibility in the first part - the failure to acknowledge that these platforms are in fact publishing the material their users push on there, their unwillingness to even try to engage with complex social norms and the inherent power their algorithms yield as they grow into behemoths that would have been unthinkable in the early 90s.

The tipping point of where they began to collide in a horrific way was probably somewhere around the beginning of the decade when Facebook increasingly selected content to put in the News Feed rather than just giving you largely uncurrated timeline of what your friends were up to. Similarly, Twitter switching to a currated timeline in 2015 made a hell of a difference in surfacing “engaging” stuff rather than a largely unfiltered experience.

Algothitms and AI are inherently neither good nor evil, but there’s a massive failing in much of my industry on the part of those who define and develop them to think through the wider implications of what are usually fairly narrow goals in the wider context they’re being placed and take responsibility for the massive social upheaval caused as a result.

16 Likes

This kind of stuff worries me a lot. I genuinely think that, if I’d been born just a few years later, there’s a chance I’d be borderline alt-righty.

1 Like

What I was trying to say with 4chan is that it was relatively small, but was very influential in terms of the shape of the internet in the 10s and how politics shaped itself. Not many came across it, but it was the engine of lots of what came after.

Another issue with the platform was that it had some very good forums for discussing music and film, which most likely radicalised many.

1 Like

I do often worry that it wouldn’t be impossible for DiS to have shifted that way too and whether, when it was my only income, whether I would have closed it down… there’s definitely parts of the old forum that I’m glad aren’t easily accessible any more.

2 Likes

Regarding responsibility though, this was a larger problem. The same mind that says this kind of algorithm has no drawbacks figures you can beat fascism by intellectual arguments.

Obviously all of that was somewhat upheld by how things stayed fairly centrist for so long previously, so it was just accepted that fringe stuff would die out.

1 Like

I’m not sure if things were centrist from the end of history in the 90s until recently, beyond a surface sheen of respectability. All that led the foundations for the ‘alt-right’, it’s all culminative.

You mean the period where the BNP were a joke who got like one tiny seat then lost it the next election as they were rubbish?

What I mean is that in general economic prosperity was just enough and people who’d lived through Hitler still significant enough that we didn’t see fascist populist stuff take a big hold. Hence people had this idea it just couldn’t happen and so they would never consider the algorithm needed moderating: the views would whither under the scrutiny of an assumed somewhat educated userbase

2 Likes

In the time you mention the DUP were in the ascendancy over here, religious zealots calling Islam satanic and advocating for homophobia. UK politics has basically just become Stormontised in the last few years (or the Conservatives have become… DUP-ed)

What I mean is, if you scratch below the surface of the UK post 1997 you’ll see that these ideas were never too far from the mainstream. Of course we all just rested on our laurels and believed in the market place of ideas that the blessed algorithm would allow for.

I think it really doesn’t help that there’s a whole pipeline these days from isolated teenage boy whose only outlet is the worst parts of internet → computer science degree → very well paid job in tech industry bubble working with pretty much identical people

Earning large amounts of money straight out of university in a very undiverse workplace doesn’t really give people life skills or insight into how life is for other people, and then they’re the people creating algorithms and systems that end up having a lot of power over other people’s lives. I saw it put once as “tech bros come up with things like an app to get someone to do your laundry, or apps to get takeaways delivered because they can’t cook, rather than say an app to help single parents co-ordinate childcare”

7 Likes

I just read / listened to Abolish Silicon Valley by Wendy Liu, which covers this sort of thing really well

Really recommend it for understanding how the tech industry as is can even warp people who go in with principles

4 Likes