Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

  • BlueMonday1984@awful.systems
    link
    fedilink
    English
    arrow-up
    11
    ·
    4 months ago

    Not a sneer, but an observation on the tech industry from Baldur Bjarnason, plus some of my own thoughts:

    I don’t think I’ve ever experienced before this big of a sentiment gap between tech – web tech especially – and the public sentiment I hear from the people I know and the media I experience.

    Most of the time I hear “AI” mentioned on Icelandic mainstream media or from people I know outside of tech, it’s being used as to describe something as a specific kind of bad. “It’s very AI-like” (“mjög gervigreindarlegt” in Icelandic) has become the talk radio short hand for uninventive, clichéd, and formulaic.

    Baldur has pointed that part out before, and noted how its kneecapping the consumer side of the entire bubble, but I suspect the phrase “AI” will retain that meaning well past the bubble’s bursting. “AI slop”, or just “slop”, will likely also stick around, for those who wish to differentiate gen-AI garbage from more genuine uses of machine learning.

    To many, “AI” seems to have become a tech asshole signifier: the “tech asshole” is a person who works in tech, only cares about bullshit tech trends, and doesn’t care about the larger consequences of their work or their industry. Or, even worse, aspires to become a person who gets rich from working in a harmful industry.

    For example, my sister helps manage a book store as a day job. They hire a lot of teenagers as summer employees and at least those teens use “he’s a big fan of AI” as a red flag. (Obviously a book store is a biased sample. The ones that seek out a book store summer job are generally going to be good kids.)

    I don’t think I’ve experienced a sentiment disconnect this massive in tech before, even during the dot-com bubble.

    Part of me suspects that the AI bubble’s spread that “tech asshole” stench to the rest of the industry, with some help from the widely-mocked NFT craze and Elon Musk becoming a punching bag par excellence for his public breaking-down of Twitter.

    (Fuck, now I’m tempted to try and cook up something for MoreWrite discussing how I expect the bubble to play out…)

    • YourNetworkIsHaunted@awful.systems
      link
      fedilink
      English
      arrow-up
      7
      ·
      4 months ago

      The active hostility from outside the tech world is going to make this one interesting, since unlike crypto this one seems to have a lot of legitimate energy behind it in the industry even as it becomes increasingly apparent that even if the technical capability was there (e.g. the bullshit problems could be solved by throwing enough compute and data at the existing paradigm, which looks increasingly unlikely) there’s no way to do it profitably given the massive costs of training and using these models.

      I wonder if we’re going to see any attempts to optimize existing models for the orgs that have already integrated them in the same way that caching a web page or indexing a database can increase performance without doing a whole rebuild. Nvidia won’t be happy to see the market for GPUs fall off, but OpenAI might have enough users of their existing models that they can keep operating even while dramatically cutting down on new training runs? Does that even make sense, or am I showing my ignorance here?