A very NSFW website called Pornpen.ai is churning out an endless stream of graphic, AI-generated porn. We have mixed feelings.

  • funkless_eck@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    77
    ·
    9 months ago

    “eh I’ll take a look”

    first thing I see is a woman on her back with her legs behind her head, smooth skin where her genitals should be and nipples in the middle of her buttocks.

    “alright then”

  • sramder@lemmy.world
    link
    fedilink
    English
    arrow-up
    37
    arrow-down
    6
    ·
    9 months ago

    People who insist on real flesh porn will ultimately be viewed as weirdo’s out of touch with reality like people who insist everything sounds better on vinyl.

    Fast forward 25 years past the first Ai war and a ragged but triumphant humanity must rediscover the lost art of waxing.

    • Harpsist@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 months ago

      Why would I want to encourage the flesh trade where real women are hurt? And are limited to what humans are physically capable of?

      When I can have AI generated people who are able to do anything imaginable and no one gets hurt?

      They’ll be arguments that ‘once people get used to the fantasies they’ll want to try it in real life’ but we all know that that just isn’t true fr 40 years of video games. There hasn’t been any uptick in the events of people eating mushrooms and jumping on turtles or - what ever the fuck a goomba is -

  • just_another_person@lemmy.world
    link
    fedilink
    English
    arrow-up
    29
    ·
    9 months ago

    At what point was porn NOT graphic, but now this thing IS GRAPHIC. Are we talking all caps, or just a small difference between the live stuff and the AI shit? Inquiring minds want to know.

  • cley_faye@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    ·
    9 months ago

    “Are we ready”, in the sense that for now it’s 95% garbage and 5% completely generic but passable looking stuff? Eh.

    But, as this will increase in quality, the answer would be… who cares. It would suffer from the same major issues of large models : sourcing data, and how we decide the rights of the output. As for it being porn… maybe there’s no point in focusing on that specific issue.

  • randon31415@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    ·
    9 months ago

    When I first heard Stable Diffusion was going open source, I knew this would happen. The only thing I’m surprised at is that it took almost 2 years.

    • lloram239@feddit.de
      link
      fedilink
      English
      arrow-up
      9
      ·
      9 months ago

      It went quite a bit faster than that. StableDiffusion has only been out for about 13 months and this started about three months after that with Unstable Diffusion. What this article is reporting on is already quite a few months old and quite a bit behind what you can do with a local install of StableDiffusion/Automatic1111/ControlNet/etc. (see CivitAI).

  • Sume@reddthat.com
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    4
    ·
    9 months ago

    Not sure how people will be so into this shit. It’s all so generic looking

    • BreakDecks@lemmy.ml
      link
      fedilink
      English
      arrow-up
      26
      ·
      9 months ago

      The actual scary use case for AI porn is that if you can get 50 or more photos of the same person’s face (almost anyone with an Instagram account), you can train your own LoRA model to generate believable images of them, which means you can now make “generic looking” porn with pretty much any person you want to see in it. Basically the modern equivalent of gluing cutouts of your crush’s face onto the Playboy centerfold, only with automated distribution over the Internet…

      • lloram239@feddit.de
        link
        fedilink
        English
        arrow-up
        20
        ·
        9 months ago

        Using a LoRA was the old way, these days you can use Roop, FaceSwapLab or ReActor, which not only can work with as little as a single good photo, they also produce better locking results than LoRA. There is no time consuming training either, just drag&drog an image and you get results in a couple of seconds.

        • pinkdrunkenelephants@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          11
          arrow-down
          7
          ·
          9 months ago

          So how will any progressive politician be able to be elected then? Because all the fascists would have to do is generate porn with their opponent’s likeness to smear them.

          Or even worse, deepfake evidence of rape.

          Or even worse than that, generate CSAM with their likeness portrayed abusing a child.

          They could use that to imprison not only their political opponents, but anyone for anything, and people would think whoever is being disappeared this week actually is a pedophile or a rapist and think nothing of it.

          Actual victims’ movements would be chopped off at the knee, because now there’s no definitive way to prove an actual rape happened since defendants could credibly claim real videos are just AI generated crap and get acquitted. No rape or abuse claims would ever be believed because there is now no way to establish objective truth.

          This would leave the fascists open to do whatever they want to anybody with no serious consequences.

          But no one cares because they want AI to do their homework for them so they don’t have to think, write, or learn to be creative on their own. They want to sit around on their asses and do nothing.

          • hyperhopper@lemmy.ml
            link
            fedilink
            English
            arrow-up
            7
            ·
            9 months ago

            People will have to learn to stop believing everything they see. This has been possible with Photoshop for even more than a decade now. All that’s changed is that it takes less skill and time now.

            • pinkdrunkenelephants@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              5
              ·
              9 months ago

              That’s not possible with AI-generated images impossible to distinguish from reality, or even expertly done photoshops. The practice, and generative AI as a whole, needs to be banned. They’re putting AI in photoshop too so ban that garbage too.

              It has to stop. We can’t allow the tech industry to enable fascism and propaganda.

                • pinkdrunkenelephants@sopuli.xyz
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  5
                  ·
                  9 months ago

                  Nah, that Thanos I-am-inevitable shit doesn’t work on me. They can ban AI, you all just don’t want it because generative AI allows you to steal other people’s talent so you can pretend you have your own

              • CoolCat38@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                arrow-down
                2
                ·
                9 months ago

                Can’t tell whether this is bait or if you are seriously that much of a Luddite.

                • pinkdrunkenelephants@sopuli.xyz
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  arrow-down
                  2
                  ·
                  9 months ago

                  Oh look at that, they just released pictures of you raping a 4-year-old, off to prison with you. Never mind they’re not real. That’s the world you wanted and those are the consequences you’re going to get if you don’t stop being lazy and learn to reject terrible things on ethical grounds.

          • Silinde@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            edit-2
            9 months ago

            Because that’s called Libel and is very much illegal in practically any country on earth - and depending on the country it’s either easy or trivial to put forth and win a case of libel in court, since it’s the onus of the defendant to prove what they said was entirely true, and “just trust me and this actress I hired, bro” doesn’t cut it.

              • Silinde@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                ·
                9 months ago

                The burden of liability will then fall on the media company, which can then be sued for not carrying out due dilligance in reporting.

          • Liz@midwest.social
            link
            fedilink
            English
            arrow-up
            4
            ·
            9 months ago

            We’re going to go back to the old model of trust, before videos and photos existed. Consistent, coherent stories from sources known to be trustworthy will be key. Physical evidence will be helpful as well.

            • pinkdrunkenelephants@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              9 months ago

              But then people will say “Well how do we know they’re not lying?” and then it’s back to square 1.

              Victims might not ever be able to get justice again if this garbage is allowed to continue. Society’s going so off-track.

    • Psythik@lemm.ee
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      6
      ·
      9 months ago

      AI is still a brand new tech. It’s like getting mad at AM radio for being staticy and low quality. It’ll improve with time as we get better tech.

      Personally I can’t wait to see what the future holds for AI porn. I’m imagining being able to get exactly what you want with a single prompt, and it looks just as real as reality. No more opening 50 tabs until you find the perfect video. Sign me the fuck up.

  • themeatbridge@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    1
    ·
    9 months ago

    Does it say something about society that our automatons are better at creating similated genitals than they are at hands?

    • douglasg14b@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      edit-2
      9 months ago

      It says that we are biologically predisposed to sex, which we are, like animals, which we are.

      It doesn’t say anything about society, it just confirms the human condition.

    • lloram239@feddit.de
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      9 months ago

      They suck quite a lot at genitals too. But what makes hands especially tricky is simply that they are pretty damn complex. A hand has five fingers that can all move independently, the hand can rotate in all kinds of way and the individual parts of a hand can all occlude each other. There is a lot of stuff you have to get right to produce a good looking hand and it is especially difficult when you are just a simple 2D algorithm that has little idea of 3D structure or motion.

    • Bop@lemmy.film
      link
      fedilink
      English
      arrow-up
      2
      ·
      9 months ago

      On a visual level, we are more interested in genitals than hands? Also, faces.

  • joelfromaus@aussie.zone
    link
    fedilink
    English
    arrow-up
    12
    ·
    9 months ago

    Went and had a look and it’s some of the funniest stuff I’ve seen all day! A few images come close to realism but a lot of them are the sort AI fever dream stuff that you could not make up.

  • RBWells@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    1
    ·
    9 months ago

    Meh. It’s all only women and so samey samey. Not sexy IMO, but I don’t think fake is necessarily not hot, art can be, certainly.

    • Zerfallen@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      9 months ago

      You can change it to men, but most of the results are intersex(?) or outright women anyway. I guess the training data is heavily weighted toward examples of women.

  • Armen12@lemm.ee
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    3
    ·
    9 months ago

    AI porn for the longest time has just looked so off to me, idk what it is

    • Buddahriffic@lemmy.world
      link
      fedilink
      English
      arrow-up
      29
      arrow-down
      1
      ·
      9 months ago

      Some things I was able to put my finger on after looking at a bunch of the images in the feed:

      • It doesn’t do skin well, treating it more like a smooth plastic than a surface with pores, wrinkles, and fine hairs.
      • It doesn’t understand lighting, so shadows don’t agree with highlights or even each other.
      • It doesn’t understand anatomy. A lot of the images were fine in this regard but others had misplaced muscles, bones, and impossible limb positioning/truncation.
      • It has no idea how to draw vaginas. Nipples are also not well understood, though it does better on average with those. They still look more like a plastic than skin, but most of them were passable at least, while I didn’t see a single vagina that looked even close to right.
      • Cethin@lemmy.zip
        link
        fedilink
        English
        arrow-up
        20
        ·
        9 months ago

        To be fair, so many humans drawing porn don’t understand anatomy or what vaginas look like. It’s hard to train when your input data is already bad.

        • Buddahriffic@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          ·
          9 months ago

          Yeah, a lot of the vaginas on that site look like hentai vaginas. I can understand it more with AIs since vaginas have a ton of variance to them, so trying to make an “average” vagina will end up not looking like any actual one.

          But the artists that draw them like that just disappoint me (though with the caveat that I have no reason to believe I’d do any better if I were to draw that). There’s a ton of inspiration out there and many do an amazing job with the rest of the body, why do they make one of the important parts (for porn) look like an afterthought?

          Though the anime style ones aren’t as bad as some others where the AI seems like it’s trying to average boy and girl parts lol.

          • Spaz@lemmy.world
            cake
            link
            fedilink
            English
            arrow-up
            6
            ·
            9 months ago

            Tbh, if people don’t know the difference from a vagina and a vulva, I don’t expect AI to do a great job generating any good porn.

            • Buddahriffic@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              9 months ago

              Using a term casually rather than medically won’t affect the quality of AI porn. Though maybe ensuring it knows the difference between a labia, clit, urethra, vagina, and asshole will produce better results.

          • Piecemakers@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            9 months ago

            To be fair, the focus of this era’s porn has hardly ever been the vagina, under any amount of scrutiny. A few aspects of sex in porn prioritized higher are, in no particular order: bounce, sounds, phrasing, setting, texture, animism/passion, power roles, etc. Hell, I’d rather that more effort has been made to visually hide prophylactic use than to focus on the vagina itself.

            I’m not in any way saying I agree with this, simply pointing out the facts as they are these days.

    • lloram239@feddit.de
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      9 months ago

      The images on the site aren’t very good (typical low-detail airbrushed-look) nor are they generated very fast. See the examples here (mostly SFW) on what you can actually do, it takes about 15sec per image on a mid range gaming PC.

      That said, one big limit of current AI models remains: It’s always images of a single subject, it can’t do multiple subjects or complex interaction. Also facial expressions still always look quite bland. It can be worked around with inpainting and stuff, but plain text prompts have a hard time generating interesting images.

    • drekly@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      9 months ago

      CivitAI is a pretty perverted site at the best of times. But there’s a disturbing amount of age adjustment plugins to make images of children on the same site they have plugins to make sex acts. It’s clear some people definitely are.

      • oats@110010.win
        link
        fedilink
        English
        arrow-up
        4
        ·
        9 months ago

        Some models also prefer children for some reason and then you have to put mature/adult in positive prompt and child in negative

        • lloram239@feddit.de
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          9 months ago

          I think part of the problem is that there is a lot of anime in the models and when you don’t filter that out with negative prompts it can distort the proportions of realistic images (e.g. everybody gets huge breasts unless you negative prompt it away). In general models are always heavily biased towards what they were trained on, and when you use a prompt or LORA that worked well on one model on another, you can get weird results. There is always a lot of nudging involved with keywords and weights to get the images to were you want it.

    • inspxtr@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      9 months ago

      I remember reading that this may be already happening to some extent, eg people sharing tips on creating it on the deep web, maybe through prompt engineer, fine tuning or pretraining.

      I don’t know how those models are made, but I do wonder the ones that need retraining/finetuning by using real csam can be classified as breaking the law.

        • JackbyDev@programming.dev
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          9 months ago

          If a search engine cannot index it then it is the deep web. So yes, Discord chats are technically part of the deep web.

            • JackbyDev@programming.dev
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              edit-2
              9 months ago

              Wikipedia on the deep web

              The deep web,[1] invisible web,[2] or hidden web[3] are parts of the World Wide Web whose contents are not indexed by standard web search-engine programs.

              Try accessing a Discord channel through your browser without being logged in. They aren’t indexed by search engines because you have to be logged in.

                • JackbyDev@programming.dev
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  9 months ago

                  I don’t care about some arbitrary challenge to get money from you. I’m trying to get you to think critically. If search engines like Google don’t index it then it’s part of the deep web. Just because things like Discord aren’t what people typically mean when people talk about the deep web doesn’t make Discord chats not part of the deep web.

    • Rustmilian@lemmy.world
      link
      fedilink
      arrow-up
      8
      arrow-down
      1
      ·
      edit-2
      9 months ago

      Hentai maybe. But realistic shit is 100% illegal, even just making such an AI would require breaking the law as you’d have to use real CSAM to train it.

    • mrnotoriousman@kbin.social
      link
      fedilink
      arrow-up
      5
      ·
      9 months ago

      There was an article the other day about underage girls in France having AI nudes spread around based on photos as young as 12. Definitely harm there.

      • Jesus_666@feddit.de
        link
        fedilink
        English
        arrow-up
        4
        ·
        9 months ago

        Typically, the laws get amended so that anything that looks like CSAM is now CSAM. Expect porn generators tuned for minor characters to get outlawed very quickly.

    • Knusper@feddit.de
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      8
      ·
      9 months ago

      Well, to develop such a service, you need training data, i.e. lots of real child pornography in your possession.

      Legality for your viewers will also differ massively around the world, so your target audience may not be very big.

      And you probably need investors, which likely have less risky projects to invest into.

      Well, and then there’s also the factor of some humans just not wanting to work on disgusting, legal grey area stuff.

      • Womble@lemmy.world
        link
        fedilink
        English
        arrow-up
        19
        ·
        9 months ago

        yup, just like the ai needed lots of pictures of astronaughts on horses to make pictures of those…

        • JonEFive@midwest.social
          link
          fedilink
          English
          arrow-up
          6
          ·
          9 months ago

          Exactly. Some of these engines are perfectly capable of combining differing concepts. In your example, it knows basically what a horse looks like, and what a human riding on horseback looks like. It also knows that an astronaut looks very much like a human without a space suit and can put the two together.

          Saying nothing of the morality, In this case, I suspect that an AI could be trained using pictures of clothed children perhaps combined with nude images of people who are of age and just are very slim or otherwise have a youthful appearance.

          While I think it’s repugnent in concept, I also think that for those seeking this material, I’d much rather it be AI generated than an actual exploited child. Realistically though, I doubt that this would actually have any notable impact to the prevalence of CSAM, and might even make it more accessible.

          Furthermore, if the generative AI gets good enough, it could make it difficult to determine whether an image is real or AI generated. That would make it more difficult for police to find the child and offender to try to remove them from that situation. So now we need an AI to help analyze and separate the two.

          Yeah… I don’t like living in 2023 and things are only getting worse. I’ve put way more thought into this than I ever wanted to.

          • Ryantific_theory@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            9 months ago

            Aren’t AI generated images pretty obvious to detect from noise analysis? I know there’s no effective detection for AI generated text, and not that there won’t be projects to train AI to generate perfectly realistic images, but it’ll be a while before it does fingers right, let alone invisible pixel artifacts.

            As a counterpoint, won’t the prevalence of AI generated CSAM collapse the organized abuse groups, since they rely on the funding from pedos? If genuine abuse material is swamped out by AI generated imagery, that would effectively collapse an entire dark web market. Not that it would end abuse, but it would at least undercut the financial motive, which is progress.

            That’s pretty good for 2023.

            • JackbyDev@programming.dev
              link
              fedilink
              English
              arrow-up
              2
              ·
              9 months ago

              With StableDiffusion you can intentionally leave an “invisible watermark” that machines can easily detect but humans cannot see. The idea being that in the future you don’t accidentally train on already AI generated images. I’d hope most sites are doing that but it can be turned off easily enough. Apart from that I’m not sure.

              • Ryantific_theory@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                9 months ago

                I could have sworn I saw an article talking about how there were noise artifacts that were fairly obvious, but now I can’t turn anything up. The watermark should help things, but outside of that it looks like there’s just a training dataset of pure generative AI images (GenImage) to train another AI to detect generated images. I guess we’ll see what happens with that.

      • d13@programming.dev
        link
        fedilink
        English
        arrow-up
        5
        ·
        9 months ago

        Unfortunately, no, you just need training data on children in general and training data with legal porn, and these tools can combine it.

        It’s already being done, which is disgusting but not surprising.

        People have worried about this for a long time. I remember a subplot of a sci-fi series that got into this. (I think it was The Lost Fleet, 15 years ago).

    • 👁️👄👁️@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      9 months ago

      You’d also have to convince them that it’s not real. It’ll probably end up creating laws tbh. Then there are weird things like Japan where lolis are legal, but uncensored genitals aren’t, even drawn.