• TommySoda@lemmy.world
    link
    fedilink
    arrow-up
    74
    arrow-down
    7
    ·
    edit-2
    13 days ago

    Let’s not do anything about the unregulated technology that can spread lies faster than ever before as websites get absolutely flooded with believable bots that outnumber the actual users. Let’s make secret passwords and handshakes like we’re in a clubhouse.

    Regardless, it’s not a bad idea since it’s probably not gonna get better for awhile if at all.

    • Ledivin@lemmy.world
      link
      fedilink
      arrow-up
      30
      arrow-down
      1
      ·
      13 days ago

      The technology is out. While something should be done on that side of things, it also doesn’t remove the technology from existence - you will still need other protections.

    • rudyharrelson@lemmy.radio
      link
      fedilink
      arrow-up
      9
      arrow-down
      1
      ·
      13 days ago

      Regulations virtually always lag years behind technology, don’t they? In the interim period with absolutely no regulations, we must take it upon ourselves to protect ourselves and loved ones from being exploited.

      Given just how wealthy the AI bubble is making some people, we may not see any common sense regulation for quite some time. Best to adapt to that reality imo. Gonna tell my friends and family to call me by my hacker alias, “X360N0_sc0peX” on the phone or I’ll assume they’re a bot.

    • zecg@lemmy.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      12 days ago

      What can be done, you can download an LLM and run it locally, they’re not going away

      • kibiz0r@midwest.social
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        13 days ago

        vigilance

        Vigilance is like, not drinking the water that comes out of a nuclear reactor.

        What we’re talking about here is letting everyone run their own reactor and dump the waste into the street.

        You don’t gain vigilance, you lose all habitable public space.

        • TechLich@lemmy.world
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          12 days ago

          It’s a bit late for that. This particular nuclear reactor is open source, free to download and runs on consumer hardware. Can’t really unfry that egg and the quality is getting better all the time. Identity fraud is already illegal in most places so not sure exactly what regulation would be appropriate here.

          • phneutral@feddit.org
            link
            fedilink
            arrow-up
            1
            ·
            12 days ago

            First of all: you need giant data centres to train the models.

            Identity fraud is illegal, copyright theft is illegal as well — put the blame on the owner of the data centres.

            I know from valid sources that governments know who theses folks are.

            • TechLich@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              12 days ago

              Not entirely true. You don’t need your own personal data centre, you can use GPU cloud instances for a lot of that stuff. It’s expensive but not so expensive that it would be impossible without being a huge tech company (only 1000s of dollars, not billions). This can be done by anyone with a credit card and some cash to burn. Also, you don’t need to train a model from scratch, you can build on existing models that others have published to cut down on training.

              However, to impersonate someone’s voice you don’t need any of that. You only need about 5-10 seconds of audio for a zero-shot impersonation with a pre-trained model. A minute or so for few-shot. This runs on consumer hardware and in some cases even in real time.

              Even to build your own model from scratch for high quality voice audio, there doesn’t need to be a huge amount of initial training data. Something like xtts was trained with about 10-15K hours of English audio which is actually pretty easy to come by in the public domain. There are a lot of open and public research datasets specifically for this kind of thing, no copyright infringements necessary. If a big tech company wants more audio data than what’s publically available, they just pay people to record audio, no need to steal it or risk copyright claims and breaking surveillance laws, they have a budget to exploit people to record whatever they want.

              This tech wasn’t invented by some evil giant tech company stealing everybody’s data, it was mostly geeky computer scientists presenting things at computer speech synthesis conferences. That’s not to say there aren’t a bunch of huge evil tech companies profiting from this or contributing to this kind of tech, but in the context of audio deepfakes being accessible to scammers, it’s not on them and I don’t think that some kind of extra copyright regulation on data centres would do anything about it.

              The current industry leader in this space in terms of companies trying to monetize speech synthesis is elevenlabs which is a private start-up with only a few dozen employees.

              The current tech is not perfect but definitely good enough to fool someone who isn’t thinking too hard over a noisy phone call and a scammer doesn’t need server time or access to a data centre to do it.

    • arglebargle@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      12 days ago

      Websites have been full of shit, bots or not, since forever. Nothing new here.