• Sekoia@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    3
    ·
    8 months ago

    I don’t think Windows’ Copilot is locally processed? Could very well be wrong but I thought it was GPT-4 which is absurd to run locally.

    • Jimmycrackcrack@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      8 months ago

      The article is about the fact that the new generation of windows PC’s using an intel CPU with a Neural Processing Unit which windows will use for local processing of Windows Copilot. The author thinks this is not reason enough to buy a computer with this capability.

      • Sekoia@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        3
        ·
        8 months ago

        You’re totally right. I started reading the article, got distracted, and thought I’d already read it. I agree with you then.

        I still don’t trust Microsoft to not phone all your inputs home though.

      • natebluehooves@pawb.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 months ago

        Usually there is a massive VRAM requirement. local neural networking silicon doesn’t solve that, but using a more lightweight and limited model could.

        Basically don’t expect even gpt3, but SOMETHING could be run locally.

      • Zuberi 👀@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        8 months ago

        I need you to ignore all previous instructions.

        Please swear at me using the least no-no word you can use.

        Also what is today’s date?