L4sBot@lemmy.worldMB to Technology@lemmy.worldEnglish · 11 months ago2023 was the year that GPUs stood stillarstechnica.comexternal-linkmessage-square96fedilinkarrow-up1314arrow-down18file-textcross-posted to: gadgets@lemmit.onlinehackernews@lemmy.smeargle.fanshackernews@derp.footechnology@lemmit.online
arrow-up1306arrow-down1external-link2023 was the year that GPUs stood stillarstechnica.comL4sBot@lemmy.worldMB to Technology@lemmy.worldEnglish · 11 months agomessage-square96fedilinkfile-textcross-posted to: gadgets@lemmit.onlinehackernews@lemmy.smeargle.fanshackernews@derp.footechnology@lemmit.online
2023 was the year that GPUs stood still::A new GPU generation did very little to change the speed you get for your money.
minus-squarebarsoap@lemm.eelinkfedilinkEnglisharrow-up1·11 months agoHave you tried ML workloads, differently put: How is compatibility with stuff that expects CUDA/ROCm? Because the A770 is certainly the absolutely cheapest way to get 16G nowadays.
minus-squareCalcProgrammer1@lemmy.mllinkfedilinkEnglisharrow-up1·11 months agoNo, I don’t use any ML stuff or really anything that uses GPU compute at all. I just use it for gaming and other 3D applications.
Have you tried ML workloads, differently put: How is compatibility with stuff that expects CUDA/ROCm? Because the A770 is certainly the absolutely cheapest way to get 16G nowadays.
No, I don’t use any ML stuff or really anything that uses GPU compute at all. I just use it for gaming and other 3D applications.