• 0 Posts
  • 1.82K Comments
Joined 11 months ago
cake
Cake day: August 9th, 2023

help-circle


  • Not quite. The issue is that LLMs aren’t designed to solve math, they are designed to “guess the next word” so to speak. So if you ask a “pure” LLM it what 1 + 1 is, it will simply spit out the most common answer.

    LLMs with integrations/plugins can likely manage pretty complex math, but only things that something like wolfram alpha could already solve for. Because it’s essentially just going to poll an external service to get the answers being looked for.

    At no point is the LLM going to start doing complex calculations on the CPU currently running the LLM.










  • Let’s say notifications are like walkie-talkies. You push a button, it sends an alert or your voice to the paired device. Neither one is storing the information, they are just relaying to each other. Now, in this case the government has issued a court order stating that a third party be given a walkie-talkie with the ability to understand the information transmitted by the first. There is still no storage being done, but a second party now receives all the information being broadcast.

    It’s not about not having the information. You don’t actually need to store it anywhere to facilitate communication, at least beyond it being in memory which most would agree doesn’t constitute storage in this situation.

    Now, could that third party store the information? Absolutely.