I want to see AI papers compared to today, we basically tore the guts out of everything, I don’t even think most of Minsky is applicable anymore (perceptrons particularly have been replaced with vector meshes from word2vec).
Statistical modeling and machine learning theory goes back several decades. I’m not sure LLM’s even use new algorithms. They may just apply various techniques that improve the performance and accuracy of pre-existing algorithms.
I want to see AI papers compared to today, we basically tore the guts out of everything, I don’t even think most of Minsky is applicable anymore (perceptrons particularly have been replaced with vector meshes from word2vec).
Statistical modeling and machine learning theory goes back several decades. I’m not sure LLM’s even use new algorithms. They may just apply various techniques that improve the performance and accuracy of pre-existing algorithms.