The problem with saying LLMs are AI - let alone a step towards AGI - is that they cannot create.
I’m not sure if there’s an intrinsic difference between humans and LLMs here. What we, including children, do is just re-hashing, re-combinating what they’ve seen / heard. I think it would be very difficult to prove that people come up with completely brand-new ideas without any external inspiration (= training input).
The examples are not really convincing of your point. The GPT output is pretty good given the ask, I’m not sure if my daughter would fare better.
I’m not sure if there’s an intrinsic difference between humans and LLMs here. What we, including children, do is just re-hashing, re-combinating what they’ve seen / heard. I think it would be very difficult to prove that people come up with completely brand-new ideas without any external inspiration (= training input).
The examples are not really convincing of your point. The GPT output is pretty good given the ask, I’m not sure if my daughter would fare better.