How can we ask “are we closer” if we don’t know what the destination is?
LLMs might still end up being an interesting special-purpose system, perhaps with fairly broad applications, but in a direction that’s different from where true AGI ends up coming from.
That seems besides the point when the question is about wether we’re getting closer to it or not.
How can we ask “are we closer” if we don’t know what the destination is?
LLMs might still end up being an interesting special-purpose system, perhaps with fairly broad applications, but in a direction that’s different from where true AGI ends up coming from.