Yes, LLM's do not have the human-level capacity to produce and utilize mental models. And it seems likely that if AGI is to be developed, it will have to include such a capacity for mental modelling.
How might such a capacity be installed in AI? A recent paper of mine that was published in the Journal Biosystems outlines how mental modell…
Yes, LLM's do not have the human-level capacity to produce and utilize mental models. And it seems likely that if AGI is to be developed, it will have to include such a capacity for mental modelling.
How might such a capacity be installed in AI? A recent paper of mine that was published in the Journal Biosystems outlines how mental modelling evolved and develops in humans. This understanding can facilitate insights into how this capacity could be incorporated into AI. The paper titled "The Evolution and Development of Consciousness: The Subject-Object emergence Hypothesis" is freely available here: https://www.sciencedirect.com/science/article/pii/S0303264722000752
Yes, LLM's do not have the human-level capacity to produce and utilize mental models. And it seems likely that if AGI is to be developed, it will have to include such a capacity for mental modelling.
How might such a capacity be installed in AI? A recent paper of mine that was published in the Journal Biosystems outlines how mental modelling evolved and develops in humans. This understanding can facilitate insights into how this capacity could be incorporated into AI. The paper titled "The Evolution and Development of Consciousness: The Subject-Object emergence Hypothesis" is freely available here: https://www.sciencedirect.com/science/article/pii/S0303264722000752
thank you or sharing, the evolutionary path is quite interesting to me.