I'm not sure I totally buy the "no plasticity" argument. If you are allowed to write to context in an agentic fashion, certainly the LLM can record intermediate answers, go back and re-rank it's memory. The "plasticity" is in the form of data that can be looked up and referenced as a shortcut. I would think this forms a Turing complete system so theoretically it can represent pretty much anything.
I'm not sure I totally buy the "no plasticity" argument. If you are allowed to write to context in an agentic fashion, certainly the LLM can record intermediate answers, go back and re-rank it's memory. The "plasticity" is in the form of data that can be looked up and referenced as a shortcut. I would think this forms a Turing complete system so theoretically it can represent pretty much anything.
no that's not the same thing. plasticity means it starts giving a different output for the same input, because the "logic" changed.
What you're describing is simply changing the input.
I’m interested in the subject, but it’s clear that an LLM did most of the writing, and I don’t know enough of the subject to tell if this is nonsense.
Interesting