Most people assume it must work like human brains and human consciousness. Can it not just be it’s own thing with the qualities it has and ones it doesn’t?
LLM clearly don’t have a stateful human like consciousness but do have some semantic understanding and build a world model when they are large enough. Image models have some grasp of 3d space.
They are neither sentient nor a stochastic parrot.
Most people assume it must work like human brains and human consciousness. Can it not just be it’s own thing with the qualities it has and ones it doesn’t?
LLM clearly don’t have a stateful human like consciousness but do have some semantic understanding and build a world model when they are large enough. Image models have some grasp of 3d space.
They are neither sentient nor a stochastic parrot.