Yeah, because theory of mind is more than just keeping records on what's been said.Goodspeed wrote: ↑21 Apr 2023, 07:57You have this tendency to simplify these processes when an AI does it, but to consider it special when a human does it. What you've just described is exactly how we ourselves parse this information and come up with an answer.I took a look at this video. What cracked me up was seeing how superficially he declared that ChatGPT has a theory of mind because it could interpret based on textual statements what each protagonist thought. But that's just simply parsing what each interlocutor said at some point, what they witnessed and how this could be assigned to their mind-variable.
Two mates walking on the sidewalk, they see a guy with a big mass of frizzy hair and a confused look, they look at each other and start laughing hard. That's theory of mind too and something that an AI couldn't understand unless it was specifically and artificially programmed to read video data, detect facial expressions and behaviour and have some software that adds emotional scores and weights to each facial expression and sequence of gestures and behaviours. Then based on how the situation unfolds, it would be able to construct some kind of emotional logic to this sitution from which it would artificially "read" the emotional context. But it would have no chance if those guys started laughing based on an in-joke, something only they knew and used to make fun of, some joke about that kind of hair or something.
Theory of mind also entails reading someone else's state of mind, realising that they also find funny or sus what you find funny or sus. An AI, again, could never live this, only read it autistically based on a program made to parse gestures and facial expressions as digital DATA. We do this without even thinking, it's done automatically for us by the brain's amygdala, superfast, in just a few tens of milliseconds.