But do we know we are not Markov chains? Are not subplots and character arcs short time horizons? Don't we parrot stuff and create derivative products? Whatever the differences may be, GPT can explain an idea to me better than most people on the street. GPT can compose a sonnet better than most people on the street. It can compose faster than me, and can synthesize more subject matter than I can. Soon, GPT will be followed by an AI that can explain an idea and write poetry better than everyone on the street. It won't matter how, it will just matter that it does. I doubt that computers play chess in a manner like we humans do, but they always win chess now. We can argue that they aren't really playing chess, but how strong is the argument that we are? We might have created a new manner of thinking. It isn't our manner of thinking that resulted from biological processes, so it most likely must be different, but it also doesn't share the same limitations. I expect that any characteristic that humans can point to can be grokked and optimized for by AI.