Yet the LLMs do seem to be able to do that, even though I cannot imagine how it would possible that they do that. Is it just a matter of parroting so sophisticated as to be able to fool us into thinking they do understand context?
It begs the question as to how we grasp context, and I don't have an answer for that, but can only think that it must somehow be a matter of feeling. I can't imagine a computer having a feeling for context—but then what do I know? — Janus
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.