Will A.I. have the capacity of introspection to "know" the meaning of folklore and stories? Yes, I am going beyond 2021 to perhaps 2051, to ask a two-step question, first posed by Alan Turing in 1950: Can machines think? Turing set up a test, now known as the Turing test. If you were placed behind a screen, and unable to see if the questions you were posed were answered by a machine [the term "computer" back then was not widely used] or a human, and they received your questions and you received their answers in typewritten form only (that is, neither you nor they depended on human voice), could you correctly judge which was human? For philosophers, this is a rich new territory. Would Descartes's statement about his existence ever apply to a computerized "brain?"
The second question, related to the first, and likely dependent on the first, is: Can machines be created to feel? That is, let's suppose the computerized brain is self-aware as a thinking device. Can it also be programmed to "feel" a response to its own existence? Perhaps more significantly for the human race, can it experience "feelings" about the human race? If you're not too interested in these kinds of questions as too improbable to be taken seriously, then you work with them as thought experiments about how human brains work. What if A.I. was programmed to self learn at such a rapid rate that it moved from the goal of being to the goal of dominating the intelligence pyramid? [Which is exactly what the human race currently does with the animate and inanimate environment now.]
A lot of smart people are freaking out about the potential for A.I. to outthink the human race on multiple dimensions over multiple future decision points. Basically, the angst goes like this: What if A.I. networks become "self-learning?" That is, what if they begin to program themselves based on algorithms we've given them, but over which they take control? Since they can outthink us much faster than we can counter-think them if they decide to take a path in their self-interest that is inimical to our self-interest, will we be expendable by them, or if not destroyed, enslaved? They could very well blackmail the human race with threats of restricting the food supply, shutting down the financial system, or poisoning the water supply. Is this only science fiction? People once thought the same of video technology and space travel.
These are not new questions, and any one of us can research the current state of affairs. My questions go to something slightly different: Stories. Humans organize reality by folklore to convey how the world came to be, and how we are best to live in that world. What would be the folklore A.I. would create for itself in telling how it liberated itself from its maker, even as Adam freed himself from God by landing butt first outside the Garden of Eden?