Initially, a 'new' AGI will surely base what it labels it's current 'knowledge' maximums, or what it is most confident that it knows for sure, (for want of a better way to explain myself here.) on it's previously stored knowledge and it's stored knowledge will include a description of what a human consciousness is.I imagine that what I call "AGI —> ASI" will never anthropomorphize itself, no matter how perfectly it will mimic humans, to the degree it engineers its own 'synthetic phenomenology', in effect, dumbing itself down with a metacognitive blindfold (i.e. keyhole). — 180 Proof
If we do find that is the reality of an exchange between us then sure, we should pause, regroup, and see if we can find a better common ground which offers some value to both of us. If not, then we should 'pause' again and find a more fruitful exchange, somewhere down the line.and we'll just have to go on talking – speculating on incommensurable data sets – past one another. — 180 Proof
Confirms for me that you do agree that the concept of self, does not in your view (in line with Mr Metzinger,) manifest as a REAL existent.I would not have recommended Metzinger's work several times if I do not find it compelling as corroborating my own speculations — 180 Proof
So, you want me to stop doing something that I did not do? In what way do you conflate:Don't put the sign ''=" between you and ↪180 Proof when it comes to me. — Eugen
withI see some crossover between our discussion here, and your discussion with Eugen on the Consciousness - Fundamental or Emergent Model thread. — universeness
:roll:Don't put the sign ''=" between you and ↪180 Proof when it comes to me. — Eugen
I'm not sure I'm a novice to 180Proof, and I do understand your answers. So when you tried to compare your "novice" with me (wether in regard to you or him), I think you're wrong.My experience as a teacher however puts the burden of patience on me. I only get really frustrated with a novice, if they are asking me questions, but constantly demonstrate an inability to understand my answers, or do understand my answers but refuse to accept the academia behind them, without good reason. — universeness
- My bad, so don't worry!You have simply misunderstood my reference to you and your recent thread. Let me clarify.
My use of the word 'novice' in my response to 180proof, contained no stealth intent to relate IN ANY WAY, to you. — universeness
As for summarizing ... that's all I've been doing in our exchanges on this topic over dozens of posts. We're here to inform, maybe inspire & intrigue, not spoon-feed each other. — 180 Proof
"Who am I?" A persona (mask) – a dynamic, virual assemblage of perdurant bodily, cognitive & demographic data aka "self") – I believe I am: "the name" to which I've learned to involuntarily answer. Who else could be? — 180 Proof
He may have the knowledge, I'm skeptical about his skills though. But I'm still waiting... — Eugen
I believe I am: "the name" to which I've learned to involuntarily answer. — 180 Proof
Exactly. :up:This seems akin to world lines, do you agree? — universeness
No. That's just you talking human talk. What does "in time" mean to you? Explain that first. Then try to analyze, for example, the retrieval of information by a computer. The human mind cannot retrieve all words simultaneously from a written text and not get a jumbled mess of information.A computer does what it does IN time. Anything mathematical is an event that happens in time. — universeness
In a manner of speaking, we perceive time as past or present. We also perceive time in terms of duration -- how long or how short.I've no idea what you mean by "perceive time" or "temporal mind". — 180 Proof
If ever an AGI is created, it still would not be sentient, as humans are sentient. Or in our usual term, conscious. The measure of consciousness involves also our fundamental propensity to inaccuracy or errors due to the fact that our perceptual qualities have been developed naturally, and overtime; involving actual experiences with objects. It's a lived experience, not created in the laboratory or simulation.This, however, would not be an intrinsic, or fundamental, feature or property of AGI itself, and therefore, it wouldn't (need to) be sentient – certainly not as we conceive of sentience today. — 180 Proof
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.