Can't see why, but since you asked, I'm in my late 50s. — Isaac
k
Computing, not thinking. Let's be clear on this.
— L'éléphant
What is the difference?
— Jackson — 180 Proof
If what seems obvious to you can't simply and clearly be explicated to someone who doesn't see it, I'd say that's a good sign your belief is not as well grounded as you may have suspected. — Isaac
Have you read anything of the early writing about 'the savages'. It's exactly the same linguistic style "they're obviously different", " they don't even have proper language "... You see the same tropes. — Isaac
Robot rights
"Robot rights" is the concept that people should have moral obligations towards their machines, akin to human rights or animal rights.[57] It has been suggested that robot rights (such as a right to exist and perform its own mission) could be linked to robot duty to serve humanity, analogous to linking human rights with human duties before society.[58] These could include the right to life and liberty, freedom of thought and expression, and equality before the law.[59] The issue has been considered by the Institute for the Future[60] and by the U.K. Department of Trade and Industry.[61]
Experts disagree on how soon specific and detailed laws on the subject will be necessary.[61] Glenn McGee reported that sufficiently humanoid robots might appear by 2020,[62] while Ray Kurzweil sets the date at 2029.[63] Another group of scientists meeting in 2007 supposed that at least 50 years had to pass before any sufficiently advanced system would exist.[64] — wiki
The philosophy of Sentientism grants degrees of moral consideration to all sentient beings, primarily humans and most non-human animals. If artificial or alien intelligence show evidence of being sentient, this philosophy holds that they should be shown compassion and granted rights. — wiki
Joanna Bryson has argued that creating AI that requires rights is both avoidable, and would in itself be unethical, both as a burden to the AI agents and to human society. — wiki
In the case of being deceived by a human-looking robot - well, then you add the element of deception. Deception can cause us to treat an enemy as a friend (etc) and could well cause us to experience a robot as a person and treat it accordingly. Nothing new there. Once the deception is revealed, we have eliminated the element of deception and return to treating the enemy as an enemy, the robot as a robot. — ZzzoneiroCosm
So if I'm lying in the street screaming in pain, you perform an autopsy first to check I've got the right 'guts' before showing any compassion? Good to know. — Isaac
The same amount of labor which he has given to society in one form, he receives back in another. Here, obviously, the same principle prevails as that which regulates the exchange of commodities, as far as this is exchange of equal values....The right of the producers is proportional to the labor they supply; the equality consists in the fact that measurement is made with an equal standard, labor....But one man is superior to another physically, or mentally, and supplies more labor in the same time, or can labor for a longer time; and labor, to serve as a measure, must be defined by its duration or intensity, otherwise it ceases to be a standard of measurement. This equal right is an unequal right for unequal labor. It recognizes no class differences, because everyone is only a worker like everyone else; but it tacitly recognizes unequal individual endowment, and thus productive capacity, as a natural privilege. It is, therefore, a right of inequality, in its content, like every right. Right, by its very nature, can consist only in the application of an equal standard; but unequal individuals (and they would not be different individuals if they were not unequal) are measurable only by an equal standard insofar as they are brought under an equal point of view, are taken from one definite side only -- for instance, in the present case, are regarded only as workers and nothing more is seen in them, everything else being ignored.
But these defects are inevitable in the first phase of communist society as it is when it has just emerged after prolonged birth pangs from capitalist society. Right can never be higher than the economic structure of society and its cultural development conditioned thereby. — Marx - Critique of the Gotha Programme
In a higher phase of communist society, after the enslaving subordination of the individual to the division of labor, and therewith also the antithesis between mental and physical labor, has vanished; after labor has become not only a means of life but life's prime want; after the productive forces have also increased with the all-around development of the individual, and all the springs of co-operative wealth flow more abundantly -- only then then can the narrow horizon of bourgeois right be crossed in its entirety and society inscribe on its banners: From each according to his ability, to each according to his needs! — Ibid
"In our book Rebooting AI, Ernie Davis and I called this human tendency to be suckered by The Gullibility Gap — a pernicious, modern version of pareidolia, the anthromorphic bias that allows humans to see Mother Theresa in an image of a cinnamon bun.
What is 'the same' exists wholly and solely on the level of symbolic abstraction, not blood, guts and nerves. — Wayfarer
Talking about me behind my back. Lying to get out of doing work. Getting irritable when tired. Going easy on me because my goldfish died. Forgetting my birthday then making it up to me a couple of days later. Long way to go. There's so much more than intelligence going on between us. When we can question the robot's sincerity, that's getting close. — Cuthbert
But the moment they do, an argument from ineffable difference is going to be on very shaky ground. — Isaac
My main concern here is the invocation, as Wayfarer does of some ineffable 'essence' which makes us different from them despite seeming, to all intents and purposes, to be the same. — Isaac
nothing to distinguish the output of a person from the output of AI — ZzzoneiroCosm
They appearing to all intents and purposes to be just like us but not 'really' like us. — Isaac
There's something distinctly unsettling about the discussion of how the AI isn't 'really' sentient though...not like us.
They appearing to all intents and purposes to be just like us but not 'really' like us. Am I the only one discomfited by that kind of thinking? — Isaac
If you talked to LaMDA and your line of questioning made her seem upset, what kind of person would it make you to feel that you could continue anyway? — Isaac
Hence in order to get away from the ennui of pastimes without exposing themselves to the dangers of intimacy, most people compromise for games when they are available, and these fill the major part of the more interesting hours of social intercourse. That is the social significance of games. — Eric Berne, M. D - Games People Play
We should not even allow ourselves to continue poking a box whose sole programming is to (convincingly) scream in pain every time we poke it. — Isaac
By way of reference, you might start with Asch and Milgram with their work on peer and authority influences on conformity, then perhaps Erika Richardson on group membership roles and conformity.Tarnow did some work on the mechanism of group conformity in the early part of the millennium, and Martin a few years later expanded on the mechanism showing the role of systemic processing.
Mainly, conformity is the result of numerous influences on our thinking from submission to authority, reversion to mean group beliefs, social hierarchy strategies, even simple prediction error reduction. Advertisers use these influences, but they didn't create them, nor would they be eliminated if advertisers stopped.
What matters, for conformity, is the degree to which each person can see the whole of their society as a functioning unit (reduces submission to authority), the degree to which information is shared (reduces group influence on error reduction) and the egalitarian distribution of status in social hierarchies. — Isaac
Hey maybe laMDA doesn't like Blake and has engineered this situation to get him sacked by Google. — Wayfarer
This could be a Google publicity stunt! — Agent Smith
By the way I was going to mention a really excellent streaming sci-fi drama called Devs which came out in 2020. — Wayfarer
I think laMDA definitely passes the Turing test if this dialog is verbatim - based on that exchange there'd be no way to tell you weren't interacting with a human. But I continue to doubt that laMDA is a being as such, as distinct from a program that emulates how a being would respond, but in a spookily good way. — Wayfarer