Unfortunately, no, mods on forums keep an eye out for offensive posts of the highly emotional type - the more emotionally charged a post is, the greater the likelihood of being banned by the panel of mods. This, if nothing else, demonstrates that mods on many forums have it backwards. Instead of doing a Turing test and weeding out chat-bots, they're actually conducting a Reverse Turing Test and expelling real people from internet forums and retaining members that are unfeeling and machine-like.
What gives? — TheMadFool
Instead of doing a Turing test and weeding out chat-bots, they're actually conducting a Reverse Turing Test and expelling real people from internet forums and retaining members that are unfeeling and machine-like. — TheMadFool
Probably a dearth of bots and an embarrassment of riches when it comes to the other. Eliminating bots is not the mods' only problem. — Kenosha Kid
Yeah death by sterilization.
First, let's make sure that the mods themselves are not bots. Now that we've gotten that out of the way, we can think of life/civility balance.
I guess it lies in the rules they laid down. Then, an unwanted consequence -- civil, but all bots themselves. Messy and emotionally charged, but real humans. — Caldwell
Rules that favor less heart and more brain. — TheMadFool
Eliminating bots is not the mods' only problem. — Kenosha Kid
Does literally anything else need to be said about this? That it needed to be said at all is embarrasing. — StreetlightX
but eventually our connection would be shallow, and often lonely. — Caldwell
That's a false dichotomy. Throwing tantrums may be unique to humans, but it's hardly what makes one a good human.Instead of doing a Turing test and weeding out chat-bots, they're actually conducting a Reverse Turing Test and expelling real people from internet forums and retaining members that are unfeeling and machine-like.
What gives? — TheMadFool
That's a false dichotomy. Throwing tantrums may be unique to humans, but it's hardly what makes one a good human — baker
Based on your pupil dilation, skin temperature and motor functions...I calculate an 83% probability that you will not pull the trigger. — Terminator
I would be interested in the topic — Jack Cummins
Instead of doing a Turing test and weeding out chat-bots, they're actually conducting a Reverse Turing Test and expelling real people from internet forums and retaining members that are unfeeling and machine-like. — TheMadFool
Since you put this in the philosophy forum as opposed to the lounge I'm going to point out this is a faulty generalization. Just because 'bots cannot simulate feelings does not imply that those who are not 'bots are not necessarily like 'bots in respect of not having feelings.There is a whole spectrum between being too passionate, to the point where emotion compromises reason, and having no feelings at all. — Pantagruel
I was banned from a subreddit for commenting that a particular child molester's throat should be cut and his body thrown in a ditch.
The whole site was clamping down on incitements to violence at the time (during the Floyd riots).
It was ok with me tho. It's their subreddit. If they don't want my violent comments, I understand — frank
A generalization that plays a role in my thesis: No chatbots can simulate emotions. Where's the "faulty" generalization? — TheMadFool
Well, in the reverse, that there could be a 'reverse turning test'. The Turing Test targets chatbots, but the reverse Turing Test doesn't target all 'real human beings' but only the set whose exaggerated emotions rise to the level of unreasonable display. So you aren't leaving behind only a "machine-like" residue. It's a faulty generalization — Pantagruel
In fact, that people can pass the reverse Turing test is why we're all still members of this forum, having outwitted the moderators into thinking we're not human or that we're state-of-the-art chatbots capable of a decent conversation with another human being and not ruffling anyone's feathers along the way. — TheMadFool
What if an AI saved your life? Last I checked, the deep bond that occasionally :chin: forms between a savior and the saved is based wholly on the act, the act of saving and not on the mental/emotional abilities of the savior. Just asking. — TheMadFool
I think putting it this way is meandering away from the point of this thread.
A man paid a $100k for a sports BMW equipped with saving the life of a driver in the event it flips over multiple times during an accident. Then the accident happened -- the car traveling 100mph flipped several times, he got out of it and walked away, from an accident that would normally kill.
To say he formed a deep bond with this machine is sentimentality. One would be very thankful. Amazed. But to call it a deep bond is projecting.
So, going back to the task at hand, can a bot have gut feeling? Do not be fooled by the word "feeling" here. Gut feeling actually operates as intelligence used in decision-making. — Caldwell
Dude, lay off the drama.It almost seems like we humans secretly aspire to become [more] machine-like and it shows in how forum moderators, not just the ones on this forum, are quick to ban those who go off the deep end. — TheMadFool
Dude, lay off the drama.
[Fully aware that classy stops being classy once one has to explain it ...]
There are four kinds of entities that aren't into drama:
1. chatbots,
2. people who try to be like chatbots
3. people who just don't like drama,
4. ideally, philosophers.
This is a philosophy forum, and philosophy is supposed to be love of wisdom, not love of drama. Philosophers should exemplify this with their conduct. One of the hallmarks of such conduct is moderation in one's emotional expression. — baker
Not on my planet.In some world, chatbots, people who try to be chatbots, and philosophers are part of the same coherent category. — TheMadFool
I'm sure some are like that.The irony is that philosophers are in the process of becoming more like existing chatbots, emotionally sterile — TheMadFool
Not on my planet. — baker
The two look the same at first glance — baker
Its very easy to emulate emotions on a forum. Any time some one makes any assertion, it replies back with phrases like, You're an idiot, racist, bigot, etc.We all know that between emotions and reason, what AI (artificial intelligence) can't emulate is emotions. — TheMadFool
Well, I suppose some people want to control emotions for such a reason.Indeed, one is the inability to emote and the other is about control but what I'm driving at is that the wish to control emotions reveals a secret obsession to be emotionally dead, like existing robots and AI. — TheMadFool
Its very easy to emulate emotions on a forum. Any time some one makes any assertion, it replies back with phrases like, You're an idiot, racist, bigot, etc.
Its actually much more difficult to produce a logical response than an emotional response because it requires more work and energy. — Harry Hindu
Well, I suppose some people want to control emotions for such a reason.
But some people follow the path of the samurai.
You're just not allowing for enough detail in this. — baker
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.