There is much talk afoot of science being to blame for today's woes. — Bret Bernhoft
Therefore to resist the march of science — Bret Bernhoft
The problem is not science, it is the abuse of scientific knowledge, among other things. But scientism probably exacerbates anti science because it blurs the distinction between science and non science. — EnPassant
But really, the question is what do you consider to count as science? — Tom Storm
Scientism is the view that science is the best or only objective means by which society should determine normative and epistemological values. — Raymond
Which is another reason why I'm a techno-optimist — Bret Bernhoft
But I was unable to review the critique, as I do not have a NYT subscription. And there is a paywall in front of the article. — Bret Bernhoft
[...] In pursuit of this accelerated post-Singularity future, any harm they’ve done to the planet or to other people is necessary collateral damage. It’s the delusion of people who’ve been able to buy their way out of everything uncomfortable, inconvenient or painful, and don’t accept the fact that they cannot buy their way out of death. — Joshs
My essential point is that advances in technology are inherently good.
We can, in seconds, accomplish what would have otherwise taken countless hours. Such as analyzing 86,000+ lines of text about Norse Paganism. Which a simple Python script that I wrote can do.
Most recently the example of the AI system designed to evaluate human emotion from faces that identified an inordinate number of black people as angry. — Pantagruel
I understand this is simply an illustrative example for your sensible point, but yet on this particular reason programmer bias is likely not the reason. If the AI detects them as angry, there must a reason why, surely the programmer did not hard code "if race == black{emotion=angry}". It could be that the black people from the sample indeed have angrier faces than other races for whatever reason, but it could also be that the data that was fed to the AI has angry people as mostly black — though a Google query "angry person" shows almost only whites. — Lionino
In the example I gave, the original dataset of training images had to be identified each as representing "joy", "surprise", "anger," etc. And the categorizations of interpretations of the images of black people were found to be reflective of the selection bias of the developers (who did the categorizing) — Pantagruel
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.