I say this because in regards to whether the thesis is true and whether it is being discussed (which you raised on The Philosophers' Cocoon), in the sense (1) the thesis is true (or plausible, at any rate), it is being discussed, and in the sense that (2) the thesis is not being discussed, the thesis is not true. — Doug
As for (2) if you don't intend to limit your point to technology, or applied knowledge, then I don't think your claim is being discussed (though I might be wrong). — Doug
Consider whether you think that learning more about mathematics is dangerous. Or, learning more about the oceans. Or how non-human animals experience the world. Or the universe. Or physics. There are any number of fields where discovery and the progression of knowledge is valuable that seem to me to undermine your concern. — Doug
There are any number of fields where discovery and the progression of knowledge is valuable that seem to me to undermine your concern. — Doug
It is the vast scale of the powers emerging from the knowledge explosion that makes the historic [progress => mistakes => more progress] process that we are used to obsolete. — Jake
I'm pleased to engage with you on this. I didn't intend my initial post to be an objection to your view (and certainly not to the presentation of it here, which I think is written quite nicely). Rather, I was sharing one academic philosopher's take on whether the issue is being discussed in philosophy scholarship as well as what I took to be the plausibility of the view. — Doug
Knowledge is a broader category than technology, which seems to be a species of knowledge. It seems to me that your view is strongest when applied to technology. But that there are other species of knowledge that don't seem so obviously problematic in the way you suggest. So, it would be interesting to see if you could extend your view to other species of knowledge. For instance, mathematical knowledge. — Doug
Jake, your ideas are interesting, but before I buy into them, it would be useful to see some empirical data. — Nathan
I agree with the objections raised by other posters. The quoted part is a generalisation that you don't really provide an argument for. — ChatteringMonkey
It is the vast scale of the powers emerging from the knowledge explosion that makes the historic [progress => mistakes => more progress] process that we are used to obsolete. — Jake
Research is funded by countries, and countries are vying for controle and economic gain. — ChatteringMonkey
The point being here, that it's not their attitude towards knowledge that is driving their research policies. — ChatteringMonkey
I'll list my objections to the argument in a more organised manner : — ChatteringMonkey
With knowledge however, our relationship to it doesn't really matter, because we, as personal actors, don't produce the knowledge that gives rise to the kind of risks we are talking about. It's only state funded research that does that. — ChatteringMonkey
2. I don't think you have justified the generalisation from one or a few examples, to all of knowledge. I don't disagree with the examples you gave, but as of yet I don't see reasons to conclude that this is necessary the case for all knowledge or even most knowledge. This argument needs to be made, unless you are settling for the less general claim that only some knowledge holds dangers that we should take into account. — ChatteringMonkey
If you are settling for the less general claim, then I don't think this is that controversial. Most funding agencies and research organisations allready have advisory and ethical boards that are supposed to look into these issues. — ChatteringMonkey
death of the entire human species is not a definite outcome of nuclear war — Marcus de Brun
Ok, but in democracies at least, we are the state, it's our money and our votes which drive the system. Each of us individually has little impact personally, but as a group we decide these questions. If any major changes were to be deployed by governments they would require buy in from the public. Even in dictatorships, the government's room for maneuver is still limited to some degree by what they can get the population to accept. — Jake
Yes, agreed, but... Can you cite cases where the science community as a whole has agreed to not learn something? There may be some, we could talk about that. — Jake
Generally, i would say, people don't really have well-thought out ideas about most issues, including what our policies about research should be. — ChatteringMonkey
I mean, it's a bit of a complex topic to do justice to here in this thread, but I think it's more the other way arround, politicians and policymakers who decide and then convince the public to adopt their views. — ChatteringMonkey
If you want to prevent certain research... — ChatteringMonkey
In my country there is a policy that prevents funding research with direct military applications. — ChatteringMonkey
If you want to prevent certain research... — ChatteringMonkey
Do you want to prevent certain research? A question to one and all... — Jake
Generally no, I don't think so, mostly for practical reasons. But I would accept exceptions if there are really good arguments to do so. The principal reason is that I don't think the knowledge itself is inherently dangerous, it's the technological applications that can be. — ChatteringMonkey
But maybe the biggest problem I have with trying to prevent research is that I don't think it will work. — ChatteringMonkey
Ok, but if the knowledge exists and offers some ability to manipulate our environment, isn't somebody going to turn that knowledge in to a technological application? Is there really a dividing line between knowledge and technology in the real world? — Jake
Will not preventing some research work? Don't we have to ask this too?
Let's recall the Peter Principle, which suggests that people will tend to be promoted up the chain until they finally reach a job they can't do. Isn't civilization in about that position? If we can't or won't limit knowledge, doesn't that mean that we will keep receiving more and more power until we finally get a power that we can't manage?
Hasn't that already happened?
If I walked around with a loaded gun in my mouth all day everyday would you say that I am successfully managing my firearm just because it hasn't gone off yet? Aren't nuclear weapons a loaded gun in the mouth of modern civilization?
I propose that my concerns are not futuristic speculation, but a pretty accurate description of the current reality. — Jake
These are certainly reasonable questions, and I agree that there are some serious issues, but I don't think we have all that much controle over the direction we are heading. The only way is forward it seems to me. Technologies will possibly bring new risks, but possibly also new solutions and ways to manage those risks. — ChatteringMonkey
The only way is forward it seems to me. — ChatteringMonkey
Technologies will possibly bring new risks, but possibly also new solutions and ways to manage those risks. — ChatteringMonkey
Jake, I'm basicly suggesting that there is a third possibility, namely that you thesis might be right AND that there still will not be done a whole lot about it in the short term. — ChatteringMonkey
Now the thesis in your opening post, while it may have it's merits, it deals only with possibities not certainties. — ChatteringMonkey
"This project aims to fill this gap by creating a transdisciplinary and multi-level theory of technological change and resistance in social systems, which will analyze the factors and societal forces that work against technology adoption, the consequences of this resistance, and the best mechanisms to overcome it."
"Resistance to technological innovation and new business models is, however, not new. It has indeed a long history in the West: attacks on Gutenberg’s printing press in the late 15th century or the protests of horse carriage drivers against motorized cars at the beginning of the 20th century precede the current growing discontent with technological change."
We human beings are simply not emotionally and cognitively configured to manage the consequences of having powerful knowledge over the long run. — Bitter Crank
Daniel Ellsberg [he Doomsday Machine: Confessions of a Nuclear War Planner] documents how these planners had calculated damage on the basis of megatons of explosive power, but had not taken into account the resulting firestorms that the hot blasts would cause. — Bitter Crank
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.