Hi Doug, many thanks for your thoughtful reply. I'm delighted that the post on the Cocoon site paid off.
First, yes, of course the piece above can be improved. I see it as a kind of first draft which I am submitting to the group mind for feedback.
However, to argue against what I just said, it seems to me the issue itself is more important than the form of presentation. As example, if I noticed your house was on fire the important thing would be that I said something, and much less how exactly I said it.
I say this because in regards to whether the thesis is true and whether it is being discussed (which you raised on The Philosophers' Cocoon), in the sense (1) the thesis is true (or plausible, at any rate), it is being discussed, and in the sense that (2) the thesis is not being discussed, the thesis is not true. — Doug
Apologies, I don't quite understand you here. If your time permits could you try again?
Yes, I agree the concern I'm expressing has been addressed in regards to particular technologies, for example, genetic engineering.
What I'm not seeing (perhaps because I don't know where to look) is a broader discussion of our relationship with knowledge itself. It seems to me the underlying problem is that we're failing to adapt our "more is better" relationship with knowledge to meet the new environment created by the success of that paradigm. As I see it, we're assuming without much questioning that what has always worked well in the past will continue to work for us in the future, and I don't believe that to be true.
The entire system is only as strong as the weakest link, and human maturity is a very sketchy business indeed. As I noted above, the vast scale of the powers being developed would seem to require greatly enhanced judgment and maturity from us, and it doesn't seem that we can evolve as fast as knowledge and technology can.
As for (2) if you don't intend to limit your point to technology, or applied knowledge, then I don't think your claim is being discussed (though I might be wrong). — Doug
I do have another line of discussion regarding our relationship with knowledge that is probably best discussed in a religion flavored conversation, and I'm not introducing that here so as to not muddle the waters. The first post opens a big enough can of worms for one thread.
Consider whether you think that learning more about mathematics is dangerous. Or, learning more about the oceans. Or how non-human animals experience the world. Or the universe. Or physics. There are any number of fields where discovery and the progression of knowledge is valuable that seem to me to undermine your concern. — Doug
I would argue the following. It's indisputable that the knowledge explosion is delivering too many benefits to begin to list. I agree with this entirely. However, none of that matters if we crash civilization, because then all those benefits will be swept away.
And there is a very real possibility that such a crash will happen, given that the machinery for that unhappy day is already in place, ready to go at the push of a button. Or the next time somebody screws up. In my opinion, the appropriate context for this discussion would be that state of mind we would bring if someone had a gun to our head, because that is literally true.
Finally, and I apologize for this, but I've just come from spending every day for months on a prominent group philosophy blog that publishes every day, where after 2 years nuclear weapons have been mentioned only briefly, and only once, and only after much hounding from me. It's upon that experience and many other similar ones that I'm questioning whether this subject is being adequately addressed by intellectual elites.
Enough from here. Thanks again for engaging and your further comments are most appreciated should your time permit.