• Jake
    1.4k
    This article will argue that the "more is better" relationship with knowledge which is the foundation of science and our modern civilization is simplistic, outdated and increasingly dangerous.

    Let's start with a quick analogy which can provide a glimpse of where we're headed.


    Our Evolving Relationship With Food

    For most of our history humans have lived near the edge of starvation much of the time. In this scarcity context a "more is better" relationship with food was entirely reasonable. In our time food is plentiful and readily available in much of the world, and where that's true more people die of obesity related diseases than die of starvation.

    The point here is that a "more is better" relationship with food which was entirely rational for a very long time in an era of food scarcity became outdated and dangerous when transported to a different era characterized by a food explosion. We lucky moderns are required to replace the simplistic "more is better" food paradigm from the earlier era with a more intelligent and sophisticated relationship which can involve somewhat complicated cost/benefit calculations.

    And if we decline to adapt, he said while glancing down in to his lap at those twenty pounds that didn't used to be there, well, the price tag may very well be an unwelcome trip to the emergency room or morgue.


    Our Evolving Relationship With Knowledge

    This is where we are in our relationship with knowledge as well. The simplistic "more is better" relationship with knowledge which served us so well for so long now must adapt to meet the challenge of the new environment which it's success has created. To understand why, let's remind ourselves of some basic facts about the knowledge explosion currently underway.

    First, the modern knowledge explosion obviously brings many benefits, way more than can be listed here, more than our ancestors could have even dreamed of. And although mistakes, missteps and even epic calamities such as technology powered global wars do occur, so far we've always managed to clean up the mess, fix the error, learn the lessons, and continue with progress. So what's the problem??

    To understand the threat posed by operating from an outdated relationship with knowledge we need to examine the issue of scale. It is the vast scale of the powers emerging from the knowledge explosion that makes the historic [progress => mistakes => more progress] process that we are used to obsolete.


    Here's What's Great About Nuclear Weapons

    Luckily for the purposes of this article at least, nuclear weapons provide a very easily understood example of how powers of vast scale change the threat landscape by erasing the room for error. As you know, the nuclear stockpiles of the great powers will have to be managed successfully every single day forever, for as long as those weapons exist.

    The key thing to note here is that as far as the future of humanity goes, successfully managing such vast power most of the time is no longer sufficient. Doing a pretty good job no longer works. Making a mistake and then fixing it is no longer an option.

    In the nuclear era the room for error we've always counted on in the past is erased, and one bad day is all it takes to end the possibility for further progress. This is what defines the revolutionary new situation we now find ourselves in, a situation which demands perfection from us.


    And Now The Bad News

    It would be a mistake to assume this article is an act of nuclear weapons activism, because I've referenced nuclear weapons here only because they are an easily accessed illustration of the far larger problem which is their source, and that is our outdated relationship with knowledge.

    If nuclear weapons were to all be abducted by aliens, the underlying "more is better" knowledge development process which created the nuclear threat would continue to create more vast powers with the potential for crashing civilization.

    Each emerging power of such vast scale will have to be successfully managed every single day forever because a single mistake with a single such power a single time is sufficient to crash the system and prevent the opportunity for renewal.


    And Now The Really Bad News

    A key fact of the knowledge explosion is that it feeds back upon itself creating an ever accelerating unfolding of new knowledge, and thus new powers. So not only will emerging powers be larger than what we could produce in the past, and not only will there be more such vast powers than currently, but they will arrive on the scene at an ever faster pace.

    Ever more, ever larger powers, delivered at an ever faster pace.

    Each of these accelerating factors; scale, number, and speed; needs to be graphed against the glacial pace of human maturity development.



    And They Said It Couldn't Get Worse

    Yep, you guessed it, the most challenging factor is us.

    There is actually nothing about thousands of years of human history which suggests that we are capable of the consistently perfect management which powers of vast scale require.

    We've been able to survive repeated episodes of murderous insanity and other such mistakes in the past only because the powers available to us were limited. As example, we threw conventional explosives at each other with wild abandon in WWII, and were saved from total destruction only because conventional explosives simply aren't powerful enough to crash civilization.

    A simplistic "more is better" relationship with knowledge contains within itself the assumption that human beings will be able to successfully manage any amount of power which emerges from that process.

    Simple common sense reveals this assumption to be a wishful thinking fantasy. To those who have children this should be obvious. We sensibly limit the powers available to kids out of the realistic understanding that their ability to manage power is limited.

    But then we assume that when children become adults they somehow magically acquire the ability to successfully manage any amount of power that the knowledge explosion may deliver. The irrationality of this assumption is proven beyond doubt by the thousands of hair trigger hydrogen bombs we adults have aimed down our own throats, a stark reality we rarely find interesting enough to comment upon.



    A Problem Of Respect

    A great irony of our time is that while we typically compete with each other to see who is the greatest supporter of science, we don't actually respect the awesome power of knowledge. What we respect instead is ourselves, our ability to develop knowledge.

    What we seem not to grasp in our self flattering immaturity is that knowledge is not our personal property but rather a force of nature which must be respected, managed, controlled, limited, just like water and electricity, and perhaps even more so.

    This necessity spells the end of the simplistic "more is better" relationship with knowledge which has defined our past. The extraordinary success of the knowledge explosion has created a revolutionary new environment which we must adapt to. And as is true for all species in all environments, a failure to adapt typically results in a predictable outcome.
  • Doug
    3
    Here are a few thoughts from an academic philosopher.

    I think this is a good, provocative read. It might benefit from some refining, though. I say this because in regards to whether the thesis is true and whether it is being discussed (which you raised on The Philosophers' Cocoon), in the sense (1) the thesis is true (or plausible, at any rate), it is being discussed, and in the sense that (2) the thesis is not being discussed, the thesis is not true.

    As far as (1), your thesis is about knowledge but your discussion seems to be about technology in particular. Limited to the progress of technology, it seems plausible to say that there is a legitimate concern about whether technology is advancing at too fast a pace for humans to survive. I think this is a well worn issue in philosophy and outside of it. I'm no expert but in contemporary discussions, one likely finds this sort of concern being debated in applied/ medical ethics and in STS fields. The earliest example I can think of being raised in the history of philosophy is Plato's critique of writing (a technology) in 'Phaedrus'.

    As for (2) if you don't intend to limit your point to technology, or applied knowledge, then I don't think your claim is being discussed (though I might be wrong). But it doesn't seem plausible to me. Consider whether you think that learning more about mathematics is dangerous. Or, learning more about the oceans. Or how non-human animals experience the world. Or the universe. Or physics. There are any number of fields where discovery and the progression of knowledge is valuable that seem to me to undermine your concern.

    If you don't intend to limit your point to technology, you might want to refine it along the following lines. Any knowledge that can have practical applications is dangerous, and to think that more is better is wrong.
  • Jake
    1.4k
    Hi Doug, many thanks for your thoughtful reply. I'm delighted that the post on the Cocoon site paid off.

    First, yes, of course the piece above can be improved. I see it as a kind of first draft which I am submitting to the group mind for feedback.

    However, to argue against what I just said, it seems to me the issue itself is more important than the form of presentation. As example, if I noticed your house was on fire the important thing would be that I said something, and much less how exactly I said it.

    I say this because in regards to whether the thesis is true and whether it is being discussed (which you raised on The Philosophers' Cocoon), in the sense (1) the thesis is true (or plausible, at any rate), it is being discussed, and in the sense that (2) the thesis is not being discussed, the thesis is not true.Doug

    Apologies, I don't quite understand you here. If your time permits could you try again?

    Yes, I agree the concern I'm expressing has been addressed in regards to particular technologies, for example, genetic engineering.

    What I'm not seeing (perhaps because I don't know where to look) is a broader discussion of our relationship with knowledge itself. It seems to me the underlying problem is that we're failing to adapt our "more is better" relationship with knowledge to meet the new environment created by the success of that paradigm. As I see it, we're assuming without much questioning that what has always worked well in the past will continue to work for us in the future, and I don't believe that to be true.

    The entire system is only as strong as the weakest link, and human maturity is a very sketchy business indeed. As I noted above, the vast scale of the powers being developed would seem to require greatly enhanced judgment and maturity from us, and it doesn't seem that we can evolve as fast as knowledge and technology can.

    As for (2) if you don't intend to limit your point to technology, or applied knowledge, then I don't think your claim is being discussed (though I might be wrong).Doug

    I do have another line of discussion regarding our relationship with knowledge that is probably best discussed in a religion flavored conversation, and I'm not introducing that here so as to not muddle the waters. The first post opens a big enough can of worms for one thread.

    Consider whether you think that learning more about mathematics is dangerous. Or, learning more about the oceans. Or how non-human animals experience the world. Or the universe. Or physics. There are any number of fields where discovery and the progression of knowledge is valuable that seem to me to undermine your concern.Doug

    I would argue the following. It's indisputable that the knowledge explosion is delivering too many benefits to begin to list. I agree with this entirely. However, none of that matters if we crash civilization, because then all those benefits will be swept away.

    And there is a very real possibility that such a crash will happen, given that the machinery for that unhappy day is already in place, ready to go at the push of a button. Or the next time somebody screws up. In my opinion, the appropriate context for this discussion would be that state of mind we would bring if someone had a gun to our head, because that is literally true.

    Finally, and I apologize for this, but I've just come from spending every day for months on a prominent group philosophy blog that publishes every day, where after 2 years nuclear weapons have been mentioned only briefly, and only once, and only after much hounding from me. It's upon that experience and many other similar ones that I'm questioning whether this subject is being adequately addressed by intellectual elites.

    Enough from here. Thanks again for engaging and your further comments are most appreciated should your time permit.
  • Jake
    1.4k
    There are any number of fields where discovery and the progression of knowledge is valuable that seem to me to undermine your concern.Doug

    Doug's sentence above seems a good summary of the objections many or most people would have to the opening post, so let's focus on this a bit.

    Everyone understands that the knowledge explosion has brought many benefits, and some problems too. We view the history, decide that the benefits out weigh the problems, and thus conclude the knowledge explosion should continue as it has over the last 500 years. This is a culture wide assumption that is generally taken to be an obvious given, thus the assumption typically doesn't receive much attention.

    This cost/benefit analysis was entirely reasonable and rational in the past, prior to the emergence of vast powers with the ability to crash the system. In the new reality we've been living in since say, the 1950's, the old cost/benefit analysis falls apart, is made obsolete. In today's world it doesn't matter if the benefits outweigh the costs 1000 to 1 if the cost is the crashing of modern civilization, because such a crash would erase all the benefits.

    In the past, the cost of the knowledge explosion was that various problems would arise that would then have to be understood, fixed and cleaned up. And then progress would continue. This was a reasonable formula because progress always did continue in spite of various problems which arose.

    Today, the potential cost of the knowledge explosion includes the end of modern civilization, the end of the knowledge explosion, the end of progress, and the end of an ability to fix our mistakes.

    The argument being presented here is that we are attempting to operate from ancient "more is better" assumptions that were long entirely rational, in a new environment that is radically different. Our philosophy is not keeping up with our technology.

    The argument is that this failure of our philosophy to adapt to new conditions is the central issue facing modern culture, and that generally speaking intellectual elites are failing to give this situation adequate focus, seeing it instead as one of a thousand issues that might be examined.
  • Doug
    3
    Hi Jake,

    I'm pleased to engage with you on this. I didn't intend my initial post to be an objection to your view (and certainly not to the presentation of it here, which I think is written quite nicely). Rather, I was sharing one academic philosopher's take on whether the issue is being discussed in philosophy scholarship as well as what I took to be the plausibility of the view.

    To clarify my point: I think we can distinguish between knowledge and technology. (Within technology, I also think we can distinguish between technological knowledge and the application of technological knowledge. For instance, we may discover how to make a nuclear bomb, but to apply this knowledge--and actually make one--is something different. Yet, I would not deny that if we have the technological knowledge, someone will apply it, or try to. I think this is at least part of your insight. And I think we see with, for instance, AI, this is the case.) Knowledge is a broader category than technology, which seems to be a species of knowledge. It seems to me that your view is strongest when applied to technology. But that there are other species of knowledge that don't seem so obviously problematic in the way you suggest. So, it would be interesting to see if you could extend your view to other species of knowledge. For instance, mathematical knowledge. It doesn't seem to me that learning the next number in pi is problematic in the way that nuclear technology is. But without proving that all species of knowledge endanger us or without limiting the type(s) of knowledge you have in mind, your argument is not as convincing as it could be.

    Moreover, given the dangers we face in the world today, it seems that knowledge is our best way out of some of them. For instance, while nuclear technology is troubling, perhaps the more serious problem humanity faces is climate change. In many respects, it seems the best and only way to save ourselves is with knowledge. People who deny the human contribution to climate change need to come to know that we are playing a big role in the problem before they will be ready to change. Alternatively, presumably those who do deny the human role should at least think it is important to figure out what is causing it. Moreover, we need to learn how we can stop it-- what are the most effective ways to slow it? Some people also think that technology will help us by e.g., coming up with a technique to remove excess carbon.

    If this is right, then even if some kinds of knowledge are dangerous, others might really help us out on the global scale. So, we need to determine the likelihood of nuclear war and climate catastrophe. (But doing this requires knowledge.)

    Another form of knowledge that would help would be increased moral knowledge. Along with this, if we had more and better knowledge about how to educate people morally, then we'd be in better shape. Again, this might be the only thing that can obviate the dangers certain technologies pose. One might deny that moral knowledge is possible or that we can learn better ways to make people morally better, but these are arguments that need to be made.

    At any rate, I think your view is worthwhile and would be a welcome addition to philosophical discussions. In case you don't know about it https://philpapers.org/ is a great place to see if there are any other thinkers who are raising similar concerns.
  • Nathan
    1
    Jake, your ideas are interesting, but before I buy into them, it would be useful to see some empirical data. You say "But then we assume that when children become adults they somehow magically acquire the ability to successfully manage any amount of power that the knowledge explosion may deliver. The irrationality of this assumption is proven beyond doubt..."

    I'm sure some people would doubt this, and to really prove to them you are right, it would be helpful to cite some studies that demonstrate how adults continue to have problems managing the amount of power they have. You may consider reading Dan Ariely's book Predictably Irrational. It looks at how human psychology often encourages irrational decisions in ways we don't often recognize. Some of his insights could be relevant to the argument you are making.

    One final point: you use arguments by analogy several times in this post. This isn't bad, but it can be dangerous as there is a fallacy called "false analogy" that is easy to fall into if you're not careful. You might consider reading this post to learn about the fallacy and this article to learn how to avoid it.
  • ChatteringMonkey
    1.3k
    It is the vast scale of the powers emerging from the knowledge explosion that makes the historic [progress => mistakes => more progress] process that we are used to obsolete.Jake

    I agree with the objections raised by other posters. The quoted part is a generalisation that you don't really provide an argument for. That some knowledge enables powerfull technologies which hold risks that we may not be able to manage, doesn't mean that all or even most do. Therefor I also don't see how it follows that we would need to change our overall attitude to knowledge.

    It would seem to be enough that we try to identify which knowledge has the possibilities for this kind of 'powers', and change our attitude to those. Of course you could then argue that it might not be possible to identify them in advance etc... but still this case has to be made i think for your overal argument to work. Or maybe you'd just say that changing our attitude to some knowlegde is allready changing our overall simplistic attitude... ok fine, i think i would agree with that.

    I do think however, that there is an even more fundamental problem here than the mere realisation that more knowlegde is not allways better. Research is funded by countries, and countries are vying for controle and economic gain. I think at least some of the people involved know there are risks, but choose to ignore them because they can't count on other countries not going ahead with it.

    Take for instance AI. China, the USA, and also the EU, although lagging behind, all invest enormous amounts of money in AI-research. They know there are potential risks, but they also know that the ones leading the race will have an enormous economic advantage over the rest. The point being here, that it's not their attitude towards knowledge that is driving their research policies.
  • Jake
    1.4k
    Wow, this is great. Excellent discussion guys! Thank you for that.

    I'm pleased to engage with you on this. I didn't intend my initial post to be an objection to your view (and certainly not to the presentation of it here, which I think is written quite nicely). Rather, I was sharing one academic philosopher's take on whether the issue is being discussed in philosophy scholarship as well as what I took to be the plausibility of the view.Doug

    Doug, I hope you will free to comment in any direction your reasoning takes you. I welcome objections and challenges of all kinds. I'm very enthusiastic, but not delicate. I'm posting to receive assistance in uncovering weaknesses in these ideas. And I would very much welcome an introduction to any intellectual elites or others who are addressing these topics.

    Knowledge is a broader category than technology, which seems to be a species of knowledge. It seems to me that your view is strongest when applied to technology. But that there are other species of knowledge that don't seem so obviously problematic in the way you suggest. So, it would be interesting to see if you could extend your view to other species of knowledge. For instance, mathematical knowledge.Doug

    Tell me if this helps.

    I'm not against knowledge, but am instead arguing for the development of a more mature understanding of our relationship with knowledge and power. I'm really arguing for more knowledge in a particular direction.

    The food analogy I referred to in my opening post might help. Obviously food is not bad, but essential. But a "more is better" relationship with food no longer works in an era of food abundance. And so we have to make more sophisticated decisions about what to eat, how much to eat, when to eat etc. Note that this inevitably involves saying no to some food we might like to consume.

    If we apply the food analogy to knowledge, we might define the challenge as:

    1) how do we understand what knowledge to say no to, and...
    2) how do we create a consensus on that decision?

    This is a difficult business indeed, because while we divide knowledge in to tidy categories within our minds, in the real world everything is connected to everything else. As example, while mathematical knowledge seems harmless when considered by itself, it is in part mathematical knowledge which makes nuclear weapons possible.

    Thank for the link to philpapers. I'm just in the process of discovering it and it does seem the kind of resource I'm asking for. Thanks again for your contributions here.
  • Marcus de Brun
    440

    Is the premise here of a 'knowledge explosion' valid? I think not. The equation of knowledge and food is also unsound as food is physical and has physical consequence whereas knowledge is non physical.

    To assert there has been an increase in knowledge with time would presume that old knowledge is preserved whilst new knowledge is added. This does not make sense. Technology reduces the need for knowledge and arguably knowledge decreases with technology. 200 years ago the average man had the knowledge to construct his own dwelling, had to travel without maps, had to cure his ills with herbs, grow his own food and quite often make his own music and food and entertainment... Etc.

    Today the average man has most of these tasks accomplished for him without his having to think about them.

    Technology and the loss of cultural knowledge through globalisation, might easily argue for an intellectual contraction rather than a knowledge explosion.
  • Jake
    1.4k
    Hi Nathan, thanks for joining us.

    Jake, your ideas are interesting, but before I buy into them, it would be useful to see some empirical data.Nathan

    First, don't buy in to the ideas, kick the tires as hard as you can. Members would actually be doing me a favor if they could liberate me from this line of thought as it's become an all consuming obsession, which isn't such a great plan if the ideas are incurably flawed.

    Next, I hear you about empirical data and studies, but as you know, I'm not the most qualified person to produce that. This issue is way too big for any one person, so I'm hoping that by engaging philosophers, scientists and other intellectual elites we can bring much greater firepower to bear on the issue than any of us could ever provide on our own. So given that you are an academic yourself, I would bounce this ball back in to your court and hope that you might use your connections to engage some of your highly educated peers on this issue, either here, on your blog, or where ever they are willing to engage.

    I'll read the links you shared regarding analogy problems, thanks. I do struggle trying to find the best way to express these ideas.
  • Jake
    1.4k
    Hi ChatteringMonkey, thanks for engaging.

    I agree with the objections raised by other posters. The quoted part is a generalisation that you don't really provide an argument for.ChatteringMonkey

    To keep things tidy, you were referring to this...

    It is the vast scale of the powers emerging from the knowledge explosion that makes the historic [progress => mistakes => more progress] process that we are used to obsolete.Jake

    I could likely use help in making the argument clearer. Here's another try.

    Let's consider the relationship between conventional explosives and nuclear bombs.

    NORMAL SCALE: With conventional explosives we can make big mistakes (ie. WWII) and then clean up the mess, learn from the mistake, and continue with progress. [progress => mistakes => more progress]

    VAST SCALE: With nuclear weapons the "clean up the mess", "learn from the mistake" and "continue with progress" parts of the process are removed, at least in the case of war between the major powers. With powers of vast scale the formula is: [perfect management OR death].

    Research is funded by countries, and countries are vying for controle and economic gain.ChatteringMonkey

    Yes, this is a huge problem, agreed. As humanity is currently organized, in competitive groupings so often led by power hungry psychopaths, what I'm describing is beyond challenging.

    I do doubt that this status quo can be substantially edited through the processes of reason. However, there is another possibility, pain. As example, consider Europe. Even though Europe is the home of Western rationality, Europeans still conducted ceaseless insane war upon each other for centuries. But then the pain of warfare became too great in WWII, and now they are united in peace like never before.

    It seems inevitable to me that sooner or later somebody is going to set off a nuke in a big city somewhere in the world. That will be a revolutionary historic event which will bring with it the possibility of revolutionary historic change.

    The point being here, that it's not their attitude towards knowledge that is driving their research policies.ChatteringMonkey

    Another good point. Yes, it's their relationship with power, which is what drives our relationship with knowledge. We usually don't pursue knowledge just for itself, but for the power it contains. I like this way of looking at it, as you're helping us dig deeper in to the phenomena. It might be useful to rephrase the question as our "more is better" relationship with power.
  • Marcus de Brun
    440
    Vast scales the formula is perfect management or death.

    There are no absolutes in nature, outside of nature ITSELF there are no perfections and as we humans exist within nature, imperfect management of all managed systems is the norm. Failure in the management of nuclear weapons, constantly occurs thankfully so far without vast consequence. The most recent failure in nuclear weapons management was the election of Donald Trump and his tweets/ conversations about the size of his nuclear 'button'. The election of a moron to a position of authority over the US nuclear arsenal, is an example of imperfect management.

    Management systems within nature require such imperfections if they are to evolve with nature in her totality.

    Your notion of death is equally problematic, death of the entire human species is not a definite outcome of nuclear war,. Human existence would not have been possible without the extinction of the dinosaurs.

    Human 'knowledge' does not become more of a threat because it is expanding, that is an oxymoronic suggestion, if human knowledge were truly expanding, then by definition the threat to human existence (caused by human beings) would be decreasing rather than increasing.

    The increasing threats posed to humanity, by humanity itself should be evidence enough to prove that knowledge is not currently expanding it is contracting. Knowledge is changing but it is not increasing or expanding; it is instead diminishing with our dependence upon technology. This dependence removes us from nature and makes us less knowledgeable of nature, and less aware of our effect upon the natural system(s) that sustains us. Our contracting knowledge in respect of nature, renders ecological collapse an inevitability.

    Technology and capitalism shield mankind from knowledge of the ecological and humanitarian effect of the transaction and or the consumptive act. They accomplish this by destroying or engineering the contraction of knowledge through an encouraged and engineered dependence upon technology.

    Capitalism is entirely dependent upon a possible 'knowledge expansion' of the few (the capitalist owners of technology), and the contraction of knowledge within the majority of humans that make up 'the market'. However once again the knowledge expansion of capitalist technocrats comes at their willingness to sacrifice philosophical knowledge (the knowledge of consequence for example)

    If the human subject is to be wooed into the consumptive act, particularly if he/she is to be manipulated into buying product that is in unnecessary, he must be rendered LESS knowledgeable. Either he must become unable to make the product himself, or he must loose the knowledge that allowed him to survive without the product, and he/she must loose the moral knowledge of the consequence(s) of the consumptive act. Capitalism and the market is entirely dependent upon knowledge contraction.

    M
  • ChatteringMonkey
    1.3k


    I'll list my objections to the argument in a more organised manner :

    1. I don't think the analogy with food works. With food our relationship to it is relevant, because when we eat to much it affects our health. With knowledge however, our relationship to it doesn't really matter, because we, as personal actors, don't produce the knowledge that gives rise to the kind of risks we are talking about. It's only state funded research that does that. So what matters is the way in which research policies are determined, and i would argue that our relationship to knowledge only marginally influences that at best.

    2. I don't think you have justified the generalisation from one or a few examples, to all of knowledge. I don't disagree with the examples you gave, but as of yet I don't see reasons to conclude that this is necessary the case for all knowledge or even most knowledge. This argument needs to be made, unless you are settling for the less general claim that only some knowledge holds dangers that we should take into account.

    3. If you are settling for the less general claim, then I don't think this is that controversial. Most funding agencies and research organisations allready have advisory and ethical boards that are supposed to look into these issues. So the idea that science also entails risks is allready incorporated into the current proces of funding and doing research. What still might be a problem though, is that ultimately these considerations maybe are not given enough weight by those deciding the research policies, because they deem other things more important, i.e. power and economics. But then the problem is not one of an oudated view on knowledge, but rather a problem of valuation (i.e.. they value power and the economy so much that they are willing to ignore the risks).
  • Jake
    1.4k
    I'll list my objections to the argument in a more organised manner :ChatteringMonkey

    Thank you ChatteringMonkey, good plan. I've been meaning to break the thesis up in to a series of concise assertions so that readers can more easily tell us where they feel problems exist.

    With knowledge however, our relationship to it doesn't really matter, because we, as personal actors, don't produce the knowledge that gives rise to the kind of risks we are talking about. It's only state funded research that does that.ChatteringMonkey

    Ok, but in democracies at least, we are the state, it's our money and our votes which drive the system. Each of us individually has little impact personally, but as a group we decide these questions. If any major changes were to be deployed by governments they would require buy in from the public. Even in dictatorships, the government's room for maneuver is still limited to some degree by what they can get the population to accept.

    2. I don't think you have justified the generalisation from one or a few examples, to all of knowledge. I don't disagree with the examples you gave, but as of yet I don't see reasons to conclude that this is necessary the case for all knowledge or even most knowledge. This argument needs to be made, unless you are settling for the less general claim that only some knowledge holds dangers that we should take into account.ChatteringMonkey

    I would surely agree that some forms of new knowledge are more dangerous than others. I'm not arguing we stop learning or go back to the 8th century, so the discriminating calculations you seem to be suggesting are appropriate.

    I am arguing against the notion that we should push forward on all fronts as fast as we can, ie. the "more is better" relationship with knowledge.

    The situation is admittedly very complicated because of the way one seemingly harmless technology can empower other more dangerous tools. Computers might be an example of this?

    If you are settling for the less general claim, then I don't think this is that controversial. Most funding agencies and research organisations allready have advisory and ethical boards that are supposed to look into these issues.ChatteringMonkey

    Yes, agreed, but... Can you cite cases where the science community as a whole has agreed to not learn something? There may be some, we could talk about that.

    For example, genetic engineering is rapidly becoming cheaper and easier. What process is going to stop your next door neighbor from someday creating new life forms in his garage? Sure, they will pass laws, but when have laws worked to the degree necessary in cases like this? If legit responsible scientists learn how to do XYZ it's only a matter of time until bad actors, or just stupid careless people, acquire that same knowledge.

    You are raising good challenges, thanks for that, keep them coming if you can.
  • Jake
    1.4k
    death of the entire human species is not a definite outcome of nuclear warMarcus de Brun

    Yes, agreed, not a likely outcome either, imho. I'm referring to the collapse of modern civilization in my concerns. Some would surely survive a nuclear war for example, but probably wish they hadn't.
  • ChatteringMonkey
    1.3k
    Ok, but in democracies at least, we are the state, it's our money and our votes which drive the system. Each of us individually has little impact personally, but as a group we decide these questions. If any major changes were to be deployed by governments they would require buy in from the public. Even in dictatorships, the government's room for maneuver is still limited to some degree by what they can get the population to accept.Jake

    I don't really agree with this, at least in part. In theory democracy is supposed to work this way, but in practice that doesn't seem to be the way it plays out. Generally, i would say, people don't really have well-thought out ideas about most issues, including what our policies about research should be. I mean, it's a bit of a complex topic to do justice to here in this thread, but I think it's more the other way arround, politicians and policymakers who decide and then convince the public to adopt their views.

    Yes, agreed, but... Can you cite cases where the science community as a whole has agreed to not learn something? There may be some, we could talk about that.Jake

    In my country there is a policy that prevents funding research with direct military applications. When a more right wing government wanted to abolish this policy, the science community collectively (or at least a large part) opposed that proposal. But I don't think it's very likely they will ever agree to voluntarily not learn something (unless maybe if it's something obviously evil). There is allways a struggle to get funds in the science community, and they will take what they can get pretty much.

    If you want to prevent certain research, I think you got to implement restrictions at the level of funding. And one of the more general restrictions typically is that applications to fund a certain research project also has to pass an ethical board.
  • Jake
    1.4k
    Generally, i would say, people don't really have well-thought out ideas about most issues, including what our policies about research should be.ChatteringMonkey

    Yes, indeed, and that includes me as well. Thus, this thread. The goal here is to generate as much discussion on the topic in as many places as possible from all points of view. I don't propose that this will solve the problem, but it's better than nothing.

    I mean, it's a bit of a complex topic to do justice to here in this thread, but I think it's more the other way arround, politicians and policymakers who decide and then convince the public to adopt their views.ChatteringMonkey

    That's surely a reasonable perspective, though I do have personal experience of being part of a small group of about 50 average everyday citizens who changed a major state law, so that can happen. Another example, the current resident of the U.S. White House. The overwhelming majority of political elites on all sides didn't want him to get the job, but the little people decided otherwise (by a minority vote).

    All that said, your basic point seems sound. Most of the time most of us don't decide things via reason, but by reference to authority. So for example, even if every word I've typed in this thread were to be proven exactly 100% true :-) that wouldn't accomplish much as I have no authority, nor any talent for acquiring it. Point being, pretty close to nobody is listening to me.

    That's a key reason why I hope this thread can attract intellectual elites of various types. Not only does this subject need their intelligence, advanced education, and talent for communicating, it needs the authority they have accumulated.

    So my plea to all readers would be, if you have any connections among intellectual elites of any flavor, please consider inviting them in to this thread, or in to similar conversations on your own websites. Any place you can find or generate such conversations is good.
  • Jake
    1.4k
    If you want to prevent certain research...ChatteringMonkey

    Do you want to prevent certain research? A question to one and all...

    In my country there is a policy that prevents funding research with direct military applications.ChatteringMonkey

    That's interesting. If you feel you can share the name of your country, please do.
  • ChatteringMonkey
    1.3k
    If you want to prevent certain research... — ChatteringMonkey
    Do you want to prevent certain research? A question to one and all...
    Jake

    Generally no, I don't think so, mostly for practical reasons. But I would accept exceptions if there are really good arguments to do so.

    The principal reason is that I don't think the knowledge itself is inherently dangerous, it's the technological applications that can be. And in practice there is usually a serious gap between knowledge being acquired and technology being develloped. It's difficult to say that a certain knowledge will eventually be used to devellop (dangerous) technologies. Given that level of uncertainty, it seems difficult to justify the wide restrictions to knowledge that would be needed. Note that i'm only talking about theoretical knowledge here, I have less problems with restictions to devellop technologies with obvious enormous risks.

    But maybe the biggest problem I have with trying to prevent research is that I don't think it will work. There is no world government. Even if only one country acquires the knowledge, the cat is allready out of the bag, and the likelyhood of all countries reaching an agreements to prevent certain research seems very very small. It's a kind of prisoners dillema, the best thing maybe would be to all refrain from a certain research (nobody loses), but since you can't count on that it's better to also do the research (otherwise you loose twice, by not having the benefit of the research, and the dangers of the research are there anyway).
  • Jake
    1.4k
    Generally no, I don't think so, mostly for practical reasons. But I would accept exceptions if there are really good arguments to do so. The principal reason is that I don't think the knowledge itself is inherently dangerous, it's the technological applications that can be.ChatteringMonkey

    Ok, but if the knowledge exists and offers some ability to manipulate our environment, isn't somebody going to turn that knowledge in to a technological application? Is there really a dividing line between knowledge and technology in the real world?

    Consider atomic research. If I understand correctly, that field began simply as curiosity about how nature works. As the understandings of the atom developed, somebody down the line realized that if the atom could be split that would release enormous energy. And then at some later point The Manhattan Project figured out how to do that, and nuclear weapons and nuclear energy were born.

    I just finished watching a documentary on Netflix called The Nuclear Option. It discusses new reactor designs which may be much safer.

    https://www.netflix.com/title/80991265

    Let's imagine that this works, and that nuclear energy becomes affordable and safe. Assuming this we could then ask...

    Was learning how to split the atom worth it?

    POSITIVE: Clean safe energy, a major contribution to fighting climate change.

    NEGATIVE: Global economy prospers, making more humans, consuming more finite resources, destroying more habit, accelerating species extinction etc.

    NEGATIVE: All benefits of science can be erased at the push of a button.

    Complicated, eh?

    But maybe the biggest problem I have with trying to prevent research is that I don't think it will work.ChatteringMonkey

    Will not preventing some research work? Don't we have to ask this too?

    Let's recall the Peter Principle, which suggests that people will tend to be promoted up the chain until they finally reach a job they can't do. Isn't civilization in about that position? If we can't or won't limit knowledge, doesn't that mean that we will keep receiving more and more power until we finally get a power that we can't manage?

    Hasn't that already happened?

    If I walked around with a loaded gun in my mouth all day everyday would you say that I am successfully managing my firearm just because it hasn't gone off yet? Aren't nuclear weapons a loaded gun in the mouth of modern civilization?

    I propose that my concerns are not futuristic speculation, but a pretty accurate description of the current reality.

    This is a great discussion, thank you, and please imagine me bowing deeply in your direction.
  • ChatteringMonkey
    1.3k
    Ok, but if the knowledge exists and offers some ability to manipulate our environment, isn't somebody going to turn that knowledge in to a technological application? Is there really a dividing line between knowledge and technology in the real world?Jake

    I do think there is a gap, certainly at the moment of acquisition of the theoretical knowledge. Governments are constantly trying to find ways to close that gap (to justify the money spend on research), because theoretical knowledge doesn't directly translate into economic value. For that you need some technology that can be marketed.

    It costs lots of money to devellop technologies, and that cost generally only increases the more advanced the technology is. The initial devellopment of the A-bomb for example has cost billions of dollars.

    Of course, once that initial devellopment is done, the cost of reproducing the allready develloped technology can be reduced, but i'd think it would still be quite the barrier for the average Joe. To build an A-bomb for instance, i'd guess you need infrastructure that almost nobody can finance on his own.

    Would atomic research have been worth it if A-bombs destroy the world? Obviously no, but that is hindsight, with perfect information. At the moment of the atomic research we didn't have that information.

    Will not preventing some research work? Don't we have to ask this too?

    Let's recall the Peter Principle, which suggests that people will tend to be promoted up the chain until they finally reach a job they can't do. Isn't civilization in about that position? If we can't or won't limit knowledge, doesn't that mean that we will keep receiving more and more power until we finally get a power that we can't manage?

    Hasn't that already happened?

    If I walked around with a loaded gun in my mouth all day everyday would you say that I am successfully managing my firearm just because it hasn't gone off yet? Aren't nuclear weapons a loaded gun in the mouth of modern civilization?

    I propose that my concerns are not futuristic speculation, but a pretty accurate description of the current reality.
    — Jake

    These are certainly reasonable questions, and I agree that there are some serious issues, but I don't think we have all that much controle over the direction we are heading. The only way is forward it seems to me. Technologies will possibly bring new risks, but possibly also new solutions and ways to manage those risks.

    And I mean, I certainly don't pretend to have all the answers here, so I agree that more attention for this would be a good thing.

    I also enjoyed the discussion, thank you sir :-).
  • Jake
    1.4k
    These are certainly reasonable questions, and I agree that there are some serious issues, but I don't think we have all that much controle over the direction we are heading. The only way is forward it seems to me. Technologies will possibly bring new risks, but possibly also new solutions and ways to manage those risks.ChatteringMonkey

    This seems a pretty good summary of the group consensus, generally speaking. It's this group consensus that I'm attempting to inspect and challenge.

    Do we have control over the direction we are heading?

    A reasonable argument can be made that knowledge is a force of nature that will take us where ever it will. That's a very real possibility that I recognize, but it doesn't seem that it is in human nature to be defeatist and just wait around for the end to come. We've very confidently tackled and successfully managed many forces of nature, so why not this too?

    I would agree that it does seem quite unlikely that we will calmly reason our way to a solution, but we need to also factor in our response to calamity and pain. The group consensus you've articulated exists today because we still think we can get away with giving ourselves more and more and more power without limit. Historic events may challenge that assumption in a profound manner.

    The only way is forward it seems to me.ChatteringMonkey

    Well ok, but um, blindly repeating outdated assumptions is not really forward movement, but rather a clinging to the past.

    Technologies will possibly bring new risks, but possibly also new solutions and ways to manage those risks.ChatteringMonkey

    And this will work most of the time, but with powers of vast scale, that's no longer good enough.

    Again, this isn't futuristic speculation, it's fully true right now. As we speak, we have to successfully manage nuclear weapons every single day, and just one bad day is all it takes to bring the whole system crashing down. As we build ever more technologies of ever larger scale at an ever faster pace, this reality will become ever more true. That's what I'm asking readers to face, the path we are currently on is unsustainable, it's a formula for disaster.

    Like you, I don't pretend to have all the answers. The only "answer" I can suggest is that we face these inconvenient questions without blinking, and raise their profile to the degree possible.

    WRONG: If the thesis of this thread can be defeated, it should be, because we don't want to go around alarming people for no reason.

    RIGHT: If the thesis of this thread can not be defeated, if it is generally found to be true, it really deserves our full attention. It's not going to accomplish anything to build many new amazing tools if they're all just going to be swept away in a coming crash....

    Which could literally happen at any moment.
  • ChatteringMonkey
    1.3k
    Jake, I'm basicly suggesting that there is a third possibility, namely that you thesis might be right AND that there still will not be done a whole lot about it in the short term.

    Maybe that's defeatist, but I do think there are good reasons for believing this. Look at what has happened with the climate change issue. We know for how long now that there is a serious problem of man-made climate change, a large majority of scientist agree with this. And still we don't manage to reach agreements for policies that sufficiently adres the issue.

    Now the thesis in your opening post, while it may have it's merits, it deals only with possibities not certainties. How should one expect goverments to react to this, considering their reaction to climate change?

    On a more positive note, i do think there is there's a good chance that this will get more attention and will be adressed eventually. I just don't think the time is now, given the urgency of some of the other issues that need to be dealt with.

    Here a link to a philosopher that deals exclusively with existential risk, might be of interest to you and inform the discussion some more :

    https://nickbostrom.com/existential/risks.html
  • Jake
    1.4k
    Jake, I'm basicly suggesting that there is a third possibility, namely that you thesis might be right AND that there still will not be done a whole lot about it in the short term.ChatteringMonkey

    I can agree with this. My best guess is that little to nothing will be done about it until some epic calamity forces us to face the issue. Or maybe we will never face it, and just race blindly over the cliff.

    That said, there is something you and I can do about it, and we are doing it together right now. We aren't in a position to be personally decisive on such a huge historic issue, but we are in a position to expand conversations like this. What is the point of philosophy if it helps us see such threats, and then we do nothing about them?

    I'd suggest there could be two purposes for this thread.

    1) Help people decide whether they think the thesis is generally correct, or not?

    2) For those who vote yes, help organize a constructive response. This could be something as simple as inviting more folks in to this thread, for example.

    Now the thesis in your opening post, while it may have it's merits, it deals only with possibities not certainties.ChatteringMonkey

    To bat the ball back over the net, I would argue that thousands of hydrogen bombs poised to erase everything accomplished over the last 500 years is not a possibility, but a well documented real world fact beyond dispute. Again, what I'm describing is not futuristic speculation, but a current reality.

    I've had this conversation many times, and it always arrives at about the same place. Intelligent people such as yourself will examine and test the thesis, and often conclude it has some merit. But then they are faced with the fact that intellectual elites and other experts are typically talking about almost everything else in the world except this. This collision between one's own reasoning and the world around us can be disconcerting, disorienting.

    I'm basically asking readers to face that we are in a bus careening down a steep mountain road, and there is no bus driver. This is not a vision folks are eager to accept, for very understandable human reasons.

    To me, even if we can't do anything about this, it's a fascinating philosophical experience. Who should you believe? Your own reason, or a culture wide group consensus and the experts?

    Thank you for the link to Bostrom. I did try to engage with him some time ago but didn't succeed. Perhaps this would be a good time to try again, good idea, thanks.
  • Jake
    1.4k
    Hmm.... How about the Amish?

    I'm not claiming I know a lot about the Amish, or that the Amish are all the same, or that we should all become Amish, but...

    They do seem an example of a group of folks who have thought about such things in their own way and bowed out of the knowledge explosion to some degree. They've chosen to keep some of what the knowledge explosion has produced, while declining other aspects. That is, they aren't prisoners of the simplistic "more is better" relationship with knowledge, but have crafted a more nuanced relationship.

    At the least the Amish seem a real world example that such choices are available, and don't automatically lead to catastrophe.

    003-13-those-awesome-amish-beards-tell-you-ec451fb06a8473a3fd74d35306655d36-696x459.jpg
  • Jake
    1.4k
    Thanks Doug, your link is quite relevant.

    The description of that project concludes....

    "This project aims to fill this gap by creating a transdisciplinary and multi-level theory of technological change and resistance in social systems, which will analyze the factors and societal forces that work against technology adoption, the consequences of this resistance, and the best mechanisms to overcome it."

    If I understand correctly (I may not) the project assumes that resistance to uncontrolled technological progress is automatically invalid, and thus scholars should seek "the best mechanisms to overcome it". Do you see that assumption too? Or am experiencing my own bias here?

    Here's an irony I find endlessly interesting.

    The "more is better" relationship with knowledge is the status quo and has been for at least 500 years, right?

    Those defending the existing status quo typically label themselves as progressive advocates for change, while those arguing the status quo should be replaced with something new are typically labeled as "clinging to the past".

    As example, see this line in the CFP...

    "Resistance to technological innovation and new business models is, however, not new. It has indeed a long history in the West: attacks on Gutenberg’s printing press in the late 15th century or the protests of horse carriage drivers against motorized cars at the beginning of the 20th century precede the current growing discontent with technological change."

    So if you don't wish to blindly follow science as it marches confidently towards civilization collapse, you are like the ignorant mobs who attacked the first printing presses.

    It seems to me that those arguing for the status quo are clinging to the past, and those arguing that the simplistic "more is better" status quo should be replaced with a more intelligent and sophisticated relationship with knowledge and power are the advocates for change.
  • BC
    13.5k
    Welcome to The Philosophy Forum. Great topic.

    We human beings are simply not emotionally and cognitively configured to manage the consequences of having powerful knowledge over the long run. We can recognize the need for long-term management (running into the scores of years, into centuries, and then many thousands of years) but we are very poor at even conceiving how to put very long term plans into effect. Religious organizations have managed to look after saints bones and sacred texts for a little over 2000 years. That's the best we have done.

    Let me take a different example: Alexander Fleming discovered penicillin in 1928, but 15 years would pass before a strain and a method was found to manufacture it in large quantities. In his Nobel Prize speech, Fleming explained that bacterial resistance to penicillin was unavoidable, but that it could be delayed by using enough penicillin for long enough.

    The discovery of penicillin, streptomycin, aureomycin, and a few dozen more antibiotics was a tremendous thing--extremely valuable knowledge. Seventy-five years later we are finding that some pathogenic bacteria (causing TB, gonorrhea, and all sorts of other infections) are emerging which are pretty much resistant to all of the antibiotics.

    Perhaps this would have happened anyway--later or sooner--but certain people ignored Fleming's warning. Among them, companies producing antibiotics to feed to beef, pork, and chickens to get them to grow faster; doctors who prescribed antibiotics for diseases for viral diseases (like colds, influenza for which they were irrelevant; patients who received the proper dose and instructions for bacterial infections but only took part of the Rx, as soon as symptoms went away. Countries which sell antibiotics over the counter, leading to widespread under-dosing or inappropriate use.

    Our management of antibiotic knowledge has gone the way of most other powerful knowledge.

    Consequently, we have wasted the long-term value of antibiotics. We are stuck with thousands of nuclear weapons and nuclear waste dumps. Virtually indestructible plastics are messing up life in the oceans, we are immersed in a soup of hormone-like chemicals; we flush tons of pharmaceuticals into rivers every day (in urine, feces, and to get rid of pills), and on and on and on.
  • Jake
    1.4k
    Hello Professor Cranky! Thanks for contributing.

    Your antibiotic example would seem to illustrate the issue pretty clearly. It's one of million cases of what we might call a "self correction loop", which goes something like this.

    1) Invent something
    2) Things go great for awhile.
    3) Abuse the something.
    4) Enter a calamity.
    5) Learn the lessons.
    6) Try again, often with better results.

    This pattern has been repeated endless times both in our personal lives and at the larger social level. So when we look to the future the group consensus says, "Sure, we'll have problems as we learn, but we'll fix the problems like we always have". What makes this assumption compelling is that it's long been true, and continues to be true for most situations today.

    What's not being sufficiently taken in to account is that the self correction loop only works with powers of limited scale. What nuclear weapons teach us is that with powers of vast scale a single failure may crash the system, thus preventing the opportunity for learning and correction.

    We human beings are simply not emotionally and cognitively configured to manage the consequences of having powerful knowledge over the long run.Bitter Crank

    I agree of course. What complicates this though, and makes it harder for folks to grasp, it that we will succeed in managing many large powers. So for instance people see that we haven't had a big nuclear war, and they take that as evidence that we can manage vast powers.

    What they're not taking in to account is that 1) as time passes and 2) the number of vast powers increases, and 3) the scale of those powers grows, the odds are increasingly against us.

    As example, it's one thing to manage nuclear weapons for 70 years, and another to manage them successfully every single day forever. It's one thing to manage one vast power, and another to manage 23 vast powers.
  • BC
    13.5k
    Just some additional details, Jake:

    Military planners underestimated the damage that nuclear weapons would do. Daniel Ellsberg [he Doomsday Machine: Confessions of a Nuclear War Planner] documents how these planners had calculated damage on the basis of megatons of explosive power, but had not taken into account the resulting firestorms that the hot blasts would cause. The damage becomes far greater and, in addition to radioactive fallout, there would be enough dust and soot blasted into the upper atmosphere to trigger--not global warming, but global cooling. It isn't that we would enter an ice age, but 7-10 degrees of cooling would be intensely disruptive and the consequences would last for several decades. Crop failures would be very severe.

    The problem of excreted medical compounds was probably unforeseen and unavoidable. But the chemical industry--like most industries--has externalized the costs and harms of production. The by-products of manufacturing chlorofluorocarbons that made things like more efficient air conditioning, Teflon, and fire retardants possible were, for the most part, dumped into the environment. These various chemicals deplete ozone and in humans have several negative effects such as immune system suppression.

    Capitalism (in any form) pretty much requires continual expansion. This has led to maximum expansion of extractive and manufacturing industries across the board, along with continual expansion of consumption -- and then consequent waste products (sewage, garbage, spoiled environments, etc.).

    Yes, the odds are against us. "Paying the piper" is way overdue, and the natural bill collectors are out and active.
  • Jake
    1.4k
    Hi again Crank,

    Daniel Ellsberg [he Doomsday Machine: Confessions of a Nuclear War Planner] documents how these planners had calculated damage on the basis of megatons of explosive power, but had not taken into account the resulting firestorms that the hot blasts would cause.Bitter Crank

    Here's the trailer to a video called Countdown To Zero which documents all kinds of screw ups and threats involved with nuclear weapons. (The 90 minute full documentary is available on youtube for $3.)

    https://www.youtube.com/watch?v=vWJN9cZcT64

    What I'm trying to point to is that as the knowledge explosion proceeds there will be more existential threat situations of this nature. It won't matter if we successfully manage most of these threats most of the time because a single failure a single time with technologies of vast scale is sufficient to crash the system. That's what we should be learning from nuclear weapons.

    So if we stick with the "more is better" relationship with knowledge paradigm we are headed for a crash, which makes much or most scientific research today largely pointless.

    What's fascinating is that intellectual elites don't get this, and yet the concept is no more complicated than how we restrict powers available to children out of the recognition that their ability to manage power is limited.

    1) When it comes to children everyone immediately gets that their ability is limited and thus the power they have should be as well.

    2) When it comes to adults we completely ignore our own limitations and instead insist, "we need as much power as possible, more is better!"

    The smartest best educated people in our culture, don't get this. I've spent at least a decade now trying to find scientists or philosophers who do get it. There are none. There are plenty who will claim they do, but when you actually look at their work, you find they haven't written the ideas being shared in this thread.

    I've just come from spending months on a prominent group blog of many academic philosophers with PhDs. None of them are interested in any of this, and scientists are even worse.
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.