• BenMcLean
    64
    Prior to the 1980s, computing was done on a big mainframe which only the largest institutions could own and control. Any remote computing was done through terminals -- essentially thin clients. Since no individual or even small business could own and control a whole system, experimentation was limited and institutionally gatekept -- and it had to be, due to scarcity.

    The microcomputer revolution of the 1980s fundamentally changed all that. We got somewhat less powerful individual systems in terms of raw computing resources, but we got a lot of them, we got them fast and we got them everywhere. The result was an explosion of innovative development in both software and even hardware in order to meet the demands of this new market. Because there was now no permission structure, no gatekepers and no credentials, anyone could build anything and they did. This was the greatest time for a creative programmer to be alive, as new possibilities were opening up every day.

    What drove this innovation, in my opinion, was not Moore's Law. The death of Moore's Law shouldn't end or even threaten this because what really drove this was 1. Local compute being unencumbered by institutional permission structures and 2. The computer hardware supply chains being abundant enough to make that possible. A continuous increase in compute capability wasn't strictly speaking necessary to fuel this so long as we could at least get our hands on hardware that's good enough.

    What really scares me about the "AI" boom isn't the fear inherent to the Information Age of computer systems replacing humans. What scares me is that "AI" being based on a subscription model accelerates a trend which was happening long before it -- cloud computing not just supplementing but totally replacing local compute. Rather than carrying us into a utopian future, what cloud computing seems to do is a technological regression to undo the microcomputer revolution of the 1980s by returning us to a centralized model of computing resources, where the only things we can actually get our hands on will be thin client terminals: not powerful enough to do anything innovative with and quite possibly so locked down with proprietary secrets and DRM that we might not be able to do anything local with them at all.

    Every innovative experiment run on the cloud will require permission, justification, subscription fees and will take place very far away from the bare metal. Rather than just trying it, like you could do from the 1980s to the 2010s, it is looking like from now on, you're going to have to justify your computing resource use to a committee somewhere. Containerization doesn't just offer us a guarantee that our software will continue to work forever -- which is a good thing -- but also permanently freezes our idea of what computer software in the future is fundamentally going to be. The feedback loop on AI training for programming seems to also be very prone to doing this as well: Stack Overflow and Reddit from 2016-2019 will forever define, for generations to come, what programming is and means, since nothing else will be allowed either by the infastructure to run the program or by our means of gathering the training data necessary to solve problems in it with LLMs.

    To me personally, nothing symbolizes the incredible freedom and innovation enabled by the microcomputer revolution of the 1980s more clearly than id Software. Sure, they made games, but each engine was powered by revolutionary technology which fundamentally wasn't supposed to be possible on the hardware they were distributing to, but was possible because the platforms were open and the basic concept of what a program could and couldn't do wasn't frozen. That openness meant people got to work in a space where it was possible to do unanticipated things.

    Cloud compute is fundamentally designed to see this type of innovation as a bug, not as a feature. Information and permission flows in a predefined direction not only because hardware is fully abstracted but because testing the limits of the system is really testing the limits of your wallet, not your hardware. In my view, that kills innovation.

    But this didn't seem to be that serious of a threat so long as cloud compute was positioned as supplementary to local compute. You could run your experiments on your own hardware, but cloud compute was then available to you as an option when it came time for your business to scale. That's great -- more power for innovators.

    That hasn't lasted. What we've seen happen recently isn't just the death of Moore's Law but a clear technological regression -- the baseline requirement for the computer gaming market has actually reduced its specification for the first time in history, from 16 GB RAM back down to 8 GB RAM. This is totally unprecedented and the implication is really disturbing.

    What makes this seem really disturbing to me isn't just because, "Oh well, we'll just stay on 8 GB for a while longer" because that isn't all that's happening. Crucial pulled out of the consumer RAM market altogether and the indication from the industry has been that their plans for 2026 are to actually reduce supply to the consumer computer hardware market, not to increase it to deal with the massive price spike. Cloud supply chains now come first -- and it is becoming increasingly less certain whether ordinary people are going to continue to be able to even access 8 GB RAM systems in the future at all.

    What seems to be happening with this "AI" boom is a realignment of incentives across the computer industry to force everyone onto the cloud for everything, to the point where local compute won't even be an option for the vast majority of people and even the real zealous enthusiasts ability to get their hands on local computer equipment may be in serious question in the future. Will we even be able to buy computers in the future, or will we be forever reduced not just to cloud-native but an absolute cloud-only computing model, with no "buy" option, only rent?

    I really don't like anything about this trend. It isn't just bad technology: it's bad politics.

    Seeing this is actually one of the things that has made me decide I have to explicitly reject libertarianism. If libertarianism was true, then the free market would naturally correct this by bringing more suppliers into the consumer computer hardware market to meet the high demand indicated by this massive price spike. But that isn't happening. What's happening is a move to force users back into a regressive model of computing that they rejected half a century ago as soon as it became possible for them to -- an innovation-killing model that nobody in their right mind wants. This is clearly very, very bad but nothing in libertarianism can explain why it's bad or can prescribe any remedy for it, because as long as "it's a private company", nothing can be done. This makes libertarianism a bad philosophy that has to be rejected. Government policy will need to address this and that means a political theory is needed that allows wielding political power precisely where libertarianism says you mustn't.

    What outrages me even more about this is that I strongly suspect that the subsidies for America's semiconductor industry are going to be funnelled into increasing supply in specialized hardware exclusively for cloud providers -- and thus I am being taxed in order to accelerate a trend I see as evil where I'd see it as fair to be taxed in order to subsidize bringing interoperable computer hardware supply back up for everyone, not just for cloud.

    And this isn't about open source either -- this is about open platforms and individual private property ownership vs enclosure and rent-seeking. This should concern everyone, not just open source advocates. It doens't matter whether you can examine the source code of the program you're running if you can't own the kind of machine that can run the program at all.

    Anyone who says "RISC-V will save us!" is deluding themselves. It fundamentally can't matter whether the underlying architecture is x64, ARM or RISC-V if you can't buy and own a computer at all -- and if the only access you get is metered through a hardware-abstracted proprietary API. Innovation requires economic room to take the risk to run failed experiments without significant penalty, which cloud computing fundamentally does not have.

    I think the locus of control over computers in society shifting towards the cloud and thus towards centralization and away from local compute is the real problem, of which concerns about social media censorship and AI are just symptoms.
  • ssu
    9.7k
    Thanks for a great OP! :up:

    I don't know so much about computers, but I've always had a distaste of everything being in a cloud. Few comments:

    Seeing this is actually one of the things that has made me decide I have to explicitly reject libertarianism. If libertarianism was true, then the free market would naturally correct this by bringing more suppliers into the consumer computer hardware market to meet the high demand indicated by this massive price spike.BenMcLean
    Libertarianism is an political philosophy, while obviously the global economy we have now isn't at all libertarian. The global economy is basically dominated by Oligopolistic competition (in every field there's a few large corporations which dominate the market and thus create an Oligopoly). Now the Oligarchs might publicly champion libertarian values and talk that kind of bullshit, but in truth what they value is the oligarchy that they are part of.

    And this isn't about open source either -- this is about open platforms and individual private property ownership vs enclosure and rent-seeking. This should concern everyone, not just open source advocates.BenMcLean
    This ought to be important.

    But I think this is something that has happened, or is push forward, in other areas than just computers.

    Think about cars or tractors.

    I had an Economic history professor, who only bought cars that were older than one specific year in the 1970's (which I've forgotten). His reasoning was that any car before that year, he could himself repair anything in the car himself and thus he only needed to buy the spare parts. But after that year there came electronics, which he couldn't do. And now look at our moders cars. WTF can an ordinary car owner do? Well, if it isn't an electric car, then just add fuel and water/washing fluid for the windscreen viper. Something else? Go to your dealership or face penalties.

    This is even worse with modern tractors, which are extremely expensive and are also computers on wheels, which heavy limitations on just what the farmer can do. It's no wonder that many farmers use age old tractors.

    I think that this is something very similar to what you told about computers and local computing. And your story goes on steroids when we take into account that actually for the vast majority of people the real computer they daily use is their smartphone. It seems there's a desire to make our local computers as dependent of the manufacturers/service providers as out smartphones are now.
  • BenMcLean
    64
    I think that in different ways, both the Left and the Right are equally to blame for this.

    The Right are to blame for this because of their blind unthinking Cold War dogmatism about economic policy -- which I used to support and now feel guilty for having been wrong about. And Trump is the closest they've ever come in my lifetime to making even a partial break with this.

    The Left are to blame for this because they prioritized corporate controlled identitarian politics, to make everybody fake and gay, over their older anti-corporate economic policy. All genuinely anti-corporate thought has been pushed out of the American Left ever since it was discovered how easily identitarian politics could transform dangerous left wing movements into becoming financially non-threatening. The working model was how they derailed the economically driven ethos of the Occupy movement with the woke bullshit. A classic divide-and-conquer move by Wall Street. Anti-liberal wokeness isn't just inherently wrong in itself -- although it totally is -- but is also a distraction from what having a left wing should be good for: being suspicious of capitalism. Keeping megacororate power in check. The Left should have listened to Bernie Sanders.

    At the time of the Occupy movement, I did not recognize the wokists and the Bernie bros as separate left wing factions. Or, to be more accurate, I thought the Marxist/socialist types were the ones steering the ship and that the critical theorist types were the useful idiots -- not a faction in their own right. But wow, the Occupy saga showed that I was wrong. The critical theorists steer the ship and Marxist/socialist types are the useful idiots!
  • jgill
    4k
    I once used VB6 to design and run programs on my computer, but one day it was gone from my computer, taken away by Microsoft. In its place was a cloud based language that seemed incomprehensible. I found and bought Liberty Basic - no subscription.

    I also used Mathtype, purchased and installed. Nowadays when I open it up it tries to get me to subscribe to the latest version.
  • BenMcLean
    64
    I ran some of these ideas through an LLM and its response was that my ideas are essentially in the category of "postliberalism" and that I need to read up on Distributism because apparently a "Digital Distributism, updated for the 21st century" is the economic model my existing thoughts are already gravitating towards. This may have been a blind spot for me for years, probably in part because my friend "The Distributist" on YouTube never really made it a project of his to adequately explain to non-Catholics like myself exactly what Distributism is and how exactly it's not just Catholic socialism.
  • ssu
    9.7k
    Anti-liberal wokeness isn't just inherently wrong in itself -- although it totally is -- but is also a distraction from what having a left wing should be good for: being suspicious of capitalism. Keeping megacororate power in check. The Left should have listened to Bernie Sanders.BenMcLean
    Basically European social democracy attempts to run exactly like that: these "socialist" understand that market capitalism does work, but the excesses have to be cut. Then the question simply becomes just what is "excess" and when has capitalism gone "too far". Issues that people can have differences.
  • BenMcLean
    64
    Basically European social democracy attempts to run exactly like that: these "socialist" understand that market capitalism does work, but the excesses have to be cut. Then the question simply becomes just what is "excess" and when has capitalism gone "too far". Issues that people can have differences.ssu
    My instinct for most of my life has been to categorically dismiss any contemporary economic idea from Europe, not only out of a doctrinaire devotion to free market ideals which I've now (recently) grown out of, but also because Europe fundamentally does not pay for its own military defense. It isn't completely devoid of military spending and is improving in this area but Europe is still heavily dependent on the United States for security its taxes do not pay for and ours do. As long as that is the case, all of Europe's economic ideas appear to be luxury beliefs for which our economic system is footing the bill to make possible.

    I probably shouldn't be so dismissive, because this is a space I'm only just learning to navigate and am in a process of re-examining my old assumptions right now, but that particular one doesn't seem to depend on Reaganite free market dogmatism: it's realpolitik.

    What are your thoughts on that? Do you think that a socialist or quasi-socialist system could actually pay for itself without turning into Soviet style tyranny the way the libertarians assume?
  • Joshs
    6.6k
    Europe fundamentally does not pay for its own military defense. It isn't completely devoid of military spending and is improving in this area but Europe is still heavily dependent on the United States for security its taxes do not pay for and ours do.BenMcLean

    You can thank the U.S. for coming up with the idea of that arrangement. After World War II, the United States did not reluctantly assume responsibility for European security because Europeans refused to pay for it. The arrangement emerged because Washington actively wanted to control the terms of European rearmament and, initially, to prevent it altogether. Demilitarization, especially of Germany, was a central American objective.

    Furthermore, the claim that European welfare states would have been unaffordable or impossible without U.S. military spending is not supported by historical evidence and collapses once you look at cases like Britain, France, or Sweden. Europe built welfare because it prioritized social insurance, labor protection, and decommodification in ways the U.S. did not, not because it was freed from defense obligations.
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.