• Mr Bee
    656
    Why is it difficult to believe? It's far more rooted in current understandings in neuroscience than any spiritual or mystifying narrative of the uniqueness of the "human soul" or whatever nonsense people attribute human creativity to stem from.Christoffer

    Because a mere intention to want to create a painting of a park doesn't get to the interesting parts about how our brains generate that image in our heads from what we know. Of course I don't know much about creativity, neuroscience, or AI like I said before, so I'm gonna avoid deeper conversations and skip over the following paragraphs you've written for the sake of time.

    But artists who trace will still come out unscathed compared to how people react to AI generated images.Christoffer

    They certainly deal with alot of criticism themselves if you're implying they don't. Tracing isn't exactly a widely accepted practice.

    That's not enough of a foundation to conclude that machines do not replicate the physical process that goes on in our brain. You're just attributing some kind of "spiritual creative soul" to the mind, that it's just this "mysterious thing within us" and therefore can't be replicated.Christoffer

    I'm willing to reject dualism as well, though I'm not sure why you're attributing this and indeterminism to people who just believe that human creativity and whatever is going on in diffusion models aren't equivalent. I'm not saying that the human brain isn't a machine, I'm just saying that there are differences between the architecture of human brains and AI diffusion models something that may reveal itself with a further understanding of neuroscience and AI.

    And when we dig into it, we see how hard it is to distinguish what actually constitutes human creativity form machine creativity.Christoffer

    Once again I don't know about that based on the lack of knowledge we seem to have about neuroscience and AI.

    But are we saying that we shouldn't progress technology and tools because of this?Christoffer

    Given how disruptive AI can be to all walks of society then I think that is a reason for pause unless we end up creating a very large society backlash.

    When photoshop arrived with all its tools, all the concept artists who used pencils and paint behaved like luddites, trying to work against concept art being made with these new digital tools. When digital musical instruments started becoming really good, the luddites within the world of composing started saying that people who can't write notes shouldn't be hired or considered "real" composers.Christoffer

    Those were more new mediums, new ways of making art and music then something that could completely replace artists can do. I'd look more at the invention of the camera and it's relation to portrait artists as a better example.

    Therefore, a company who fires an artist in favor of someone who's not an artist to start working with AI generation, will soon discover that the art direction becomes sloppy and uninspiring, not because the AI model is bad, but because there's no "guiding principles" and expert eye guiding any of it towards a final state.Christoffer

    The issue is whether or not they'd care. Honestly we should all be concerned on that because if they're fine with it then artists are out of a job and we as consumers will have to deal with more sloppy art. Already I see advertisements where people have 6 fingers on a hand.

    The "good enough" companies, before these AI models, have never been good for artists anyway. Why would artists ever even care for their work towards these companies if they themselves won't care for the artists?Christoffer

    Because they're the ones giving most artists a job at this point and they need those jobs. Unfortunately that's the society we live in.

    Then you agree that the lawsuits going on that targets the training process rather than the outputs, uses of outputs and the users misusing these models are in the wrong.Christoffer

    They're both related. If the output process is not considered transformative enough then if the input contains copyright material then it's illegal.

    Sorry, ready this too late :sweat: But still, the topic requires some complexity in my opinion as the biggest problem is how the current societal debate about AI is often too simplified and consolidated down into shallow interpretations and analysis.Christoffer

    I can see that you're interested in a long conversation, and I genuinely appreciate the time spent responding, but I just don't have the time on my end to devote to reading and responding to it. Like I said I don't think I'm qualified enough to engage in a deeper conversation on the nature of creativity which is why I've tried to avoid it. I hope you understand.
  • frank
    16k
    So, it's essentially exactly the same as how our brain structure works when it uses our memory that is essentially a neural network formed by raw input data; and through emotional biases and functions synthesize those memories into new forms of ideas and hallucinations. Only through intention do we direct this into forming an intentional creative output, essentially forming something outside of us that we call art.Christoffer

    I agree with your point, but might disagree with this detail. I don't think intention is a requirement of artistic output. An artist may not have anything to say about what a particular work means. It just flows out the same way dreams do. Art comes alive in the viewer. The viewer provides meaning by the way they're uniquely touched.

    AI doesn't have a dreamworld or collective unconscious from which things flow, which is why AI generators have to be censored. The potential for new art is clearly enormous. The artists who are offended by it are just upset that their skills have become superfluous.
  • Christoffer
    2.1k
    Because a mere intention to want to create a painting of a park doesn't get to the interesting parts about how our brains generate that image in our heads from what we know. Of course I don't know much about creativity, neuroscience, or AI like I said before, so I'm gonna avoid deeper conversations and skip over the following paragraphs you've written for the sake of time.Mr Bee

    Intention is more than just will, intention drives creation in a fluid constant manner, not just a will to paint a park, but every detail of that park and the interpretation of it into reworks and changes.

    But it's important to know the depths of all of this, because that's what's part of defining the foundation for laws and regulations.

    They certainly deal with alot of criticism themselves if you're implying they don't. Tracing isn't exactly a widely accepted practice.Mr Bee

    From ever encounter with other artists I've had, almost all plays very loose with the concept of copyright during their process. It's very undefined were the line is drawn and most often than not it's people who aren't artists themselves who moralize the most about where lines are drawn. Tracing and photobashing are just obvious extreme examples of it, but there are lots of double standards behind closed doors during the workflow and process of creating works of art.

    Among artists there's a lot of moralizing towards others while not a lot of introspection into their own workflows.

    I'm willing to reject dualism as well, though I'm not sure why you're attributing this and indeterminism to people who just believe that human creativity and whatever is going on in diffusion models aren't equivalent.Mr Bee

    Because it's a question of having a foundation for laws and regulations. To defend human creativity with that it's magic isn't enough. Anything closer to facts is required and the closest to facts we have is what we've found in neuroscience and how these processes are similar between neural networks/machine learning and the human brain. What people like to believe in their own private spiritual way is no foundation for societal law and regulations.

    I'm not saying that the human brain isn't a machine, I'm just saying that there are differences between the architecture of human brains and AI diffusion models something that may reveal itself with a further understanding of neuroscience and AI.Mr Bee

    The architecture in the neural network is mimicked while the totality of the brain is not. What's "same" is the behavior of memory formation and path formation. But the question is that if the process is similar, then the argument can be made that a person with photographic memory who reads a book and then writes something inspired by it is utilizing the same kind of process as the AI model that's trained on the same book writing out something based on it.

    If the trained AI is considered a "normal computer", that would mean that a human had a specific part in the brain that stored a direct copy of a file of that book, not how we actually remember what we've read and how we remember it within the context of other texts. The AI model does the same, it doesn't have a file of the book, it "remembers" the book in relation to other texts.

    If we are to create laws and regulations for that, then it's not as simple as saying that the engineers "programmed in this book into the AI", because that's like saying that someone took a book and forced it into my skull as a file into my brain rather than gave it to me to memorize in my head. The difference between a normal computer and a trained AI is that there's no copy of the original file, there's nothing of the original file just like there's no original copy of the book in my brain. And most of copyright laws are based on physical or digital files being spread around.

    Given how disruptive AI can be to all walks of society then I think that is a reason for pause unless we end up creating a very large society backlash.Mr Bee

    Would you want to go back to before the industrial revolution because society supposedly paused it? With all the improvements in society that eventually came because of it?

    I find it a bit ironic that people don't want massive change while at the same time complain that nothing is done about the problems that actually exist in society. These AI models aren't just pumping out images, video and texts, they are used in other applications. Just the other day, a paper was released on protein folding that uses the new AlphaFold 3 and within it, it uses a diffusion model as part of the system.

    If I have to be blunt, the benefits of these AI systems are potentially so massive that I couldn't care less about a minority of bitter artist who lost a job at a corporation that didn't even appreciate these artists contribution enough to value them staying. These different AI models are working together and help each other with tasks that reach beyond mere generative images or text, and the development of such beneficial processes will be delayed if they're forced to retrain on less data due to some court ruling.

    Destroying the model's capabilities by restricting the training is not the solution, the solution is to restrict the use and put guardrails on the outputs. But society and artists are frankly so uneducated about all this that they don't understand the consequences of what they're screaming for. They just hate AI because they're afraid and become luddites who smash machines that could benefit society in ways they can't even fathom.

    I despise such stupidity. Imagine a cancer drug that gets delayed because artists got a court to rule a model restricted and taken down when in the middle of being a key component in a larger structure of AI systems in analysis and protein folding. People don't know what they're talking about, people don't seem to have a clue about anything related to AI models beyond the click bait journalism and twitter brawls. Yet, at the same time, they want new cancer treatments and improvements to society in which, right now, AI is making massive improvements using the same models that rely on the same training data that they want to restrict and remove from these companies.

    The equation in all of this is so skewed and misrepresented.

    Those were more new mediums, new ways of making art and music then something that could completely replace artists can do. I'd look more at the invention of the camera and it's relation to portrait artists as a better example.Mr Bee

    AI can't replace artists. As I've mentioned, an artist is skilled in having the eye, ear and mind to evaluate what is being created. An AI can't do this. You need an artist to use AI effectively, you can't have some hack generate stuff if the output is going to be used professionally.

    And those new mediums weren't new ways for those who were skilled in older methods of concept art. People who spent decades painting with real pencils and paint. They viewed the change just as artists today do with AI because they didn't know how to retrain themselves, so they became obsolete. Yet, people treat this transition as "whatever". Just as we treat all previous transitions in society and cherish how good life is with all our modern standards of living and working. Why would today be any different?

    But sure, the invention of the camera and painting portraits might be more similar. But that also invented the job of the portrait photographer. Just like a concept artist today will be able to utilize their skill of knowledge in composition and design and use an AI model to speed up the process towards a specific goal. Why wouldn't they?

    I view the hacks who think they're artists because they went from no artistic knowledge to writing prompts and getting outputs, as the same as people who bought a DSLR with video capabilities in 2008 thinking they will be the next Oscar nominated cinematographer just because the DSLR had a 35mm sensor and produced a photographic look almost exactly how cinema cameras produced images on real film productions... only to end up making ugly low budget shit that sure had an optically and technically similar look, but nothing of everything else in terms of composition and filmmaking language. Because they weren't actual cinematographers who could evaluate what they were making. They couldn't see their mistakes and they didn't know why something should look in a certain way.

    AI will not replace artists. It's a misunderstanding of what an artist is. AI generation will replace the low-paid jobs at companies who didn't care for the artists to begin with and who never really had the money or care neither.

    Companies who actually care about art but use AI will still need artists, they need their eyes to handle AI generative outputs and change according to plan and vision.

    Honestly we should all be concerned on that because if they're fine with it then artists are out of a job and we as consumers will have to deal with more sloppy art. Already I see advertisements where people have 6 fingers on a hand.Mr Bee

    Is that a bad thing? Does anyone actually care about ads in terms of aesthetic appreciation? Or is everyone trying their best to get their adblockers to work better and remove them all together? The working conditions for artists were already awful for these kinds of companies, maybe it's better that this part of the industry collapses and gets replaced by outputs of the same quality as the no-caring CEOs of these agencies. Why are we defending shit jobs like these? It's them which will be replaced first.

    What it will do is probably focus the consumers attention towards real artists work more. The difference will feel clearer. Just like how audiences have gotten fed up with the Marvel Cinematic Universe repeating itself over and over and over again and instead started to embrace original blockbusters once more. People don't actually want the repetitive factory outputs, they want artistic perspectives and this might rip the band aid off the industry instead of making everything into a low-wage factory.

    As said, artists won't disappear, because they will be the ones working on stuff that matters. Yes, some might not return to working in the field, because they were never able to get to higher positions above these low-paid shitty jobs for these uncaring CEOs in the first place. So maybe that's not a bad thing? Just like the luddites who smashed the sewing machines got better health after those slave-like conditions ended and sewing machines were the condition to work under instead.

    Because they're the ones giving most artists a job at this point and they need those jobs. Unfortunately that's the society we live in.Mr Bee

    Then start learning AI and be the artist who can give that expertise showing an edge against the non-artists that are put to work with these AIs and who don't know what the word "composition" even means. It's an easier working condition, it's less stressful and it's similarly "just a job". Why is anyone even crying over these kinds of jobs getting replaced by an easier form? It makes no sense really. Why work as a slave because a CEO demanded 5 new concept art pieces over the weekend not caring how much work that this would demand of a concept artist?

    A company hiring a non-artist to work with AI won't work and they will soon realize this. You need the eyes, you need the ears and the poetic mind to evaluate and form what the AI is generating, that's the skill the artist is bringing.

    They're both related. If the output process is not considered transformative enough then if the input contains copyright material then it's illegal.Mr Bee

    No, if the output isn't transformative enough, that's not a problem of the training data, that's a problem with alignment in order to mitigate accidental plagiarism. What you're saying is like saying that if an artist makes something that's not transformative enough, we should rule that artist to never be able to look in magazines or photographs again for inspiration and references. They will never be able to make anything again because they went to an arts museum and saw other paintings and because they accidentally plagiarized. It would mean their workflow is infringing on copyright, not the output, which makes no sense in comparison to common artistic practices and workflows.

    The training process and the output are not the same thing in terms of copyright and a user producing plagiarism with an AI model does not mean the training process is infringing on copyright.

    It's this misunderstanding of how the technology works that makes anti-AI arguments illogical and problematic.

    I don't think intention is a requirement of artistic output. An artist may not have anything to say about what a particular work means. It just flows out the same way dreams do. Art comes alive in the viewer. The viewer provides meaning by the way they're uniquely touched.frank

    Intention here is not really meant as "knowing" or the "truth" of expression, it's merely the driving force or guiding principle from which creation is directed. In essence, it means that there's an intention of choice to draw a bridge against a sunset, an intention that does not appear for an AI on its own because it does not have an emotional component to experience and subjectivity. The identity of a person, their subjective sense of reality forms an intention to create something. In ourselves it forms as an interplay between our physical creative process in our brain (the one similar to these systems) and our emotional life informing and guiding us towards an intended path for what that process creates.

    And this is also why I say that artists won't disappear. Because even an AI that is a superintelligence and has the capacity to create art on its own because they're essentially sentient, would still just constitute a single subjective perspective. Becoming a single "artist" among others. Maybe more able to produce art quicker and int more quantities, but still, people might like its art, but will want to see other perspectives from other artists and that requires a quantity of individual artists, AIs included.
  • frank
    16k
    Intention is more than just will, intention drives creation in a fluid constant manner, not just a will to paint a park, but every detail of that park and the interpretation of it into reworks and changes.

    But it's important to know the depths of all of this, because that's what's part of defining the foundation for laws and regulations.
    Christoffer

    I was thinking of intention as in a desire to create something meaningful. An artist might not have any particular meaning in mind, or even if they do, it's somewhat irrelevant to the meaning the audience finds. So it's obvious that AI can kick ass creatively. In this case, all the meaning is produced by the audience, right?

    And this is also why I say that artists won't disappear. Because even an AI that is a superintelligence and has the capacity to create art on its own because they're essentially sentient, would still just constitute a single subjective perspective. Becoming a single "artist" among others. Maybe more able to produce art quicker and int more quantities, but still, people might like its art, but will want to see other perspectives from other artists and that requires a quantity of individual artists, AIs included.Christoffer

    Yea, an AI artist could create a century's worth of art in one day. I don't really know what to make of that.
  • BC
    13.6k
    It's easy to fall into the debate trap of always summerizing anything against companies as having pure capitalist interests, but trying to solve AI tools is kinda the worst attempt if all intentions are just profit.Christoffer

    This is obviously a topic dearer to your heart than mine. Not criticizing your lengthy reply, just observing the depth of your response.

    No, it isn't the case that corporations are merely greedy thieves who think of nothing but making money. However, corporations do have a fiduciary requirement to generate profit for their shareholders. So, making money is the name of the game. The people who make up corporations, especially when engaged in new, cutting edge projects, clearly have lots of intellectual interests aside from the fiduciary requirement. I'm sure developing AI is an engrossing career for many people, requiring all the ingenuity and creativity they can muster.

    The Manhattan Project was also an engrossing, intellectually stimulating, technologically high-end project which some of the participants greatly regretted being involved in once their product was used on Hiroshima and Nagasaki.

    I'm not claiming that CHAT-GPT, for example, is the same thing as a big city-destroying bomb, but the final uses to which this technology will be applied are not at all clear, and I don't trust corporations -- or their engineers -- to just do the right thing,

    The motto of 3M, one of our local corporate heroes, is "Innovative technology for a changing world". Sounds nice. They have made a lot of fine products more complex and important than Post-it Notes™. Many of their products contain PFAS. At one time PFAS was thought to be biologically inactive. Turns out it is not biologically inactive, and everything from polar bears to penguins now carry a load of the stuff. Plus, 3M dumped a lot of waste products from its production into landfills from which it promptly leaked out. They are paying out billions of dollars in fines and very expensive clean-up costs.

    Just an example. True enough, it's not quite the same as AI. But corporations regularly do things that turn out to be harmful--sometimes knowing damn well that it was harmful--and later try to avoid responsibility.
  • Mr Bee
    656
    To defend human creativity with that it's magic isn't enough.Christoffer

    Yeah but I've never said it was magic. I just said I don't know enough about the processes in detail, but just a first glance impression gives off the feeling they're different. Chat-GPT I don't think is as intelligent as a human is. It doesn't behave the way a human intelligence would. Can I explain what is the basis for that? No. Does that mean I think it's magic then? Not at all.

    I find it a bit ironic that people don't want massive change while at the same time complain that nothing is done about the problems that actually exist in society.Christoffer

    The things is that for alot of people generative AI seems like a solution looking for a problem. I think this video sums it up pretty nicely:

    .

    If I have to be blunt, the benefits of these AI systems are potentially so massive that I couldn't care less about a minority of bitter artist who lost a job at a corporation that didn't even appreciate these artists contribution enough to value them staying.Christoffer

    According to some reports, AI could replace hundreds of millions of jobs. If it doesn't replace it with anything else, then to brush off the economic disruption to people's lives without considering policies like UBI is the sort of thinking that sets off revolutions.

    Is that a bad thing? Does anyone actually care about ads in terms of aesthetic appreciation? Or is everyone trying their best to get their adblockers to work better and remove them all together? The working conditions for artists were already awful for these kinds of companies, maybe it's better that this part of the industry collapses and gets replaced by outputs of the same quality as the no-caring CEOs of these agencies. Why are we defending shit jobs like these? It's them which will be replaced first.Christoffer

    I mean the shoddy standards are gonna affect more than just ads. Expect more to nitpick every piece of media for some bad instances of AI gone wrong. When you're too cheap to hire someone to create something then you're probably also too lazy to fix the inevitable problems that comes with generated content.

    Companies who actually care about art but use AI will still need artists, they need their eyes to handle AI generative outputs and change according to plan and vision.Christoffer

    I can imagine the top art companies, like say a Pixar or a Studio Ghibli, focusing solely on human art, in particular because they can afford it. I don't see them relying on AI. Like a high end restaurant that makes it's food from scratch without any manufactured food, they'll probably continue to exist.

    There will also be companies that use AI to a certain extent as well, and then companies that rely on it too much to the point of being low-end.

    Then start learning AI and be the artist who can give that expertise showing an edge against the non-artists that are put to work with these AIs and who don't know what the word "composition" even means. It's an easier working condition, it's less stressful and it's similarly "just a job". Why is anyone even crying over these kinds of jobs getting replaced by an easier form? It makes no sense really. Why work as a slave because a CEO demanded 5 new concept art pieces over the weekend not caring how much work that this would demand of a concept artist?

    A company hiring a non-artist to work with AI won't work and they will soon realize this. You need the eyes, you need the ears and the poetic mind to evaluate and form what the AI is generating, that's the skill the artist is bringing.
    Christoffer

    None of this really addresses their concern about financial stability. I fear that this new technology just gives more leverage over a group of people who have been historically underpaid. I hope it ends up well, but I'm not gonna brush off these concerns as luddite behavior.

    What you're saying is like saying that if an artist makes something that's not transformative enough, we should rule that artist to never be able to look in magazines or photographs again for inspiration and references.Christoffer

    Not at all. They're just not allowed to use their non-transformative output based on those references and expect to get paid for it. Like I said before, if you want to look at a bunch of copyrighted material and download it on your computer, that's fine since all you're doing is consuming.
  • jkop
    923
    Why are artists allowed to do whatever they want in their private workflows, but not these companies?Christoffer

    Noone is allowed to do whatever they want. Is private use suddenly immune to the law? I don't think so.

    Whether a particular use violates the law is obviously not for the user to decide. It's a legal matter.
  • Christoffer
    2.1k
    However, corporations do have a fiduciary requirement to generate profit for their shareholders. So, making money is the name of the game. The people who make up corporations, especially when engaged in new, cutting edge projects, clearly have lots of intellectual interests aside from the fiduciary requirement. I'm sure developing AI is an engrossing career for many people, requiring all the ingenuity and creativity they can muster.BC

    Actually, OpenAI was purely open source before they realized they needed investments for the amount of computing power needed to handle the training of their AI model. And the usage of copyrighted material wasn't decided based on making money but trying to solve and improve the AI models they attempted to make. There was no inflow of money in this until society realized what they could do and the potential for these models. So all the greed came after the first models were already trained, and so did all the criticism, basically inventing reasons to fight against the technology by flipping the narrative into "us the good guys workers" and "them the greedy soulless corporations". But I think everyone needs to chill and realize that reality isn't so binary and focus on criticism where it's valid and vital, not fundamentally emotional outbursts and herd mentality.

    but the final uses to which this technology will be applied are not at all clear, and I don't trust corporations -- or their engineers -- to just do the right thing,BC

    Which is why we need regulations, laws and I'm all for demanding a global agency with veto control over every AI company in the world and who has full insight into what's going on behind closed doors. A kind of UN for AI development. Because AI also has enormous benefits for us. In medical science we're talking about a possible revolution.

    This is why I'm opposed to the argument that AI training on copyrighted material is copyright infringement. Technically it's not, but also, it stops the development and progress of AI research and the tools we get from it. We need to keep developing these models, we just need better oversight on the other end, on the uses of them. And people need to understand the difference between what these two sides mean.

    In the end, if artists could stop acting like radicalized luddites and be part of figuring out what the actual way forward is, we'll rather end up in alignment with how we're to use these models. They will become tools used by artists more than replacing them.

    Instead of destroying the machine, figure out the path into the future, because people are more inclined to listen to all the doom and gloom, and be very bad at looking closer at the actual benefits this technology can produce.

    Just an example. True enough, it's not quite the same as AI. But corporations regularly do things that turn out to be harmful--sometimes knowing damn well that it was harmful--and later try to avoid responsibility.BC

    Yes, and that's why I'm all for oversight and control of the "output" side of AI, to have societal regulations and demands on these companies so that the usage benefit people rather than harm them. The problem is that these companies get attacked at the wrong end, attacked at the point that could destroy the machines. Out of pure emotional fear, radicalized by social media hashtag conflicts spiraling out of control.

    I was thinking of intention as in a desire to create something meaningful. An artist might not have any particular meaning in mind, or even if they do, it's somewhat irrelevant to the meaning the audience finds. So it's obvious that AI can kick ass creatively. In this case, all the meaning is produced by the audience, right?frank

    I'm using the term "intention" here as more of a technical distinction of agency in actions. Humans can decide something, based on meaning, based on needs and wants, which does not exist within the AI systems, such decisions has to be input in order to get an output. They will not produce anything on their own. Even if I just type "generate an image of a bridge" without any more details in descriptions, it will rather randomize any decisions of composition, lighting, color etc. but it doesn't do so based on wants, needs and emotions. The more I "input", the more of my subjective intention I bring into the system as guiding principles for its generation.

    The creative process within these AI systems are rather more the technical and physical process being similar to the neurological processes in our brain when we "generate" an image (picture it in your mind) and my argument is based around how copyright cannot be applied to simulating that brain process as a neural network anymore than a camera replicating the act of seeing. A camera can be used to plagiarize, but blaming the camera manufacturer and R&D for some user's plagiarism using said camera is not right.

    Yea, an AI artist could create a century's worth of art in one day. I don't really know what to make of that.frank

    Not really. A true artist takes time to evaluate and figure out the meaning of what they want to create and what they've created. Someone just asking an AI to produce large quantities of something just generates content.

    Content and art aren't equal. One is purely made for profit and filler, the other a pursuit of meaning. You can't pursue meaning in anything by focusing on quantity. Some people might find meaning in some AI generated content, some people don't care about art enough to make a distinction. But what's the difference between that and the difference between a long running soap opera compared to 2001 a space odyssey? I for one, as an artist and thinker, don't care if trash content and quantity production in media gets replaced by AI. It'll give more time for us to focus on actually creating art rather than being forced to work with content as a factory.

    I don't know enough about the processes in detail, but just a first glance impression gives off the feeling they're different.Mr Bee

    This here is the problem with the overall debate in the world. People replace insight and knowledge with "what they feel is the truth". A devastating practice in pursuit of truth and a common ground for all people to exist on. Producing "truths" based on what feels right is what leads to conflict and war.

    Chat-GPT I don't think is as intelligent as a human is. It doesn't behave the way a human intelligence would. Can I explain what is the basis for that? No. Does that mean I think it's magic then? Not at all.Mr Bee

    No on is claiming that either. Why is this a binary perspective for you? Saying that ChatGPT simulates certain systems in the brain does not mean it is as intelligent as a human. But it's like it has to either be a dead cold machine or equal to human intelligence, there's no middle ground? ChatGPT is a middle ground; it mimics certain aspects of the human brain in terms of how ideas, thoughts and images are generated out of neural memory, but it does not reach the totality of what the brain does.

    The things is that for alot of people generative AI seems like a solution looking for a problem.Mr Bee

    Maybe because they can't think past shallow internet conflicts on twitter and look deeper into this topic.

    I think this video sums it up pretty nicely:Mr Bee

    It summarizes one thing only: the shallow and surface level media coverage of AI. Of course these tech companies hyperbole things when much of the investment money that gets poured into the industry comes from the stupid people who fall for these promises.

    That doesn't mean, however, that there aren't tremendous potentials for AI technologies. We're already seeing this within medical sciences like AlphaFold 3.

    The inability for people to see past tech company hyperboles and shallow media coverage of this technology is what drives people into this shallow hashtag Twitter-level of conflict over AI. All of this is radicalizing people into either camp, into either side of the conflict. It is utterly stupid and filled with bullshit that stands in the way of actual productive conversation about how to further develop and use this new computational tech into the future.

    Such conversations exist, but they get drowned in all the bullshit online.

    According to some reports, AI could replace hundreds of millions of jobs. If it doesn't replace it with anything else, then to brush off the economic disruption to people's lives without considering policies like UBI is the sort of thinking that sets off revolutions.Mr Bee

    Of course, no one truly into the real productive discourse about AI ignores this issue.

    However, while people fear the economic disruption, the world is also worried about how work is stressing us to death, how people have lost a sense of meaning through meaningless jobs and how we're existentially draining ourselves into a soulless existence. So, why would such disruption be bad?

    Because it would effectively drive society towards systems like UBI, and it would force society to question what a "good job" really is.

    It would push us all into breaking an Hegelian situation of the master/slave concept (interpretably in this context), in which all the masters replace workers with robots and AI, effectively lowering production cost to extremely low numbers and increase their revenue, but who's gonna buy anything when no one has the effective income to buy their products?

    It's ironic that people complain about these AI companies through the perspective of capitalist greed when the end result of massive societal disruption through AI would effectively be the end of capitalism as we know it, since it removes the master/slave dynamic.

    Any nation who spots these disruptions will begin to incorporate societal changes to make sure living conditions continue to be good for the citizens. And instead of working ourselves to death with meaningless jobs, people may find time to actually figure out what is meaningful for them to do and focus on that.

    Because people don't just want factory manufactured things. There are millions of companies in the world who don't expand into factory production and low-wage exploitation, both on moral grounds but also that their product is sold specifically as "handmade". It's like people in the AI discourse aren't actually thinking through the nuances of what a post-disruption society would look like. They just end their thinking at the disruption, they can only think through the perspective of free market capitalism and are unable to picture society past this system. Maybe because their lives are so ingrained into capitalism that they're unable to even think in other perspectives, but the truth is that we may very well see a totally new system of society that hasn't been realized yet, even theoretically, because it relies on a foundation of absolute automation of all sectors in society, a condition that hasn't been on the table before.

    This is why most science fiction depictions of the future are rather simplistic when depicting the implication of disruptive technology. Ignoring the butterfly effects of new technologies emerging.

    Just think about the industrial revolution, all of this with AI is repeating history. We're seeing a threat to workers of a previous type, but see an opening for new types of jobs. We also see an increase in living conditions through the industrial- and post-industrial revolution. And the discussion has ALWAYS been the same, with people believing that the "new technology" spells the end for humanity because they cannot fathom the actual consequences and rather always view things in the perspective of destruction and oblivion.

    We already have a capitalist machinery that is destroying people both in low-wage nations and in more westernized societies. Disrupting this is not a bad thing. But we have to make sure to focus on aligning society towards better conditions for people. That's impossible if people just do shouting matches about who's worst and always complain about any new technology that gets invented.

    I'm rather optimistic about a future that has, through automation, disrupted capitalism as we know it. I think people make shallow assumptions that the rich would just continue to accumulate wealth indefinitely, something that is technically impossible, especially in a world in which the gears of capitalism has essentially broken down. And I think people forget that the progress in the world so far has made people live in conditions that were only available to royalty as close as 100 years ago. What the tech gurus are hyperboling in media is totally irrelevant. They're speaking to the gullible masses and investors, not taking part in the real discussions about the future. A future that I think people attribute the same kind of doom and gloom towards that people in all time periods of innovation and progress has been doing.

    I, however, see a technology that can revolutionize aspects of our lives in ways that will make our lives better, even if that means everyone's out of a job. Because then we can focus on living life together and not wasting our lives on being slaves for masters. Essentially, the question becomes, do you want to continue with a world as it is today? Or would the world be better off if soul crushing work gets demolished by automation, leaving everyone to mostly benefit from such an automated society.

    When you're too cheap to hire someone to create something then you're probably also too lazy to fix the inevitable problems that comes with generated content.Mr Bee

    Still doesn't change the fact that these lazy CEOs and companies were treating artists badly before firing them due to AI. I don't think this is a loss for artists. Trash commercials isn't something I would value high and I actually think it's better for the soul of the artist to just not work at such places. I've seen many artists who just stop making art all together after a few years of such bullshit. That's killing artists more than any AI does.

    I can imagine the top art companies, like say a Pixar or a Studio Ghibli, focusing solely on human art, in particular because they can afford it. I don't see them relying on AI. Like a high end restaurant that makes it's food from scratch without any manufactured food, they'll probably continue to exist.

    There will also be companies that use AI to a certain extent as well, and then companies that rely on it too much to the point of being low-end.
    Mr Bee

    But AI will also lower the cost for those who focus on human created art. The reason why I think the official debate about AI is so misleading and bad is because people believe that AI tools are only generative in terms of images or video. But there are other aspects in AI that speed up the process of making art that is in pure control of an artist. Like rotoscoping and isolating elements in sound. Or like the fact that an artist working at a game company right now is required to not only create the design for some asset like a "building" on a game map, they have to create maybe 20 versions of the same design, something that could very well take upwards of a few months. But now there are technologies that enables them to focus on just one building, and really get into depth and detail with that design without stressing through it, and then let an AI iterate different versions of that design without losing their own control over it.

    Such tools enhance artists control and gives them more time to really polish their creations. So even human created art will benefit. And we're not criticizing art created today for being "cheated" with current tools of the trade. No one is shedding tears because rotoscoping has started to become trivial, it's a time consuming and soul-crushing process that waste artists time and the investors money. And in animation work like for Pixar, they're working with a lot of AI assisted tools for animation that makes animations be more fluid and physically accurate because such parts of the process have always been a chore.

    But many artists seem to not understand this and believe that it's all about generating images and video. Those aspects, in their simplest form, are just consumer grade toys, or content creation similar to Shutterstock. The true meaning of those tools are in tandem with other tools enhancing the work for actual artists rather than replacing them.

    None of this really addresses their concern about financial stability. I fear that this new technology just gives more leverage over a group of people who have been historically underpaid. I hope it ends up well, but I'm not gonna brush off these concerns as luddite behavior.Mr Bee

    But what you describe is exactly what the luddite situation was during the industrial revolution. And their lives became better. Why are we criticizing this disruption by defending an industry that was underpaying them and using them to the brink of them giving up art all-together?

    It's oddly ironic that people argue against such disruption through wanting to maintain previously bad conditions. Effectively criticizing the greed of these capitalists, while framing the criticism within the context of making money, even if the conditions are horrible.

    Would you argue against automation in low-wage sweatshops that uses kids as their workforce? People's arguments resemble saying that we shouldn't improve the conditions there with automation because that would mean that they would get no pay at all and it's better that they get some pay and work themselves to death than no pay at all. Really?

    That's an awful defense of keeping the status quo. Bad working conditions disappearing due to automation is a good thing.

    Not at all. They're just not allowed to use their non-transformative output based on those references and expect to get paid for it. Like I said before, if you want to look at a bunch of copyrighted material and download it on your computer, that's fine since all you're doing is consuming.Mr Bee

    Yes, consuming like the training process does with these models. It's a training process, like training your brain on references and inspirations, filling your brain with memories of "data" that will be used within your brain to transform into a new ideas, images or music. This is what I'm saying over and over. Don't confuse the output of these AI models with the training process. Don't confuse the intention of the user with the training process. Because you seem to be able to make that distinction for a human, but not for the AI models. You seem to forget that "user" component of the generative AI and the "user decision" of how to use the generated image. Neither having to do with the training process.

    Noone is allowed to do whatever they want. Is private use suddenly immune to the law? I don't think so.

    Whether a particular use violates the law is obviously not for the user to decide. It's a legal matter.
    jkop

    If the private use is within law and identical to what these companies do, then it is allowed, and that also means that the companies do not break copyright law with their training process. If you confuse the output and use of the output, with the private workflow of the artist or training process of an AI, then you are applying copyright laws wrong and effectively apply copyright infringement onto the workflow and not the result.

    I don't understand why this is so confusing for you? You just adhere to it being a "legal matter", but legal matters operate on established consensus of were specific laws apply. Claiming something illegal that technically doesn't fall under being illegal, does not make it illegal, regardless of people's feelings about it or their personal want that it should. That's not how laws work.
  • frank
    16k
    The more I "input", the more of my subjective intention I bring into the system as guiding principles for its generation.Christoffer

    This can be true, but not necessarily. Look at this image:

    hSlhbQd.jpeg

    All sorts of meaning could be projected onto it, but my intention probably wouldn't show up anywhere because of the way I made it. The words I entered had nothing to do with this content. It's a result of using one image as a template and putting obscure, meaningless phrases in as prompts. What you get is a never ending play on the colors and shapes in the template image, most of which surprise me. My only role is picking what gets saved and what doesn't. This is a technique I used with Photoshop, but the possibilities just explode with AI. And you can put the ai images into Photoshop and then back into ai. It goes on forever

    I think the real reason art loses value with AI is the raw magnitude of output it's capable of.
  • jkop
    923
    If the private use is within law and identical to what these companies do, then it is allowed, and that also means that the companies do not break copyright law with their training process.Christoffer

    You claim it's identical merely by appeal to a perceived similarity to private legal use. But being similar is neither sufficient nor necessary for anything to be legal. Murder in one jurisdiction is similar to legal euthanasia in another. That's no reason to legalize murder.

    Corporate engineers training an Ai-system in order to increase its market value is obviously not identical to private fair use such as visiting a public library.

    We pay taxes, authors or their publishers get paid by well established conventions and agreements. Laws help courts decide whether a controversial use is authorized, fair or illegal. That's not for the user to decide, nor for their corporate spin doctors.
  • chiknsld
    314
    If you do not want the AI to know a particular information, then simply do not put that information onto the internet.

    I'll give you an example: You are a scientist working on a groundbreaking theory that will win you the Nobel Prize (prestige) but the AI is now trained on that data and then helps another scientist who is competing with you from another country. This would be an absolutely horrible scenario, but is this the AI's fault, or is it the fault of the scientist who inputted valuable information?

    In this scenario, both scientists are working with AI.

    We all have a universal responsibility to create knowledge (some more than others) but you also do not want to be a fool and foil your own plans by giving away that knowledge too quickly. It's akin to the idea of "casting pearls". :nerd:
  • Christoffer
    2.1k
    All sorts of meaning could be projected onto it, but my intention probably wouldn't show up anywhere because of the way I made it. The words I entered had nothing to do with this content. It's a result of using one image as a template and putting obscure, meaningless phrases in as prompts. What you get is a never ending play on the colors and shapes in the template image, most of which surprise me. My only role is picking what gets saved and what doesn't. This is a technique I used with Photoshop, but the possibilities just explode with AI. And you can put the ai images into Photoshop and then back into ai. It goes on foreverfrank

    Intention is still just my technical term for you guiding the model. If it's left to generate without any input it will just randomize in meaningless ways. You could end up with something that you can project meaning onto, but that is not the same as meaning in creation.

    Even random words in the prompt is still intention, your choice of words, your choice of only relying on intuition, it's still a guiding principle that the AI model can't decide on its own.

    Such exploration through random choices is still guidance. And we've seen that in traditional art as well, especially abstract art.

    I think the real reason art loses value with AI is the raw magnitude of output it's capable of.frank

    You can say the same about photography. Why is street photography considered art? It's an output that isn't able to be replicated; instant snapshots and thousands upon thousands of photos, choosing one.

    Art is the interplay between the artist and the audience, the invisible communication through creation, about anything from abstract emotions to specific messages. The purpose of a created artwork is the narrative defining the value of it. If all you do is generating images through randomization, then what is the purpose you're trying to convey?

    Aesthetic appreciation can be made with whatever you find beauty in, but that doesn't make it art. A spectacular tree in a spectacular place in spectacular light can make someone cry of wonder of it's beauty. But it isn't art. The "why" you do something is just as important with randomized processes as with intentional ones. Your choice of relying on intuition is part of the communicative narrative.

    But if you don't have that and just generate quantity upon quantity of images until you find something that you like, then that's content production. You're not conveying anything, you're exploring a forrest in search of that beautiful tree in that spectacular location. Content that is beautiful, but not really art as the purpose of its creation isn't there.

    You claim it's identical merely by appeal to a perceived similarity to private legal use. But being similar is neither sufficient nor necessary for anything to be legal. Murder in one jurisdiction is similar to legal euthanasia in another. That's no reason to legalize murder.jkop

    This is simply just a False Analogy fallacy you're making. An absurd comparison of two vastly different legal situations.

    The similarity is simply within the private use. You can't argue against that my use of copyright material in private is just as valid as if a tech company works with copyright material in private. The only point at which you can call copyright laws into question is by examining the output, the production out of the work done in private.

    If you say that the use of copyrighted material by these companies in their private workflow is illegal. Then why is it legal if I use copyrighted material in my private workflow? What is the difference here? That it's a tech company? If I have an art company and I use copyrighted material in my workflow but it doesn't show up in the creation I provide to clients, shouldn't that be considered illegal as well?

    You're just saying there's a line, but without a clear definition of what or where that line is drawn. In order to rule something illegal in a legal matter, you absolutely have to be able to draw a line that is based on facts and cold hard truth to the extent it's possible. And if not possible, then rulings will be in favor of the accused as crime cannot be proven beyond doubt.

    Corporate engineers training an Ai-system in order to increase its market value is obviously not identical to private fair use such as visiting a public library.jkop

    That's just a straw man misrepresentation of the argument. You're inventing a specific purpose (through your personal opinion of what they do) and then comparing it to something made within the context of an argument as analogy for the purpose of showing the similarities of the physical process in the brain and with the technology.

    On top of that, it's a misrepresentation of why such data was used. It has nothing to do with increasing market value; it's rather a method developed over the years in machine learning sciences. Increasing the amount of data required using material that's copyrighted in order to these those magnitudes of quantities. This was made long before there was any notion of gold rush in this industry. And therefore you're still just making arguments on the preconceived idea and ideology that these companies are evil, not on any actual ground for conclusions within the context of legal matters.

    The arguments you make here doesn't honestly engage with the discussion, you're simply ignoring the arguments that's been made or you don't care because you just fundamentally disagree. Neither which is valid to prove the opinion you have, and it ends up just being an emotional response, which isn't enough in matters of what is legal or not.

    We pay taxes, authors or their publishers get paid by well established conventions and agreements. Laws help courts decide whether a controversial use is authorized, fair or illegal. That's not for the user to decide, nor for their corporate spin doctors.jkop

    What's your point? You still need to prove that the training process is illegal. Instead you're basically just saying "corporations BAD, we GOOD". Describing how laws and courts help decide something doesn't change anything about what I've been talking about. Quite the opposite, I've been saying over and over that proving "theft" in court is close to impossible in the context of training these models as the process is too similar to what artists are already doing in their workflow. And because of this, just saying that "courts will decide" is meaningless as a counter argument. Yes, their rulings will decide, but as I've shown examples of, there are court rulings that seemed like clear cases of infringement that ended up freeing the accused anyway. And those cases were made in which the decision of the accused was clearly established to use other's materials directly in the finished work. If any court anywhere is to prove "theft" in the process of AI training, then they have to prove that these models has the files they were trained on within them, which they don't. And if they decide that a neural memory is the same as storing files like normal on hard drives, then what does that mean for a person with photographic memory when our brain has a similar method of neural pathways for storage? The problems pile up with trying to prove "theft" to the point it would most likely not hold in court beyond doubt.

    If you do not want the AI to know a particular information, then simply do not put that information onto the internet.chiknsld

    This is true even outside anything about AI. People seem to forget that websites that show something officially, outside of private conversations, like these posts and this thread, is public information. The act of writing here is me accepting that my text is part of the official reality that anyone can access, read or save. I still own copyright to my text without it needing any stamp or being registered as such, but if someone download this text to use while they write some essay on copyright law and AI, then I can't stop them. But if they cite me without citation, that is plagiarism.

    That's why AI alignment in outputs needs to evolve in order to align with the same obligations humans exist under.

    I'll give you an example: You are a scientist working on a groundbreaking theory that will win you the Nobel Prize (prestige) but the AI is now trained on that data and then helps another scientist who is competing with you from another country. This would be an absolutely horrible scenario, but is this the AI's fault, or is it the fault of the scientist who inputted valuable information?

    In this scenario, both scientists are working with AI.

    We all have a universal responsibility to create knowledge (some more than others) but you also do not want to be a fool and foil your own plans by giving away that knowledge too quickly. It's akin to the idea of "casting pearls"
    chiknsld

    If you use an AI in that manner you have accepted that the AI is using your information. There are AI models tuned to keep privacy of data, otherwise they wouldn't be able to be used in fields like science.

    That's more of a question of being irresponsible to yourself.

    In terms of science however, one area that's gotten a huge improvement with AI is versions of models trained on publications. The level of speed at which someone can do research on and getting good sources for citations has increased so much that it's already speeding up research in the world overall.
  • chiknsld
    314
    In terms of science however, one area that's gotten a huge improvement with AI is versions of models trained on publications. The level of speed at which someone can do research on and getting good sources for citations has increased so much that it's already speeding up research in the world overall.Christoffer

    Indeed, that is splendid! Hopefully real progress is on the way (cures for cancer, the creation of AGI, breaking down unknown barriers of physical reality, etc.) :cheer:
  • Mr Bee
    656
    This here is the problem with the overall debate in the world. People replace insight and knowledge with "what they feel is the truth". A devastating practice in pursuit of truth and a common ground for all people to exist on. Producing "truths" based on what feels right is what leads to conflict and war.Christoffer

    I said I was withholding my judgement. I have never claimed that the case is definitively settled for the reasons I've mentioned before regarding the state of our knowledge about both neuro and computer science. You clearly seem to disagree but I suppose we can just agree to disagree on that.

    No on is claiming that either. Why is this a binary perspective for you? Saying that ChatGPT simulates certain systems in the brain does not mean it is as intelligent as a human. But it's like it has to either be a dead cold machine or equal to human intelligence, there's no middle ground? ChatGPT is a middle ground; it mimics certain aspects of the human brain in terms of how ideas, thoughts and images are generated out of neural memory, but it does not reach the totality of what the brain does.Christoffer

    I have never said it was binary. I just said that whatever the difference is between human and current AI models it doesn't need to be something magical. You were the one who brought up magic and dualism in portraying the people who argue that there is something missing in AI.

    In any case, you seem to agree that there is a difference between the two as well, well now the question is what that means with regards to it's "creativity" and whether the AI is "creative" like we are. Again I think this is a matter where we agree to disagree on.

    However, while people fear the economic disruption, the world is also worried about how work is stressing us to death, how people have lost a sense of meaning through meaningless jobs and how we're existentially draining ourselves into a soulless existence.Christoffer

    Work in and of itself isn't the problem. One may argue that for some work is what gives their life meaning, It's unhealthy working conditions that are the problem more so.

    It's ironic that people complain about these AI companies through the perspective of capitalist greed when the end result of massive societal disruption through AI would effectively be the end of capitalism as we know it, since it removes the master/slave dynamic.Christoffer

    It also removes the leverage that workers usually have over their employers. In a society that is already heavily biased towards the latter, what will that mean? That's a concern that I have had about automation, even before the advent of AI. It honestly feels like all the rich folk are gonna use automation to leave everyone else in the dust while they fly to Mars.

    Any nation who spots these disruptions will begin to incorporate societal changes to make sure living conditions continue to be good for the citizens.Christoffer

    History shows that that is rarely how things go. If that were the case then wealth inequality wouldn't have been so rampant and we would've solved world hunger and climate change by now. People are very bad at being proactive after all. It's likely any necessary changes that will need to happen will only happen once people reach a breaking point and start protesting.

    And instead of working ourselves to death with meaningless jobs, people may find time to actually figure out what is meaningful for them to do and focus on that.Christoffer

    The irony is that it seems like AI is going after the jobs that people found more meaningful, like creative and white collar jobs. It's the more monotonous blue collar jobs that are more secure for now, at least until we see more progress made in the realm of robotics. Once automation takes over both then I don't know where that will leave us.

    I'm rather optimistic about a future that has, through automation, disrupted capitalism as we know it.Christoffer

    We'll see. Maybe I've just been a pessimist recently (nothing about what's currently going on in the world is giving me much to be hopeful about) but I can just as easily see how this can end up going in a dystopian fashion. Maybe it's because I've watched one too many sci-fi movies. Right now, assuming people get UBI, Wall-E seems to be on my mind currently.

    Still doesn't change the fact that these lazy CEOs and companies were treating artists badly before firing them due to AI. I don't think this is a loss for artists.Christoffer

    Certainly alot of them don't value their workers on a personal level alot of the time but I'd distinguish that from abuse. Of course that isn't really the main concern here.

    Yes, consuming like the training process does with these models. It's a training process, like training your brain on references and inspirations, filling your brain with memories of "data" that will be used within your brain to transform into a new ideas, images or music. This is what I'm saying over and over. Don't confuse the output of these AI models with the training process. Don't confuse the intention of the user with the training process. Because you seem to be able to make that distinction for a human, but not for the AI models. You seem to forget that "user" component of the generative AI and the "user decision" of how to use the generated image. Neither having to do with the training process.Christoffer

    I mean if you want to just train a model and keep it in your room for the rest of it's life, then there's nothing wrong with that, but like I said that's not important. None of what you said seems to undermine the point you're responding to, unless I am misreading you here.
  • Christoffer
    2.1k
    I said I was withholding my judgement. I have never claimed that the case is definitively settled for the reasons I've mentioned before regarding the state of our knowledge about both neuro and computer science. You clearly seem to disagree but I suppose we can just agree to disagree on that.Mr Bee

    The problem is that it influence your conclusions overall. It becomes part of the premisses for that conclusion even though it should be excluded.

    I have never said it was binary. I just said that whatever the difference is between human and current AI models it doesn't need to be something magical.Mr Bee

    There are more similarities than there are differences within the context of neural memory and pathway remembering and generation. Pointing out the differences are like excluding a white birch tree from the categories of trees because it was white and not brown like the others. Just because there are differences doesn't mean they're categorically different. Storage and distribution of copyrighted material rely heavily on the proof of storage of such files for the intention of spreading or functions that focus on spreading; neither which is going on within these models.

    In the end, it doesn't even have to be that complex. Using copyrighted material in private, regardless if a company does it or a private person, is not breaking copyright. Ruling that it is would lead to absurd consequences.

    In any case, you seem to agree that there is a difference between the two as well, well now the question is what that means with regards to it's "creativity" and whether the AI is "creative" like we are. Again I think this is a matter where we agree to disagree on.Mr Bee

    "Creative like we are" have two dimensions based on what totality of the process you speak of. "Creative like our brain's physical process" is what I'm talking about, not "creative as subjective entity with intent".

    One may argue that for some work is what gives their life meaning, It's unhealthy working conditions that are the problem more so.Mr Bee

    Stress levels are at all time highs, the working conditions are not stretched from good to bad, it's from bad to worse. A society that only focus on finding meaningful jobs will have the foundation for them, right now, there's no such foundation. Instead, the foundation is free market capitalism which is an entity that grows into making everything as effective as possible and reducing costs as much as possible. It is impossible to build a long term sustainable level of non-stress environments within it.

    It also removes the leverage that workers usually have over their employers. In a society that is already heavily biased towards the latter, what will that mean? That's a concern that I have had about automation, even before the advent of AI. It honestly feels like all the rich folk are gonna use automation to leave everyone else in the dust while they fly to Mars.Mr Bee

    As i said, wide automation breaks the Hegalian master/slave concept of capitalism.

    The scenario you describe cannot happen as the distinction of "rich" only has meaning within the concept of a capitalist system. Meaning, someone needs to buy the products that make the rich, rich. They might accumulate wealth, they might even fly to Mars, but so what? You think the world stops spinning because of that? The engineers and workers still exist and can work on something else, new leaders and new ideologies emerge among the people that's left, new strategies and developments occur.

    This is why I feel all debates about automation and the rich and similar topics are just shallow level. People shouting hashtags online and actually not digging deeper into proper research about the consequences of automation. It's mostly just "the rich will just do whatever they want and everyone else is fucked". If people decide they're just fucked and don't care anymore, then they're part of the problem they complain about. People can work together and do something instead of just hashtagging bullshit online.

    History shows that that is rarely how things go. If that were the case then wealth inequality wouldn't have been so rampant and we would've solved world hunger and climate change by now. People are very bad at being proactive after all. It's likely any necessary changes that will need to happen will only happen once people reach a breaking point and start protesting.Mr Bee

    I'm talking about nations, not the world. A single nation that experience a collapse in the free market and usual workforce will have to change their society in order to make sure its citizens don't overthrow the government. The inability for nations to solve world problems has nothing to do with national pressures of their own society changing, such actions can happen over night and do so when society is under pressure or in collapse.

    And the problems with world hunger and climate change mostly has to do with dictators and corrupt people at the top of the governments in those nations that are either struck by poverty or are the worst offenders of climate change.

    We can also argue that automation creates a situation in which the means of production gets so cheap that expanding food production to poor nations isn't a loss for richer nations. Again, the consequence analysis of automation seem to stop at a shallow level.

    The irony is that it seems like AI is going after the jobs that people found more meaningful, like creative and white collar jobs. It's the more monotonous blue collar jobs that are more secure for now, at least until we see more progress made in the realm of robotics. Once automation takes over both then I don't know where that will leave us.Mr Bee

    As I've said numerous times in this thread, creating art and being an artist won't go away because the meaning of creating art is not the same as creating content. Content creation is meaningless, it's the business of creating design for the purpose of profit. Automation of content creation won't render creation of art meaningless. And art and AI can be fused in terms of better tools for artists.

    Full wide automation of most jobs in society leads to what I'm describing. A collapse of capitalism as we know it. And it's up to each nation to decide on how to deal with that.

    Future meaningful jobs or rather tasks within a fully automated society are those that will focus on the value of individual thought and ideas for knowledge and art, as well as production that gains a value of meaning through being handmade.

    It's just that people are so indoctrinated into the capitalist free market mindset that they mistake capitalist mechanics for being an integral part of human existence. It's not.

    We'll see. Maybe I've just been a pessimist recently (nothing about what's currently going on in the world is giving me much to be hopeful about) but I can just as easily see how this can end up going in a dystopian fashion. Maybe it's because I've watched one too many sci-fi movies. Right now, assuming people get UBI, Wall-E seems to be on my mind currently.Mr Bee

    Wall-E is an apt dystopian vision of the future. I don't think any of this will lead to some post-apocalyptic reality or war torn destroyed world. We will more likely enter a bright future in which we are physically healthy in a clean environment that's ecologically sound. But instead so intellectually unstimulated that we're just mindlessly consuming endless trash content...

    ...but how is that any different from today? In which the majority just mindlessly consume trash, eat trash and do nothing of actual meaning, either for themselves or others. While some try to find meaning, try to focus on exploration of meaning in their lives through art, science and knowledge.

    The only difference would be that such a future society would make the consumers fully focused on consumption and those interested in meaning, fully focused on meaning. Instead of everyone being part of the machine that creates all the trash.

    Certainly alot of them don't value their workers on a personal level alot of the time but I'd distinguish that from abuse. Of course that isn't really the main concern here.Mr Bee

    I'm not talking about abuse, I'm talking about how a lack of interest in the art that's created leads to an experience among artists of making meaningless stuff. The same lack of interest in what they do that makes these companies turn to AI is what makes artists drained of a sense of meaning. So who cares if they replace their artists with AI, most of those artists may need that to rattle them out of such a destructive co-dependent relationship with their current job in order to get going and find meaning elsewhere, or join a company that values their input. And even if AI takes over entirely, you still need artists eyes, ears and minds to evaluate what's created and guide it... they will always have a job.

    The same can't be said of white collar managers and soon also blue collar workers as the robots get better every day. But the situation is the same for them as well. A blue collar worker work themselves into medical costs that ruins them, for companies who build and makes stuff that won't ever credit the workers. Let them get to build something with their hands for companies who appreciate that level of meaning instead.

    I mean if you want to just train a model and keep it in your room for the rest of it's life, then there's nothing wrong with that, but like I said that's not important. None of what you said seems to undermine the point you're responding to, unless I am misreading you here.Mr Bee

    You are still confusing the training process with the output. You can absolutely use a model officially, as long as it's aligned with our copyright laws. If a user of that model then misuse it, then that's not the model or the engineers fault.
  • frank
    16k
    Such exploration through random choices is still guidance. And we've seen that in traditional art as well, especially abstract art.Christoffer

    That's true. So AI becomes another tool, not a competing intellect.

    Art is the interplay between the artist and the audience, the invisible communication through creation, about anything from abstract emotions to specific messages. The purpose of a created artwork is the narrative defining the value of it. If all you do is generating images through randomization, then what is the purpose you're trying to convey?Christoffer

    My view is based on people seeing my work and reading complex messages into it. And this is painting, not ai art. I'm interested in their interpretations. My own intentions are fairly nebulous. Does that mean I'm not doing art? I guess that's possible. Or what if communications isn't really the point of art? What if it's about a certain kind of life? A work comes alive in the mind of the viewer. In a way, Hamlet is alive because you're alive and he's part of you. I'm saying what if the artist isn't creating an inanimate message, but rather a living being?
  • Christoffer
    2.1k
    So AI becomes another tool, not a competing intellect.frank

    For anyone using ChatGPT for anything other than just playing around, it's clear that it's a tool that requires a bit of training to get something out of. I find it quite funny whenever I see journalists and anti-AI people just using it in the wrong way in order to underscore how bad it is or how stupid it is.

    For instance, a lot of the demonstrations of plagiarism have been an elaborate line of attempts to push a system to break and in return plagiarize. It's good to do it in order to fine-tune the system to not accidentally end up in such a position, but to demonstrate plagiarism it's quite a dishonest action to do.

    It would be like playing elaborate tricks with a commissioned painter, lying to them about the intent of the artwork being created, saying that it's just a copy of another painter's work for the sake of playing around, for private use etc. in order to trick the painter into plagiarism so that you can then blame the painter publicly for being a plagiarist.

    My view is based on people seeing my work and reading complex messages into it. And this is painting, not ai art. I'm interested in their interpretations. My own intentions are fairly nebulous. Does that mean I'm not doing art? I guess that's possible. Or what if communications isn't really the point of art? What if it's about a certain kind of life? A work comes alive in the mind of the viewer. In a way, Hamlet is alive because you're alive and he's part of you. I'm saying what if the artist isn't creating an inanimate message, but rather a living being?frank

    The "communication" does not have to be known. It's more basic and fundamental than something like a "message". The "communication" and "intent" has more to do with your choices, and those "choices" during your creative work becomes the "intentions" and "communication" that will later be interpreted by the receiver and audience.

    For instance, a painter who creates abstract art might not know at all why she makes something in a certain way, but the choices are made in letting that instinct go wild and letting the sum of her subjective identity as a human evolve through that work.

    Similar to if someone has an outburst in an argument which forms from that person's subjective identity and experience, influencing the receiver in that argument to react or interpret. Art is a form of human communication that is beyond mere words or description; rather it is a form of transforming the interior of the subjective into observable and interpretable form for others.

    Which is why a world without art leads to cold interactions, conflicts and war as in such a world there are no such levels of communication between subjective internal perspectives to form a collective conscious, and why dictators fear it so much.
12Next
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.