Fair, my point was: how is it important that we understand how consciousness works to see that the way consciousness creates Art is not very comparable to a machine recognizing patterns?
The commentor above has compared inspiration to the way AI Companies are using the labor of millions of artists for free. In this context I assume this is what they were hinting at when responding to “AI is not being inspired” with “we don’t know how consciousness works”
Because one is a black box that very well may just be a much more advanced version of what current AI does. We don’t know yet. It’s possible that with the training of trillions of trillions of moments of experience a person has that an AI may be comparable.
I mean, the likelihood is basically zero, but it’s impossible to prove the negative. At the end of the day, our brains are just messy turing machines with a lot of built in shortcuts. The only things that set us apart is how much more complicated they are, and how much more training data we provide it. Unless we can crack consciousness, it’s very possible some day in the future we will build an incredibly rudimentary AGI without even realizing that it works the same way we do, scaled down. But without truly knowing how our own brain works fully, how can we begin to claim it does or doesn’t work like something else?
Are you seriously suggesting that human creativity works by learning to reduce the amount of random noise they output by mapping words to patterns?
Is that what they suggested? Or are you just wanting to be mad about something?
Fair, my point was: how is it important that we understand how consciousness works to see that the way consciousness creates Art is not very comparable to a machine recognizing patterns?
The commentor above has compared inspiration to the way AI Companies are using the labor of millions of artists for free. In this context I assume this is what they were hinting at when responding to “AI is not being inspired” with “we don’t know how consciousness works”
Because one is a black box that very well may just be a much more advanced version of what current AI does. We don’t know yet. It’s possible that with the training of trillions of trillions of moments of experience a person has that an AI may be comparable.
I mean, the likelihood is basically zero, but it’s impossible to prove the negative. At the end of the day, our brains are just messy turing machines with a lot of built in shortcuts. The only things that set us apart is how much more complicated they are, and how much more training data we provide it. Unless we can crack consciousness, it’s very possible some day in the future we will build an incredibly rudimentary AGI without even realizing that it works the same way we do, scaled down. But without truly knowing how our own brain works fully, how can we begin to claim it does or doesn’t work like something else?