I don’t get this. AI bros talk about how “in the near future” no one will “need” to be a writer, a filmmaker or a musician anymore, as you’ll be able to generate your own media with your own parameters and preferences on the fly. This, to me, feels like such an insane opinion. How can someone not value the ingenuity and creativity behind a work of art? Do these people not see or feel the human behind it all? And are these really opinions that you’ve encountered outside of the internet?
My daughter (15f) is an artist and I work at an AI company as a software engineer. We’ve had a lot of interesting debates. Most recently, she defined Art this way:
“Art is protest against automation.”
We thought of some examples:
I defined Economics this way:
“Economics is the automation of what nature does not provide.”
An example:
Jobs are created in one of two ways: either by destroying the ability to automatically create things (destroying looms, maybe), or by making people want new things (e.g. the creation of jobs around farming Eve Online Interstellar Kredits). Whenever an artist creates something new that has value, an investor will want to automate its creation.
Where Art and Economics fight is over automation: Art wants to find territory that cannot be automated. Economics wants to discover ways to efficiently automate anything desirable. As long as humans live in groups, I suppose this cycle does not have an end.
Art is subjective, AI is a buzzword, if statements are considered AI, especially in the gaming world.
And the current state of LLMs and what are the smartest and brightest in the industry have only managed to produce utter trash, while sacrificing the planet and its inhabitants. I like your daughter more, she will create more value and at the same time not be a total corporate tool, ruining the planet for generations to come, mad respect.
(not calling you a tool, but people who work with LLMs)
I do work with LLMs, and I respect your opinion. I suspect if we could meet and chat for an hour, we’d understand each other better.
But despite the bad, I also see a great deal of good that can come from LLMs, and AI in general. I appreciated what Sal Khan (Khan Academy) had to say about the big picture view:
https://www.ted.com/talks/sal_khan_how_ai_could_save_not_destroy_education?subtitle=en