In the latest episode of my podcast"Music and Meaning” (Tuesday, December 19th), I tackle the ubiquitous headline topic of Artificial Intelligence (AI) and its impact on music.
Since everyone is talking about it everywhere, why listen to my pontifications? Because I've been here before, during the last major technological disruption in the arts in the late 1970s and early '80s. Remember big hair and big electronic snares?
"The first time I heard a soprano sax sample, without questioning it, my friend Brent and I immediately put it on a record. 'It sounds just like a soprano sax!' That's what I said. 9 months later, after the record was in the stores, it sounded like a synthesized sax with a tiny bit of real saxophone DNA. I'd been duped."
"Chatbots and Sexy Synth Saxophones" kicks off with Elon Musk's AI warnings and the tech industry's temporary pause on advancements like ChatGPT-4, then draws a parallel between AI's unchecked growth and the biblical Tower of Babel, posing sharp ethical questions. I reminisce about my early days of tech adoption as a young music producer, from Moog synthesizers to digital samplers and drum machines, emphasizing how I learned a tough lesson: These tools came with an ideology of their own! I got played before I played them. And the whole AI thing is no different. We can learn from the past how to handle the present and future. Listen now!
This is my first take on AI and music, with more to come. In a future episode, I plan to demonstrate the new AI music production tools, arriving fresh daily by the dozens.
So please join me for "Chatbots and Sexy Synth Saxophones," a CT Media original podcast (avail Tuesday, December 19th, everywhere).
And BTW, thank you for listening to the debut episode, "What's Christian About Chemistry?" — the story of me signing the legendary band, Switchfoot. Because of your curiosity and faithfulness, the show debuted in the Top 20 on Spotify and Apple podcasts. Thanks for such a meaningful launch. Practice neighbor love and tell a friend. Uh . . . make that five friends.
A note of importance: If you're reading or listening in from Tennessee, the day and place of the book launch for Why Everything That Doesn't Matter, Matters So Much is locked in—Parnassus Books on March 12th of 2024. The time is not set yet, but it will be a ticketed event. Since the store can only hold approximately 140 people, your ticket price will include an autographed book, which we will personalize that evening if you'd like. More on that to come, but for now, I will leave you with this link to save. Watch that March 12th spot for details.
I’m still trying to figure out what to say about AI. As a technologist and serial entrepreneur, I get asked about it more often than some. AI has been evolving so rapidly and steadily over my lifetime that most do not even perceive its ubiquity in their daily lives already. In the early days, OCR (optical character recognition) seemed miraculous, leading to companies like Google (in its infancy) to proclaim that every written work would soon be available to the world for free, and that libraries would soon be obsolete. Now OCR is, well, boring, and available to anyone with a phone, and capable of real-time translation to any language. Then voice-recognition became ubiquitous. If you’ve ever used Siri or Alexa, you’re using what most now refer to as AI. It too, has become boring. As we continue to expect, and even demand more, the evolution will continue.
Most of the questions I get about AI are really asking about a “brain bucket”. Over the years I have found that many people just want to compartmentalize things as they search to make the complex simple. If they can reduce technology, or music genres, or politics into morality “brain buckets,” proclaiming entire categories as evil or good, they can choose to ignore or promote, criticize or aggrandize, based on what they think are “clear” moral choices. I have found, though, that brain buckets are too simplistic, and in some cases both dehumanizing and even immoral in their own right.
Clearly there are immoral choices one can make when using technology, music, or political power, or any realm of creating or cultivating. But in my opinion, that’s not where the conversation should be centered. Instead, I’m asking myself and my friends whether AI can become an instrument that, when well played, can help us to love better and live better.
This requires a much deeper and broader understanding of the technology and its applications rather than a simplistic brain bucket answer. AI is not amoral, but it is not intrinsically immoral or moral either. The answers exist in the tension, the borders of technology where few dare to stalk. They require patient and careful thought and a stance of curiosity and incisive wisdom. In short, they require humans to do what only we can do -- if we refuse to become binary in our choices, which, in the worst imitation of a computer is often the most inhumane decision we can make.
Charlie, I have thought about that Art House newsletter so many times over the years. You really have no idea.
I hope to unearth it from a box at my parents’ house one of these days. Oh the treasures I might find in their garage or attic from years gone by...