Episode 7: James Barrat | The Dangers of Superintelligence | Click to Listen
In this episode, author and filmmaker James Barrat and I discuss the potential for superintelligence research to go horribly, dangerously wrong.
Also, the wonderful blog Boing Boing is co-promoting this episode (full description below). Just click the play button above to listen. My co-host Tom Merritt and I discuss pages 380-464 of the novel toward the end.
You can subscribe to the podcast within any podcast app. Simply use your app's search function (type in "After On") to find & subscribe.
Or, to subscribe from your computer click here then click the blue “View on iTunes” button (left side of the page under the After On image) then click “Subscribe” (similar location) in the iTunes window. Or simply follow the feed http://afteron.libsyn.com/rss
After On Podcast #7: James Barrat on the Dangers of Superintelligence (from Boing Boing)
To hear a wide-ranging interview about the real-world risks we humans could face from a rogue superintelligence, hit play, below. My guest is author and documentary filmmaker James Barrat. Barrat’s 2014 book Our Final Invention was the gateway drug that ushered me into the narcotic realm of contemplating super AI risk. So it’s on first-hand authority that I urge you to jump in – the water’s great!
This is the seventh episode of my podcast series (co-hosted by Tom Merritt), which launched here on Boing Boing last month. The series goes deep into the science, tech, and sociological issues explored in my novel After On – but no familiarity with the novel is necessary to listen to it.
The danger of artificial consciousness has a noble pedigree in science fiction. In most minds, its wellspring is 2001: A Space Odyssey, which features HAL 9000 – an onboard computer that decides to kill off its passengers before they can disconnect it (spoiler: HAL’s rookie season ends – rather abruptly – with a 1-1 record).
James’s interest in this subject was piqued when he interviewed 2001’s author, Arthur C. Clarke, back in the pertinent year of 2001. Clarke’s concerns about superintelligence went beyond the confines of fiction. And he expressed them cogently enough to freak James out to this day.
Among James’s worries is that Hollywood has inoculated many of us from taking super AIs seriously by depicting them so preposterously. “Imagine if the Centers for Disease Control issued a serious warning about vampires,” he notes. “It’d take time for the guffawing to stop, and the wooden stakes to come out. Maybe we’re in that period right now with AI, and only an accident or a near-death experience will jar us awake.”
James and I discuss the “vampire problem” and many other issues in our interview. If you’re looking to cut back on the long, unproductive hours you currently waste on sleep, you should definitely give it a listen.
You can subscribe to the podcast within any podcast app. Simply use your app's search function (type in "After On") to find and subscribe. To subscribe via your computer on iTunes, just click here, then click the blue “View on iTunes” button (on the left side of the page), then click “Subscribe” (in a similar location) in the iTunes window. Or follow the feed http://afteron.libsyn.com/rss