Inescapable Existential Dangers (IEDs) – Creators and Destroyers of Worlds


Details
Technological advances create multiple Inescapable Existential Dangers (IEDs), innovations that yield both unprecedented benefits and catastrophes, lying like landmines among paths to progress. This is an ancient bind that ensnares nearly all societies and epochs. While celebrating the destruction of Carthage, Scipio Africanus the Younger brooded over the inevitable doom of Rome, since the might-makes-right world order eventually topples all nations. Modern tech has maintained this dynamic while ballooning the consequences to existential proportions that would have been ungraspable to Scipio.
In this meetup, we’ll explore the OG IED, nuclear weapons, and the new IED, AI, which promises to dwarf the former like Hiroshima does Carthage.
OG IED: Nuclear weapons debuted by ending a global war that had destroyed over 50 million lives and was poised to kill tens of millions more. The obliteration of Hiroshima and Nagasaki eliminated the need to invade Japan and reenact the Battle of Berlin, the deadliest siege in history. The 80-year-long nuclear peace, built on nuclear deterrence, enabled unprecedented international trade and stability that expanded the gross global domestic product to $25 trillion and the world population to 8 billion. This peace is built on intricate human and tech networks that have repeatedly experienced harrowing errors. As MIT’s Max Tegmark noted, the fail-safes reduce the yearly threat of nuclear war to 1% or less, which sums to 100% in a few centuries or less.
The prosperity and advances spawned by the long nuclear peace birthed an astronomically larger IED, AI, which is projected to produce knowledge that may cure afflictions from cancer to old age and produce solutions to yet more IEDs like climate change. AI also promises to surpass human performance in all fields, leading to universal unemployment and idleness. Beyond human obsolescence is the Alignment Problem, the intractable and deadly problem of how to keep the leash on artificial brains with ever-ballooning intellects that vastly surpass human thought, rendering them unknowable. How can we control what we can’t understand? Yet again, a breakthrough that heralds undreamt-of prosperity and happiness for billions also augurs doom.
Can we manage the ever-proliferating flock of IEDs so we can enjoy their benefits while avoiding their dangers? Is progress a huge mortgage that always comes due?
You do not have to review the resources prior to the meetup in order to participate.
Thinking Through the Unthinkable:
- The two world wars killed nearly 100 million people. Nuclear weapons ended this recurring catastrophe. That’s at least 100 million lives saved. Some argue, in a nuclear-free world, we’d be at war with Putin’s Russia or China now. Are these savings in human lives worth paying for later? Would you be willing to sacrifice loved ones in war now if this ended the threat of nuclear war forever?
- The evolutionary biologist Geoffrey Miller has denounced efforts to build superintelligent AI (SGI) as “evil,” citing worries about the fates of his kids, who may live to see a world where everything they can do will be done better by machines. What if his children were stricken with terminal cancer, where cures for these may require AI? If you are a parent, would you endorse SGI to save the lives of your kids now, though it may strip their lives of purpose and even threaten their lives later? One possible AI future: https://youtu.be/O-2tpwW0kmU?si=T-olPfPYq2_V_1Yq Geoffrey Miller on Evil AI: https://youtu.be/MtMxdIV3sYM?si=jX4vJ0UCWxhZI5qx
AI as Cancer and Disease Cure: https://youtu.be/JWrZgiKTxms?si=uvEa3CvAq0OsiZBL - Though AI may be the ultimate IED, it’s far from the only new one on the horizon. Podcaster Sam Harris talks about yet another, the threat of artificial pandemics due to proliferating biotechnological know-how: https://www.samharris.org/podcasts/making-sense-episodes/special-episode-engineering-apocalypse

Inescapable Existential Dangers (IEDs) – Creators and Destroyers of Worlds