Skip to content

The Singularity and Existential Risk

The Singularity and Existential Risk

Details

Existential risk (google it if the term is new to you) is a major threat to all of humanity, and it comes in many forms. It is literally a matter of life and death, not just for some humans, but for ALL humans. The singularity may reduce the chance of some of the risks, and increase the chance of others.

Which one do you consider most serious, and also one that you have some idea how we might mitigate? What aspects of the Singularity contain inherent risk, and what consequences of the Singularity could help exacerbate specific existential risks?

Perhaps it is grey goo, terminator-esque robot extinction, subtle or not so subtle erosion of what it means to be human, or some other Singularity caused event or series of events that we need to guard against. How exactly?

Perhaps most of the current existential risks will lessen once the Singularity occurs, such as global warming, asteroid collision, and global pandemics. But how? Waving the singularity magic wand around only goes so far.

Perhaps even super-intelligence will not be able to help us avoid some calamities, like geomagnetic reversal or super-volcanoes (although this may only kill all life on earth, rather than all humans living off-earth). Are some things unavoidable no matter how smart we get?

As people arrive, we will have one-on-one discussions, but once we have a quorum, be prepared to get up in front of the rest of the attendees, and give your views, and specific answers to some or all of the above questions. Each person will get 5 or 10 minutes, with concise Q&A from the rest of the attendees immediately afterwards. Be prepared to defend your arguments. Or change your mind if others raise serious doubts about your assertions.

This will be in some ways like an unconference - rather than famous speakers, everyone gets to speak.

And specific action items can come out at the end. Perhaps political, fund-raising, awareness raising, or particular projects targeted to achieve specific goals. Talking is great, maybe if we can get a consensus we might be able to start doing as well, and direct the course of the future, even if only in small ways. Although when it comes to existential risk, nothing is really small.

Looking forward to some thought-provoking ideas.

Photo of Futurism NYC group
Futurism NYC
See more events
Central Park Delacorte Theater
81st St. & Central Park West · New York, NY