PauseAI Community Meetup
Details
AI is advancing rapidly, and experts warn that we're headed towards disaster at full speed. This is not a fringe opinion. According to a survey of over 2,000 AI engineers, most people working at the cutting edge of AI believe there's on average a 14% chance of AI causing the extinction of humanity. You wouldn't get on a plane with a 14% chance of falling out of the sky. You wouldn't put your family on that plane. But we're all already on that plane.
Today's AI systems aren't (probably) particularly dangerous yet, but all the major AI companies are racing to make more general, more intelligent, more capable AI systems that are even more powerful, without understanding how they work, and without being able to predict how they will behave. The financial incentives are misaligned with responsible behavior, so it's up to us to communicate the need for a global moratorium on dangerous AI research to those in power, and get them to enforce a PAUSE until the safety protocols can catch up with capabilities. The mission of PauseAI is to to bring that global moratorium into existence in a meaningful way as fast as possible.
One way to do that is with numbers. If we can show politicians that there are real people who strongly care about this issue--not just internet warriors, but people who will actually show up in person to events--they are much more likely to act. Another way to do that is with direct appeals to elected officials, especially those who represent you. Writing and calling in makes a difference.
If you haven't been to an event yet, please attend! At this event, in addition to getting to know other concerned humans, we'll be discussing what we can do to make a Pause happen sooner.
If you have anyone who is at all interested, please bring them along. There will also be educational resources for newcomers.
