AI: Moral Responsibility and Social Impacts (Philosopher's Bazaar)


Details
Autonomous vehicles must, where collisions are unavoidable, make decisions about damage, harm and the lives of passengers, pedestrians, riders and the occupants in other vehicles whether these are driverless or not. Should the responsibility for accidents rest primarily with manufacturers, the owners or users of these vehicles?
Autonomous weapons are designed to destroy and kill so their deployment and utilization must address the issues of proper engagement and proportional use of force. If not, the manufacturers and users of these weapons could face prosecution over war crimes.
Automated decision-making can block people from commercial, social or health services based on inaccurate and unaccountable decisions. Should the use of 'social scoring' and rating algorithms which use personal data aggregated from online sources be allowed without restrictions or safeguards?
Facial recognition is being unwittingly forced upon consumers, both in retail and on our streets, with little notice or thought to moral implications of surveillance and creation of invisible profiles while we conduct our day-to-day business. Why is this really happening, how is it affecting us and why does it pose moral and social problems if left unchecked?
Please join us to explore Artificial Intelligence and some of its current and likely future impacts on moral responsibility and society.
https://www.meetup.com/the-philosophers-bazaar/events/286683561/

AI: Moral Responsibility and Social Impacts (Philosopher's Bazaar)