How to Hack an AI - SecTalks SYD0x4F (79th)
Details
# Presentaton
Adversarial machine learning (or AML) is a field growing in prominence that represents the ability to ‘hack’ Artificial Intelligence (AI) and Machine Learning (ML) algorithms by poisoning data sets imperceptibly before training, by evading classification, leaking confidential information or by hijacking the model’s function to make it do something it wasn’t intended to. The rapid uptake of AI/ML systems by organisations means the attack surface is growing significantly. I believe AI/ML security may soon join cyber security as one of the greatest technological and geostrategic threats. However, there is still time to learn from the lessons of cyber security.
This talk is intended to inform cyber security professionals about the increasing relevance of their field to AML and AI/ML security. It will describe how ML models work, why vulnerabilities exist and how they can be exploited. I will demonstrate the cutting edge of AML - glasses that deceive facial recognition detectors, stickers that can disguise objects in the physical world from image classification engines, and how carefully crafted noise can cause speech to text systems to hear messages that humans can’t. I will also describe some of my own research. The audience should come away with an appreciation for the field of AML, why AI/ML security is a growing concern, and how in their roles as cyber security professionals they can contribute to the dialogue.
by Harriet Farlow (@HarrietHacks)
Harriet Farlow runs an AI Security company called Mileva Security Labs and is a PhD Candidate in AI Security at the University of New South Wales in Canberra, Australia. With a passion for bridging technical and non-technical disciplines, Harriet’s professional experience spans consulting, academia, Government and a technology start-up. She holds a Bachelor of Science in Physics and Bio-anthropology, and a Master of Cyber Security, Strategy and Diplomacy.
# Speed hiring (experimental)
There is an opportunity for potential employers to do an impromptu 10 second description of their open role. If you are interested, speak with one of the organizer before start of the session.
Please note best way to support SecTalks and tell others about your open roles is by mean of sponsorship. We encourage companies, small or large, specially local to come forward and support their local community.
# Sponsors
- Google (https://careers.google.com)
- SecDim (https://www.secdim.com)
- TikTok (https://www.tiktok.com/@tiktok_australia)
# Notes
- For sponsoring SecTalks Sydney, contact [sydney@sectalks.org](http://mailto:sydney@sectalks.org)
- To speak at SecTalks, fill up https://j.mp/sectalkscfp
