What we're about
Upcoming events (2)
The Association of Computing Machinery http://www.acm.org/about-acm/about-the-acm-organization is the world’s largest computing society, handling Computer Science conferences and publications. The San Francisco Bay Area ACM is a local professional chapter, a non-profit 501c(3), founded in 1957. We hold two meetups a month on (1) General Computing on the 3rd Wednesday of the month, and (2) Data Science SIG, on data mining, deep learning or big data on the 4th Monday of the month. Among these Meetups, we recently emphasis Security & Social Modeling discussions, and We generally have[masked] people attending our talks.
See also our YouTube channel (https://www.youtube.com/user/sfbayacm) with OVER 140 past talks. And you can find our Security & Social Modeling talks on YouTube playlist: https://www.youtube.com/playlist?list=PL87GtQd0bfJyVsBgkL-TyNzZhsNOYK2v_ .
In general, we are seeking speakers to book in advance. Talks could be like something you would see at a computing conference, an educational subject for experienced computing professionals. It is fine to err on the side of more technical, algorithmic or mathematical.
If you would like to submit a talk proposal, please provide the following:
- 3 available dates (DS on 4th Monday of the month) or (General Computing on 3rd Wed of the month). We skip December for talks.
- speaker name, phone, email, LinkedIn (or picture)
- talk title
- talk description (include any desired links, related reading)
- speaker bio (include any desired links)
CALL FOR PRESENTATIONS IN SECURITY & SOCIAL MODELING
- Efficiency Gap function, insufficiency and complementariness Reference: https://www.theatlantic.com/science/archive/2018/01/efficiency-gap-gerrymandering/551492/
- Code of Ethics in Machine Learning
- Ethic in financial product design
- Ethic in social data collection
- Ethic in Patent design
- Deep learning from Chatbots
- Dimension reduction in social science domains
Available dates for speakers in 2021:
General Computing talks on 3rd Wednesday: 9/21, 10/19, 11/16/2022;
Data Science SIG talks on 4th Monday in general: 8/22, 9/26, 10/24/, 12/5/2022.
On the left side of the Meetup page, in the "Organizers:" box, there is a "Contact" button you can use for the submission, use "general computing", "S&S" or "DS SIG" talk at the beginning to propose your talk.
You can also contact me (Greg Makowski) about sponsorship opportunities for our non-profit organization. We are run by unpaid volunteers. If you provide financial sponsorship, sponsor food or the video recording for a night or talk series, we can offer either
a) a "thank you for the donation letter with our 501c(3) non-profit tax ID" for your tax deduction
b) "thank the sponsor" time to address the event audience during the "upcoming events" period of one of our events (7:00 - 7:10)
c) opt-in registration information of the attendees
d) "thank the sponsor" branding on the video, posted on our YouTube video channel of our talks
e) a banner in our monthly email newsletter to 6,000 opt-in bay area computing professionals or a section of our print newsletter to members only
f) make a suggestion and we can see what we can do, constrained by our volunteer effort and non-profit status.
Liana Ye, Program Chair, and Greg Makowski, Business Development Lead and Data Science SIG Chair
Pre-register on Zoom:
6:45 Connection to Zoom chat with speaker and organizers
7:00 SFbayACM intro, upcoming events, introduce the speaker
7:10 presentation starts (~90 min with Q&A)
50% Discount for Ray Summit 2022 San Francisco 8/22/2022 noon to 8/24/2022 3:15pm in-person in San Francisco for viewing the intro to Ray lecture on video: https://youtu.be/Fc-xuUBs5MY
Existing production machine learning systems often suffer from various problems that make them hard to use. For example, data scientists and ML practitioners often spend most of their time-fighting YAMLs and refactoring code to push models to production.
To address this, the Ray community has built Ray AI Runtime (AIR), an open-source toolkit for building large-scale end-to-end ML applications. By leveraging Ray’s distributed compute strata and library ecosystem, the AIR Runtime brings scalability and programmability to ML platforms.
The main focus of the Ray AI Runtime is on providing the compute layer for Python-based ML workloads and is designed to interoperate with other systems for storage and metadata needs.
In this session, we’ll explore and discuss the following:
- How AIR is different from existing ML platform tools like TFX, Sagemaker, and Kubeflow
- How AIR allows you to program and scale your machine learning workloads easily
- Interoperability and easy integration points with other systems for storage and metadata needs
- AIR’s cutting-edge features for accelerating the machine learning lifecycle such as data preprocessing, last-mile data ingestion, tuning and training, and serving at scale
Key takeaways for attendees are:
- Understand how Ray AI Runtime can be used to implement scalable, programmable machine learning workflows.
- Learn how to pass and share data across distributed trainers and Ray native libraries: Tune, Serve, Train, RLlib, etc.
- How to scale python-based workloads across supported public clouds
Richard Liaw is an engineering manager and technical lead at Anyscale, where he leads a team in building open source libraries on top of Ray. He is on leave from the PhD program at UC Berkeley, where he worked at the RISELab advised by Ion Stoica, Joseph Gonzalez, and Ken Goldberg.
In his time in the PhD program, he was part of the Ray team, building scalable ML libraries on top of Ray.
Xiaowei Jiang is a software engineer working on the Ray OSS project, mostly focusing on machine learning libraries. Before Ray, she worked on Google Assistant for four years and on Uber's growth team for two years. Xiaowei is passionate about applying Ray to complex machine learning systems and enabling Ray users by addressing their use cases and pain points.