What we're about

This meetup is focused on Data Science on AWS as well as open source AI/ML technologies.

Upcoming events (5)

Data Science on AWS Monthly Webinar: Advanced Analytics and AI/ML

Online Workshop - See Details Below

RSVP Webinar: https://www.eventbrite.com/e/1-hr-free-workshop-pipelineai-gpu-tpu-spark-ml-tensorflow-ai-kubernetes-kafka-scikit-tickets-45852865154

Zoom Link: https://us02web.zoom.us/j/82308186562

Talk #1: Learning Quantum Machines

Abstract: The state of quantum computing today resembles in many ways the early days of machine learning. Near term quantum algorithms are based on the idea of differentiable programming and “trainable” quantum algorithms. Much like in machine learning, our theoretical understanding of the computational power of these algorithms is limited, however, researchers in academia and industry are starting to experiment with early stage quantum computers to explore a variety of use cases from optimization to computational chemistry. In this talk you will learn how Amazon Braket allows customers to explore the potential of quantum computing, and how AWS is bringing mature ML tooling to quantum computing through the open source library PennyLane to help the two fields learn from each other and accelerate innovation.

Speaker: Eric Kessler, PhD
(https://www.linkedin.com/in/eric-kessler-aws/)

Talk #2: Saving cost by pruning state-of-the-art deep learning models with Amazon SageMaker Debugger (https://www.amazon.science/blog/the-science-behind-sagemakers-cost-saving-debugger)

Abstract: State-of-the-art deep learning models consist of millions of parameters and training such models from scratch is computationally intensive and can take hours, days, or even weeks. Especially in case of transfer learning, where one fine-tunes a pretrained model on a smaller dataset, one may not need a large number of parameters in the model - a smaller model may just perform as well. Model pruning can significantly reduce model size without sacrificing accuracy. The idea is simple: identify the redundant parameters in the model that contribute little to the training process. This talk will show how you can use Amazon SageMaker Debugger to identify the least important weights during training and then prune the model.

Speakers: Nathalie Rauschmayr, Applied Scientist @ AWS and Connor Goggins, ML Engineer @ AWS

Talk #3: TBD

RSVP Webinar: https://www.eventbrite.com/e/1-hr-free-workshop-pipelineai-gpu-tpu-spark-ml-tensorflow-ai-kubernetes-kafka-scikit-tickets-45852865154

Zoom link: https://us02web.zoom.us/j/82308186562

Meetup: https://meetup.datascienceonaws.com

Related Links
=============
O'Reilly Book: https://www.amazon.com/dp/1492079391/
Website: https://datascienceonaws.com
Meetup: https://meetup.datascienceonaws.com
GitHub Repo: https://github.com/data-science-on-aws/
YouTube: https://youtube.datascienceonaws.com
Slideshare: https://slideshare.datascienceonaws.com
Support: https://support.pipeline.ai

Workshop: Build an AI/ML pipeline with BERT, TensorFlow and Amazon SageMaker

Online Workshop - See Details Below

Workshop: Build an AI/ML pipeline with BERT, TensorFlow and Amazon SageMaker

RSVP: https://www.eventbrite.com/e/full-day-workshop-kubeflow-bert-gpu-tensorflow-keras-sagemaker-tickets-63362929227

**Description**

In this hands-on workshop, we will build an end-to-end AI/ML pipeline for natural language processing with Amazon SageMaker.

You will learn how to:

• Ingest data into S3 using Amazon Athena and the Parquet data format
• Visualize data with pandas, matplotlib in Jupyter notebooks
• Run data bias analysis with SageMaker Clarify
• Perform feature engineering on a raw dataset using Scikit-Learn and SageMaker Processing Jobs
• Store and share features using SageMaker Feature Store
• Train and evaluate a custom BERT model using TensorFlow, Keras, and SageMaker Training Jobs
• Evaluate the model using SageMaker Processing Jobs
• Track model artifacts using Amazon SageMaker ML Lineage Tracking
• Run model bias and explainability analysis with SageMaker Clarify
• Register and version models using SageMaker Model Registry
• Deploy a model to a REST Inference Endpoint using SageMaker Endpoints
• Automate ML workflow steps by building end-to-end model pipelines using SageMaker Pipelines

**Pre-requisites**
Modern browser - and that's it!
Every attendee will receive a cloud instance
Nothing will be installed on your local laptop
Everything can be downloaded at the end of the workshop

**Location**
Online

Related Links
=============
O'Reilly Book: https://www.amazon.com/dp/1492079391/
Website: https://datascienceonaws.com
Meetup: https://meetup.datascienceonaws.com
GitHub Repo: https://github.com/data-science-on-aws/
YouTube: https://youtube.datascienceonaws.com
Slideshare: https://slideshare.datascienceonaws.com
Support: https://support.pipeline.ai

Workshop: Build an AI/ML pipeline with BERT, TensorFlow and Amazon SageMaker

Online Workshop - See Details Below

Workshop: Build an AI/ML pipeline with BERT, TensorFlow and Amazon SageMaker

RSVP: https://www.eventbrite.com/e/full-day-workshop-kubeflow-bert-gpu-tensorflow-keras-sagemaker-tickets-63362929227

**Description**

In this hands-on workshop, we will build an end-to-end AI/ML pipeline for natural language processing with Amazon SageMaker.

You will learn how to:

• Ingest data into S3 using Amazon Athena and the Parquet data format
• Visualize data with pandas, matplotlib in Jupyter notebooks
• Run data bias analysis with SageMaker Clarify
• Perform feature engineering on a raw dataset using Scikit-Learn and SageMaker Processing Jobs
• Store and share features using SageMaker Feature Store
• Train and evaluate a custom BERT model using TensorFlow, Keras, and SageMaker Training Jobs
• Evaluate the model using SageMaker Processing Jobs
• Track model artifacts using Amazon SageMaker ML Lineage Tracking
• Run model bias and explainability analysis with SageMaker Clarify
• Register and version models using SageMaker Model Registry
• Deploy a model to a REST Inference Endpoint using SageMaker Endpoints
• Automate ML workflow steps by building end-to-end model pipelines using SageMaker Pipelines

**Pre-requisites**
Modern browser - and that's it!
Every attendee will receive a cloud instance
Nothing will be installed on your local laptop
Everything can be downloaded at the end of the workshop

**Location**
Online

Related Links
=============
O'Reilly Book: https://www.amazon.com/dp/1492079391/
Website: https://datascienceonaws.com
Meetup: https://meetup.datascienceonaws.com
GitHub Repo: https://github.com/data-science-on-aws/
YouTube: https://youtube.datascienceonaws.com
Slideshare: https://slideshare.datascienceonaws.com
Support: https://support.pipeline.ai

Past events (299)

Workshop: Build an AI/ML pipeline with BERT, TensorFlow and Amazon SageMaker

Online Workshop - See Details Below

Photos (578)

Find us also at