• Deep learning in industry: CrowdAI, Dropbox

    When this meetup started in 2012, it was hard to find working data scientists that were bullish on the topic of deep neural networks. Fast forward to today and it seems like deep learning is everywhere. Justin Timberlake is making music videos that literally take place at deep learning conferences: https://www.youtube.com/watch?v=gA-NDZb29I4 . YSL is enlisting AI researchers to help sell fragrances: https://www.youtube.com/watch?v=nU09j2gGHYg . Nvidia's stock price looks like the price of bitcoin without the dips. Surely these are signs of irrational exuberance? But amidst the pits of madness and neverending hype parades, there should be at least one place where grounded, practical applications of deep learning in industry can be discussed. And that place is the SF Neural Network Aficionados. So for our next event, happening February 20th in San Francisco, we will have 3 awesome speakers discussing grounded, practical applications of deep learning in industry. Each talk will be 20-30 minutes and the event starts at 7:00. ============================================================ Title: Applying Deep Learning to Satellite Images Jigar Doshi is a Deep Learning Lead at CrowdAI, where he works on deep learning, computer vision and machine learning applied to satellite images. He also works on machine learning models that improve the annotation process. Prior to joining CrowdAI, Jigar worked at IBM Watson Research doing computer vision research on videos for tasks like activity recognition, classification, and re-identification. Before that, he was working on his Ph.D. at Georgia Tech building machine learning and computer vision models for humanoid robots. ============================================================ Title: The ML behind the Dropbox doc scanner Tom Berg is on the Machine Learning team at Dropbox, where he's worked on image recognition, object detection, OCR, active learning, and making machine learning accessible to non-specialists. He came to Dropbox after completing his Ph.D. at Columbia, where he worked on fine grained image recognition and built birdsnap.com. In his talk, Tom will describe the machine learning components behind the Dropbox doc scanner, which allows users to generate PDFs in Dropbox from their physical paper documents. ============================================================ Title: Understanding the error surface of neural networks Dave Sullivan will be reviewing the concept of neural network error surfaces and discussing the recent paper "Visualizing the Loss Landscape of Neural Nets" (https://arxiv.org/abs/1712.09913). The paper investigates why resnets are so effective and suggests additional avenues for further research. Dave is a freelance data scientist focused on helping companies make the best use of state of the art algorithms and techniques. He is the meetup organizer and has been a proponent of deep learning and the larger concept of "differentiable programming" since 2011. ============================================================== LOCATION: Dropbox, the file storage company we all know and love, has been nice enough to provide space for the event. One caveat is everyone entering the building needs to sign Dropbox's on-site M-NDA. Here is a link: https://s3.us-east-2.amazonaws.com/dropboxnda/DropboxMNDAv416.pdf . Please read it ahead of time and be prepared to sign at the event. For those of you who have not been to one of these events, the group is approximately 33% data scientists / 33% engineers / and 33% non-technical-but-interested-in-machine-learning. Food and beverages will be provided. There's usually 30-40 minutes to socialize before and after the scheduled speakers. Looking forward to seeing everyone there!

    7
  • Keras, chat bots, and deep learning

    Clover

    Hi everyone, We've got an exciting meetup coming up on October 22nd. First, we'll have a talk from Francois Chollet, creator of the Keras deep learning framework. He'll be providing some context about the purpose of Keras, a short technical intro of how it's designed and how to use it, and a quick overview of what is coming up in the near future. If you haven't checked out Keras already, you should, it's really cool! http://keras.io/ Second, we'll have a talk from Dave Sullivan (me), co-founder and CEO of Ersatz Labs. He'll be discussing recent advances in the area of chat bots and question/answer systems. The talk will cover an overview of concepts involved--such as word vectors, recurrent neural networks, and attention mechanisms--and include a demo of a (very much a prototype) neural network based chat bot. The event starts at 7:00, food and drinks will be provided. We are at a new location, in Sunnyvale, so for all of you peninsula dwellers that can never make it up to SF for these, this is the one to come to. There will be about an hour of networking prior to the talks, and each talk will run 30-45 minutes. This is a picture of the space: picture of space (https://instagram.com/p/6fhLNkLVibqONW-GtOJstKLgELvLhAoiZHdZw0/) I'm looking forward to seeing you there! Dave Sullivan Organizer

    10
  • What's the big deal about "deep learning" anyway?

    Location visible to members

    Hi everyone, 2015 has been an exciting year so far for deep learning. Researchers are now able to take images and use neural network-based language models to generate textual descriptions. IE, computer looks at a picture of two children playing on a beach and says "two children playing on a beach". It's still far from perfect at this particular application, but the fact that it can do it at all is pretty amazing. Several papers have come out about this, but here's one: paper (http://www.cs.toronto.edu/~zemel/documents/captionAttn.pdf) Meanwhile, there are two new types of architecture emerging from Facebook and Google: Memory networks and Neural turing machines. These are different but similar architectures in that they both attempt to use neural networks to encode "more specific"/less distributed information while still being fully differentiable via back propagation. memory networks (http://arxiv.org/pdf/1410.3916.pdf) , neural turing machines (http://arxiv.org/pdf/1410.5401v2.pdf) There has also been some major advances in applying deep learning to robotics. Much of this builds off of the work deep mind was doing with teaching neural networks to play Atari games. End-to-end training of deep visuomotor policies paper here (http://arxiv.org/pdf/1504.00702v1.pdf) So let's get together and discuss these and other exciting developments in the world of deep learning and machine learning. Schedule 7:00-7:30: Drinks and networking 7:30-8:00: Deep learning history from Ronjon Nag 8:00-9:00: What changed, why "this time it's different", and new application areas from Dave Sullivan 9:00-9:30: Q+A Deep Learning Workshops Are you interested in boosting your skills in the area of deep learning and neural networks? Ersatz Labs is hosting a workshop about using deep learning with video data on July 1st. It will be a hands-on workshop and will give attendees practical experience with using deep learning algorithms. More information at event page (http://www.ersatzlabs.com/events/)

    8
  • Free Workshop: Deep learning on Kaggle with Ersatz

    Location visible to members

    Hello everyone, It's been over 6 months since our last meetup, and a lot has changed in the world of deep learning. The state of the art continues to improve, and the amount of research coming out of the field is growing at a seemingly exponential rate. Deep learning has become so popular that many are sure it's about to fall off a peak of inflated expectations into a trough of disillusionment so deep... it will collapse the world economy. Then again, others are worried neural networks will become sentient and take over the world with only Bruce Willis and/or Elon Musk fighting for the forces of good. The truth is more pedestrian than that. We can now build applications that are able to process images, audio, text, and various types of sensor data better than ever before. That's the boring truth about deep learning: It's just computers learning to remember and process the things we tell them better. Be that as it may, one of the most common questions I get is "How can I actually use deep learning?" This same question comes from two groups: businesses that read about deep learning somewhere and want to know how (or if) to use it, or people already familiar with machine learning (sometimes very familiar) that want to know how they can use deep neural networks for a given project. Both groups just want to use deep learning. So this time for Neural Network Aficionados, I'm going to put on a free deep learning workshop where I show everyone how to do a Kaggle contest (this competition (https://www.kaggle.com/c/datasciencebowl)) using Ersatz, the software my company makes, from start to finish. Most of the stuff we'll be talking about will be broadly applicable--IE, you don't need Ersatz to do it. For instance, you'll see what all the different model parameters do and how to set them (things like learning rate, dropout, layer types), how to think through your problem and apply data transformations, and the basic workflow of training and using a deep neural network. Because we're using Ersatz, it will be light on programming and full of awesome visuals. We'll be hosting at a new location. The nice folks at Orange, the French telecom, have been kind enough to offer some space for us to host the event. Their office is located in San Francisco near the Ferry Building. There will be time to network in the beginning, after which I'll host the workshop. You can follow along with the process if you bring a laptop with a web browser. Looking forward to seeing you there! Dave Sullivan Ersatz Labs

    9
  • Neural networks in Java with Deeplearning4j

    NextSpace

    Hi everybody, we're announcing a new meetup with a guest speaker in mid June: Adam Gibson will present his open-source, distributed deep-learning framework, Deeplearning4j. This will include demos of its sentiment analysis and facial recognition tools. DL4J is a commercial-grade platform written in Java and compatible with Hadoop. Its neural nets work for image recognition, text analysis and speech-to-text. DL4J has implementations of such algorithms as binary and continuous restricted Boltzmann machines, deep-belief nets, denoising autoencoders, convolutional nets and recursive neural tensor networks. Users with a working knowledge of Java will be able to undertake anomaly/fraud detection, recommendation engines and social-media ranking systems, among many other applications. Adam studied computer science and business administration at Michigan Tech, and now serves as machine-learning instructor at the data science academy Zipfian. He lives in San Francisco.

    25
  • How to build a game with neural networks

    NextSpace

    Hi everyone, It's been a while since we've met, so let's meet up and talk neural networks. In the spirit of practical utilization of deep learning research, I thought I might do a workshop/tutorial on how to add "neural network intelligence" to a relatively simple javascript game. This game is a little different because there's actually no player input -- various agents must instead learn strategies to deal effectively with their environment. "Emergent behavior" on a micro scale. This is similar to the type of work that deep mind was doing when they demoed neural networks playing Atari games. In their experiments, they generated features from raw pixel data using convolutional neural networks, then fed them to a Q learning algorithm, a standard reinforcement learning method. Our method is a bit different in that we are using a recurrent neural network for the whole process. We'll be using the new API from Ersatz (a neural network API I happen to be co-founder of...) to do the neural network training and predicting. This is for 2 reasons: 1) self promotion and 2) to avoid the theory of neural networks a little bit and focus on their practical use and how to write that into your code. The presentation will be easier to follow if you are reasonably familiar with javascript. Neural network experience/knowledge not required to benefit. Beer and chairs to be provided, so no need to bring your own. Ah, and by popular request: soda too! Thanks, I hope to see you all there.

    1
  • NIPS 2013 Recap + big Ersatz announcement

    NextSpace

    Hello neural network aficionados, NIPS (Neural Information Processing Symposium) happened in December and it was quite a show. Deepmind demoed reinforcement learning using neural networks to win at Atari games. Amazon busted out some delivery drones and tried to convince us it totally wasn't a PR stunt. Mark Zuckerberg even stopped by to let us all know that he thought we were awesome. Some really cool research was presented too. One speaker provided interesting evidence suggesting that convolutional nets are already better at feedforward image recognition than monkeys. Who knew? There were several papers that looked into Dropout in more detail, uncovering more information about what's actually happening under the hood. There was some interesting research re: deep recurrent neural networks. Lots of stuff about convolutional nets. Some people are building FPGAs for neural networks now. There was a bunch of really cool stuff covered. So anyway, let's meetup and talk about all of that. For the first part of the meetup, I'd like to open it up to anyone who wants to give a lightning talk (~5 minutes about a deep learning topic), and if no one wants to go, I'll just talk about research that was particularly interesting that came out of NIPS. For the final part of the meetup, I'm going to show off my company's neural network product Ersatz doing some cool neural network stuff. I'll have a couple big announcements re: all of that too, so stick around... It's at the same place it's been the last few meetings. Beer and chairs will be provided. See you there!

    9
  • Deep learning demos + roundtable discussion

    NextSpace

    Hi neural netters, Last time we met was back in June. Since then, there have been a handful of developments in deep learning--this includes research into different types of nonlinearities (max out, "p-norm"), using autoencoders as generative models, and some very nifty research related to NLP and how learned word vectors can be linearly combined for interesting results--eg. vector('queen')+vector('chair') ~= vector('throne'). There's a lot of work going into *applications* of deep learning now. Meanwhile, my company has also been heads down working on Ersatz, our deep learning PaaS. It's got a long way to go, but I'm comfortable saying it's currently the most powerful and easy to use set of deep learning tools available. Then again, it's not yet a crowded market (wait till this time next year...) So let's have a meetup. Last time, I did a demo for the first half and had sort of a free wheeling roundtable discussion during the second half. I think the roundtable worked particularly well, so let's do that again. I'll make sure to get it on tape this time :-) So agenda: Since dreamforce (big salesforce.com convention) is in town, I'll be demoing an actual real life use case for using neural networks (in this case, the networks implemented in Ersatz) to plan a day's calls for a busy salesperson automatically--IE, answer the question "Statistically, what are the most beneficial actions I can take today in order to improve my odds of hitting my sales target?" I'll also show a much more "fun" demo where I take a bunch of learned word vectors and map them into a 3 dimensional space using T-SNE and render something like 300,000 unique words into a 3d space using three.js w/ webGL. Then we'll just discuss whatever you want re: machine learning, deep learning, neural networks, moving towards AI, how it all ties together, etc. etc. Booze will be provided as per usual. Because if there's anything that goes with hardcore optimization of graphical models, it's booze... Re: group in general: Lately (like, last 6 months), I haven't had enough time to host these as often as I'd like. SO if anyone would like to step up and help out with hosting the one after this (which will probably be NIPS 2013 themed), I'd be ecstatic! So anyway, let me know if you'd be up for planning one of these. Thanks everyone, hope to see you there!

    9
  • Neural network applications w/ Ersatz + beer

    Location visible to members

    After a bit of a hiatus, SF Neural Network Aficionados is back! This time, I'm going to be showing everyone how to use neural nets in actual applications. I will demonstrate this using the Ersatz API (Ersatz is the neural network PaaS product my company has been working on for a while) These methods will be broadly applicable to any NN architecture/product/package/whatever you might want to use, so feel free to replace Ersatz with your NN solution of choice. The main things I'll be focusing on are the practical aspects of what kinds of problems you can solve using neural networks, along with how to get your data setup as "inputs" and what to do with the "outputs" you get back from your NN. I'll also show you how to diagnose and remedy training issues, such as exploding gradients or overfitting. I'm hosting at a new location this time--about a block away at Nextspace instead of Wework. Also, if anyone would like to speak at a future meetup, or if you have anything in particular you'd like *me* to try to cover, please let me know. Looking forward to seeing you all again and hearing about the latest work you've been doing with NNs.

    10
  • Training Neural Networks With Simulated Data

    Needs a location

    I would like to present a short talk on training neural networks using generated or transformed data for supervised learning. I research human biomechanics and am applying physics simulation data to the training of neural networks.