• Papers We Love #7: CNNs & Francisco Varela

    innoQ Deutschland GmbH

    Papers We Love is back with two talks! The first talk will be about Group Equivariant Convolutional and Capsule Networks by Tolga Birdal. The second talk will be about the Mathematical Work of Francisco Varela by Johannes Drever. ** Agenda 19:00 doors open, food & drinks provided by INNOQ 19:30 Tolga's talk + questions 20:05 break, more food & drinks 20:20 Johannes' talk + questions 21:00 everybody out ** Group Equivariant Convolutional and Capsule Networks Convolutional neural networks (CNN) have demonstrated tremendous capabilities paving the way for the AI revolution. Part of this is due to their translational invariance, or in a more general terminology, equivariance. While for typical CNN, shift of the input images were handled at no additional effort, dealing with rotational changes was a nuisance. This is because CNNs, as well as many other state of the art learning machinery, are not capable of handling transformations other than translations, by design. Luckily, many transformations that are to be taken care of are elements of what is called a mathematical group. With this notion, Cohen and Welling have re-addressed the conventional CNNs devising G-CNN, the group equivariant convolutional networks. Given the proper definition and constructs of a group, such as the special orthogonal group of rotations, G-CNN achieve equivariant or invariant features to the group actions, such as rotating an input image. This achieved by what is called a G-Conv, the group convolution. Lenssen et al. have then taken this idea further developing the Group Equivariant Capsule Networks, an extension of the group-equivariance properties to the famous capsule networks. In this talk, we are primarily interested in the latter, drawing the necessary background from the formers. Tolga's purpose is to introduce, motivate, discuss and brainstorm on these state of the art advancements. http://proceedings.mlr.press/v48/cohenc16.pdf https://papers.nips.cc/paper/8100-group-equivariant-capsule-networks.pdf https://papers.nips.cc/paper/6975-dynamic-routing-between-capsules.pdf Tolga Birdal has recently defended his PhD thesis at the Computer Vision Group, Chair for Computer Aided Medical Procedures, Technical University of Munich and was a Doktorand at Siemens AG. He completed his Bachelors as an Electronics Engineer at the Sabanci University in 2008. In his subsequent postgraduate programme, he studied Computational Science and Engineering at Technical University of Munich. In continuation to his Master's thesis on "3D Deformable Surface Recovery Using RGBD Cameras", he now focuses his research and development on large object detection, pose estimation and reconstruction using point clouds. Recently, he is awarded both Ernst von Siemens Scholarship and EMVA Young Professional Award 2016 for his PhD work. For further information, visit tbirdal.me and http://campar.in.tum.de/Main/TolgaBirdal. ** Mathematical Work of Francisco Varela Epistemology is the study of the nature of knowledge, justification, and the rationality of belief. Second order cybernetics is a specific epistemology which takes a unified view on biological systems and machines. Francisco Varela is one of the most influential thinkers in this area. He has contributed to the development the concept of autopeiesis. In this this talk an introduction to the ideas of second order cybernetics will be given. https://constructivist.info/articles/13/1/011.kauffman.pdf Johannes Drever has studied computer science at the TUM with a focus on machine learning and did a PhD at the department of neurology at the LMU. He works as a software developer at Linova on applications in the aviation, health and torsional vibration industry. He is a Haskell enthusiast and has a growing interest in applied category theory.

    1
  • Coffee and Papers: Talking about trees

    Veranstaltungsort für Mitglieder sichtbar

    This is going to be a meetup in a cafe: grab some food, pick a favourite beverage and enjoy free, informal discussion about papers. Rules: pick a paper from the following (or multiple) and leave a note that you'll be concentrating on it, have a rough idea about what's going on in the paper. Subject: everything about trees and modern ways of writing (and reading) things to (and from) different types of disks. LSM Trees: http://db.cs.berkeley.edu/cs286/papers/lsm-acta1996.pdf Cache Oblivious B-Trees: http://erikdemaine.org/papers/FOCS2000b/paper.pdf If anyone has suggestions as regards where to eat, please share it. Otherwise we'll pick a cafe/restaurant in Maxvorstadt or Schwabing or nearby.

    25
  • An Introduction to Statistical Relational Learning (SRL)

    JetBrains Event Space

    Artificial Intelligence is booming and how! The current trend is to use Deep Learning tools across a multitude of domains. However, there are alternate Machine Learning approaches which can perform robust and accurate reasoning and learning on complex relational data. Today in this talk, I will speak about one such approach of Markov Logic Networks (MLN) in Statistical Relational Learning (SRL). MLN is a powerful framework combining combining statistical (i.e. Markov Random Fields) and logical reasoning (First Order Logic). I will highlight an example of how it can be used on text to do Natural Language Understanding (NLU) and extract affordances in the Robotics domain. This use case scenario shows how large scale inference on text can be leveraged to give knowledge to artificial assistants. This introductory talk would touch base on the basics and acquaint you with all the state of the art tools. Watch this space for more! JetBrains will be hosting this meetup. Location is the Event Space, which you reach from the right side entrance of the building.

    11
  • Probabilistic Programming

    Codecentric Office Munich

    Interested in Probabilistic Programming? Come over & join our next meetup! We're going to talk about Probabilistic Programming and Bayesian Methods. If you'd like to get prepared and know more, you can read the "Bayesian Methods for Hackers" book: https://github.com/CamDavidsonPilon/Probabilistic-Programming-and-Bayesian-Methods-for-Hackers We'll host two talks. One is by Alex Petrov, he's going to talk about Bayesian Inference and how to make machines biased: Humans have many biases. Let's call them priors. They also have their expertises. Let's call them hypothesis spaces. We often make the conclusions we make swiftly, and without much effort and call it intuition. How do we make machines reason intuitively and what it may end with. Examples will be given in Probabilistic Programming framework called Anglican, successor of the Goodman & Tenenbaum Church language. One will be given by Johannes Drever. He will give some examples from the book "Probabilistic Models of Cognition" (https://probmods.org/) in the church programming language. The paper "Simulation as an engine of physical scene understanding" (http://www.pnas.org/content/110/45/18327.short) serves as an in-depth example on how Monte Carlo simulations may be used as a model of human cognition.

    8
  • Primer on Bayesian Statistics

    Benötigt einen Veranstaltungsort

  • Neural networks and machine learning

    STYLIGHT GmbH

    The next meetup will be about neural networks and machine learning. Neural networks have had a successful revival under the label "deep learning" in recent years. We'll try to get an overview on recent developments in this meetup. As for now, we have a presentation of a classical paper which highlights how the analysis of the visual cortex leads to new learning algorithms ("sparse coding" [1]). It would be great if we could find other presentations on Deep Learning, Computer Vision, NLP, Recurrent Neural Networks and Large Scale Machine Learning. For a comprehensive reading list see [2]. Presentation: * Sparse Coding with an Overcomplete Basis Set: A Strategy Employed by V1? (Johannes Drever) [1] * Introduction to RBMs. (Markus Liedl) * Introduction to Deep Learning theory: what have changed in the last years and why it has so much attention right now (Sergii Khomenko) [1] http://redwood.berkeley.edu/bruno/papers/VR.pdf [2] http://deeplearning.net/reading-list/

    16
  • Probabilistic Data Structures

    Codecentric Office Munich

    The next Meetup will be about probabilistic data structures, what they are and what they are used for. It is thought more as an overview of the different data structures than of sophisticated use cases. The following article provides a good overview about the topic: [1] Wikipedia is also a good starting point [2] Some (definitely not all) interesting papers are: • Bloom filter: Space/Time Trade-offs in Hash Coding with Allowable Errors [3] • Skip list: Skip Lists: A Probabilistic Alternative to Balanced Trees [4] • Count–min sketch: An Improved Data Stream Summary: The Count-Min Sketch and its Applications [5] • Hidden Markov Models: [6], [7], [8] Agenda: 19:00 - 19:30 Socializing 19:30 - 20:30 Talks Bloom Filters - Stefan Seelmann Count-Min Sketch - Alex Petrov Hidden Markov Models - Juan Miguel Cejula 20:30 - 20:45 Break 20:45 - 21:30 Discussion For the rest of the evening, starting to create something would be nice (for those who don't get enough). Some ideas: • Implementing one or multiple data structures in a language of choice • Creating browser based animations how some of the data structures work • Evaluating existing implementations • Outline for a (non-scientific) magazine article or conference talk [1] https://highlyscalable.wordpress.com/2012/05/01/probabilistic-structures-web-analytics-data-mining/ [2] http://en.wikipedia.org/wiki/Category:Probabilistic_data_structures [3] http://astrometry.net/svn/trunk/documents/papers/dstn-review/papers/bloom1970.pdf [4] ftp://ftp.cs.umd.edu/pub/skipLists/skiplists.pdf [5] http://dimacs.rutgers.edu/~graham/pubs/papers/cm-full.pdf [6] http://www.cs.ubc.ca/~murphyk/Bayes/rabiner.pdf [7] http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2766791/ [8] http://cit.mak.ac.ug/staff/pnabende/papers/cisse08_pnabende.pdf

    3
  • Bootstrap Meetup: Consensus Algorithms: Paxos and Raft

    Hi everyone, Let's meet and talk about Consensus Algorithm Papers: In Search Of An Understandable Consensus Algorithm: https://ramcloud.stanford.edu/raft.pdf The Part-Time Parliament: http://research.microsoft.com/en-us/um/people/lamport/pubs/lamport-paxos.pdf Paxos Made Simple: http://pdos.csail.mit.edu/6.824/papers/paxos-simple.pdf Who organises it? It's community-organised. If you'd like to co-organise, moderate or can help in any other way, ping Alex, he'll add you to admins. How does it work? We all read papers offline, then we meet together and start up a discussion, hacking, trying things out, bringing up examples from industry and learn more about what we've read. We've got four lightning talks to heat up the discussion: - Sorting algorithms based on 0-1 lemma (Stefan Schwetschke) - RANSAC (random sample consensus) algorithm with a few examples (Kai Wolf) - Software systems that use Paxos or Raft (Daniel Mitterdorfer) - An overview how etcd uses the raft protocol (Christine Koppelt) Why should I participate? Because papers are fun! It's actually quite easy to read papers, and we'd like to have more people who know about modern Computer Science and Math trends. That would significantly improve your knowledge and marketability. There are many more papers to read and discuss. You can check some of them on GitHub (https://github.com/papers-we-love/papers-we-love).

    26