A Neural Algorithm of Artistic Style
Presented by Andrew Carroll
How would Georges Serat have painted the view at Sunset Cliffs? How would Hokusai have painted the breaks off of La Jolla? What would a hurricane over San Diego look like? Though we can't change history or the climate to visualize these possibilities, a clever algorithm proposed by Gatys, Ecker and Bethge and built atop convolutional neural networks can get us close. "Neural Style Transfer", as it has become known, is a family of techniques that can reimagine a target image in a completely different style, provided by a second image or painting. Recent classics from this collection of techniques include skylines reimagined as though they were the subject of Starry Night, Benedict Cumberbatch seamlessly taking the place of Mona Lisa, and more recent work on stylizations that maintain structure (like a San Francisco skyline with Parisian night lighting).
In this talk, I'll explain the basic technique, which revolves around a mathematical quantification of two distinct aspects of an image: its style and its content. Both of these are measured from feature maps in a convolutional neural network, so I'll give a light overview of these beasts, grounded in very basic image processing via filters. Are these just cute algorithms, or do they "[offer] a path forward to an algorithmic understanding of how humans create and perceive artistic imagery"? I don't know, but I'm glad the Cumberbatch picture exists (credit: Luan et. al.)
Street parking on 6th, 7th & 8th Avenues north of B Street is usually easy at that hour. Meters nearby are free after 6. Read signage before you park on A street.
If you're interested in presenting a paper please fill out this form (https://docs.google.com/forms/d/e/1FAIpQLScaI-fWdys27-ByT_HdtsJ73V4AxZr0hf1GSqLsQ1IwAaPdIQ/viewform) or talk to us in person at the meetup.