GOst Meetup

Details

Hi folks!✌

Want to try something new with your favorite programming language?🤔 Or you just want to optimize your developing workflow for large Go applications? Then join us for talks from a Google Developer Expert Ellen Körbes, who works with developer relations at Garden. Together we’ll go through two talks about speeding up your workflow⚡ and build a neural network from scratch in Go🤖.

Agenda

18:45 - 19:00 - Registration
19:00 - 19:15 - Opening
19:15 - 20:00 - Go & Kubernetes Sitting In A Tree
20:00 - 20:15 - Break
20:15 - 21:00 - Learn Neural Networks With Go—Not Math!
21:00 - 21:30 - Networking

Location: Sigma Software Office(Naukova street 7D).
Since the amount of attendees is limited, please say “Yes” only if Go development is applicable for you. 🙏

We are waiting for you 😊
Talk 1: Go & Kubernetes Sitting In A Tree
Developing Go applications for Kubernetes commonly involves a whole lot of manually re-compiling and waiting and pushing and pulling.
In this talk, Ellen Körbes will show you how Garden tackled this issue to achieve the snappiest Go development experience possible, including issues and time sinks such as:
- Juggling dependencies and vendoring and goproxy
- Dealing with huge Docker images
- Does removing debugging information from binaries help?
- When to re-build the container vs. just update files in it?
- Should we recompile locally or in a running container?
...and more. If you're writing Go applications for Kubernetes and looking for a snappier workflow that stays out of your face, this talk's for you.

Talk 2: Learn Neural Networks With Go—Not Math!
Even the most amazing programmers may not have the first clue about math. That makes learning neural networks particularly inaccessible, as an integral part of explaining it relies on mathematical formulas. Ah, the formulas…with all their lines and curves and ancient symbols; they’re just as unintelligible as they are beautiful.
What’s a better way for us to learn it instead? With a language we all speak: code.
Ellen Körbes dives into every component required to write a neural network from scratch, like network structure, activation functions, forward propagation, gradient descent, and backpropagation. But you’ll look at them as a programmer: defining what you’re trying to achieve, then writing an implementation for it. And you’ll do it using only Go code—no specialized libraries like TensorFlow and PyTorch required. So if you ever wanted to really understand how a neural network works but thought it to be out of your reach because of the math, this is for you. Code, not math! Algorithms, not logarithms!