Skip to content

Intro to ergodicity economics & Clustering y otras tareas usando Fermat Distance

Photo of Federico Carrone
Hosted By
Federico C.
Intro to ergodicity economics & Clustering y otras tareas usando Fermat Distance

Details

El meetup va a hacerse en la oficina de LambdaClass (https://lambdaclass.com) que está a unas cuadras del obelisco, en Tucumán 840 Piso 5 Departamento C.

Intro to Ergodicity Economics
By Marcos Feole

The paper we are going to discuss is written by Ole Peters and can be downloaded from here: https://ergodicityeconomics.com/lecture-notes/

La ergodicidad en finanzas y economía da lugar a una re-interpretación de algunos conceptos fundamentales de la teoría económica estándar, al introducir el promedio temporal (en vez del clásico valor de expectación o esperanza o promedio de ensamble) en el análisis del retorno en el tiempo de un portfolio de inversiones. En particular se dará una introducción al tema a partir de ejemplos sencillos pero esclarecedores de la problemática de fondo, y que además permiten mostrar el enorme poder predictivo y explicativo más allá de los resultados clásicos. La charla incluye revisiones de la siguiente bibliografía:
(1) Ergodicity Economics lecture notes. O. Peters and A. Adamou. 2018. Chapters 1 and 5.
(2) Evaluating gambles using dynamics. M. Gell-Mann and O. Peters. 2016.
(3) Leverage efficiency. O. Peters and A. Adamou. arXiv:1101.4548v2, 2017.
(4) Optimal leverage from non-ergodicity. O. Peters. 2011.
(5) More resources: https://ergodicityeconomics.com

---

Clustering and other learning tasks with Fermat distance
By Pablo Groisman. The paper: https://arxiv.org/abs/1810.09398

Several learning tasks, like clustering, require a notion of distance between data points. The easiest choice is Euclidean distance, but this is not suitable in general. A typical example is k-means, the seminal clustering algorithm that is known to fail when clusters have a nonlinear geometry. The list of tasks also includes classification, density estimation, regression and manifold learning.

We will introduce the (empirical and population) Fermat distance. A notion of distance that recovers the geometry of the manifold and the density that generates the data as the sample size goes to infinity.

We will show that by replacing Euclidean distance with Fermat distance several learning algorithms are outperformed (e.g. k-means, spectral clustering, Naive Bayes classifiers, topological data analysis).

Depending on time and audience interest, we will sketch some proofs.

Photo of Papers We Love Buenos Aires group
Papers We Love Buenos Aires
See more events
Tucumán 840
Tucumán 840 · CABA, Sa