Our goal is discussing a different approach to Machine Learning based in semantic embeddings into algebraic structures. If you like Mathematics and Machine Learning you may be interested in this group.
Algebraic and Statistical learning are both aimed at deriving predictive models of the world from data. However, unlike Statistical Learning, Algebraic Learning is not affected by the frequency of the input data. This means that Algebraic Learning algorithms are robust to the statistical properties of the data and do not forget what they know when you teach them another thing.
Statistical Learning algorithms use probabilistic and statistical methods (including entropy or error minimization) to fit the input data. Neural Networks belong to this class. Algebraic Learning, on the other hand, does not use function minimization, instead model size and algebraic freedom are targeted. As a result Algebraic Learning algorithms do not overfit the data; they are natural generalizers without any need for regularization.
Algebraic Machine Learning has a few more properties of interest. There is no need for parameters or hyper-parameters. It is entirely parameter-free. The algebra grows and shrinks as new data arrives following strict and universal (problem-independent) rules. Supervised and unsupervised learning is possible using the same algorithm.
Algebraic Machine Learning allows for combining known facts about the data with learning from raw data. Most important, Algebraic Machine Learning agents can transfer to each other what they have learned independently of how often they can communicate. Agents can cooperate working in the same or related problems which opens the door to large-scale parallelization and collective learning.