前往小程序,Get更优阅读体验!
立即前往
首页
学习
活动
专区
工具
TVP
发布
社区首页 >专栏 >范畴论与机器学习

范畴论与机器学习

作者头像
CreateAMind
发布2023-11-30 13:14:25
2320
发布2023-11-30 13:14:25
举报
文章被收录于专栏:CreateAMind

Category Theory ∩ Machine Learning

Category theory has been finding increasing applications in machine learning. This repository aims to list all of the relevant papers, grouped by fields.

For an introduction to the ideas behind category theory, check out this link.

https://www.brunogavranovic.com/research_philosophy.html:

布鲁诺·加夫拉诺维奇

首页 帖子 论文 研究理念 关于

研究理念

我的研究重点是通过范畴论的视角理解智力的基本原理。在这种情况下,方法和目标是相互交织的:要理解智力的抽象本质——我们如何在头脑中构建思想——首先有必要找到一种更好的方式来构建这些思想。范畴论是做到这一点的工具。这是一个强有力的主张,需要作出解释。最终我希望能抽出时间写一篇。

智能的概念在很大程度上与生命、自创生、心灵哲学、意识和控制论等概念重叠。所有这些概念总是被赋予神话成分。但随着21 世纪科学和数学的发展,我们终于开始有足够的背景来提出正确的问题。目前我的工作主要是理论性的(参见论文),但计划是最终开始编写实际的代码。更重要的是,由于未知的原因,理解智力的问题本身可能是无意义的或措辞不当的。我考虑界定这个问题并提出正确的问题,这本身就是进步。

“理解这个问题就几乎等于知道答案。” - Tom Leinster,基本范畴论

从本质上讲,通过投入我的资源,我敢打赌,在这些智力和意识的抽象问题上取得进展所需的要素是对抽象本身的研究:范畴论。1我最终会写下关于所有这些的更多想法:我如何理解范畴论在现代科学中的作用以及如何使用它来理解机器学习。

我正在学习如何最好地表达这一长期研究理念。这些仍然是我不断回顾的粗略笔记,以便反思和改进它们。请对任何不清晰和令人信服的事情提出质疑。我也在努力扩大这些努力,所以如果您的目标重叠,请随时与我们联系!

虽然这是一个长期目标,但我目前在攻读博士学位期间所做的工作更为具体。我仍在使用范畴论,但现在是为了形式化和提炼人工神经网络中信息流的本质,并绕道博弈论。这使我能够在长期研究目标上取得切实、可衡量的进展。

我写了许多论文,将神经网络的构建块分解为可组合的部分:逆导数类别和透镜方面的自动微分、 P a r a结构方面的神经网络权重以及作为组合的各种神经网络参数化镜头。我目前正在致力于形式化特定的神经网络架构,特别是生成对抗网络,我的目标是将其博弈论特性与组合博弈论中的现有工作联系起来。

  1. 范畴论在这里是一个总括术语,包括类型论、同伦论及其各种(高级)组合。↩

There might be papers missing, and some papers are in multiple fields. Feel free to contribute to this list - preferably by creating a pull request.

Survey papers

  • Category Theory in Machine Learning

Deep Learning

  • Categorical Foundations of Gradient-Based Learning
  • Backprop as Functor
  • Lenses and Learners
  • Reverse Derivative Ascent
  • Dioptics
  • Learning Functors using Gradient Descent (longer version here)
  • Compositionality for Recursive Neural Networks
  • Deep neural networks as nested dynamical systems
  • Neural network layers as parametric spans
  • Categories of Differentiable Polynomial Circuits for Machine Learning
Graph Neural Networks
  • Graph Neural Networks are Dynamic Programmers
  • Natural Graph Networks
  • Local Permutation Equivariance For Graph Neural Networks
  • Sheaf Neural Networks
  • Sheaf Neural Networks with Connection Laplacians
  • Neural Sheaf Diffusion: A Topological Perspective on Heterophily and Oversmoothing in GNNs
  • Graph Convolutional Neural Networks as Parametric CoKleisli morphisms
  • Learnable Commutative Monoids for Graph Neural Networks
  • Sheaf Neural Networks for Graph-based Recommender Systems
  • Asynchronous Algorithmic Alignment with Cocycles
  • Topologically Attributed Graphs for Shape Discrimination

Differentiable programming / automatic differentiation

  • Functorial String Diagrams for Reverse-Mode Automatic Differentiation
  • Differentiable Causal Computations via Delayed Trace
  • Simple Essence of Automatic Differentiation
  • Reverse Derivative Categories
  • Towards formalizing and extending differential programming using tangent categories
  • Correctness of Automatic Differentiation via Diffeologies and Categorical Gluing
  • Denotationally Correct, Purely Functional, Efficient Reverse-mode Automatic Differentiation
  • Higher Order Automatic Differentiation of Higher Order Functions
  • Space-time tradeoffs of lenses and optics via higher category theory
  • [https://arxiv.org/abs/2307.02447](Using Rewrite Strategies for Efficient Functional Automatic Differentiation)

Probability theory

  • Markov categories
  • Markov Categories and Entropy
  • Infinite products and zero-one laws in categorical probability
  • A Convenient Category for Higher-Order Probability Theory
  • Bimonoidal Structure of Probability Monads
  • Representable Markov Categories and Comparison of Statistical Experiments in Categorical Probability
  • De Finneti's construction as a categorical limit
  • A Probability Monad as the Colimit of Spaces of Finite Samples
  • A Probabilistic Dependent Type System based on Non-Deterministic Beta Reduction
  • Probability, valuations, hyperspace: Three monads on Top and the support as a morphism
  • Categorical Probability Theory
  • Information structures and their cohomology
  • Computable Stochastic Processes
  • Compositional Semantics for Probabilistic Programs with Exact Conditioning

Bayesian/Causal inference

  • The Compositional Structure of Bayesian Inference
  • Dependent Bayesian Lenses: Categories of Bidirectional Markov Kernels with Canonical Bayesian Inversion
  • A category theory framework for Bayesian Learning
  • Causal Theories: A Categorical Perspective on Bayesian Networks
  • Bayesian machine learning via category theory
  • A Categorical Foundation for Bayesian probability
  • Bayesian Open Games
  • Causal Inference by String Diagram Surgery
  • Disintegration and Bayesian Inversion via String Diagrams
  • Categorical Stochastic Processes and Likelihood
  • Bayesian Updates Compose Optically
  • Automatic Backward Filtering Forward Guiding for Markov processes and graphical models
  • Compositionality in algorithms for smoothing
  • A Channel-Based Perspective on Conjugate Priors
  • A Type Theory for Probabilistic and Bayesian Reasoning
  • Denotational validation of higher-order Bayesian inference
  • The Geometry of Bayesian Programming
  • Relational Reasoning for Markov Chains in a Probabilistic Guarded Lambda Calculus

Topological Data Analysis

  • On Characterizing the Capacity of Neural Networks using Algebraic Topology
  • Persistent-Homology-based Machine Learning and its Applications - A Survey
  • Topological Expressiveness of Neural Networks

Metric space magnitude

  • Approximating the convex hull via metric space magnitude
  • Practical applications of metric space magnitude and weighting vectors
  • Weighting vectors for machine learning: numerical harmonic analysis applied to boundary detection
  • The magnitude vector of images
  • Magnitude of arithmetic scalar and matrix categories

Blog posts

  • Neural Networks, Types, and Functional Programming
  • Towards Categorical Foundations of Learning
  • Graph Convolutional Neural Networks as Parametric CoKleisli morphisms
  • Optics vs Lenses, Operationally
  • Meta-learning and Monads

Automata Learning

  • Automata Learning: A Categorical Perspective
  • A Categorical Framework for Learning Generalised Tree Automata

Misc

  • Generalized Convolution and Efficient Language Recognition
  • General supervised learning as change propagation with delta lenses
  • From Open Learners to Open Games
  • Learners Languages
  • A Constructive, Type-Theoretic Approach to Regression via Global Optimisation
  • Characterizing the invariances of learning algorithms using category theory
  • Functorial Manifold Learning
  • Diegetic representation of feedback in open games
  • Assessing the Unitary RNN as an End-to-End Compositional Model of Syntax
  • Classifying Clustering Schemes
  • Category Theory for Quantum Natural Language Processing
  • Categorical Hopfield Networks
  • Categorification of Group Equivariant Neural Networks
  • Equivariant Single View Pose Prediction Via Induced and Restricted Representations
  • A Category-theoretical Meta-analysis of Definitions of Disentanglement
  • Isomorphism, Normalizing Flows, and Density Estimation: Preserving Relationships Between Data

用数学范畴定义生命的尝试

本文参与 腾讯云自媒体同步曝光计划,分享自微信公众号。
原始发表:2023-11-27,如有侵权请联系 cloudcommunity@tencent.com 删除

本文分享自 CreateAMind 微信公众号,前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

本文参与 腾讯云自媒体同步曝光计划  ,欢迎热爱写作的你一起参与!

评论
登录后参与评论
0 条评论
热度
最新
推荐阅读
目录
  • Category Theory ∩ Machine Learning
    • 布鲁诺·加夫拉诺维奇
      • 研究理念
      • Survey papers
      • Deep Learning
      • Differentiable programming / automatic differentiation
      • Probability theory
      • Bayesian/Causal inference
      • Topological Data Analysis
      • Metric space magnitude
      • Blog posts
      • Automata Learning
      • Misc
领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档