Github项目推荐 | 基于PyTorch以用户为中心的可微概率推理包Brancher

A user-centered Python package for differentiable probabilistic inference.

Brancher是一个以用户为中心的Python包，用于可区分的概率推理。

Site：https://brancher.org/

Github项目地址：

https://github.com/AI-DI/Brancher

Brancher允许使用随机变分推理来设计和训练可微分贝叶斯模型。 Brancher基于深度学习框架PyTorch。

安装

`pip install brancher`

建立概率模型

`a = NormalVariable(loc = 0., scale = 1., name = 'a')b = NormalVariable(loc = 0., scale = 1., name = 'b')`

`c = NormalVariable(loc = a**2 + BF.sin(b),                    scale = BF.exp(b),                    name = 'a')`

示例：自回归建模

概率模型

```T = 20driving_noise = 1.
measure_noise = 0.3x0 = NormalVariable(0., driving_noise, 'x0')
y0 = NormalVariable(x0, measure_noise, 'x0')
b = LogitNormalVariable(0.5, 1., 'b')

x = [x0]
y = [y0]
x_names = ["x0"]
y_names = ["y0"]for t in range(1,T):
x_names.append("x{}".format(t))
y_names.append("y{}".format(t))
x.append(NormalVariable(b*x[t-1], driving_noise, x_names[t]))
y.append(NormalVariable(x[t], measure_noise, y_names[t]))AR_model = ProbabilisticModel(x + y)```

观察数据

`[yt.observe(data[yt][:, 0, :]) for yt in y]`

自回归变分分布

`Qb = LogitNormalVariable(0.5, 0.5, "b", learnable=True)logit_b_post = DeterministicVariable(0., 'logit_b_post', learnable=True)Qx = [NormalVariable(0., 1., 'x0', learnable=True)]Qx_mean = [DeterministicVariable(0., 'x0_mean', learnable=True)]for t in range(1, T):    Qx_mean.append(DeterministicVariable(0., x_names[t] + "_mean", learnable=True))    Qx.append(NormalVariable(BF.sigmoid(logit_b_post)*Qx[t-1] + Qx_mean[t], 1., x_names[t], learnable=True))variational_posterior = ProbabilisticModel([Qb] + Qx)model.set_posterior_model(variational_posterior)`

推理

`inference.perform_inference(AR_model,                             number_iterations=500,                            number_samples=300,                            optimizer="SGD",                            lr=0.001)`

0 条评论