Categories
Misc

Bayes by backprop implementation using tfp

I am trying to implement a simple bayes by backprop regression using the following tutorial

[http://krasserm.github.io/2019/03/14/bayesian-neural-networks/`](http://krasserm.github.io/2019/03/14/bayesian-neural-networks/)“`

But my regressor is no where near learning. below is my model, can anyone suggest a simple implementation of the tutorial?

“`

def prior(kernel_size, bias_size, dtype=None):
n = kernel_size + bias_size
prior_model = tf.keras.Sequential([
tfp.layers.DistributionLambda(
lambda t:tfd.MultivariateNormalDiag(loc=tf.zeros(n), scale_diag=tf.ones(n))
)
])
return prior_model
def posterior(kernel_size, bias_size,dtype=None):
n = kernel_size + bias_size
posterior_model = tf.keras.Sequential([
tfp.layers.VariableLayer(tfp.layers.MultivariateNormalTriL.params_size(n),dtype=dtype),
tfp.layers.MultivariateNormalTriL(n)
])
return posterior_model
model = tf.keras.Sequential([
tfp.layers.DenseVariational(units=20,input_shape=(1,),make_prior_fn=prior, make_posterior_fn=posterior,kl_weight=1/x_train.shape[0]),
tf.keras.layers.ReLU(),
tfp.layers.DenseVariational(units=20,make_prior_fn=prior, make_posterior_fn=posterior,kl_weight=1/x_train.shape[0]),
tfp.layers.DenseVariational(units=1,make_prior_fn=prior, make_posterior_fn=posterior,kl_weight=1/x_train.shape[0])
])
model.compile(loss=tf.keras.losses.MeanSquaredError() , optimizer=tf.keras.optimizers.Adam(lr=0.001),metrics=[‘mae’])
print(model.summary())
history = model.fit(x_train, y_train, batch_size=batch_size, epochs=500,verbose=2)

“`

submitted by /u/_tfp_beginner
[visit reddit] [comments]

Leave a Reply

Your email address will not be published. Required fields are marked *