Hi. I tried to train a little more complex network than one presented in Bayesian Regression example. It is supposed to learn simple parabolic dependency. The determenistic version of this model successfully learns it. Whereas Bayesian one is able to learn only something close to straight line.

It seems that the bug lies at loss = svi.step(torch.tensor(x[:, np.newaxis]), torch.tensor(y[:, np.newaxis])), hence you are doing inference using only 1 (not 500) sample y.You should use loss = svi.step(torch.tensor(x[:, np.newaxis]), torch.tensor(y)) to correct yourpyro.sample("obs", ...). A tip to detect bugs like this is: use pyro.enable_validation().

torch.tensor(y) fix worked. But when I enable validation, training crashes with the following exception:

ValueError: at site "bayesian_model$$$0.weight", invalid log_prob shape
Expected [], actual [8]
Try one of the following fixes:
- enclose the batched tensor in a with iarange(...): context
- .independent(...) the distribution being sampled
- .permute() data dimension

I tried to .independent(1) the normal distribution being sampled in pyromodel, but it didn't work. Could you explain what's going on?

The reason why we need this is because Pyro requires us to account for the independent batch dimensions (i.e. the dimensions of log_prob) at any sample site by either designating them as independent via pyro.iarange or moving them into event_shape to be summed out via .independent. This independence information is exploited by SVI. For more details, I would highly recommend going through the shapes tutorial (you can also run the corresponding notebook in tutorials to play around with it).