pymc-learn is a library for practical probabilistic
machine learning in Python.

It provides a variety of state-of-the art probabilistic models for supervised
and unsupervised machine learning. It is inspired byscikit-learnand focuses on bringing probabilistic
machine learning to non-specialists. It uses a syntax that mimics scikit-learn.
Emphasis is put on ease of use, productivity, flexibility, performance,
documentation, and an API consistent with scikit-learn. It depends on scikit-learn
and PyMC3 and is distributed under the new BSD-3 license,
encouraging its use in both academia and industry.

Users can now have calibrated quantities of uncertainty in their models
using powerful inference algorithms – such as MCMC or Variational inference –
provided by PyMC3.
See Why pymc-learn? for a more detailed description of why pymc-learn was
created.

The difference between the two models is that pymc-learn estimates model
parameters using Bayesian inference algorithms such as MCMC or variational
inference. This produces calibrated quantities of uncertainty for model
parameters and predictions.

# For regression using Bayesian Nonparametrics>>>fromsklearn.datasetsimportmake_friedman2>>>frompmlearn.gaussian_processimportGaussianProcessRegressor>>>frompmlearn.gaussian_process.kernelsimportDotProduct,WhiteKernel>>>X,y=make_friedman2(n_samples=500,noise=0,random_state=0)>>>kernel=DotProduct()+WhiteKernel()>>>gpr=GaussianProcessRegressor(kernel=kernel).fit(X,y)>>>gpr.score(X,y)0.3680...>>>gpr.predict(X[:2,:],return_std=True)(array([653.0...,592.1...]),array([316.6...,316.6...]))

Recent research has led to the development of variational inference algorithms
that are fast and almost as flexible as MCMC. For instance Automatic
Differentation Variational Inference (ADVI) is illustrated in the code below.