About:
pycobra is a python library for ensemble learning, which serves as a toolkit for regression, classification, and visualisation. It is scikit-learn compatible and fits into the existing scikit-learn ecosystem.

Changes:

Project is now fully scikit-learn compatible, implements 2 new predictor aggregation algorithms, more Jupyter notebooks and examples, and continuous integration for tests.

Theano 0.9.0 (20th of March, 2017)

Highlights (since 0.8.0):

* Better Python 3.5 support
* Better numpy 1.12 support
* Conda packages for Mac, Linux and Windows
* Support newer Mac and Windows versions
* More Windows integration:
* Theano scripts (``theano-cache`` and ``theano-nose``) now works on Windows
* Better support for Windows end-lines into C codes
* Support for space in paths on Windows
* Scan improvements:
* More scan optimizations, with faster compilation and gradient computation
* Support for checkpoint in scan (trade off between speed and memory usage, useful for long sequences)
* Fixed broadcast checking in scan
* Graphs improvements:
* More numerical stability by default for some graphs
* Better handling of corner cases for theano functions and graph optimizations
* More graph optimizations with faster compilation and execution
* smaller and more readable graph
* New GPU back-end:
* Removed warp-synchronous programming to get good results with newer CUDA drivers
* More pooling support on GPU when cuDNN isn't available
* Full support of ignore_border option for pooling
* Inplace storage for shared variables
* float16 storage
* Using PCI bus ID of graphic cards for a better mapping between theano device number and nvidia-smi number
* Fixed offset error in ``GpuIncSubtensor``
* Less C code compilation
* Added support for bool dtype
* Updated and more complete documentation
* Bug fixes related to merge optimizer and shape inference
* Lot of other bug fixes, crashes fixes and warning improvements

About:
BayesOpt is an efficient, C++ implementation of the Bayesian optimization methodology for nonlinear-optimization, experimental design and stochastic bandits. In the literature it is also called Sequential Kriging Optimization (SKO) or Efficient Global Optimization (EGO). There are also interfaces for C, Matlab/Octave and Python.

About:
Hype is a proof-of-concept deep learning library, where you can perform optimization on compositional machine learning systems of many components, even when such components themselves internally perform optimization.

About:
Optunity is a library containing various optimizers for hyperparameter tuning. Hyperparameter tuning is a recurrent problem in many machine learning tasks, both supervised and unsupervised.This package provides several distinct approaches to solve such problems including some helpful facilities such as cross-validation and a plethora of score functions.

Changes:

This minor release has the same feature set as Optunity 1.1.0, but incorporates several bug fixes, mostly related to the specification of structured search spaces.