A novel multi-scale regression and classification method

Typically neural network algorithms use either large (e.g. sigmnoidal) or local (e.g. Gaussian) activation functions. Here we propose a new architecture and training method that integrates activation functions from multiple scales while balancing their signal absorption in local and global scales.