Since a conceptor, achieving direction-selective damping of high-dimensional network signals, usually takes the form of a projection matrix and is deduced analytically, a conceptor-based neural network is thought to be untrainable with backpropagation and gradient-descent algorithms from end to end. It limits the application of conceptors. To address this issue, an algorithm is proposed to train conceptor-based neural networks from end to end with gradient-descent algorithms. To the best of the authors’ knowledge, it is the first work of such an end-to-end training algorithm. To develop this algorithm, a softmax-like loss function involved with conceptors is constructed empirically. Based on this loss function, corresponding gradients are deduced by using backpropagation method so that it is possible to train conceptor neural networks from end to end with a gradient-descent algorithm. Several experiments are conducted to show the feasibility and effectiveness of the proposed training algorithm.