Various researchers have used one hidden layer neural networks (weighted sums of sigmoids) to find the unknown solutions of partial differential equations. They are embedded within an energy function that describes the differential equations and the boundary conditions. The energy function is then optimised to obtain the solution. This approach is simpler to implement than contemporary methods such as the Finite Element Method. For example, instead of constructing a complex mesh to fit a particular boundary, the optimisation approach evaluates the energy function at various points on the domain. Researchers have observed that sigmoid basis functions of neural networks interpolate functions smoothly (between points) with relatively few network nodes. However optimisation is at risk of being unsuccessful or inefficient.

This research examines the choice of solution representations that involve artificial neural networks, with the aim of enhancing optimisation. It considers two independent factors. First, the type of basis function, particularly sigmoids or sinusoids. Second, whether boundary value information is augmented algebraically within the solution (allowing automatic satisfaction of boundary conditions) or specified as a separate constraint in the energy function. Software has been developed to conduct experiments and gather statistical results with a test bed of simple smooth functions.

The research uses two methods to obtain solutions. First, it employs the optimisation approach to solve interpolation problems and differential equations by the minimisation of an energy function. Second, it introduces a fast method of interpolating known functions using a matrix method. It is postulated that interpolation alone could be a sufficient test to assess solution representations.