The RBF model fits the data better, and agrees better with background knowledge of the variables. I thought of trying to calculate the min/max slope of the fitted line in order to prune out NNs with steep cliffs vs. more gentle curves. Identifying crossing lines would be another way to prune, but that's a different question.

Thanks for any suggestions.

Note: I used ggplot2 here and tagged the question accordingly, but that doesn't mean it couldn't be accomplished with some other function. I just wanted to visually illustrate why I'm trying to do this. I suppose a loop could do this with y1-y0/x1-x0, but perhaps there's a better way.?

@CarlWitthoft Do I need to pass more options? I did what you suggested with my_loess <- loess(x~y, data=rbf) and then numericDeriv(my_loess$y) and I get an error: Error in length(theta) : 'theta' is missing. Sorry for making you walk me through everything!
–
HendyAug 30 '12 at 22:27

I am writing a neural network myself for ultrafast trading. In the beginning I was using Loess or Lowess to fit time series, but what I wanted was smooth derivatives, which Loess doesn't supply. Even, if you implement Loess yourself and use the orthogonal polynomials for each point to compute a derivative you get odd results. There is a reason for this.

A solution to your problem can be found in a paper by Graciela Boente: Robust estimators of high order derivatives of regression functions. The formula is on page 3. The paper is freely available on the internet. Once you have acquired both values and derivatives, you can use this for uniquely defining cubic splines, which will give contiguous derivatives.