Comments - Use PRESS, not R squared to judge predictive power of regression - AnalyticBridge2019-09-15T08:00:28Zhttps://www.analyticbridge.datasciencecentral.com/profiles/comment/feed?attachedTo=2004291%3ABlogPost%3A245306&xn_auth=noThis is an amazing post. Than…tag:www.analyticbridge.datasciencecentral.com,2013-05-14:2004291:Comment:2456892013-05-14T00:27:11.709ZSean Flaniganhttps://www.analyticbridge.datasciencecentral.com/profile/SeanFlanigan
<p>This is an amazing post. Thanks so much. R-Squared discussions tend to launch many bar fights.</p>
<p>This is an amazing post. Thanks so much. R-Squared discussions tend to launch many bar fights.</p> The ability to predict the fu…tag:www.analyticbridge.datasciencecentral.com,2013-05-13:2004291:Comment:2455272013-05-13T17:27:55.714ZMirko Krivanekhttps://www.analyticbridge.datasciencecentral.com/profile/MirkoKrivanek
<p>The ability to predict the future performance, rather than goodness of fit on existing data, is a great advantage. This can be achieved using cross-validation, which your method does in some way, through the leaving-one-out procedure. It would be nice to see a metric that simultaneously addresses</p>
<ul>
<li>robustness (R Square and PRESS fail)</li>
<li>no sensitivity to number of observations (R square fails, not sure about PRESS)</li>
<li>has predictive power (R square fails, PRESS…</li>
</ul>
<p>The ability to predict the future performance, rather than goodness of fit on existing data, is a great advantage. This can be achieved using cross-validation, which your method does in some way, through the leaving-one-out procedure. It would be nice to see a metric that simultaneously addresses</p>
<ul>
<li>robustness (R Square and PRESS fail)</li>
<li>no sensitivity to number of observations (R square fails, not sure about PRESS)</li>
<li>has predictive power (R square fails, PRESS wins)</li>
</ul> One of our readers wrote:
Vin…tag:www.analyticbridge.datasciencecentral.com,2013-05-13:2004291:Comment:2457632013-05-13T17:03:36.983ZVincent Granvillehttps://www.analyticbridge.datasciencecentral.com/profile/VincentGranville
<p>One of our readers wrote:</p>
<p><em>Vincent, to normalize Rsquared, use Fisher Transform and then apply the T test to the results. It takes care of the data variability and the data size. Outliers are a problem, but they will mess up the quality of the least squares model, anyway, regardless of the criteria by which you judge the quality of your model. if you don't want to worry about them, use quantile regression.</em></p>
<p>One of our readers wrote:</p>
<p><em>Vincent, to normalize Rsquared, use Fisher Transform and then apply the T test to the results. It takes care of the data variability and the data size. Outliers are a problem, but they will mess up the quality of the least squares model, anyway, regardless of the criteria by which you judge the quality of your model. if you don't want to worry about them, use quantile regression.</em></p> Great reading for statisticia…tag:www.analyticbridge.datasciencecentral.com,2013-05-12:2004291:Comment:2455342013-05-12T20:41:28.570ZVincent Granvillehttps://www.analyticbridge.datasciencecentral.com/profile/VincentGranville
<p><span>Great reading for statisticians and data scientists. R^2 has many flaws: it is sensitive to outliers and size-sensitive:an R^2 of 0.65 does not have the same meaning for a data set with 20 observations, than for a dataset with 10,000 observations. How do you normalize this?</span></p>
<p><span>Great reading for statisticians and data scientists. R^2 has many flaws: it is sensitive to outliers and size-sensitive:an R^2 of 0.65 does not have the same meaning for a data set with 20 observations, than for a dataset with 10,000 observations. How do you normalize this?</span></p>