Statistical Regression analysis provides an equation that explains the nature and relationship between the predictor variables and response variables. For a linear regression analysis, following are some of the ways in which inferences can be drawn based on the output of p-values and coefficients.

While interpreting the p-values in linear regression analysis in statistics, the p-value of each term decides the coefficient which if zero becomes a null hypothesis. A low p-value of less than .05 allows you to reject the null hypothesis. This could mean that if a predictor has a low p-value, it could be an effective addition to the model as the changes in the value of the predictor are directly proportional to the changes in the response variable.

Significance of Regression Coefficients for curvilinear relationships and interaction terms are also subject to interpretation to arrive at solid inferences as far as Regression Analysis in SPSS statistics is concerned.

Height is a linear effect in the sample model provided above while the slope is constant. But if your sample requires polynomial or interaction terms, it cannot be intuitive interpretation. In general, polynomial terms structure curvature while interaction terms show how the predictor values are interrelated.

A significant polynomial term makes interpretation less intuitive as the effect of changes made in the predictor depends on the value of that predictor. The same way, a significant interaction term denotes that the effect of the predictor changes with the value of any other predictor too. While interpreting regression analysis, the main effect of the linear term is not solely enough. Fitted line plots are necessary to detect statistical significance of correlation coefficients and p-values. They should be coupled with a deeper knowledge of statistical regression analysis in detail when it is multiple regression that is dealt with, also taking into account residual plots generated.