I don't even know if this question makes sense, but what is the difference between multiple regression and partial correlation (apart from the obvious differences between correlation and regression, ...

I've been looking into the boot package in R and while I have found a number of good primers on how to use it, I have yet to find anything that describes exactly what is happening "behind the scenes". ...

When we do multiple regressions and say we are looking at the average change in the $y$ variable for a change in an $x$ variable, holding all other variables constant, what values are we holding the ...

I am wondering what the exact relationship between partial $R^2$ and coefficients in a linear model is and whether I should use only one or both to illustrate the importance and influence of factors.
...

Not sure if normalize is the correct word to use here, but I will try my best to illustrate what I am trying to ask. The estimator used here is least squares.
Suppose you have $y=\beta_0+\beta_1x_1$, ...

I understand the concept that $\hat\beta_0$ is the mean for when the categorical variable is equal to 0 (or is the reference group), giving the end interpretation that the regression coefficient is ...

Suppose I wish to regress $Y$ against a normalized $X$, but I would like a sparse solution. After regression, why is discarding the coefficients with smallest magnitude not allowed?
For the record, ...

When running ridge regression, how do you interpret coefficients that end up larger than their corresponding coefficients under least squares (for certain values of $\lambda$)? Isn't ridge regression ...

So, I want to fit a random effects negativ-binomial model. For such a model STATA can produce exponentiated coefficients. According to the help file such coefficients can be interpreted as incidence-...

There's regression model where $Y = a + bX$ with $a = 1.6$ and $b=0.4$, which has a correlation coefficient of $r = 0.60302$.
If $X$ and $Y$ are then switched around and the equation becomes $X = c + ...

My goal is to use the coefficients derived by previous research on the subject to predict actual outcomes given a set of independent variables. However, the research paper lists the Beta coefficients ...

I'm currently working on building a predictive model for a binary outcome on a dataset with ~300 variables and 800 observations. I've read much on this site about the problems associated with stepwise ...

I have a linear regression model where the dependent variable is logged and an independent variable is linear. The slope coefficient for a key independent variable is negative: $-.0564$. Not sure how ...

I'm evaluating two (2) refrigerants (gases) that were used in the same refrigeration system. I have saturated suction temperature ($S$), condensing temperature ($D$), and amperage ($Y$) data for the ...

After gathering valuable feedback from previous questions and discussions, I have came up with the following question: Suppose that the goal is to detect effect differences across two groups, male vs. ...

In this paper, (Bayesian Inference for Variance Components Using Only Error Contrasts, Harville, 1974), the author claims
$$(y-X\beta)'H^{-1}(y-X\beta)=(y-X\hat\beta)'H^{-1}(y-X\hat\beta)+(\beta-\hat\...