1. The problem statement, all variables and given/known data
Under the simple linear regression model Y= A + Bx + e, where A is the intercept (a known concept), B is the slope parameter (unknown) and e is a random error term satisfying the normality assumption.
If (X1,Y1)...(Xn,Yn) are the n data points observed, find the least squares estimator of B.
Further, if the random error terms (e) satisfy the normality assumption with nuisance parameter σ, identify the distribution of the least squares estimator B.

2. Relevant equations
If we wish to fit a line that minimizes the sum of squares of the vertical distances of our n data points, we have the equation Y= A+ Bx, where B = (Ʃ(xi-xbar)Y)/(Ʃxi-xbar)^2. This is the same as the maximum likelihood estimator for B under the normality assumption.

Further, under the normality assumption, B is normally distributed with mean B and variance (σ^2)/(Ʃxi-xbar)^2.

3. The attempt at a solution
I fear that I am confused by this problem! From what I can tell, the relevant equations posted above provide the appropriate answers, however I've already proved those equations in a previous homeworks - so either its just trivial or I'm missing something obvious. If the latter is the case, I would greatly appreciate someone letting me know (and perhaps pointing me in the right direction).