For 1., I'm not sure where to start. I know it is an inhomogenous system with the left part of the matrix being "A", the unknowns x, y, z being x and the right part of the matrix being b.
Then if Rank(A) = Rank(A|b) and < than the number of unknowns, there are infinite solutions; if = to the number of unknowns, then there is 1 solution. Now, if Rank(A) =/= Rank(A|b), there are no solutions.

But the problem for me is how to find the unknowns to get to that stage?

For 2a), I tried to go:

Let S be the set of all vectors of the form (0, 0, 0) and let u and v be vectors in S.

Since you already have a 1 in the "first column, first row", and a 0 in the "first column, second row", you need to get a 0 in the "first column, third row" and you can do that by adding the first row to the last row.

Continue row-reducing as far as you can. Of course, at some point you may need to divide by something involving a- you can do that and get a single unique solution as long as that "something" is not 0. For what value of a is that "something" 0? In that case, you wind up with a third row consisting of all 0s except possibly the fourth column. What does that mean? What happens if the fourth column is also 0 and for what values of a and b does that happen?

For question 2b, just do what you normally would had there not been any unknowns! After you finish the row operations, set the last row to be a zero row to determine the values of $ for linear dependence. Do you know why this is done?

Once again, just saying that the components are in R is not sufficient. You must show that kx1+ ky1+ kz1= 0.

Therefore: S is a subspace of R^3

Yes, provided you clean up (i) and (ii).

Working on 2b). For 1, I row reduced, but then I get weird forms of a in the 3rd column.

You have u = (2x, -1, -1), v = (-1, 2x, -1) and w = (-1, -1, 2x) and are asked for what values of x they are linearly dependent. Row reducing will work and can be simplified by putting (-1, -1, 2x) as the first row, (-1, 2x, -1) as the second row, and (2x, -1, -1) as the third row. Doing that I get -4x2+ 2x- 2 as the remaining number in the third row and these will be linearly dependent if that is 0.

Btw, where can I read a tutorial on how to use the mathematical text that you guys use?

It doesn't follow that the sum is in S just because the indvidual components are in R. You need to show that the sum vector also satisfies the definition of S: that (x1+ x2)+ (y1+y2)+ (z1+ z2)= 0.

Once again, just saying that the components are in R is not sufficient. You must show that kx1+ ky1+ kz1= 0.

Yes, provided you clean up (i) and (ii).

Ok, thanks!

HallsofIvy said:

You have u = (2x, -1, -1), v = (-1, 2x, -1) and w = (-1, -1, 2x) and are asked for what values of x they are linearly dependent. Row reducing will work and can be simplified by putting (-1, -1, 2x) as the first row, (-1, 2x, -1) as the second row, and (2x, -1, -1) as the third row. Doing that I get -4x2+ 2x- 2 as the remaining number in the third row and these will be linearly dependent if that is 0.

Yeah, I worked that out myself, and was just about to post an update. I apologize for not posting my remark clearly; it should have been:

"I'm working on 2b now.
However, for 1., I row reduced, but then I get weird forms of a in the 3rd column."

Anyway, I got $ = 1 or -1/2. Furthermore, when $ = -1/2, the second row is a zero row as well, from my final matrix of ($ replaced by [tex]\lambda[/tex]):