Finding the values of k for which a matrix has no solutions

Problem:
Find the values of k for which the system of equations
has no solutions.

I instinctively want to solve a question like this by adding/subtracting rows to turn the system into row echelon form, but the k:s in there make it awfully difficult. I've tried changing the order of the equations, but this hasn't helped me either. It seems like such an easy question, but I'm still stumped.

That will only help me determine if there is a unique solution or not, though, won't it? If the determinant is 0 there can be either an infinite amount of solutions or no solutions, but I can't distinguish between these two cases.

Note that there is no value of k that will make the columns of your matrix linearly dependent. To see this, first solve the homogeneous system ( = 0 vector). The first equation you have implies that:

But the second equation forces

Using the third equation,

which has no solution except x, z = 0, which will then force y = 0 to solve the homogeneous equation. Hence, your columns are linearly independent independent of your choice of k (bad pun there). You can safely take the determinant and solve.

Incidentally, how do I insert line breaks (carriage returns)? The standard Latex " \\ " doesn't seem to be working.

take the determinant. you will get a quadratic equation in k, which will factor over the integers, giving 2 rational values for k. use these in your matrix, and row-reduce each of the two. you will find both values of k lead to an inconsistent system, that (1,1,1) is not in the column space of either matrix.