A system of equations with more unknowns than equations

It is known that a system of linear equations is considered overdetermined if there are more equations than unknowns...

My question simply stated: is there a term to describe a system of linear equations if there are more equations than unknowns?
Google seems to struggle when my search query is
"A system of equations with more unknowns than equations" !

Ive recently encountered such a system and was curious to do more research into developing or looking into efficient algorithms for solving them---since my current method of solving them is a bit tedious.

With x+ y+ z= 2 and x+ y- z= 4, I can add the equations to eliminate z and get 2x+ 2y= 6 so that x+ y= 3. I can choose any value for, say, x, then solve for y, then z. For example, if x= 1, then y= 3- 1= 2 so that the first equation becomes 1+ 2+ z= 3+ z= 2 and z= -1. x= 1, y= 2, z= -1 is a solution. But I could as well, take x= 2 and get y= 3- 2= 1, 3- z= 4, x= -1. In fact, (x, 3- x, -1) is a solution for x any real number.

With x+ y+ z= 2, x+ y+ z= 1, subtracting the second equation from the first, we get 0= 1 which is impossible. If fact the two left sides are identical. If x+ y+ z= 2, then it can't possibly be equal to 1 whatever x and y are!

Thank you very much for the terminology clarification, thats what I thought the terms meant---seems different sources use different ways to describe the same thing at times.

I appreciate the help....I would like to post my method here when I establish something to work these 'underdetermined' systems (not a proof mind you!) just to make sure Im not reinventing the wheel or anything (in addition to taking unnecessary steps)

Thank you very much for the terminology clarification, thats what I thought the terms meant---seems different sources use different ways to describe the same thing at times.

I appreciate the help....I would like to post my method here when I establish something to work these 'underdetermined' systems (not a proof mind you!) just to make sure Im not reinventing the wheel or anything (in addition to taking unnecessary steps)

thanks again for the help with all my questions

What you're attempting has probably already been shown in mxn matrices where m<n.

@hovette
Thanks! I had not heard that term before although Least Squares has come up before and I have looked into that algorithm before---have not read through it yet but I am guessing thaey must be similar.

One method I was shown in n x n matices is to have your
variable matrix

and a

constant matrix
then find determinant of the variable matrix, and let it be 'k'

Next to get value of x replace 1st column with constant matrix -> and find the determinant, and let it be 'c', therefore the value of x is 'c/k' so on and so forth for y and z.

My question then is this method applicable to an n x m matrix? I have not taken linear algebra yet, but I do know than inverses do not exist for rectangular matrices (is this why pseduinverses are used???) but do determinants?

Thanks for the input...this is highly engaging and I enjoy coming across new tools

The method you describe, which is known as Cramer's rule, follows from the adjugate formula for the inverse of a matrix; that is, A-1 = adj(A)/det(A). (You can prove Cramer's rule from this by expanding the top determinant along the column you replaced.) However, adjugates and determinants only exist for square matrices.

I don't know of any similar method for non-square matrices that is equivalent to using the pseudoinverse. In particular, the pseudoinverse depends on the inner product (since, after all, for an equation Ax = b the pseudoinverse gives the vector x = A+b which minimizes the norm of x or Ax - b, depending on whether the system is underdetermined or overdetermined, but x depends on the norm), so any such method must use the inner product somewhere.

Here are presented two equations with 3 unknowns (x,y,z) is this system solvable using a matrix method? The reason being is because none of the equations have ALL 3 variables in them only two at a time. Is it a necessary condition that requires all three variables to be in the linear equations to be solved using a matrix algorithm?
Therefore does this constitute by def. a linear system of equations that is solvable using a known matrix method as we have been discussing above.
All example Ive come across seem to have all variables present in each linear equation of the system.

Here are presented two equations with 3 unknowns (x,y,z) is this system solvable using a matrix method? The reason being is because none of the equations have ALL 3 variables in them only two at a time. Is it a necessary condition that requires all three variables to be in the linear equations to be solved using a matrix algorithm?
Therefore does this constitute by def. a linear system of equations that is solvable using a known matrix method as we have been discussing above.
All example Ive come across seem to have all variables present in each linear equation of the system.

Here are presented two equations with 3 unknowns (x,y,z) is this system solvable using a matrix method?

In this particular case, if we use the Ax=b method, where we then invert A (say with A-1 and premultiply b with this, then we must have that A has an inverse. But A here is not a square matrix, and only square matrices have inverses, so the matrix solutions will not prove fruitful.

On the other hand, for all my spoofery, I am trying to prove that the only solution to α1 -2α2 +3α3=0 and 2α2-α3=0 is that α1=α2=α3=0. Can you offer any help?