Dual space of a vector space.

Hi all,

This is perhaps something that you don't hear very often (although would be some comfort to me if you have felt like this before). I've been studying mathematics for around 4 years now. At first, I enjoyed it - the concepts were explained, or at least easy enough to work out what they were getting at - there was enough intuition and motivation to me to see the point in why we are studying what we are. But now I feel completely different - I mainly study functional analysis now, and I find that most textbooks I come across are written "definition, lemma, proof, theorem, proof, end of chapter" - with little or no motivation. I am getting bored, I am seeing no reason why I should pick up a book any more other than the times that I have to. I just feel stupid when I tell someone I am interested in analysis, and yet I can't even tell you why Banach started looking at complete normed spaces (for them then to be named Banach spaces afterwards) because I seem to have to learn everything in some sort of backwards fashion.

One of the biggest motivators for the development of functional analysis was the field of quantum mechanics. And I'm probably not telling you anything you don't already know. If you're looking for real-world applications to motivate the study of functional analysis, look no further. Is that what you're asking?

I'm not really looking for real-world applications - although it is great to know they are there. And in fact I didn't know that - I really know very little about where everything started off and comes from, which is pretty much what I am talking about.

Consider the Dual space of a vector space. In every textbook I have read it is given as a definition with no motivation what so ever, a few results are proved, the Hahn-Banach theorem is stated, and the reader is supposed to believe how this is a remarkable theorem and an amazing result. Why? I have no idea - I have only been given an abstract definition with no motivation for why we need to define this. Obviously no one created this definition out of thin air, and then proved some results about it - someone found out something before that, used it and realised these types of spaces were useful, and then named them Dual spaces - but how did this happen? No books can tell me, and I find this happens a lot in analysis text books that I read.

What I used to see as being a beautiful subject now seems to be more like a big mess on my page, that I might be able to mathematically follow, but that I have no intuition of, and that I can't understand the reasoning behind.

For the dual space, look no further than the Dirac notation, in physics, for inner products. Vectors in the regular vector space look like this: They're called "kets". Vectors in the dual space look like this: They're called "bras", and consist of the complex conjugate transpose of vectors in the regular vector space (at least, for finite-dimensional spaces, this is true. It's a bit more complicated for infinite-dimensional spaces, but the idea generalizes). Now you can form a "bra-ket", or "bracket", which looks like this:

It's an inner product! So that's what vectors in the dual space are for: regular multiplication with vectors in the original space produces an inner product.

However, you can also do it in reverse: a "ket-bra". The result is an operator (matrix, in finite-dimensional spaces), which looks like this:

and is called an "outer product".

So that's the motivation for the dual space: it's where the vectors live that, when multiplied with vectors from the original space, produce either an inner product, or an outer product.

With this idea, you can now build up things like the Riesz Representation Theorem, various spectral theorems, etc. It's a very powerful notation, and one the mathematicians might, with advantage, adopt more widely.

I've always been introduced to the Dual space as the set of linear maps from a vector space V to a field F. Is this the same thing? I can't see the connection here with the idea of complex conjugate transpose (i.e. what does this mean if you have functions?). If I have a vector space V, such as the real numbers, and the dual space V* as the set of linear maps from R to R - how do I make the inner product that you mention of?

I've always been introduced to the Dual space as the set of linear maps from a vector space V to a field F. Is this the same thing?

Yes. The result of the inner product is a number in the field. So, multiplication by the vector in the dual maps the vector in V to a number in F, the same as what you just said.

Originally Posted by measureman

I can't see the connection here with the idea of complex conjugate transpose

Ok. Let's start with the real case. The basic idea here is to write the inner product in terms of matrix multiplication. If I have vector and vector and I want to form their inner product, I do this:

right?

Now, if and are both column vectors, I could write this using matrix multiplication as follows:

In this expression, the usual matrix multiplication does for me what the sum does in the previous expression. With me so far?

If we extend this idea to the complex case (that is, the components of the vectors can have imaginary parts), you have to do a complex conjugate of one of the vectors, or the usual formula for magnitude doesn't come out right. That is, suppose is a complex-valued vector. I want my usual formula

to work. But it won't if the dot product doesn't have any conjugation going on in there. Example:

Then I know from my geometry of the complex plane that But, if I just blindly use the real case's dot product, I get

So you see, with no complex conjugation, my formula doesn't work. But, if I do the complex conjugation, it does:

as required. So now, if I want to write the complex-case inner product using matrix multiplication, I'm going to have to write it this way:

where the symbol indicates the complex conjugate transpose (Hilbert adjoint). So that's the motivation/connection for the complex conjugate transpose.

Originally Posted by measureman

(i.e. what does this mean if you have functions?).

For functions the inner product is defined (assuming all regularity conditions are satisfied) as entirely analogous to the finite-dimensional case:

Incidentally, I much prefer the inner product being linear in the second argument (the way the physicists do it), rather than linear in the first argument. The physicists just have it way over the mathematicians here.

Originally Posted by measureman

If I have a vector space V, such as the real numbers, and the dual space V* as the set of linear maps from R to R - how do I make the inner product that you mention of?

If your vector space is just the real numbers, then the dual space is also just the real numbers, and your inner product is normal multiplication of real numbers.

Functional analysis has its origins in the work of Hilbert and Fredholm, who were interested in techniques for solving differential and integral equations. They had the amazing insight that ideas from the geometry of finite-dimensional spaces, such as length and angle (represented by inner products) and duality (which occurs in the context of tangent spaces), could be applied to the study of infinite-dimensional spaces of functions. What Banach did in his great book Théorie des opérations linéaires was to show how the topological concept of completeness is the key property for proving deep analytic results in infinite-dimensional spaces. The techniques that he developed are crucial in almost every branch of differential equations theory.

Dual space

HI ,
your discussion were interesting for me that i am studying physics, by the way i have many problems in functional analysis because as physicist i can't read something and without getting imagine of it pass through it , one of this aspect is dual space.
As i understand the dual is the space of all linear functions on vector space,but function means something that acts on others for example multiplication. so why we say for example that our vector space is rela number , then dual of it is real number? because by definition of linear function , and here our function is multiplication not another real line .
thanks in advance

Reply to ebi120: Your post is nearly unintelligible. I will do the best I can in replying .

...the dual is the space of all linear functions on vector space...

It's better to say that the dual of a vector space is the space of all linear functionals on the vector space. This might help to answer your next question as well. If your vector space is the real numbers, then the dual is the real numbers again, because multiplication of a real by a real gets you a real, and hence multiplication can be considered the "inner product" on the reals.

Dual Space

thanks for your answer, actually i somehow get your answer, as i understand you said that because the real numbers multiply on real numbers so our dual space is this real numbers. what about the situation that my space be matrices, then as i know trace of these matrices will be my dual space, but my confusing is how can the act of trace on matrices be an space ?

thanks for your answer, actually i somehow get your answer, as i understand you said that because the real numbers multiply on real numbers so our dual space is this real numbers. what about the situation that my space be matrices, then as i know trace of these matrices will be my dual space, but my confusing is how can the act of trace on matrices be an space ?

i hope you get my mean

The dual of the spaces of all nxn matrices is again the space of nxn matrices. (In fact, the dual of any finite-dimensional space can be identified with the space itself.) The way that the trace comes into it is like this: If you have an nxn matrix , then it induces a linear functional on the space of nxn matrices through the formula

thanks, as i understand your mean is dual space of finite dimensional space is the space itself, so is it true for finite dim the dual is not a new space and is the same as the old space ?
and if yes what is the reason ?
and other question when we say continuous linear function what does it mean? how can we prove continuity? and is it possible give to example one for continuous linear functional and another for non-continuous one ?

thanks, as i understand your mean is dual space of finite dimensional space is the space itself, so is it true for finite dim the dual is not a new space and is the same as the old space ?

What Opalg actually wrote was, "...the dual of any finite-dimensional space can be identified with the space itself." When you ask the question, "Are they the same?", I have to say, "It depends." If you mean does the dual have the same basic algebraic properties as the vector space, then yes. You can provide an isomorphism from the vector space to its dual. But are they identical? I'd say no, because, for example, the dual space of a vector space consisting of column vectors is a vector space of the same dimension consisting of row vectors.

and if yes what is the reason ?
and other question when we say continuous linear function what does it mean?

Linear function? Or linear functional?

how can we prove continuity? and is it possible give to example one for continuous linear functional and another for non-continuous one ?

Well, you can show that the inner product is continuous. That is, you can prove that in an inner product space, if you have and then

And, by the Reisz Representation Theorem, every linear functional can be represented as an inner product. So all linear functionals are continuous in the sense just mentioned. If you want a discontinuous functional, you'll have to go nonlinear.

thanks, my mean by continuity was continuity for functional , and what about self-dual ?
if we have self-dual space in this case can we say the dual space and the origin space are identical ?
and one other thing because i am new user i don't know where can i ask my questions related to the sequences in vector space?
thanks