So you can see the standard format is not even close to the whole picture, If you want to integrate terms like $\int (e^{\mathrm{d}x} - 1)$ please do! There is absolutely nothing to stop you figuring out from scratch how to solve this sort of integral and making the theory rigorous.

Thanks for the answer. I think it's closest to answering my original question. Also, especially thanks for the link to the intro to non-standard analysis. I've been interested in learning about hyper real numbers for a while now, but every article I've looked at so far has been kind of terse.
–
SamiAug 13 '10 at 3:00

2

You write «If you want to integrate terms like $\int(e^{dx}-1)$ please do!» but you don't mean integrate: you mean find a meaning for the notation. You say «how to solve this sort of integral» but that integral does not make sense according to standard definitions of the integral, so it simply does not make sense to 'solve' it. You can of course come up with a theory that does give sense to such notations, but coming up with such a thing is not computing integrals...
–
Mariano Suárez-Alvarez♦Oct 21 '10 at 14:20

2

@Mariano Suárez-Alvarez, that is splitting hairs, everything you said is formally correct but there's not much to be learned from it.
–
anonOct 21 '10 at 16:34

1

@muad: one cannot compute things which do not make any sense under the definitions one uses. That is not splitting hairs!
–
Mariano Suárez-Alvarez♦Oct 21 '10 at 16:58

2

@muad: I have seen way too many students 'compute' things which do not make any sense with the definitions they have... I would say that recognizing when something is defined or not is one of the most important things one has to learn in order to do any kind of meaningful math.
–
Mariano Suárez-Alvarez♦Oct 22 '10 at 16:26

I think your question here shows that, while you have been using these symbols, you haven't really been given a proper motivation for where they came from.
Let's go back and consider how we came up with the idea of an integral. In a typical class, you will see a lot of pictures like this:
We find the area under the curve by summing up the area of all these little rectangles. If we wanted to write an expression for the area, it would look like:
The Σ means that we are computing a sum. We are adding the areas of the rectangles, which we have numbered 1 through n, to get the complete area under the curve. The area of each rectangle is given by multiplying the height by the width. The height is given by f(xi) because the base of the rectangle is at 0, and the top of the rectangle is where it meets the function f. The Δx represents the width of each rectangle.
When we find the integral, we are taking the limit of this sum as the number of rectangles goes to infinity, and each individual rectangle becomes infinitesimally tiny. You can think of the dx as the equivalent of Δx: it represents the infinitesimally small width of each rectangle that we added up to get the area.

Once you realize this, we can see why integrals only make sense when written ∫f(x)dx. Because we are adding up the areas of rectangles that have height f(x) and width dx. If you try to interpret the expressions you wrote in this way, you will see that they do not really make sense as integrals: you are not summing up rectangles, so you are not finding an area under a curve.

You could, of course, define your own notation in which those expressions behave the way you expect them to, but all mathematical notation is driven based on what people find useful, and what people can agree on and easily understand. Your reuse of the integral sign and dx that people are used to seeing in a particular context will probably result in few people adopting your definition.

@Sami: There is a limit definition for the integral. However, even for the standard Riemann integral, the rigorous definition can appear quite arcane. A less general definition may look something like this. Like I said in my answer, you can adjust the expression as you have in your comment, but the result is not useful in the same way the standard integral is, it only shares a similarity in form.
–
Larry WangAug 3 '10 at 5:22

Justin L, indeed - in the Non-standard analysis it is a beautiful definition!
–
anonAug 12 '10 at 20:21

We should just write $\int f$ or $\int \lambda x. f(x)$ or something, if the $\mathrm{d}x$ is meaningless and indivisible.
–
anonAug 12 '10 at 19:50

1

@muad, why should we? Should we also drop the $df/dx$ notation? There is absolutely no reason why notations which are 'split' in this way should not be used.
–
Mariano Suárez-Alvarez♦Oct 21 '10 at 14:17

No, it's not valid. The dx in the integral is a representation of the fact that the integral is obtained as an area, so multiplying the "average" of the function value at each point by an infinitesimal interval.

As the manner in which we don't calculate the area does not change, the notation does not change.

There are different notations that are used when the integral is over a curve, or over more than variable (thus leading for example to volumes).

The d(variable) notation is also used as a reminder that the integral is against a specific variable and not another, e.g. that int x/y dx differs from int x/y dy.

In the context of calculus, $dx$ simply means 'integrate with respect to $x$'. Some books even omit $dx$ entirely because 'of course we're integrating with respect to $x$'. The $dx$-bit does not get a proper meaning before differential forms.