Introduction to Integration
In this section, we examine the other half of calculus, integral calculus. To begin, we introduce some new notation. First, suppose we have the function The differential of is denoted where:
Because is equal to we often read as the differential of Notice that this notation looks similar to the notation for a derivative:
Indeed, the notations are closely related:
This close relationship is no accident. It stems directly from the Leibnizian (after the mathematician Gottfried Wilhelm Leibniz) interpretation of the derivative: Derivatives are ratios of differentials. Hence the notation:
So and are differentials. But what is a differential? A differential is a kind of infinitesimal — an infinitely small quantity. Thus, whenever we see the notation:
or anything that looks like it (e.g., , etc.), we want to think: "A very, very tiny bit of " (or, a very, very tiny bit of etc.)
Leibniz is credited with perfecting techniques for handling infinitesimals. In part because of how effective his notation was — it's far, far more effective than Newton's.1 In fact, so much so that some historians argue that the reliance on Newton's notation set British mathematics behind continental Europe's by over a century.
A good way to see how the notation works is through linear approximation. Recall how we used and to denote the change in and the change in respectively:
This is Newton's notation. With Leibniz notation, we replace with and with
For example, consider the problem:
problem What is approximately equal to?
We start by writing:
With Leibniz notation, we take the differential of
Following the same ideas we saw with linear approximations, we pick a starting point, say
Returning to Leibniz notation, we get:
Given our equation is:
This means that:
Hence, we know that This is the increment, or infinitesimal change, we're interested in. Carrying out the approximation. We know that so:
Antiderivatives
Another piece of notation we'll introduce is the following:
In the notation above, is called the antiderivative or indefinite integral of The is the integral symbol. To understand what these terms mean, let's start with some examples. When we see the expression:
we read it as "the integral of " The integral of is a function whose derivative is From our discussion of derivatives, we know that the derivative of is Accordingly, we say that is the antiderivative, or indefinite integral, of
But why is it called "indefinite?"" Because the derivative of a constant is zero, so we can add any constant to and still get the derivative
Whenever we take the antiderivative of a function, it's ambiguous up to some constant. Let's consider another example. What does this expression evaluate to:
This example is essentially asking us, "What function, when differentiated, yields " We know from the power rule that:
Thus, to get we need to cancel out the
Checking our work using differential notation:
And because we can add any constant, we must write:
Furthermore, examining the antiderivative, we see a restriction. The proposition:
is true if, and only if, On the other hand, the proposition:
contains no such restriction. Consider another problem. Evaluate the expression below:
The first step is to rewrite the expression into a more familiar form:
Looking at this, we immediately see that the function we're looking for is (don't forget the constant):
If we think carefully about the derivative of we'd realize that we can also get the derivative when is negative:
Accordingly, the antiderivative is more properly written as:
Another example:
Here we get:
The addition of a constant is the only ambiguous thing about antiderivatives. If we examine the antiderivative closely, we can draw a few inferences. Say we had some function and another function If then it follows that:
If then it must be the case that:
Rearranging:
In other words, results in a constant.2
What this tells us is that the addition of a constant is the only ambiguous thing about antiderivatives. Beyond that constant, the antiderivative is unique — the antiderivative where is the only antiderivative defined as
antiderivative uniqueness theorem If then
Footnotes
-
Constructing a notation system is an exercise in abstraction. A good notation system can significantly impact how easy or difficult a problem is — it allows its user to rapidly draw inferences, which is precisely how mathematics is done. It doesn't take much imagination to see why this is the case. If we had to write computer programs in binary — the language understood by computers — it's unlikely we'd see the myriad of technologies we see today. ↩
-
Notice that this was one of the lemmas we drew from the mean value theorem. ↩