Order of functions, asymptotes

In the previous section we talked about comparing functions around infinity. In fact, this was just a special case, we can compare functions at any point of their domains. However, it is most useful at infinity, since at other points it is less intuitive. We will start by a brief survey of order from a theoretical point of view, then we will introduce asymptotes.

Order of functions

Comparing functions has been inspired by practical problems not unlike the intuitive evaluation of limits. In applications (above all in physics) it often helps if we replace a complicated expression by a simple one without making a large error.

Definition.
Let a be a real number, ∞, or −∞. Let f,g be functions defined on some reduced neighborhood of a.

We say that f = O(g) at a if there is a reduced neighborhood U of a and a constant A > 0 so that for all x from U we have

f (x)| ≤ Ag(x).

We say that f ≍ g at a, also denoted f = Θ(g), if there is a reduced neighborhood U of a and constants A1, A2 > 0 so that for all x from U we have

A1g(x) ≤ f (x) ≤ A2g(x).

We say that f = o(g) at a, also denoted f << g, if

We say that f ∼ g at a if

Sometimes we use the notation f = O(g), xa and f = o(g), xa.

The second inequality can be expressed using just one constant, the condition is equivalent to finding one positive A satisfying (1/A)⋅g(x) ≤ f (x) ≤ Ag(x). This is in turn equivalent to (1/A)⋅f (x) ≤ g(x) ≤ Af (x), so this is obviously a symmetric relation.

The limit condition in the definition of o(g) can be also written like this:

The first two definitions show rough comparison. Either f is smaller than g up to a multiplicative constant (around a), or it is about the same as g up to a multiplicative constant f ≍ g means that close to a the graph of f lies in a strip around the graph of g whose width is determined by the two constants). The next two definitions express a similar idea using limits, but they require more. When f = o(g) at a, it means that close to a the function f is negligible compared to g. We will return to this later. The condition f ∼ g means that close to a these two functions are pretty much equal.

Precisely and expressed mathematically, the latter two notions are stronger. If f = o(g) at a, then also f = O(g) at a. If f ∼ g at a, then also f ≍ g at a.

Terminology is a bit vague at this point. The two o-symbols are called "big-O" and "little-O", for instance we would say that "f is a little-o of g at a". These two symbols are also sometimes referred to as "Landau symbols". These two comparisons are orderings, that is, they have similar properties as the usual inequality. For instance, they are transitive, that is, if at a we have f = o(g) and g = o(h), then f = o(h); the same is true for "big-O".

The two relations ≍ and ∼ are equivalences, so they have similar property as the usual equality. The relation ≍ and the big-O are tied together similarly as inequality and equality. Namely, if at a we have f = O(g) and g = Of ), then there necessarily f ≍ g. This is not true for little-o and the ∼ relation.

Both pairs are connected in another way: If f = O(g) and g ≍ h at a, then f = O(h) there. Likewise, if f = O(g) and f ≍ h at a, then h = O(g) there.
Similarly, if f = o(g) and g ∼ h, then f = o(h). Likewise, if f = o(g) and f ∼ h, then h = o(g).

In the case of "big O" and both equivalences we often say that f and g are of the same order (of magnitude) at a. The equivalence ≍ is hard to prove directly, instead we usually use the following.

Fact.
If there is a real number A > 0 such that

then f ≍ g at a.

 

We already mentioned the usefulness of the relation ∼ when guessing limits. This comes also very handy in physics. In particular one often compares functions to powers. If we have a more complicated function and we want to guess its behavior near some point, we can try to compare different parts of this function to powers and then we know which parts can be ignored.

Example: Assume that we have a function f + g and we know that f ∼ xA and g ∼ xB at infinity for some A,B > 0. If A > B, then we can ignore the part g around infinity, mathematically, f + g) ∼ f at infinity. Note that from A > B we also obtain that g = of ) at infinity.

In general we have this:

Fact.
If g = Of ) at a, then f + g) ≍ f at a.
If g = of ) at a, then f + g) ∼ f at a.

In fact, we have been using these considerations in the previous section on intuitive evaluation at infinity. In physics and numerical mathematics they also often do comparison at zero. There it is a bit tricky, since the scale of powers works differently. In particular, larger powers are actually beaten by smaller powers at 0.

Example: Assume that we have a function f + g and we know that f ∼ xA and g ∼ xB at 0 for some A, B > 0. If A < B, then we can ignore the part g around 0, mathematically, f + g) ∼ f at 0.

Example: The function f (x) = x2 − x is of order x2 at infinity; that is, f ∼ x2 at infinity. On the other hand, f (x) = x2 − x is of order x at 0 (check!), that is, f ∼ x at 0.
The little-o comparisons are also reversed. On the one hand, at infinity we have x << x2, that is, x = o(x2). On the other hand, at zero we have x2 << x, that is, x2 = o(x).

Physicists, engineers and numerical mathematicians use such stuff quite often. They would determine the order of the whole expression and then ignore all its terms that are o of this order.

One can also make comparisons at a from one side only. Definitions and properties are analogous.

Example: The function f (x) = x2 − ln(x) is of order x2 at infinity, that is, f ∼ x2 at infinity. In other words, we have ln(x) << x2 at infinity, or ln(x) = o(x2) at infinity.

On the other hand, f (x) = x2 − ln(x) is of order ln(x) at 0 from the right (check!), that is, f ∼ ln(x) at 0 from the right. In other words, we have x2 << ln(x) at 0 from the right, or x2 = o(ln(x)) at 0 from the right. This is unpleasant, we cannot assign a power-order to f at 0 from the right, but that's life. Indeed, check that there is no A for which we would have f ∼ xA at 0 from the right.

Before we get to asymptotes, we will tie up the concepts from this section to the concepts of the previous section. There we had two notions. Order allowed us to guess limits and is the same as the order ∼ here. It is rather precise, for instance 13x2 + x is of order 13x2 at infinity (the 13 must be there). We also used the notion of type, which was very useful when deciding which terms are not important. For instance, 13x2 + x is of type x2 at infinity. This notion is similar to the similarity ≍, but it is somewhat more loose, since it does not consider the sign. For instance, -x2 is of the type x2, but we don't have -x2 ≍ x2. We used the type for guessing limits since it is more convenient than the relation ≍. But that's enough of nitpicking, for a casual user of calculus this is way beyond the horizon.

Asymptotes

We had several notions comparing functions, but none of them help us draw graphs. For that we need a different notion. Note that even if we use the strongest notion we have above, it is still not enough to draw the graph properly. Indeed, for instance x2 − x ∼ x2 at infinity, but the graphs of the two functions differ by quite a bit at infinity. Although both are parabolas, so the basic shape is the same, they differ by x and therefore the graphs are spreading apart as we go to infinity.

We will therefore ask a different question: What is the difference between two functions near a certain point a?

Definition.
Let a be a real number, ∞, or −∞. Let f,g be functions defined on some reduced neighborhood of a.

We say that the graph of f is asymptotic for the graph of g at a if

Note that this relation is symmetric, that is, if f is asymptotic for g at a (we say it like this for short), then also g is asymptotic for f at a. In fact, this relation is an equivalence.

What is the relationship between this notion and the order we covered above? There is one special case. If f and g are not separated from 0 as we approach a (see below), then these two notions are independent. Otherwise asymptoticity is stronger.

Fact.
Assume that there is a reduced neighborhood U of a point a and a constant k > 0 such that f | > k and |g| > k on U. If these two functions are mutually asymptotic at a, then also f ∼ g.

However, sometimes these notions are the same.

Fact.
Assume that functions f, resp. g have non-zero limits A, resp B at a. Then the following are equivalent:
 (i)   A = B;
 (ii)   f and g are mutually asymptotic at a;
 (iii)   f ∼ g at a.

Otherwise asymptoticity usually brings more information. The most typical case is when the functions go to (minus) infinity at (minus) infinity, then asymptoticity is strictly stronger than the notion of order. For instance, x ∼ (x + 7) at infinity, but these two functions are not asymptotic there.

One reason why we almost exclusively use asymptoticity with a equal to infinity or minus infinity is that it does not help us at all when drawing graphs around proper points. Consider some particular a, for instance a = 0. The functions x2 and x⋅sin(1/x) have both limit 0 at a, so when we subtract them, we get the limit as zero again, therefore they are mutually asymptotic at 0. However, when you compare their graphs, you will see that their shapes are totally different around the origin.

Situation is therefore usually as follows. We are given a function and we know that at infinity it goes to infinity (or negative infinity etc.). We want to know something more about the manner in which it goes to that infinity. We would like to compare it to some other, nicer function that we already know. Thus we would look for a candidate for an asymptote. Since asymptoticity is very strong, we seldom find such a candidate. But if we do, we can make a very good guess concerning the graph of the given function around infinity (resp. negative infinity).

Most often we look for straight lines as asymptotes. Once we decide that we are interested only in straight lines as asymptotes, we may extend the notion a bit further and we also get a nice algorithm for determining asymptotes (if any exist). Thus the focus shifts away from the notion of order and we prefer to leave the topic of straight asymptotes to a different section, asymptotes in Derivative - Theory - Graphing functions.


Back to Theory - Limits