In the previous section we talked about comparing functions around infinity. In fact, this was just a special case, we can compare functions at any point of their domains. However, it is most useful at infinity, since at other points it is less intuitive. We will start by a brief survey of order from a theoretical point of view, then we will introduce asymptotes.
Comparing functions has been inspired by practical problems not unlike the intuitive evaluation of limits. In applications (above all in physics) it often helps if we replace a complicated expression by a simple one without making a large error.
Definition.
Let a be a real number, ∞, or−∞. Let f,g be functions defined on some reduced neighborhood of a.We say that
f = O(g) at a if there is a reduced neighborhood U of a and a constantA > 0 so that for all x from U we have
| f (x)| ≤ A⋅g(x). We say that
f ≍ g at a, also denotedf = Θ(g), if there is a reduced neighborhood U of a and constantsA1, A2 > 0 so that for all x from U we have
A1⋅g(x) ≤ f (x) ≤ A2⋅g(x). We say that
f = o(g) at a, also denotedf << g, ifWe say that
f ∼ g at a if
Sometimes we use the notation
The second inequality can be expressed using just one constant, the condition
is equivalent to finding one positive A satisfying
The limit condition in the definition of
The first two definitions show rough comparison. Either f is
smaller than g up to a multiplicative constant (around a), or
it is about the same as g up to a multiplicative constant
Precisely and expressed mathematically, the latter two notions are stronger.
If
Terminology is a bit vague at this point. The two o-symbols are called
"big-O" and "little-O", for instance we would say that "f is a
little-o of g at a". These two symbols are also
sometimes referred to as "Landau symbols". These two comparisons are
orderings, that is, they have similar properties as the usual inequality. For
instance, they are transitive, that is, if at a we have
The two relations
≍
and ∼ are equivalences, so they have similar property as the usual equality.
The relation
≍ and the big-O are tied
together similarly as inequality and equality. Namely, if at a we have
Both pairs are connected in another way:
If
Similarly,
if
In the case of "big O" and both equivalences we often say that f and g are of the same order (of magnitude) at a. The equivalence ≍ is hard to prove directly, instead we usually use the following.
Fact.
If there is a real numberA > 0 such thatthen
f ≍ g at a.
We already mentioned the usefulness of the relation ∼ when guessing limits. This comes also very handy in physics. In particular one often compares functions to powers. If we have a more complicated function and we want to guess its behavior near some point, we can try to compare different parts of this function to powers and then we know which parts can be ignored.
Example: Assume that we have a function
In general we have this:
Fact.
Ifg = O( f ) at a, then( f + g) ≍ f at a.
Ifg = o( f ) at a, then( f + g) ∼ f at a.
In fact, we have been using these considerations in the previous section on intuitive evaluation at infinity. In physics and numerical mathematics they also often do comparison at zero. There it is a bit tricky, since the scale of powers works differently. In particular, larger powers are actually beaten by smaller powers at 0.
Example: Assume that we have a function
Example: The function
The little-o comparisons are also reversed. On the one hand, at
infinity we have
Physicists, engineers and numerical mathematicians use such stuff quite often. They would determine the order of the whole expression and then ignore all its terms that are o of this order.
One can also make comparisons at a from one side only. Definitions and properties are analogous.
Example: The function
On the other hand,
Before we get to asymptotes, we will tie up the concepts from this section to
the concepts of the previous section. There we had two notions. Order allowed
us to guess limits and is the same as the order ∼ here. It is rather precise, for instance
We had several notions comparing functions, but none of them help us draw
graphs. For that we need a different notion. Note that even if we use the
strongest notion we have above, it is still not enough to draw the graph
properly. Indeed, for instance
We will therefore ask a different question: What is the difference between two functions near a certain point a?
Definition.
Let a be a real number, ∞, or−∞. Let f,g be functions defined on some reduced neighborhood of a.We say that the graph of f is asymptotic for the graph of g at a if
Note that this relation is symmetric, that is, if f is asymptotic for g at a (we say it like this for short), then also g is asymptotic for f at a. In fact, this relation is an equivalence.
What is the relationship between this notion and the order we covered above? There is one special case. If f and g are not separated from 0 as we approach a (see below), then these two notions are independent. Otherwise asymptoticity is stronger.
Fact.
Assume that there is a reduced neighborhood U of a point a and a constantk > 0 such that| f | > k and|g| > k on U. If these two functions are mutually asymptotic at a, then alsof ∼ g.
However, sometimes these notions are the same.
Fact.
Assume that functions f, resp. g have non-zero limits A, resp B at a. Then the following are equivalent:
(i)A = B;
(ii) f and g are mutually asymptotic at a;
(iii)f ∼ g at a.
Otherwise asymptoticity usually brings more information. The most typical
case is when the functions go to (minus) infinity at (minus) infinity, then
asymptoticity is strictly stronger than the notion of order. For
instance,
One reason why we almost exclusively use asymptoticity with a equal to
infinity or minus infinity is that it does not help us at all when drawing
graphs around proper points. Consider some particular a, for instance
Situation is therefore usually as follows. We are given a function and we know that at infinity it goes to infinity (or negative infinity etc.). We want to know something more about the manner in which it goes to that infinity. We would like to compare it to some other, nicer function that we already know. Thus we would look for a candidate for an asymptote. Since asymptoticity is very strong, we seldom find such a candidate. But if we do, we can make a very good guess concerning the graph of the given function around infinity (resp. negative infinity).
Most often we look for straight lines as asymptotes. Once we decide that we are interested only in straight lines as asymptotes, we may extend the notion a bit further and we also get a nice algorithm for determining asymptotes (if any exist). Thus the focus shifts away from the notion of order and we prefer to leave the topic of straight asymptotes to a different section, asymptotes in Derivative - Theory - Graphing functions.