
In this chapter you've learned a set of rules for evaluating derivatives: derivatives of products, quotients, functions inside other functions, etc. Because these rules exist, it's always
possible to find a formula for a function's derivative, given the formula for the original function. Not only that, but there is no real creativity required, so a computer can be programmed to
do all the drudgery. For example, you can download a free, opensource program called Yacas from and install it on a Windows or Linux machine. There is even a version you can run in a web browser without installing any special
software:
A typical session with Yacas looks like this:
Example
D(x) x^2 2*x D(x) Exp(x^2) 2*x*Exp(x^2) D(x) Sin(Cos(Sin(x))) -Cos(x)*Sin(Sin(x)) *Cos(Cos(Sin(x)))
Upright type represents your input, and italicized type is the program's output.
First I asked it to differentiate with respect to
, and it told me the result was
. Then I did the derivative of
, which I also could have done fairly easily by hand. (If you're trying this out on a
computer as you read along, make sure to capitalize functions like Exp, Sin, and Cos.) Finally I tried an example where I didn't know the answer off the top of my head, and that would have been
a little tedious to calculate by hand.
Unfortunately things are a little less rosy in the world of integrals. There are a few rules that can help you do integrals, e.g., that the integral of a sum equals the sum of the integrals, but the rules don't cover all the possible cases. Using Yacas to evaluate the integrals of the same functions, here's what happens. 1
Example
Integrate(x) x^2 x^3/3 Integrate(x) Exp(x^2) Integrate(x)Exp(x^2) Integrate(x) Sin(Cos(Sin(x))) Integrate(x) Sin(Cos(Sin(x)))
The first one works fine, and I can easily verify that the answer is correct, by taking the derivative of , which is
. (The answer could have been
, or
, where
was any constant, but Yacas doesn't bother to tell us that.) The second and third ones don't work, however; Yacas just spits back
the input at us without making any progress on it. And it may not be because Yacas isn't smart enough to figure out these integrals. The function
can't be integrated at all in terms of a formula containing ordinary operations and
functions such as addition, multiplication, exponentiation, trig functions, exponentials, and so on.
That's not to say that a program like this is useless. For example, here's an integral that I wouldn't have known how to do, but that Yacas handles easily:
Example
Integrate(x) Sin(Ln(x)) (x*Sin(Ln(x)))/2 -(x*Cos(Ln(x)))/2
This one is easy to check by differentiating, but I could have been marooned on a desert island for a decade before I could have figured it out in the first place. There are various rules,
then, for integration, but they don't cover all possible cases as the rules for differentiation do, and sometimes it isn't obvious which rule to apply. Yacas's ability to integrate sin ln
shows that it had a rule in its bag of tricks that I don't know, or didn't
remember, or didn't realize applied to this integral.
Back in the 17th century, when Newton and Leibniz invented calculus, there were no computers, so it was a big deal to be able to find a simple formula for your result. Nowadays, however, it may
not be such a big deal. Suppose I want to find the derivative of sin cos , evaluated at
. I can do something like this on a calculator:
Example
sin cos sin 1 = 0.61813407 sin cos sin 1.0001 = 0.61810240 (0.61810240-0.61813407) /.0001 = -0.3167
I have the right answer, with plenty of precision for most realistic applications, although I might have never guessed that the mysterious number 0.3167 was actually (cos 1)(sin sin 1)(cos cos sin 1). This could get a little tedious if I wanted to graph the function, for instance, but then I could just use a computer spreadsheet, or write a little computer program. In this chapter, I'm going to show you how to do derivatives and integrals using simple computer programs, using Yacas. The following little Yacas program does the same thing as the set of calculator operations shown above:
Example
1 f(x):=Sin(Cos(Sin(x))) 2 x:=1 3 dx:=.0001 4 N( (f(x+dx)-f(x))/dx ) -0.3166671628
(I've omitted all of Yacas's output except for the _nal result.) Line 1 de_nes the function we want to di_erentiate. Lines 2 and 3 give values to the variables x and dx. Line 4 computes the derivative; the N( ) surrounding the whole thing is our way of telling Yacas that we want an approximate numerical result, rather than an exact symbolic one.
An interesting thing to try now is to make dx smaller and smaller, and see if we get better and better accuracy in our approximation to the derivative.
Example
5 g(x,dx):= N( (f(x+dx)-f(x))/dx ) 6 g(x,.1) -0.3022356406 7 g(x,.0001) -0.3166671628 8 g(x,.0000001) -0.3160458019 9 g(x,.00000000000000001) 0
Line 5 defines the derivative function. It needs to know both and
. Line 6 computes the derivative using dx = 0.1, which we expect to
be a lousy approximation, since
is really supposed to be infinitesimal, and 0.1 isn't even that small. Line 7
does it with the same value of
we used earlier. The two results agree exactly in the first decimal
place, and approximately in the second, so we can be pretty sure that the derivative is -0.32 to two figures of precision. Line 8 ups the ante, and produces a result that looks accurate to at
least 3 decimal places. Line 9 attempts to produce fantastic precision by using an extremely small value of dx. Oops - the result isn't better, it's worse! What's happened here is that Yacas
computed
and
, but they were the same to within the precision it was using, so
rounded off to zero. 2
Example demonstrates the concept of how a derivative
can be defined in terms of a limit:
The idea of the limit is that we can theoretically make approach as close as we like to
, provided we make
sufficiently small. In reality, of course, we eventually run into the limits of our ability to do the computation, as in the
bogus result generated on line 9 of the example.
- 4130 reads