You are here

Differentiation on a computer

12 November, 2015 - 11:22

In this chapter you've learned a set of rules for evaluating derivatives: derivatives of products, quotients, functions inside other functions, etc. Because these rules exist, it's always possible to find a formula for a function's derivative, given the formula for the original function. Not only that, but there is no real creativity required, so a computer can be programmed to do all the drudgery. For example, you can download a free, opensource program called Yacas from \texttt{} and install it on a Windows or Linux machine. There is even a version you can run in a web browser without installing any special software: \texttt{}

A typical session with Yacas looks like this:


D(x) x^2
D(x) Exp(x^2)
D(x) Sin(Cos(Sin(x)))

Upright type represents your input, and italicized type is the program's output.
First I asked it to differentiate x^2 with respect to x, and it told me the result was 2x. Then I did the derivative of_e{x}^{2} , which I also could have done fairly easily by hand. (If you're trying this out on a computer as you read along, make sure to capitalize functions like Exp, Sin, and Cos.) Finally I tried an example where I didn't know the answer off the top of my head, and that would have been a little tedious to calculate by hand.

Unfortunately things are a little less rosy in the world of integrals. There are a few rules that can help you do integrals, e.g., that the integral of a sum equals the sum of the integrals, but the rules don't cover all the possible cases. Using Yacas to evaluate the integrals of the same functions, here's what happens. 1


Integrate(x) x^2
Integrate(x) Exp(x^2)

The first one works fine, and I can easily verify that the answer is correct, by taking the derivative of x^3/3, which is x^2. (The answer could have been x^3/3+7, or x^3/3+c, where c was any constant, but Yacas doesn't bother to tell us that.) The second and third ones don't work, however; Yacas just spits back the input at us without making any progress on it. And it may not be because Yacas isn't smart enough to figure out these integrals. The function _ex^2 can't be integrated at all in terms of a formula containing ordinary operations and functions such as addition, multiplication, exponentiation, trig functions, exponentials, and so on.

That's not to say that a program like this is useless. For example, here's an integral that I wouldn't have known how to do, but that Yacas handles easily:


Integrate(x) Sin(Ln(x))

This one is easy to check by differentiating, but I could have been marooned on a desert island for a decade before I could have figured it out in the first place. There are various rules, then, for integration, but they don't cover all possible cases as the rules for differentiation do, and sometimes it isn't obvious which rule to apply. Yacas's ability to integrate sin ln x shows that it had a rule in its bag of tricks that I don't know, or didn't remember, or didn't realize applied to this integral.

Back in the 17th century, when Newton and Leibniz invented calculus, there were no computers, so it was a big deal to be able to find a simple formula for your result. Nowadays, however, it may not be such a big deal. Suppose I want to find the derivative of sin cos \textrm{sin }x, evaluated at x=1. I can do something like this on a calculator:



sin cos sin 1 =
sin cos sin 1.0001 =
/.0001 =

I have the right answer, with plenty of precision for most realistic applications, although I might have never guessed that the mysterious number 0.3167 was actually (cos 1)(sin sin 1)(cos cos sin 1). This could get a little tedious if I wanted to graph the function, for instance, but then I could just use a computer spreadsheet, or write a little computer program. In this chapter, I'm going to show you how to do derivatives and integrals using simple computer programs, using Yacas. The following little Yacas program does the same thing as the set of calculator operations shown above:


1 f(x):=Sin(Cos(Sin(x)))
2 x:=1
3 dx:=.0001
4 N( (f(x+dx)-f(x))/dx )

(I've omitted all of Yacas's output except for the _nal result.) Line 1 de_nes the function we want to di_erentiate. Lines 2 and 3 give values to the variables x and dx. Line 4 computes the derivative; the N( ) surrounding the whole thing is our way of telling Yacas that we want an approximate numerical result, rather than an exact symbolic one.

An interesting thing to try now is to make dx smaller and smaller, and see if we get better and better accuracy in our approximation to the derivative.


5 g(x,dx):=
N( (f(x+dx)-f(x))/dx )
6 g(x,.1)
7 g(x,.0001)
8 g(x,.0000001)
9 g(x,.00000000000000001)

Line 5 defines the derivative function. It needs to know both x and dx. Line 6 computes the derivative using dx = 0.1, which we expect to be a lousy approximation, since dx is really supposed to be infinitesimal, and 0.1 isn't even that small. Line 7 does it with the same value of dx we used earlier. The two results agree exactly in the first decimal place, and approximately in the second, so we can be pretty sure that the derivative is -0.32 to two figures of precision. Line 8 ups the ante, and produces a result that looks accurate to at least 3 decimal places. Line 9 attempts to produce fantastic precision by using an extremely small value of dx. Oops - the result isn't better, it's worse! What's happened here is that Yacas computed f(x) and f(x+dx), but they were the same to within the precision it was using, so f(x+dx)-f(x) rounded off to zero. 2

Example demonstrates the concept of how a derivative can be defined in terms of a limit:\frac{dy}{dx}=\lim_{\Delta \rightarrow 0}\frac{\Delta y}{\Delta x}

The idea of the limit is that we can theoretically make \Delta y/\Delta x approach as close as we like to dy/dx, provided we make \Delta x sufficiently small. In reality, of course, we eventually run into the limits of our ability to do the computation, as in the bogus result generated on line 9 of the example.