It is interesting to consider the sequence of signals that we obtain as we incorporate more terms into the Fourier series approximation of the half-wave rectified sine wave (Example 4.2).
Define *s*** K** (

**) to be the signal containing**

*t***+1 Fourier terms.**

*K*

Figure 4.5shows how this sequence of signals portrays the signal more accurately as more terms are added.

We need to assess quantitatively the accuracy of the Fourier series approximation so that we can judge how rapidly the series approaches the signal. When we use a **K
+1**-term series, the error the difference between the signal and the **K +1**-term series corresponds to the unused terms from the series.

To find the rms error, we must square this expression and integrate it over a period. Again, the integral of most cross-terms is zero, leaving

Figure 4.6 shows how the error in the Fourier series for the half-wave rectified sinusoid decreases as more terms are incorporated. In particular, the use of four terms, as shown in the bottom plot of Figure 4.5, has a rms error (relative to the rms value of the signal) of about 3%. The Fourier series in this case converges quickly to the signal.

We can look at Figure 4.7 to see the power spectrum and the rms approximation error for the square wave.

**K**in the Fourier series equals 99.

Because the Fourier coefficients decay more slowly here than for the half-wave rectified sinusoid, the rms error is not decreasing quickly. Said another way, the square-wave's spectrum contains more power at higher frequencies than does the half-wave-rectified sinusoid. This difference between the two Fourier series results because the half-wave rectified sinusoid's Fourier coefficients are proportional to while those of the square wave are proportional to If fact, after 99 terms of the square wave's approximation, the error is bigger than 10 terms of the approximation for the half-wave rectified sinusoid. Mathematicians have shown that no signal has an rms approximation error that decays more slowly than it does for the square wave.

**Exercise 4.5.1**

Calculate the harmonic distortion for the square wave. More than just decaying slowly, Fourier series approximation shown in Figure 4.8 (Fourier series approximation of a square wave)
exhibits interesting behavior.

Although the square wave's Fourier series requires more terms for a given representation accuracy, when comparing plots it is not clear that the two are equal. Does the Fourier series really
equal the square wave at **all** values of ** t**? In particular, at each step-change in the square wave, the Fourier series exhibits
a peak followed by rapid oscillations. As more terms are added to the series, the oscillations seem to become more rapid and smaller, but the peaks are not decreasing. For the Fourier series
approximation for the half-wave rectified sinusoid (Figure 4.5: Fourier Series spectrum of a half-wave rectified sine wave), no such behavior occurs. What is happening?

Consider this mathematical question intuitively: Can a discontinuous function, like the square wave, be expressed as a sum, even an infinite one, of continuous signals? One should at least be suspicious, and in fact, it can't be thus expressed. This issue brought Fourier9 much criticism from the French Academy of Science (Laplace, Lagrange, Monge and LaCroix comprised the review committee) for several years after its presentation on 1807. It was not resolved for almost a century, and its resolution is interesting and important to understand from a practical viewpoint.

The extraneous peaks in the square wave's Fourier series **never** disappear; they are termed **Gibb's****phenomenon** after the American physicist Josiah Willard Gibbs. They occur whenever the signal is dis continuous, and will always be present whenever the signal has jumps.

Let's return to the question of equality; how can the equal sign in the definition of the Fourier series be justified? The partial answer is that **pointwise** each
and every value of ** t** equality is

**not**guaranteed. However, mathematicians later in the nineteenth century showed that the rms error of the Fourier series was always zero.

What this means is that the error between a signal and its Fourier series approximation may not be zero, but that its rms value will be zero! It is through the eyes of the rms value that we
redefine equality: The usual Definition of equality is called **pointwise equality**: Two signals s1 (t), s2 (t) are said to be equal pointwise if
*s*_{1} (*t*)= *s*_{2} (*t*) for all values of t. A new definition of equality is **mean-square equality**: Two signals are
said to be equal in the mean square if **rms** (*s*_{1} − *s*_{2})=**0**. For Fourier series, Gibb's phenomenon peaks have finite height and zero width. The error differs from zero only at
isolated points whenever the periodic signal contains discontinuities and equals about 9% of the size of the discontinuity. The value of a function at a finite set of points does not affect its
integral. This effect underlies the reason why defining the value of a discontinuous function, like we refrained from doing in defining the step function (Section 2.2.4: Unit Step), at its
discontinuity is meaningless. Whatever you pick for a value has no practical relevance for either the signal's spectrum or for how a system responds to the signal. The Fourier series value "at"
the discontinuity is the average of the values on either side of the jump.

- 2515 reads