|
|
Power Series

You may already notice that power functions are the second most lovely functions ever; it only loses out to the exponential functions. This is because differentiating and integrating power functions are incredibly straightforward. So what if you could write any function as a sum of power functions? You could use the power functions differentiation and integration rules to differentiate and integrate almost any function. But first, you need to understand how power series work.

Mockup Schule

Explore our app and discover over 50 million learning materials for free.

Power Series

Illustration

Lerne mit deinen Freunden und bleibe auf dem richtigen Kurs mit deinen persönlichen Lernstatistiken

Jetzt kostenlos anmelden

Nie wieder prokastinieren mit unseren Lernerinnerungen.

Jetzt kostenlos anmelden
Illustration

You may already notice that power functions are the second most lovely functions ever; it only loses out to the exponential functions. This is because differentiating and integrating power functions are incredibly straightforward. So what if you could write any function as a sum of power functions? You could use the power functions differentiation and integration rules to differentiate and integrate almost any function. But first, you need to understand how power series work.

Formula for Power Series

You may have guessed that all the terms in a power series are basically powers of a variable. If you thought this, then you are entirely correct.

A series of the form

\[ \begin{align} \sum _{n=0} ^{\infty} c_{n} (x-a) ^{n} = c _{0} + c _{1} (x-a) + c _{2} (x-a) ^{2} + ... \end{align} \]

is called a power series centered at \( x=a \).

Looking more closely at the power series centered at \( x=a \), naturally emerges a special case of power series when \( a=0 \):

A series of the form

\[ \begin{align} \sum _{n=0} ^{\infty} c_{n} x ^{n} = c _{0} + c _{1} x + c _{2} x ^{2} + ... \end{align} \]

is called a power series centered at \( x=0 \).

Notice that in both definitions is assumed that \( (x-a)^0=1 \) when \( x=a \) and \( x^0=1 \) when \( x=0 \).

The power series centered in \( x=0 \) looks very familiar. In fact, if you look back into the Geometric Series article, you will notice a huge similarity between the power series centered in \( x=0 \) and the geometric series; let's see an example:

Check if the following series converges:

\[ \begin{align} \sum _{n=0} ^{\infty} x ^{n} =1+x+x^2+x^3+... \end{align} \]

Answer:

Looking at the definition, you can see that this series is an example of a power series centered in \( x=0 \), where

\[ c_n=1, \text{ for all } n\ge 0\]

From another perspective, you can see that this is also a geometric series, recalling that a geometric series has the following form:

\[ \sum _{n=0} ^{\infty} ar ^{n}=a+ar+ar^2+\dots \]In this example, you have

\[ a=1 \text{ and } r=x. \]

A geometric series converges if and only if \( |r|<1 \); therefore, the series only converges if

\[ |x|<1. \]

Or, in other words, if

\[ -1<x<1, \]

then the series

\[ \begin{align} \sum _{n=0} ^{\infty} x ^{n} \end{align} \]

converges. Since this is a convergent geometric series, it converges to

\[ \sum _{n=0} ^{\infty} x ^{n} = \frac{1}{1-x}. \]

Notice that this is only possible if you state that \( |x|<1 \); otherwise, you can not apply the above formula.

Check our Geometric Series article for more information about geometric series and its sum formula.

The previous example is a classic one, where you can associate your geometric series knowledge with a new topic. However not all power series are that easy to analyze. Let's look at a non-geometric power series example.

The series

\[ \sum _{n=0} ^{\infty} \frac{x ^{n}}{n!} =1+x+\frac{x^2}{2}+\frac{x^3}{6}+\dots \]

and

\[ \sum _{n=0} ^{\infty} \frac{(x-1)^{n}}{n+1} =1+\frac{x-1}{2} +\frac{(x-1) ^2}{3}+\frac{(x-1) ^3}{4}+\dots \]

are both power series. The first one is centered at \(x=0\) with

\[ c_n = \frac{1}{n!}, \]

while the second is centered in \(x=1\) with

\[ c_n = \frac{1}{n+1}. \]

Radius of Convergence Power Series

Analyzing the definition of power series, you can notice that the series depends on the value of \( x \). This suggests that you can have a power series that converge for certain values of \( x \) and diverge for others. How far out it converges is called the radius of convergence. Let's look at a formal definition:

The radius of convergence of a power series, centered in \( x=a \), is a real value \( R \) where

  • The series converges for all \( x \) such as \( |x-a|<R \)
  • The series diverges for all \( x \) such as \( |x-a|>R \)

If the series only converges for \( x=a \), then \( R=0 \). If the series converges for all values of \( x \), then \( R=\infty \).

Notice that the definition does not mention the case where \( |x-a|=R \). This is because the convergence of the series at these values does not change the radius of convergence. However, you still need to check if the series converges at those points to write the interval of convergence, so let's define what this interval is:

The interval containing all the values of \( x \) such that the power series converges on those values is called the Interval of Convergence. It has the form of \( (a-R,a+R) \), depending if the series converges or not at the endpoints.

The form of the interval of convergence depends on whether or not the series converges at the endpoints \(a-R\) and \(a+R\).

A question that arises from these definitions is how do you calculate the radius of convergence? You can use the convergence tests, more specifically the Ratio Test. Sometimes you might need the Root Test, but this depends on the series you are analyzing. Let's take a look at some examples.

What is the radius and interval of convergence for the following series?

\[ \sum _{n=0} ^{\infty} \frac{x ^{n}}{n!} =1+x+\frac{x^2}{2}+\frac{x^3}{6}+\dots \]

Answer:

The Ratio Test says that a series converges if

\[ \lim\limits_{n \to \infty} \left| \frac{a_{n+1}}{a_n} \right| <1. \]

  • In this series, you have

\[ a_n = \frac{x ^{n}}{n!}. \]

  • Applying the ratio test and simplifying the expression

\[ \begin{align}L &= \lim\limits_{n \to \infty} \left| \frac{x ^{n+1}}{(n+1)!} \cdot \frac{n!}{x ^{n}} \right| \\&= \lim\limits_{n \to \infty} \left| \frac{x ^{n+1}}{(n+1)\cdot n!} \cdot \frac{n!}{x ^{n}} \right| \\&= \lim\limits_{n \to \infty} \left| \frac{x}{n+1} \right|\\&= |x| \lim\limits_{n \to \infty} \frac{1}{n+1} \\ &= |x| \cdot 0 \\ &= 0 \\ &< 1.\end{align} \]

As the limit is zero and does not depend on the value of \( x \), you have

  • Interval of Convergence

\[ (-\infty, +\infty) \]

  • Radius of Convergence

\[ R=\infty \]

The example above is classic, and you will see it in other areas. Let's now see another example where the series does not converge for all values of \( x \).

Find the radius and interval of convergence for the following series:

\[ \sum _{n=0} ^{\infty} \frac{(x-1)^{n}}{n+1}. \]

Answer:

Let's apply the ratio test again.

  • In this series, you have

\[ a_{n}=\frac{(x-1)^{n}}{n+1}. \]

  • Applying the ratio test and simplifying the expression

\[ \begin{align}L &= \lim\limits_{n \to \infty} \left| \frac{(x-1)^{n+1}}{n+2} \cdot \frac{n+1}{(x-1)^{n}} \right| \\&= \lim\limits_{n \to \infty} \left| \frac{(x-1)(n+1)}{n+2}\right| \\ &= |x-1| \cdot \lim\limits_{n \to \infty} \frac{n+1}{n+2} \\ &= |x-1| \cdot 1 \\ &= |x-1|. \end{align} \]

As the result for the limit does depend on the value of \( x \), but the ratio test says that this limit needs to be smaller than one for the series to be convergent. Checking it:

  • From the ratio test, you have

\[ |x-1|<1.\]

  • Solving this inequality,

\[ \begin{align} -1 &< x-1 < 1 \\ -1 +1 &< x < 1+1 \\ 0 &< x < 2. \end{align} \]

This way, you have that the interval of convergence is \( (0,2) \). You still aren't done though! You need to check if the series converges or not at the endpoints. Checking the endpoints:

  • For \( x=0 \), you have the following series

\[ \sum _{n=0} ^{\infty} \frac{(-1)^{n}}{n+1}. \]

  • As this is an alternating series, you may apply Leibniz's Theorem. Check the Alternating Series article for more information. If you use Leibniz's Theorem, then

\[ a_n=\frac{1}{n+1}. \]

  • Applying the theorem
  1. You need that \( a_n >0 \), which is valid for this series;
  2. You need that \( a_n \ge a_{n+1}\), which is also valid since \[ n+1 < n+2 \] implies that \[\frac{1}{n+1} > \frac{1}{n+2}.\]
  3. You need that \( \lim\limits_{n \to \infty} a_n = 0 \), which is also true since \[ \lim\limits_{n \to \infty} \frac{1}{n+1}=0.\]

Therefore the series converges for \( x=0 \). Let's look at the other endpoint.

  • For \( x=2 \), you have the following series

\[ \sum _{n=0} ^{\infty} \frac{1}{n+1}. \]

  • Substituting \( m=n+1 \) to insure that the series starts at zero, you have

\[ \sum _{m=1} ^{\infty} \frac{1}{m} . \]

This is the harmonic series, which is divergent.

Putting all of these together you can say:

  • Interval of Convergence

\[ [0, 2) \]

  • Radius of Convergence

\[ R=1 \]

Power Series Expansion

One of the coolest parts of studying power series is being able to write a function as a power series expansion. This can be weird at first, but the more complex the functions you are studying, the more difficult it will be to integrate or differentiate them. In fact, some functions cannot even be integrated using the traditional way. But if you can write them as a power series then you will be only integrating and differentiating power functions.

Given a function \( f(x) \), the Power Series Expansion of \( f\) is a power series such that

\[ f(x)=\sum _{n=0} ^{\infty} c_n x^n\]

for a given radius of convergence.

First let's take a look at an old friend, the geometric series:

Show that the function

\[ f(x)=\frac{1}{1-x} \]

can be written as a power series expansion.

Answer:

Going back to the geometric series, you the formula would be

\[ \sum _{n=0} ^{\infty} a r^n = \frac{a}{1-r}, |r|<1. \]

Looking at \( f(x) \) and comparing it with the geometric series, if you take \( a=1 \) and \( r=x \) you can substitute it back to the series, getting

\[ \sum _{n=0} ^{\infty} x^n = \frac{1}{1-x}, |x|<1. \]

Therefore, if \( -1<x<1\) then the power series expansion of \( f(x) \) is

\[ f(x) = \frac{1}{1-x} = 1+x+x^2+x^3+\dots \]

Let's take a look at another example.

Show that the function

\[ f(x)=\frac{1}{4+x^2} \]

can be written as a power series expansion.

Answer:

Let's write the function in a more helpful way by doing some algebra:

\[ \begin{align}f(x) &= \frac{1}{4+x^2} \\ &= \frac{1}{4(1+\frac{x^2}{4})} \\ &= \frac{\frac{1}{4}}{1-\left(-\frac{x^2}{4}\right)}. \end{align}\]

Looking at the new form of \( f(x) \) and comparing it to the geometric series, you can set

\[ \begin{align} a=\frac{1}{4}, \text{ and } r=-\frac{x^2}{4}. \end{align} \]

Then substituting it back into the series,

\[ \begin{align} \frac{1}{4+x^2} &= \sum _{n=0} ^{\infty} \frac{1}{4}\left( -\frac{x^2}{4}\right)^n \\ &= \sum _{n=0} ^{\infty} \frac{(-1)^n}{4} \left(\frac{x^2}{4}\right)^n .\end{align}\]

This is only valid if \( |r|<1 \)! Therefore

\[ \begin{align} \left| \frac{x^2}{4}\right| &<1 \\ \left| \frac{x^2}{4}\right| &< 1 \\ \frac{x^2}{4} &< 1 \\ x^2 &<4 \\ -2 < &x < 2 \end{align}\]

Hence, if \(-2<x<2\) then the power series expansion of \( f(x) \) is

\[ f(x) = \frac{x^2}{x^2+4} = 1-\frac{x^2}{4}+\frac{x^4}{16}-\frac{x^6}{64}+\dots \]

Derivative of Power Series

Now that you can write some functions as a power series expansion, let's see what happens when you need to calculate the derivative of this function; as this is a sum of power functions, it should be straightforward. First, recall some derivatives properties:

  • Power Rule

\[ (x^n)' = nx^{n-1} \]

  • The derivative of a constant multiple

\[ (cf)'(x)=cf'(x) \]

  • The derivative of a sum

\[ (f+g)'(x)=f'(x)+g'(x) \]

Using these three properties, you can take the derivative of any power series

Consider \( f(x) \) as the following power series

\[ f(x) = \sum _{n=0} ^{\infty} c_n x^n. \]

Then \( f'(x) \) is given by

\[ f'(x) = \sum _{n=1} ^{\infty} c_n n x^{n-1} \]

Let's expand the previous series:

\[ \ f(x) = c_0+c_1 x+c_2x^2+c_3x^3+\dots \]

Take the derivative of each term using the derivative properties to get

\[ \begin{align} f'(x) &= [c_0+c_1 x+c_2x^2+c_3x^3+...]' \\ &= 0+c_1+2c_2x+3c_3x^2+... \\ &= c_1+2c_2x+3c_3x^2+\dots \end{align} \]

Notice that \( f'(x)\) now starts with the term \( c_1 \), so it does not make sense anymore for the series to start at \( n=0 \). Therefore the sigma notation of \( f'(x) \) is

\[ \begin{align} f'(x) = \sum _{n=1} ^{\infty} n c_n x^{n-1}. \end{align} \]

Let's check out some examples.

Calculate the derivative of the following function:

\[ f(x) = \sum _{n=0} ^{\infty} \frac{x ^{n}}{n!}. \]

Answer:

First, consider the expanded form of this series

\[ f(x) = 1+x+\frac{x^2}{2}+\frac{x^3}{6}+\frac{x^4}{24}+\frac{x^5}{120}+\dots \]

Taking the derivative gives you

\[ \begin{align} f'(x) &= \left[ 1+x+\frac{x^2}{2}+\frac{x^3}{6}+\frac{x^4}{24}+\frac{x^5}{120}+\dots \right]' \\ &= 0+1+x+\frac{x^2}{2}+\frac{x^3}{6}+\frac{x^4}{24}+\dots \\ &= 1+x+\frac{x^2}{2}+\frac{x^3}{6}+\frac{x^4}{24}+\dots \end{align} \]

Therefore, you have the helpful fact that

\[ \begin{align} f(x) &= 1+x+\frac{x^2}{2}+\frac{x^3}{6}+\frac{x^4}{24}+\dots\\ f'(x) &= 1+x+\frac{x^2}{2}+\frac{x^3}{6}+\frac{x^4}{24}+\dots \\ f'(x) &=f(x). \end{align} \]

Now let's calculate now \(f'(x)\) using the sigma notation:

\[ f'(x) = \left[\sum _{n=0} ^{\infty} \frac{x ^{n}}{n!}\right]' .\]

Using the derivative properties,

\[ \begin{align} f'(x) &= \sum _{n=0} ^{\infty} \left[ \frac{x ^{n}}{n!}\right]' \\ &= \sum _{n=1} ^{\infty} \frac{nx ^{n-1}}{n!}. \end{align} \]

You would like the series to start at zero, so substitute in \( m=n-1 \) to get

\[ \begin{align} f'(x) &= \sum _{m=0} ^{\infty} \frac{(m+1)x ^{m}}{(m+1)!} \\ &= \sum _{m=0} ^{\infty} \frac{(m+1)x ^{m}}{(m+1)\cdot m!} \\ &= \sum _{m=0} ^{\infty} \frac{x ^{m}}{m!}.\end{align} \]

Notice that the power series for \( f'(x) \) is the same as the series for \( f(x) \)! What other function can you think of where the derivative of the function gives you the function back? Yep, it is your old friend the exponential function. It takes a little more work to show that \(f(x)\) is actually the same as the exponential function, and that is something you will see in a later class.

Let's look at a case where taking the derivative can help you.

Consider the following power series expansion

\[ \begin{align} f(x) = \frac{1}{1-x} = \sum _{n=0} ^{\infty} x^n,\quad |x|<1. \end{align} \]

Find a power series expansion for the function

\[ g(x) = \frac{1}{(1-x)^2}.\]

Answer:

To start solving this problem, first notice that \( g(x) \) is the derivative of \( f(x)\):

\[ \begin{align} f'(x) &= \left( \frac{1}{1-x} \right)' \\ &= \frac{0\cdot(1-x) - 1\cdot(-1)}{(1-x)^2} \\ &= \frac{1}{(1-x)^2} \\ &= g(x). \end{align} \]

Now that you know that \( g(x)=f'(x) \), you can take the derivate of the power series, and this will become the power series expansion of \( g(x) \). Doing this gives you

\[ \begin{align} f'(x) &= \left[ \sum _{n=0} ^{\infty} x^n\right]' \\ &= \sum _{n=0} ^{\infty} \left[ x^n\right]' \\ &= \sum _{n=1} ^{\infty} n x^{n-1}. \end{align} \]

You can make the series start at zero if you substitute \( m = n-1 \). Therefore

\[ g(x) = \sum _{m=0} ^{\infty} (m+1) x^{m}.\]

Remember that this is only possible if \(|x|<1\)! You can't get rid of the constraints on \(x\).

Common Power Series

While learning about power series, you will come across several different power series. However, you will notice some common ones, starting with those that can be written as a geometric series. Let's take a look.

Write the following function as a power series

\[ f(x)=\dfrac{1}{1-x}+\dfrac{1}{1+x}.\]

Answer:

First, use some algebra to write this function in a more compact way:

Sum up the fractions

\[ \begin{align} f(x) &=\dfrac{1+x+1-x}{(1-x)\cdot(1+x)} \\ &=\dfrac{2}{(1-x^2)} . \end{align}\]

Then, compare it to geometric series

\[ \begin{align} \sum _{n=0} ^{\infty} a r^n &= \frac{a}{1-r}, |r|<1 \\ f(x) &=\dfrac{2}{(1-x^2)}. \end{align}\]

Now set \(a = 2\) and \(r= x^2\). Remember that for a geometric series to converge you need \( |r|<1\), so you need to consider

\[ \begin{align} \left| x^2\right| <1 \\ x^2<1 \\ -1<x<1. \\\end{align}\]

Now you can set up the series with the convergence interval:

\[ \sum _{n=0} ^{\infty} 2 (x^2)^n = \frac{2}{1-x^2}, \quad |x|<1, \]

or simplifying

\[ \sum _{n=0} ^{\infty} 2 x^{2n} = \frac{2}{1-x^2}, \quad |x|<1.\]

Therefore if \( -1<x<1 \) you have the following series expansion for \(f(x)\):

\[ \begin{align} f(x) & = \sum _{n=0} ^{\infty} 2 x^{2n} \\ &= 2 + 2x^2+2x^4+2x^6+\dots \end{align}\]

Other common power series are those related to the \( \cos(x) \) and \( \sin(x) \) functions. They are very similar to each other. As you go deeper into the power series realm, you will see that:

\[ \begin{align} \sin(x) &= \sum _{n=0} ^{\infty} (-1)^n \dfrac{x^{2n+1}}{(2n+1)!} = x - \dfrac{x^3}{3!} + \dfrac{x^5}{5!} - \dfrac{x^7}{7!}+\dots \\ \cos(x) &= \sum _{n=0} ^{\infty} (-1)^n \dfrac{x^{2n}}{(2n)!} = 1 - \dfrac{x^2}{2!} + \dfrac{x^4}{4!} - \dfrac{x^6}{6!}+\dots \end{align}\]

Let's check the convergence radius and interval for \( \sin(x) \).

To calculate the convergence radius and interval, for the power series expansion of \(\sin(x)\), you need to apply the Ratio Test for convergence.

First set up \( a_n \) as

\[ a_n = (-1)^n \dfrac{x^{2n+1}}{(2n+1)!}.\]

Apply the Ratio Test and simplifying the expression to get

\[ \begin{align}L &= \lim\limits_{n \to \infty} \left| \frac{(-1)^{n+1} \cdot x ^{2(n+1)+1}}{(2(n+1)+1)!} \cdot \frac{(2n+1)!}{(-1)^n \cdot x ^{2n+1}} \right| \\ &= \lim\limits_{n \to \infty} \left| \frac{(-1) \cdot x ^2 \cdot (2n+1)!}{(2n+3)!} \right| \\ &= |x^2| \cdot \lim\limits_{n \to \infty} \left| \frac{(2n+1)!}{(2n+3)(2n+2)(2n+1)!} \right| \\ &= |x^2| \cdot \lim\limits_{n \to \infty} \left| \frac{1}{(2n+3)(2n+2)} \right| \\ &= |x^2| \cdot 0 \\ &= 0. \end{align} \]

As the limit is zero and does not depend on the value of \( x \), you have

  • Interval of Convergence

\[ (-\infty, +\infty) \]

  • Radius of Convergence

\[ R=\infty \]

Similarly, you will find that the power series expansion for \( cos(x) \) has its interval of convergence as \( (-\infty, +\infty) \) and radius of convergence as \( R=\infty \).

Power Series - Key takeaways

  • A power series centered in \( x=a \) has the form

    \[ \sum _{n=0} ^{\infty} c_{n} (x-a) ^{n} = c _{0} + c _{1} (x-a) + c _{2} (x-a) ^{2} + \dots \]

  • A power series centered in \( x=0 \) has the form

    \[ \sum _{n=0} ^{\infty} c_{n} (x) ^{n} = c _{0} + c _{1} x + c _{2} x ^{2} +\dots \]

  • The radius of convergence is a real value \( R \) where

    • The series converges for all \( x \) such as \( |x-a|<R \)

    • The series diverges for all \( x \) such as \( |x-a|>R \)

    • If the series only converges for \( x=a \) then \( R=0 \).

    • If the series converges for all values of \( x \) then \( R=\infty \).

  • Power Series Expansion of \( f\) is a power series such as

    \[ f(x)=\sum _{n=0} ^{\infty} c_n x^n\]

    for a given radius of convergence.

  • Derivative of a Power Series Expansion

    \[ f(x) = \sum _{n=0} ^{\infty} c_n x^n \]

    Therefore \( f'(x) \) is given by

    \[ f'(x) = \sum _{n=1} ^{\infty} c_n n x^{n-1} \]

Frequently Asked Questions about Power Series

Finding a power series expansion depends on the function that is being analyzed; you can use the geometric series sum for some of them, but in general the Taylor Series is the best way to do it.

It is a series of terms added successively. In the series, all members are a power of x multiplied by a constant. A simple example is:


S= 1+x2+x3+x4+x5+x6+x7+...xn-2+xn-1+xn

The order is from small powers of x to higher ones.

There are many examples of power series, and some examples are:


  • The McLaurin series.
  •  A Geometric series.


Specifically, a McLaurin series example which is an example of a power series can be:


Serie = a0+a1x+a2x2+a3x3+a4x4...anxn

Power series are used to express functions in terms of a long sum of terms using powers of \(x\).

They are useful to solve problems where there is not a formal solution to a problem and this one can be approximated using a series.

Many areas of science and engineering use the power series as a form to solve problems.

Test your knowledge with multiple choice flashcards

If the radius of convergence is zero then the series converges for every value of \(x\).

A series that is convergent for all values of \(x\) has an infinite radius of convergence.

If you derive a power series and keep the same index, the new series will continue starting at \(n=0\).

Next

Join over 22 million students in learning with our StudySmarter App

The first learning app that truly has everything you need to ace your exams in one place

  • Flashcards & Quizzes
  • AI Study Assistant
  • Study Planner
  • Mock-Exams
  • Smart Note-Taking
Join over 22 million students in learning with our StudySmarter App Join over 22 million students in learning with our StudySmarter App

Sign up to highlight and take notes. It’s 100% free.

Entdecke Lernmaterial in der StudySmarter-App

Google Popup

Join over 22 million students in learning with our StudySmarter App

Join over 22 million students in learning with our StudySmarter App

The first learning app that truly has everything you need to ace your exams in one place

  • Flashcards & Quizzes
  • AI Study Assistant
  • Study Planner
  • Mock-Exams
  • Smart Note-Taking
Join over 22 million students in learning with our StudySmarter App