# Solving Nonlinear Equations Numerically

Knowing the roots of some nonlinear equation allows us to answer the question about which values of x make $f(x)$ equal to zero. There are many approaches to findings the roots of equation in nonlinear algebra. Some of the well-known ones include: the linear interpolation (or bisection) method, Newton’s method, Bernoulli’s method and its variations. In this post I am demonstrating an implementation of the two basic methods: bisection and Newton’s. We will also compare their performances.

The bisection method for finding the roots of some function $f(x)$ involves two starting values a and b, where a<b and the function is defined on [a,b]. The solution is approximated by iteratively halving the distance between a and b, and resetting one of these points. The point to reset depends on both, the sign of the slope of $f(x)$ on [a,b] and the sign of $f(x_1)$, where $x_1$ is the current approximation. This process terminates when either of the following conditions is reached: (a) we find the solution to some acceptable accuracy, or (b) we do not find the solution and terminate because the current distance on which we iterate is less than some error value (i.e. we have exhausted the interval without convergence), or we have exceeded the number of allowed iterations. The solution may not be obtained if the starting points are chosen poorly, and there are no roots on the [a,b] interval. For example, take a look at Figure 1, which is a plot of $f(x)=x^2-2$, where some possible [a,b] interval is marked. Imagine instead we pick [a,b] to be [2,3]. Since nowhere on [2,3] is $f(x)=0$, this interval does not provide any roots.

Which of the two roots do you think this method finds if we take a and b to be -2.0 and 2.0? The answer depends on the used method to approximate the slope of the function and what we do when this slope is zero. That is, it will be equally correct to move to the right or to the left. So, it is up to us.

The Newton’s method is a little bit more complicated. Essentially, it is based on the idea that, starting with some guess value $x_0$, we can improve upon the guess by updating it with the value of x for which $f(x)$ is the tangent line to the function. We repeat until convergence to a solution is achieved. Mathematically, this can be expressed as:

$m(x)=f'(x_0)\times(x_1-x_0) +f(x_0)$
$x_1=x_0-\frac{f(x_0)}{f'(x_0)}$

where $x_1$ is the next approximation and $m(x)$ is the tangent line function.

As you can see, the Newton’s method requires us to approximate the first derivative of $f(x)$. This is why I think it is a bit more complex. Also, there are several conditions, under which this method fails to find a solution. One of them is that $f'(root)$ is undefined. Another is that $f'(x) =0$. More conditions can be found on Wikipedia.

Below I am providing one possible way to implement these two methods. Please note the following points about my implementation:

1. One can never provide too many checks for potential run-time failures. I chose to stop at just a few. That is, for any possible solution x, $f(x)$ should be defined. Also, we should be able to estimate the 1st derivative of $f(x)$. While testing the implementation on some functions I noticed that if the 1st derivative is close to zero, then Newton’s method tends to diverge. That is why I am rounding to the nearest 3 f.d. on line 104.
2. My approximation to the first derivative of $f(x)$ uses four points.
3. I have left a few Debug.Assert in the code to help with debugging the code for some ill-behaved functions.
4. The function for which we need to find the roots, is passed as a delegate.

Below is my implementation along with the run example:

using System;
using System.Collections.Generic;

namespace SolvingNonLinearEquations
{
public abstract class NonlinearEquationSolutionMethod
{
public delegate double NonLinearFunction(double x);

public NonlinearEquationSolutionMethod(NonLinearFunction func)
{
this.func = func;
}

protected NonLinearFunction func;
protected const double maxiter = 100;
}

public sealed class BisectionMethod: NonlinearEquationSolutionMethod
{
private BisectionMethod(double from, double to, double err, NonLinearFunction func)
:base(func)
{
this.from = from;
this.to = to;
this.err = err;
}

//ensure we correctly intitialise from and to points where from < to
public static BisectionMethod SetBisectionMethod(double from, double to, double err, NonLinearFunction func)
{
if (from >= to)
return null;
else
return  new BisectionMethod(from, to, err, func);
}

public double Solve()
{
double solution = (from+to)*0.5;
int iter = 1;

//determine of the function has positive or negative slope on the given interval
double slope = func(to) - func(from);
double temp=0.0;

while ((to - from) > err && iter < maxiter)
{
temp = func(solution);
System.Diagnostics.Debug.Assert(!temp.Equals(Double.NaN), "Division by zero or another illegal operation has occured.");

if (temp < 0.0)
{
if (slope > 0)
from = solution;
else
to = solution;
}
else
{
if (slope > 0)
to = solution;
else
from = solution;
}
solution = (from + to) * 0.5;
++iter;
}
if (iter == maxiter || Math.Abs(func(solution)) > Math.Abs(err))
{
Console.WriteLine("Failed to converge.");
solution = default(double);
}
return solution;
}

private double from;
private double to;
private double err;
}

public sealed class NewtonMethod:NonlinearEquationSolutionMethod
{
public NewtonMethod(double guess,  double err, NonLinearFunction func)
: base(func)
{
this.guess = guess;
this.err = err;
}

public double Solve()
{
//Newton's method requires computing the function's first derivative
int iter = 0;
double temp = func(guess);
double tempdev = numFirstDerivative(func, guess);

if (temp==0)
return guess;

//check if the derivative exists at the initial guess
//if not, we cannot use Newton's method with this guess
if (tempdev.Equals(Double.NaN) || Math.Round(tempdev, 3).Equals(0.000))
{
Console.WriteLine("Derivate of this function is undefined or zero at this point.");
return default(double);
}

while(iter<=maxiter && Math.Abs(func(guess))>err && numFirstDerivative(func, guess)!=0)
{
System.Diagnostics.Debug.Assert(!temp.Equals(Double.NaN), "Division by zero or another illegal operation has occured.");
guess = guess - (temp/numFirstDerivative(func, guess));
++iter;
temp = func(guess);
}

if (func(guess).Equals(Double.NaN) || numFirstDerivative(func, guess).Equals(Double.NaN))
guess = default(double);

return guess;
}

//approximates the first derivative of a single-variabled function
private double numFirstDerivative(NonLinearFunction func, double x)
{
double delta = 0.01;
return (func(x - 2 * delta) - 8 * func(x - delta) + 8 * func(x + delta) - func(x + 2 * delta)) / (12 * delta);
}

private double guess;
private double err;
}

class EntryPoint
{
static void Main(string[] args)
{
BisectionMethod bm = BisectionMethod.SetBisectionMethod(-2.0, 2.0, 0.001, funcToSolve);
NewtonMethod nm = new NewtonMethod(0.5, 0.001, funcToSolve);

Console.WriteLine(bm.Solve());
Console.WriteLine(nm.Solve());

For function $f(x)=x^2-2$, the two roots are $\pm\sqrt{2}$. In my implementation, the bisection method finds the negative root, while the Newton’s method finds the positive root, since the initial guess is closer to the positive root. It takes 13 iterations to approximate the solution with the bisection method. While it takes only 4 iterations with the Newton’s method.