# Computational complexity of Fibonacci Sequence

## Computational complexity of Fibonacci Sequence

### Question

I understand Big-O notation, but I don't know how to calculate it for many functions. In particular, I've been trying to figure out the computational complexity of the naive version of the Fibonacci sequence:

```
int Fibonacci(int n)
{
if (n <= 1)
return n;
else
return Fibonacci(n - 1) + Fibonacci(n - 2);
}
```

What is the computational complexity of the Fibonacci sequence and how is it calculated?

### Accepted Answer

You model the time function to calculate `Fib(n)`

as sum of time to calculate `Fib(n-1)`

plus the time to calculate `Fib(n-2)`

plus the time to add them together (`O(1)`

). This is assuming that repeated evaluations of the same `Fib(n)`

take the same time - i.e. no memoization is use.

`T(n<=1) = O(1)`

`T(n) = T(n-1) + T(n-2) + O(1)`

You solve this recurrence relation (using generating functions, for instance) and you'll end up with the answer.

Alternatively, you can draw the recursion tree, which will have depth `n`

and intuitively figure out that this function is asymptotically `O(2`

^{n}`)`

. You can then prove your conjecture by induction.

Base: `n = 1`

is obvious

Assume `T(n-1) = O(2`

^{n-1}`)`

, *therefore*

`T(n) = T(n-1) + T(n-2) + O(1)`

*which is equal to*

`T(n) = O(2`

^{n-1}`) + O(2`

^{n-2}`) + O(1) = O(2`

^{n}`)`

However, as noted in a comment, this is not the tight bound. An interesting fact about this function is that the T(n) is asymptotically **the same** as the value of `Fib(n)`

since both are defined as

`f(n) = f(n-1) + f(n-2)`

.

The leaves of the recursion tree will always return 1. The value of `Fib(n)`

is sum of all values returned by the leaves in the recursion tree which is equal to the count of leaves. Since each leaf will take O(1) to compute, `T(n)`

is equal to `Fib(n) x O(1)`

. Consequently, the tight bound for this function is the Fibonacci sequence itself (~`θ(1.6`

^{n}`)`

). You can find out this tight bound by using generating functions as I'd mentioned above.

Read more... Read less...

Just ask yourself how many statements need to execute for `F(n)`

to complete.

For `F(1)`

, the answer is `1`

(the first part of the conditional).

For `F(n)`

, the answer is `F(n-1) + F(n-2)`

.

So what function satisfies these rules? Try a^{n} (a > 1):

a^{n} == a^{(n-1)} + a^{(n-2)}

Divide through by a^{(n-2)}:

a^{2} == a + 1

Solve for `a`

and you get `(1+sqrt(5))/2 = 1.6180339887`

, otherwise known as the golden ratio.

So it takes exponential time.

I agree with pgaur and rickerbh, recursive-fibonacci's complexity is O(2^n).

I came to the same conclusion by a rather simplistic but I believe still valid reasoning.

First, it's all about figuring out how many times recursive fibonacci function ( F() from now on ) gets called when calculating the Nth fibonacci number. If it gets called once per number in the sequence 0 to n, then we have O(n), if it gets called n times for each number, then we get O(n*n), or O(n^2), and so on.

So, when F() is called for a number n, the number of times F() is called for a given number between 0 and n-1 grows as we approach 0.

As a first impression, it seems to me that if we put it in a visual way, drawing a unit per time F() is called for a given number, wet get a sort of pyramid shape (that is, if we center units horizontally). Something like this:

```
n *
n-1 **
n-2 ****
...
2 ***********
1 ******************
0 ***************************
```

Now, the question is, how fast is the base of this pyramid enlarging as n grows?

Let's take a real case, for instance F(6)

```
F(6) * <-- only once
F(5) * <-- only once too
F(4) **
F(3) ****
F(2) ********
F(1) **************** <-- 16
F(0) ******************************** <-- 32
```

We see F(0) gets called 32 times, which is 2^5, which for this sample case is 2^(n-1).

Now, we want to know how many times F(x) gets called at all, and we can see the number of times F(0) is called is only a part of that.

If we mentally move all the *'s from F(6) to F(2) lines into F(1) line, we see that F(1) and F(0) lines are now equal in length. Which means, total times F() gets called when n=6 is 2x32=64=2^6.

Now, in terms of complexity:

```
O( F(6) ) = O(2^6)
O( F(n) ) = O(2^n)
```

There's a very nice discussion of this specific problem over at MIT. On page 5, they make the point that, if you assume that an addition takes one computational unit, the time required to compute Fib(N) is very closely related to the result of Fib(N).

As a result, you can skip directly to the very close approximation of the Fibonacci series:

```
Fib(N) = (1/sqrt(5)) * 1.618^(N+1) (approximately)
```

and say, therefore, that the worst case performance of the naive algorithm is

```
O((1/sqrt(5)) * 1.618^(N+1)) = O(1.618^(N+1))
```

PS: There is a discussion of the closed form expression of the Nth Fibonacci number over at Wikipedia if you'd like more information.

You can expand it and have a visulization

```
T(n) = T(n-1) + T(n-2) <
T(n-1) + T(n-1)
= 2*T(n-1)
= 2*2*T(n-2)
= 2*2*2*T(n-3)
....
= 2^i*T(n-i)
...
==> O(2^n)
```

It is bounded on the lower end by `2^(n/2)`

and on the upper end by 2^n (as noted in other comments). And an interesting fact of that recursive implementation is that it has a tight asymptotic bound of Fib(n) itself. These facts can be summarized:

```
T(n) = Ω(2^(n/2)) (lower bound)
T(n) = O(2^n) (upper bound)
T(n) = Θ(Fib(n)) (tight bound)
```

The tight bound can be reduced further using its closed form if you like.