Algorithms review 3 Amortized Analysis

This is a personal notes of algorithms courses

Definition#

In a sequence of operations, the worst-case does not occur ofen in each operation; some operation may be cheap (even be O(1)), some may be expensive. Therefore, a traditional worst-case per operation analysis can give an overly pessimitic bound. When the same operation takes different times, amortized analysis can accurately calculate the runtime complexity. it can give the average performance of each operation in the worst case. Please notice that amortized analysis is not anverage case analysis. In average case analysis we compute the expected cost of each operation. To describe it in a more general example. There is an algorithm with 1000 operations. There are 99 operations take $0(1)$ and one operation takes $O(n ^ 2)$. If we use average case analysis. The expected cost of this algorithm might be $O(n^2)$. However, it does not need so much time since this time-consuming operation can be amortized to other 99 operations. So, amortized function can give us a tigher upper bound of an algorithm. There are generally three method to perform amortized analysis.(We only forcus on the first two methods)

  1. The aggregate method computes the upper bound $T(n)$ on the total cost of n operations. The amortized cost is given by $T(n)/n$. In this method each operation will get the same amortized cost, even if there are several types of operations in the sequence.

  2. The accounting method(or the banker’s method) computes th individual cost of each operation. We assign different charges to each operation; some operations may charge more or less than they actually cost. The amount we charge an operation is called its amortized cost.

  3. The potential method(or the physicist’s method).

Unbounded Array#

The aggregrate method#

We implement the idea of java.util.ArrayList in java. We maintain an array of a fixed length limit and an internal index size. When we insert a value, if current index dose reach array.length, we will double the size and then we will copy our elements to the new array. The table showing below records the current size of the array, its new size, the number of inserts and the number of copies.

Insert Old size New Size Copy
1 1 - -
2 1 2 1
3 2 4 2
4 4 - -
5 4 8 4
6 8 - -
7 8 - -
8 8 - -
9 8 16 8

This table shows that 9 inserts requre$1 + 2 + 4 + 8 = 15$ copy operations. Therefore, the amortized cost of a single insert is the total cost$(9+15=24)$ over $9$ inserts, which is $2.67$.

When we generalize this pattern. Assume we start with the array of size $1$ and make $2^n + 1$ inserts. These inserts will require $\sum\limits_{k=1}^n{2^n} = 2^{n+1} - 1$ copy operations. So the total work $T(n) = (2^n + 1) + (2^{(n+1)} - 1) = 3\cdot 2^n$. Then we compute the average cost per insert as a limit when the input size tend to infinity:
$$\lim_{n\to\infty}\frac{3\cdot2^n}{1+2^n} = 3$$
So the amortized cost of insertion is constant, namely $O(3)$. This method is called aggegate method which seeks an upper bound on the total running time of a sequence of operations.

Reference#

[1] Algorithms in Action, by V. Savvich, First Edition, 2019.