AVL Trees

Recall that balanced BSTs maintain $h = O(\log n)$, so all operations run in $O(\log n)$ time.

Selfbalancing BSTs are data types that automatically keep track of their height to maintain the balanced property.

One of the most important operations on most selfbalancing binary trees is the rotation. Left and right rotations are animated below.^{[1]}

Some of the popular data structures that implement selfbalancing BSTs include:

23 trees

AVL trees

Redblack trees
In this section, we’ll discuss AVL trees.

AVL: Adel’sonVel’skii & Landis

AVL trees are the first example (invented in 1962) of a selfbalancing binary search tree.

AVL trees satisfy the heightbalance property: for any node $n$, the heights of $n$’s left and right subtrees can differ by at most 1.

To make math easier, we can define each null node to have height of 1. This will make balancing easier.

As part of data structure augmentation, each node stores its height. Alternatively, you can store just difference in heights.

We will call a node unbalanced if its left and right children differ in height by more than 1.
AVL Find

The
find
operation for an AVL tree exactly the same as for a binary search tree. 
Since an AVL tree is guaranteed to be balanced,
find
always takes $\Theta(\log n)$ time in the worst case, where $n$ is the number of elements in the AVL tree.
AVL Insert

To insert into an AVL tree,

Insert as into a simple BST.

Work your way up the tree, restoring the AVL property and updating heights as you go.


Take a look at the slides for an example of inserting a sequence of numbers.
AVL Remove

Removing a node from an AVL tree falls into one of the three cases:

The node is a leaf. This is easy. Simply delete the node.

The node has just one child. This is just as easy. Simply delete the node and connect the parent of the deleted node to the child of the deleted node.

The node has two children. This is a bit more complicated.
We define the inorder predecessor of a node as the previous node in the inorder traversal of the tree. To find the inorder predecessor, go to left subtree and find the rightmost node.
We define the inorder successor of a node as the next node in the inorder traversal of a tree. To find the inorder successor, go to right subtree and find the leftmost node.
In order to delete a node that has two children, replace it with either its inorder predecessor or inorder successor.


After deleting from an AVL, you might need to rebalance the tree.
AVL Tree Performance

For an AVL tree with $N$ nodes, searching, insertion and deletion each take $O(\log N)$ time.

The path from the root to the deepest leaf in an AVL tree is at most $\sim1.44 \log(N + 2)$.
Disadvantages of AVL Trees

As you can see, AVL trees are difficult to implement.

In addition, AVL trees have high constant factors for some operations. For example, restructuring is an expensive operation, and an AVL tree may have to rebalance itself $\log_2 n$ in the worst case during a removal of a node.

Most STL implementations of the ordered associative containers (sets, multisets, maps and multimaps) use redblack trees instead of AVL trees. Unlike AVL trees, redblack trees require only one restructuring for a removal or an insertion.