AVL Tree


An AVL Tree (named after its inventors Adelson-Velsky and Landis) is a self-balancing Binary Search Tree (BST). The key feature of an AVL tree is that it ensures that the heights of the two child subtrees of any node differ by no more than one. If the tree becomes unbalanced after an insertion or deletion, it performs rotations to restore balance.


What is an AVL Tree?

An AVL Tree is a binary search tree in which the difference between the heights of the left and right subtrees of any node (called the balance factor) is restricted to the values -1, 0, or 1. This balance constraint ensures that the tree remains approximately balanced, providing logarithmic time complexity for search, insertion, and deletion operations.

Key Properties of an AVL Tree:

  1. Balance Factor: The balance factor of any node is calculated as the height of the left subtree minus the height of the right subtree:

    Balance Factor=Height of Left SubtreeHeight of Right Subtree

    The balance factor can only be -1, 0, or 1 for the tree to be balanced.

  2. Height of Tree: The height of an AVL tree is kept logarithmic relative to the number of nodes, ensuring that the time complexity of operations remains optimal.

  3. Rotations: AVL trees perform rotations (single or double) to maintain balance after insertions and deletions.


AVL Tree Rotations

When the balance factor of a node exceeds 1 or -1 after an operation (insertion or deletion), the tree is said to be unbalanced. To restore balance, AVL trees use rotations. There are four types of rotations:

  1. Left Rotation (LL Rotation):

    • Applied when a node is unbalanced due to a heavy right subtree.
  2. Right Rotation (RR Rotation):

    • Applied when a node is unbalanced due to a heavy left subtree.
  3. Left-Right Rotation (LR Rotation):

    • A combination of a left rotation followed by a right rotation.
  4. Right-Left Rotation (RL Rotation):

    • A combination of a right rotation followed by a left rotation.

Operations in an AVL Tree

1. Insertion in an AVL Tree

The insertion process in an AVL tree follows the same process as in a regular binary search tree, but after insertion, the tree needs to be rebalanced if the balance factor of any node is not in the range of -1, 0, or 1.

Steps for Insertion:

  1. Insert the node as you would in a standard BST.
  2. After insertion, update the height of each node.
  3. Calculate the balance factor of each node, starting from the inserted node up to the root.
  4. If any node’s balance factor is out of range (-1, 0, or 1), perform the appropriate rotation.

2. Deletion in an AVL Tree

In AVL trees, deleting a node is similar to deletion in a standard BST, but after deletion, the tree may need to be rebalanced.

Steps for Deletion:

  1. Remove the node as you would in a standard BST.
  2. Update the height of each node after deletion.
  3. Calculate the balance factor and perform rotations if needed to restore balance.

3. Search in an AVL Tree

Searching in an AVL tree is the same as in a regular binary search tree, where you compare the value with the current node and recursively traverse either the left or right subtree.

4. Traversal in an AVL Tree

AVL trees support standard binary tree traversals, including:

  • In-order Traversal: Traverse the left subtree, visit the node, then traverse the right subtree.
  • Pre-order Traversal: Visit the node, traverse the left subtree, then the right subtree.
  • Post-order Traversal: Traverse the left subtree, traverse the right subtree, then visit the node.
  • Level-order Traversal: Visit the nodes level by level from top to bottom.

Code Example: Implementing an AVL Tree in Python

Here's a Python implementation of an AVL Tree that supports insertion, deletion, and in-order traversal:

class Node:
    def __init__(self, key):
        self.left = None
        self.right = None
        self.value = key
        self.height = 1

class AVLTree:
    # Insertion
    def insert(self, root, key):
        if not root:
            return Node(key)

        if key < root.value:
            root.left = self.insert(root.left, key)
        else:
            root.right = self.insert(root.right, key)

        root.height = 1 + max(self.getHeight(root.left), self.getHeight(root.right))

        balance = self.getBalance(root)

        # Left heavy
        if balance > 1 and key < root.left.value:
            return self.rightRotate(root)

        # Right heavy
        if balance < -1 and key > root.right.value:
            return self.leftRotate(root)

        # Left-right case
        if balance > 1 and key > root.left.value:
            root.left = self.leftRotate(root.left)
            return self.rightRotate(root)

        # Right-left case
        if balance < -1 and key < root.right.value:
            root.right = self.rightRotate(root.right)
            return self.leftRotate(root)

        return root

    # Left Rotate
    def leftRotate(self, z):
        y = z.right
        T2 = y.left

        y.left = z
        z.right = T2

        z.height = 1 + max(self.getHeight(z.left), self.getHeight(z.right))
        y.height = 1 + max(self.getHeight(y.left), self.getHeight(y.right))

        return y

    # Right Rotate
    def rightRotate(self, z):
        y = z.left
        T3 = y.right

        y.right = z
        z.left = T3

        z.height = 1 + max(self.getHeight(z.left), self.getHeight(z.right))
        y.height = 1 + max(self.getHeight(y.left), self.getHeight(y.right))

        return y

    # Get height of a node
    def getHeight(self, root):
        if not root:
            return 0
        return root.height

    # Get balance factor of a node
    def getBalance(self, root):
        if not root:
            return 0
        return self.getHeight(root.left) - self.getHeight(root.right)

    # In-order Traversal
    def inorderTraversal(self, root):
        if root:
            self.inorderTraversal(root.left)
            print(root.value, end=" ")
            self.inorderTraversal(root.right)

# Example Usage
avl = AVLTree()
root = None

# Inserting nodes
nodes = [10, 20, 30, 40, 50, 25]
for node in nodes:
    root = avl.insert(root, node)

# In-order Traversal
print("In-order Traversal of AVL Tree:")
avl.inorderTraversal(root)

Output:

In-order Traversal of AVL Tree:
10 20 25 30 40 50

Applications of AVL Trees

  1. Efficient Searching: Since AVL trees are always balanced, they guarantee O(log n) time complexity for search operations, making them ideal for applications that require fast lookups.

  2. Database Indexing: AVL trees are used in database indexing to maintain sorted data with quick access, insertion, and deletion.

  3. Memory Management: AVL trees are used in memory management algorithms where quick allocation and deallocation of memory blocks are necessary.

  4. Priority Queues: AVL trees can be used to implement priority queues, ensuring that elements are processed in order of priority.

  5. Self-Balancing Trees: AVL trees are a form of self-balancing binary search tree, ensuring that the tree remains efficient even after many insertions and deletions.