×

AVL Tree in Python

Avl Tree Python Ds F

An AVL Tree is a self-balancing Binary Search Tree (BST) where the difference between the heights of left and right subtrees of any node cannot be more than one. While performing operations, if at any time they differ by more than one, rebalancing is performed to restore this property.

Image 184

In the above example, let us consider the node with the value 50. The left subtree of this node has a height of 2 and the right subtree has a height of 1. The difference between their heights is 1.

Since this difference for each node of the tree is less than or equal to 1, they satisfy the fundamental property for an AVL Tree. Hence, the above diagram is an example of the AVL Tree data structure.

Following are some properties of an AVL Tree that make it more advantageous than a binary search tree:

  • The height of an AVL Tree is always balanced. If N is the number of nodes in the tree, the height never increases beyond logN.
  • AVL Tree gives better search time complexity as compared to a simple Binary Search Tree.
  • AVL trees have self balancing properties, due to which they can be efficiently used to perform operations such as Insertion, Deletion, Searching, etc.

Creation of AVL Trees

To initialize an AVL tree, we create an AVL tree class having four components – a data value, reference to the left child node, a reference to the right child node, and a variable height. The variable height is used for checking and maintaining the self-balancing property of the AVL tree.

While implementing the various operations of an AVL tree, we will use the queue data structure built using a linked list. We have dedicated an article for queue built using a linked list. For a detailed explanation of the Queue class below, refer to that article.

#Creating the Queue class using linked list
class Node:
    def __init__(self, value=None):
        self.value = value
        self.next = None

    def __str__(self):
        return str(self.value)


class LinkedList:
    def __init__(self):
        self.head = None
        self.tail = None

    def __iter__(self):
        node = self.head
        while node:
            yield node
            node = node.next

class Queue:
    def __init__(self):
        self.linkedList = LinkedList()

    def __str__(self):
        values = [str(x) for x in self.linkedList]
        return ' '.join(values)

    def enqueue(self, value):
        new_node = Node(value)
        if self.linkedList.head == None:
            self.linkedList.head = new_node
            self.linkedList.tail = new_node
        else:
            self.linkedList.tail.next = new_node
            self.linkedList.tail = new_node

    def isEmpty(self):
        if self.linkedList.head == None:
            return True
        else:
            return False

    def dequeue(self):
        if self.isEmpty():
            return "The Queue is empty."
        else:
            tempNode = self.linkedList.head
            if self.linkedList.head == self.linkedList.tail:
                self.linkedList.head = None
                self.linkedList.tail = None
            else:
                self.linkedList.head = self.linkedList.head.next
            return tempNode

    def peek(self):
        if self.isEmpty():
            return "The Queue is empty."
        else:
            return self.linkedList.head

    def delete(self):
        self.linkedList.head = None
        self.linkedList.tail = None
#Creating the AVL Tree Class
import QueueLinkedList as queue

class AVLNode:
    def __init__(self, data):
        self.data = data
        self.leftChild = None
        self.rightChild = None
        self.height = 1

newTree = AVLNode(None)

Time and Space Complexity

The time complexity for the creation of an AVL tree is O(1) because it takes constant time for the initialization of data variables. The space complexity is O(1) as well since no additional memory is required.

Operations on AVL Tree

Insertion in AVL Tree

Since AVL Tree is a type of Binary Tree, each new element that is inserted in an AVL Tree needs to be inserted in a specific location in the tree. However, this might disturb the balance of the AVL Tree. To maintain its self-balancing property, insertion in an AVL Tree follows any of the two measures:

  • Rotation is not required.
  • Rotation is required.

There are four conditions of rotation in AVL Trees:

  • (LL)Left Left Condition
  • (LR)Left Right Condition
  • (RR)Right Right Condition
  • (RL)Right left Condition

LL – Left Left Condition while Insertion

Example of Left-Left (LL) condition:

Image 188

An AVL tree may become unbalanced if a node is inserted in the left subtree of the left subtree. When the Left Left Condition is encountered, we can maintain the balance of the tree by performing a right rotation operation.

Example of Right Rotation:

Algorithm of Right Rotation

  • Create a varable newRoot. Assign it the left child of the disbalanced node.
  • Assign the right child of the disbalanced node to the left child.
  • Set the right child of the newRoot as the disbalanced node.
  • Update heights of the disbalanced node and newRoot node.
#Implementing Right Rotation
def rightRotate(disbalanceNode):
    newRoot = disbalanceNode.leftChild
    disbalanceNode.leftChild = disbalanceNode.leftChild.rightChild
    newRoot.rightChild = disbalanceNode
    disbalanceNode.height = 1 + max(getHeight(disbalanceNode.leftChild), getHeight(disbalanceNode.rightChild))
    newRoot.height = 1 + max(getHeight(newRoot.leftChild), getHeight(newRoot.rightChild))
    return newRoot

Time and Space Complexity

The time complexity to tackle the LL condition is O(1) as it takes constant time to initialize data to a variable. The space complexity is O(1) as well because no additional memory is required.

RR – Right Right Condition while Insertion

Example of Right-Right (RR) condition:

An AVL tree may become unbalanced if a node is inserted in the right subtree of the right subtree. When the Right Right Condition is encountered, we can maintain the balance of the tree by performing a left rotation operation.

Example of Left Rotation:

Algorithm of Left Rotation

  • Create a varable newRoot. Assign it the right child of the disbalanced node.
  • Assign the left child of the disbalanced node to the right child.
  • Set the leftt child of the newRoot as the disbalanced node.
  • Update heights of the disbalanced node and newRoot node.
#Implementing Left Rotation
def leftRotate(disbalanceNode):
    newRoot = disbalanceNode.rightChild
    disbalanceNode.rightChild = disbalanceNode.rightChild.leftChild
    newRoot.leftChild = disbalanceNode
    disbalanceNode.height = 1 + max(getHeight(disbalanceNode.leftChild), getHeight(disbalanceNode.rightChild))
    newRoot.height = 1 + max(getHeight(newRoot.leftChild), getHeight(newRoot.rightChild))
    return newRoot

LR – Left Right Condition while Insertion

Example of Left-Right (LR) condition:

A left-right rotation is a combination of left rotation followed by a right rotation. An AVL tree may become unbalanced if a node is inserted in the right subtree of the left subtree. We can maintain its self-balancing property by performing a left-right rotation.

Example of Left-Right Rotation:

Algorithm of Left-Right (LR) Rotation

  • Perform left rotation on the left child of disbalanced node.
  • Perform right rotation on disbalanced node.

Time and Space Complexity

The time complexity to tackle the LR condition is O(1) as it takes constant time to initialize data to a variable. The space complexity is O(1) as well because no additional memory is required.

RL – Right Left Condition while Insertion

Example of Right-Left (RL) Condition:

A right-left rotation is a combination of right rotation followed by a left rotation. An AVL tree may become unbalanced if a node is inserted in the right subtree of the left subtree. We can maintain its self-balancing property by performing a right-left rotation.

Example of Right-Left Rotation:

Algorithm of Right-Left (RL) Rotation

  • Perform right rotation on the right child of disbalanced node.
  • Perform left rotation on disbalanced node.

Time and Space Complexity

The time complexity to tackle the RL condition is O(1) as it takes constant time to initialize data to a variable. The space complexity is O(1) as well because no additional memory is required.

Insertion in AVL Tree (Combining all conditions)

While inserting new elements in an AVL Tree, we have to keep in mind that no node remains unbalanced. After inserting each element in its rightful position, we check whether or not the balance of subsequent elements is disturbed.

We then need to figure out the condition of disbalance of the node. The self-balancing property is maintained in the implementation of insertion which uses various rotations to restore the fundamental property of the AVL tree.

For the insertion function to be fully operational, we’ll need two additional functions – getHeight which returns the height of the current node and getBalance which returns the difference between the heights of the left and right subtree of the given node.

#Implementing Insertion in an AVL Tree
def getHeight(root_node):
    if not root_node:
        return 0
    return root_node.height

def getBalance(root_node):
    if not root_node:
        return 0
    return getHeight(root_node.leftChild) - getHeight(root_node.rightChild)

def insertNode(root_node, node_value):
    if not root_node:
        return AVLNode(node_value)
    elif node_value < root_node.data:
        root_node.leftChild = insertNode(root_node.leftChild, node_value)
    else:
        root_node.rightChild = insertNode(root_node.rightChild, node_value)
    
    root_node.height = 1 + max(getHeight(root_node.leftChild), getHeight(root_node.rightChild))
    balance = getBalance(root_node)
    if balance > 1 and node_value < root_node.leftChild.data:
        return rightRotate(root_node)
    if balance > 1 and node_value > root_node.leftChild.data:
        root_node.leftChild = leftRotate(root_node.leftChild)
        return rightRotate(root_node)
    if balance < -1 and node_value > root_node.rightChild.data:
        return leftRotate(root_node)
    if balance < -1 and node_value < root_node.rightChild.data:
        root_node.rightChild = rightRotate(root_node.rightChild)
        return leftRotate(root_node)
    return root_node

#Initializing the AVL Tree
newTree = AVLNode(70)
newTree = insertNode(newTree, 50)
newTree = insertNode(newTree, 90)
newTree = insertNode(newTree, 30)
newTree = insertNode(newTree, 80)
newTree = insertNode(newTree, 100)
newTree = insertNode(newTree, 20)
newTree = insertNode(newTree, 40)

levelOrderTraversal(newTree)

Output

1 15

Time and Space Complexity

The recursive function searches for the correct position by dividing the subtrees at each level and moving on to the next level depending on the value of the node. In this way, the number of function calls takes the form of logN. Hence, the time complexity for insertion is O(logN).

The space complexity is O(logN) as well because every time the function is called, it allocates space in the stack in memory to determine the next function call.

Traversal in AVL Tree

Traversal is the process of visiting every node in the tree, exactly once, to print values, search nodes, etc. For an AVL tree, traversal can be implemented in any of the following ways:

  • Depth First Traversals:
    • Inorder (Left, Root, Right)
    • Preorder (Root, Left, Right)
    • Postorder (Left, Right, Root) 
  • Breadth-First Traversal:
    • Level Order Traversal

PreOrder Traversal in AVL Tree

Pre-order traversal visits the current node before its child nodes. In a pre-order traversal, the root is always the first node visited.

Image 138

Example:

Image 187

PreOrder Traversal – 70 -> 50 -> 30 -> 20 -> 40 -> 60 -> 90 -> 80 -> 100

Implementation of PreOrder Traversal in AVL Tree

For implementing pre-order traversal in an AVL tree, the below code makes use of a recursive function preOrderTraversal. Initially, the root node is passed as an argument to this function.

If the root node is not empty, its content is displayed followed by calling the same recursive function with the left and right of the current node. The function preOrderTraversal terminates a recursion when the encountered node is found to be empty.

#Function to implement PreOrder Traversal
def preOrderTraversal(root_node):
    if not root_node:
        return
    print(root_node.data)
    preOrderTraversal(root_node.leftChild)
    preOrderTraversal(root_node.rightChild)

#Initializing the AVL Tree
newTree = AVLNode(70)
newTree = insertNode(newTree, 50)
newTree = insertNode(newTree, 90)
newTree = insertNode(newTree, 30)
newTree = insertNode(newTree, 80)
newTree = insertNode(newTree, 100)
newTree = insertNode(newTree, 20)
newTree = insertNode(newTree, 40)

preOrderTraversal(newTree)

Output

1 11

Time and Space Complexity

The time complexity for PreOrder traversal is O(N) because the function recursively visits all nodes of the tree. When we use a recursive function, it holds up memory in the stack so as to recognize its next call. Thus, the space complexity for PreOrder traversal is O(N).

InOrder Traversal in AVL Tree

In-order traversal means to visit the left branch, then the current node, and finally, the right branch.

Image 140

Example:

Image 187

PreOrder Traversal – 20 -> 30 -> 40 -> 50 -> 60 -> 70 -> 80 -> 90 -> 100

Implementation of InOrder Traversal in AVL Tree

For implementing in-order traversal in an AVL tree, the below code makes use of a recursive function inOrderTraversal. Initially, the root node is passed as an argument to this function.

If the root node is not empty, a recursive call to the left child is made, followed by displaying the content of the current node, and then a recursive call to the right child is made. The function inOrderTraversal terminates a recursion when the encountered node is found to be empty.

#Function to implement InOrder Traversal
def inOrderTraversal(root_node):
    if not root_node:
        return
    inOrderTraversal(root_node.leftChild)
    print(root_node.data)
    inOrderTraversal(root_node.rightChild)

#Initializing the AVL Tree
newTree = AVLNode(70)
newTree = insertNode(newTree, 50)
newTree = insertNode(newTree, 90)
newTree = insertNode(newTree, 30)
newTree = insertNode(newTree, 80)
newTree = insertNode(newTree, 100)
newTree = insertNode(newTree, 20)
newTree = insertNode(newTree, 40)

inOrderTraversal(newTree)

Output

1 12

Time and Space Complexity

The time complexity for InOrder traversal is O(N) because the function recursively visits all nodes of the tree. The space complexity for InOrder traversal is O(N) because the stack holds memory continuously while using a recursive function.

PostOrder Traversal in AVL Tree

Post-order traversal means visiting the left branch, then the current node, and finally, the right branch. In a post-order traversal, the root is always the last node visited.

Image 141

Example:

Image 187

PreOrder Traversal – 20 -> 40 -> 30-> 60 -> 50 -> 80 -> 100 -> 90 -> 70

Implementation of PostOrder Traversal in AVL Tree

For implementing in-order traversal in an AVL tree, the below code makes use of a recursive function postOrderTraversal. Initially, the root node is passed as an argument to this function.

If the root node is not empty, a recursive call to the left child is made, followed by a recursive call to the right child, and then a displaying the content of the current node. The function postOrderTraversal terminates a recursion when the encountered node is found to be empty.

#Function to implement PostOrder Traversal
def postOrderTraversal(root_node):
    if not root_node:
        return
    postOrderTraversal(root_node.leftChild)
    postOrderTraversal(root_node.rightChild)
    print(root_node.data)

#Initializing the AVL Tree
newTree = AVLNode(70)
newTree = insertNode(newTree, 50)
newTree = insertNode(newTree, 90)
newTree = insertNode(newTree, 30)
newTree = insertNode(newTree, 80)
newTree = insertNode(newTree, 100)
newTree = insertNode(newTree, 20)
newTree = insertNode(newTree, 40)

postOrderTraversal(newTree)

Output

1 13

Time and Space Complexity

The time complexity for PostOrder traversal is O(N) because the function recursively visits all nodes of the tree. The space complexity for PostOrder traversal is O(N) because the stack holds memory continuously while using a recursive function.

Level Order Traversal in AVL Tree

Trees can also be traversed in a level order manner. The Level Order traversal is a breadth-first traversal where nodes are visited level-by-level. Then, they are traversed in a left-to-right manner.

Example:

Image 174

Level Order Traversal – 70 -> 50 -> 90 -> 30 -> 60 -> 80 -> 100 -> 20 -> 40

Implementation of Level Order Traversal in AVL Tree

For implementing level-order traversal in an AVL tree, the below code makes use of a queue to implement the same. The function levelOrderTraversal is initially called with the root node as its parameter.

If the current node is not empty, the same is pushed into a queue, followed by iterating the queue. Queue iteration works as follows:

  • The first element is dequeued and its content is displayed.
  • The children of this node are enqueued into the same queue.
  • The same procedure is implemented for all nodes that enter the queue.
  • The function terminates when every element is dequeued.
#Function to implement LevelOrder Traversal
def levelOrderTraversal(root_node):
    if not root_node:
        return
    else:
        customQueue = queue.Queue()
        customQueue.enqueue(root_node)
        while not(customQueue.isEmpty()):
            root = customQueue.dequeue()
            print(root.value.data)
            if root.value.leftChild is not None:
                customQueue.enqueue(root.value.leftChild)
            if root.value.rightChild is not None:
                customQueue.enqueue(root.value.rightChild)

#Initializing the AVL Tree
newTree = AVLNode(70)
newTree = insertNode(newTree, 50)
newTree = insertNode(newTree, 90)
newTree = insertNode(newTree, 30)
newTree = insertNode(newTree, 80)
newTree = insertNode(newTree, 100)
newTree = insertNode(newTree, 20)
newTree = insertNode(newTree, 40)

levelOrderTraversal(newTree)

Output

1 15

Time and space Complexity

The time complexity for Level Order traversal is O(N) because the function visits all nodes of the tree. The space complexity for Level Order traversal is O(N) because we store the data of the tree in a temporary queue.

Searching for a node in AVL Tree

In an AVL Tree, the left child of a node has a value lesser than that node and the right child has a greater value. Thus, while searching, we compare the value to be searched with that of the node we visit.

If the value is less than that of the node, we traverse the left subtree. Otherwise, we traverse the right subtree. This optimizes the time utilization as we divide the tree at each level during our search.

Example:

Image 175

Implementation of AVL Tree

To implement searching a node in an AVL tree, we create a function searchNode that takes two arguments -the root node of the tree and the value to be searched.

The recursive function compares the value of the current node with the value to be searched. Further, the recursive function is called for the left child or the right child of the current node depending on whether the value to be searched is lesser or greater than the current node.

#Searching a node in an AVL Tree
def searchNode(root_node, node_value):
    if root_node.data == node_value:
        print("The element has been found.")
    elif node_value < root_node.data:
        if root_node.leftChild.data == node_value:
            print("The element has been found.")
        else:
            searchNode(root_node.leftChild, node_value)
    else:
        if root_node.rightChild.data == node_value:
            print("The element has been found.")
        else:
            searchNode(root_node.rightChild, node_value)

#Initializing the AVL Tree
newTree = AVLNode(70)
newTree = insertNode(newTree, 50)
newTree = insertNode(newTree, 90)
newTree = insertNode(newTree, 30)
newTree = insertNode(newTree, 80)
newTree = insertNode(newTree, 100)
newTree = insertNode(newTree, 20)
newTree = insertNode(newTree, 40)

searchNode(newTree, 90)

#Output
The element has been found.

Time and space Complexity

The recursive function searches for the correct node by dividing the subtrees at each level and moving on to the next level depending on the value of the node. In this way, the number of function calls takes the form of logN. Hence, the time complexity for search is O(logN).

The space complexity is O(logN) as well because every time the function is called, it allocates space in the stack in memory to determine the next function call.

Insertion in AVL Tree

The deletion of any element from an AVL Tree is a sequential process. We must take into account whether the node to be deleted is a leaf node or an internal node. Then, the node can be removed and its adjacent nodes can further get interconnected.

However, this might disturb the balance of the AVL Tree. To maintain its self-balancing property, we perform the four rotations on the imbalanced nodes as per the situation. While deletion, any of the following three cases can be encountered:

  • The AVL Tree doesn’t exist
  • Rotation is required.
  • Rotation is not required.

In addition to the rotation functions, we need another function called getMinValueNode which returns the minimum value in the entire AVL Tree.

def getMinValueNode(root_node):
    if root_node is None or root_node.leftChild is None:
        return root_node
    return getMinValueNode(root_node.leftChild)

def deleteNode(root_node, node_value):
    if not root_node:
        return root_node
    elif node_value < root_node.data:
        root_node.leftChild = deleteNode(root_node.leftChild, node_value)
    elif node_value > root_node.data:
        root_node.rightChild = deleteNode(root_node.rightChild, node_value)
    else:
        if root_node.leftChild is None:
            temp = root_node.rightChild
            root_node = None
            return temp
        elif root_node.rightChild is None:
            temp = root_node.leftChild
            root_node = None
            return temp
        temp = getMinValueNode(root_node.rightChild)
        root_node.data = temp.data
        root_node.rightChild = deleteNode(root_node.rightChild, temp.data)
    balance = getBalance(root_node)
    if balance > 1 and getBalance(root_node.leftChild) >= 0:
        return rightRotate(root_node)
    if balance < -1 and getBalance(root_node.rightChild) <= 0:
        return leftRotate(root_node)
    if balance > 1 and getBalance(root_node.leftChild) < 0:
        root_node.leftChild = leftRotate(root_node.leftChild)
        return rightRotate(root_node)
    if balance < -1 and getBalance(root_node.rightChild) > 0:
        root_node.rightChild = rightRotate(root_node.rightChild)
        return leftRotate(root_node)
    
    return root_node

#Initializing the AVL Tree
newTree = AVLNode(70)
newTree = insertNode(newTree, 50)
newTree = insertNode(newTree, 90)
newTree = insertNode(newTree, 30)
newTree = insertNode(newTree, 80)
newTree = insertNode(newTree, 100)
newTree = insertNode(newTree, 20)
newTree = insertNode(newTree, 40)

newTree = deleteNode(newTree, 30)

levelOrderTraversal(newTree)

Output

1 17

Time and space Complexity

The recursive function searches for the node to be deleted by dividing the subtrees at each level and moving on to the next level depending on the value of the node. In this way, the number of function calls takes the form of logN. Hence, the time complexity for search is O(logN).

The space complexity is O(logN) as well because every time the function is called, it allocates space in the stack in memory to determine the next function call.

Clearing of Entire AVL Tree

To delete an entire AVL tree, we simply set the root node and left and right child references to None. Then, we deallocate the space held up by the root data. This way we free up the allocated memory space.

#Function to Delete Entire AVL Tree
def deleteAVL(root_node):
    root_node.data = None
    root_node.leftChild = None
    root_node.rightChild = None
    return "The AVL Tree has been successfully deleted."

#Initializing the AVL Tree
newTree = AVLNode(70)
newTree = insertNode(newTree, 50)
newTree = insertNode(newTree, 90)
newTree = insertNode(newTree, 30)
newTree = insertNode(newTree, 80)
newTree = insertNode(newTree, 100)
newTree = insertNode(newTree, 20)
newTree = insertNode(newTree, 40)

deleteAVL(newTree)

#Output
The AVL Tree has been successfully deleted.

Time and Space Complexity

The time complexity for deletion of the entire AVL tree is O(1) since it takes constant time for setting the references to None. The space complexity is O(1) as well since no additional memory is required for deletion.

Time and Space Complexity of AVL Tree

Time ComplexitySpace Complexity
Create AVL TreeO(1) O(1)
Insert a node AVL Tree O(logN) O(logN)
Traverse AVL Tree O(N) O(N)
Search for a node AVL Tree O(logN) O(logN)
Delete a node from AVL Tree O(logN) O(logN)
Delete Entire AVL Tree O(1) O(1)