Sorting Algorithms implemented in Python- Merge Sort, Bubble Sort, Quick Sort

Merge sort is a classic divide-and-conquer algorithm that efficiently sorts a list or array by dividing it into smaller sublists, sorting those sublists, and then merging them back together. Here’s a step-by-step explanation of how merge sort works, along with an example:

How Merge Sort Works

  1. Divide: Split the list into two halves.
  2. Conquer: Recursively sort each half.
  3. Combine: Merge the two sorted halves back together.

Detailed Steps

  1. Divide:
    • If the list is empty or has one element, it is already sorted. Return it as is.
    • Otherwise, split the list into two halves.
  2. Conquer:
    • Recursively apply the merge sort algorithm to each half.
  3. Combine:
    • Merge the two sorted halves into a single sorted list.

Example

Let’s sort the list [38, 27, 43, 3, 9, 82, 10] using merge sort.

  1. Divide:
    • Split into two halves: [38, 27, 43, 3] and [9, 82, 10]
  2. Conquer:
    • Recursively sort each half.
  3. Sort the first half [38, 27, 43, 3]:
    • Split into [38, 27] and [43, 3]
    • Sort [38, 27]:
      • Split into [38] and [27]
      • Both [38] and [27] are already sorted.
      • Merge [38] and [27] to get [27, 38]
    • Sort [43, 3]:
      • Split into [43] and [3]
      • Both [43] and [3] are already sorted.
      • Merge [43] and [3] to get [3, 43]
    • Merge [27, 38] and [3, 43] to get [3, 27, 38, 43]
  4. Sort the second half [9, 82, 10]:
    • Split into [9] and [82, 10]
    • Sort [82, 10]:
      • Split into [82] and [10]
      • Both [82] and [10] are already sorted.
      • Merge [82] and [10] to get [10, 82]
    • Merge [9] and [10, 82] to get [9, 10, 82]
  5. Combine:
    • Merge the sorted halves [3, 27, 38, 43] and [9, 10, 82]:
      • Compare the first elements of each list: 3 and 9.
      • 3 is smaller, so it goes first.
      • Next, compare 27 and 9.
      • 9 is smaller, so it goes next.
      • Continue comparing and merging the remaining elements.
    • The final sorted list is [3, 9, 10, 27, 38, 43, 82].

Merge Sort in Python

Here’s a Python implementation of merge sort:

def merge_sort(arr):
    if len(arr) <= 1:
        return arr

    # Divide the array into two halves
    mid = len(arr) // 2
    left_half = arr[:mid]
    right_half = arr[mid:]

    # Recursively sort each half
    left_sorted = merge_sort(left_half)
    right_sorted = merge_sort(right_half)

    # Merge the sorted halves
    return merge(left_sorted, right_sorted)

def merge(left, right):
    result = []
    i = 0
    j = 0

    # Merge the two halves
    while i < len(left) and j < len(right):
        if left[i] < right[j]:
            result.append(left[i])
            i += 1
        else:
            result.append(right[j])
            j += 1

    # Append any remaining elements
    result.extend(left[i:])
    result.extend(right[j:])

    return result

# Example usage
arr = [38, 27, 43, 3, 9, 82, 10]
sorted_arr = merge_sort(arr)
print(sorted_arr)

This code demonstrates the merge sort algorithm, dividing the list into smaller parts, sorting them, and merging them back together to get a fully sorted list.

All three Sorting Algorithms Compared Together

1. Bubble Sort

Bubble Sort is a simple sorting algorithm that repeatedly steps through the list, compares adjacent elements and swaps them if they are in the wrong order. The pass through the list is repeated until the list is sorted.

Time Complexity: O(n^2)
Space Complexity: O(1)
Stability: Yes

def bubble_sort(arr):
    n = len(arr)
    for i in range(n):
        for j in range(0, n - i - 1):
            if arr[j] > arr[j + 1]:
                arr[j], arr[j + 1] = arr[j + 1], arr[j]
    return arr

# Test Bubble Sort
arr = [64, 34, 25, 12, 22, 11, 90]
print("Bubble Sort Result:", bubble_sort(arr))

2. Quick Sort

Quick Sort is a highly efficient sorting algorithm and is based on partitioning the array into smaller sub-arrays. A large array is partitioned into two arrays, one of which holds values smaller than the specified value (pivot), and another array holds values greater than the pivot.

Time Complexity:

  • Best: O(n log n)
  • Average: O(n log n)
  • Worst: O(n^2)

Space Complexity: O(log n)
Stability: No

def quick_sort(arr):
    if len(arr) <= 1:
        return arr
    pivot = arr[len(arr) // 2]
    left = [x for x in arr if x < pivot]
    middle = [x for x in arr if x == pivot]
    right = [x for x in arr if x > pivot]
    return quick_sort(left) + middle + quick_sort(right)

# Test Quick Sort
arr = [64, 34, 25, 12, 22, 11, 90]
print("Quick Sort Result:", quick_sort(arr))

3. Merge Sort

Merge Sort is a divide-and-conquer algorithm that divides the unsorted list into n sublists, each containing one element, and then repeatedly merges sublists to produce new sorted sublists until there is only one sublist remaining.

Time Complexity: O(n log n)
Space Complexity: O(n)
Stability: Yes

def merge_sort(arr):
    if len(arr) <= 1:
        return arr
    mid = len(arr) // 2
    left = merge_sort(arr[:mid])
    right = merge_sort(arr[mid:])
    return merge(left, right)

def merge(left, right):
    result = []
    i = j = 0
    while i < len(left) and j < len(right):
        if left[i] < right[j]:
            result.append(left[i])
            i += 1
        else:
            result.append(right[j])
            j += 1
    result.extend(left[i:])
    result.extend(right[j:])
    return result

# Test Merge Sort
arr = [64, 34, 25, 12, 22, 11, 90]
print("Merge Sort Result:", merge_sort(arr))

Comparisons

Let’s summarize the comparisons:

AlgorithmTime ComplexitySpace ComplexityStabilityPractical Efficiency
Bubble SortO(n^2)O(1)YesNot efficient
Quick SortO(n log n) / O(n^2)O(log n)NoHighly efficient
Merge SortO(n log n)O(n)YesEfficient

Practical Example and Comparison Results

Let’s test these algorithms on a larger dataset and compare their execution time.

import time
import random

# Generate a large array of random integers
large_arr = [random.randint(0, 10000) for _ in range(1000)]

# Measure execution time for Bubble Sort
start_time = time.time()
bubble_sort(large_arr.copy())
print("Bubble Sort Time:", time.time() - start_time)

# Measure execution time for Quick Sort
start_time = time.time()
quick_sort(large_arr.copy())
print("Quick Sort Time:", time.time() - start_time)

# Measure execution time for Merge Sort
start_time = time.time()
merge_sort(large_arr.copy())
print("Merge Sort Time:", time.time() - start_time)

When you run the above code, you will likely see that Bubble Sort takes significantly longer than Quick Sort and Merge Sort on large arrays, demonstrating the inefficiency of Bubble Sort for larger datasets. Quick Sort and Merge Sort should perform much faster, with Quick Sort typically being the fastest in practice due to its lower constant factors, despite its worst-case time complexity being O(n^2).

Note: The exact timing results can vary based on the hardware and specific implementation details.

Calculating Complexities in Recursive Algorithms

When calculating the complexities of recursive algorithms, the two main components to consider are:

  1. Recurrence Relation: This represents the time complexity of the recursive call itself.
  2. Base Case: This is the condition under which the recursion stops.

The overall time complexity can often be derived by solving the recurrence relation, which describes how the runtime of the function depends on the size of its input.

Example: Merge Sort

For example, let’s consider Merge Sort:

def merge_sort(arr):
    if len(arr) <= 1:
        return arr
    mid = len(arr) // 2
    left = merge_sort(arr[:mid])
    right = merge_sort(arr[mid:])
    return merge(left, right)
  1. Recurrence Relation: The merge sort function splits the array into two halves and recursively sorts each half. This gives us a recurrence relation of T(n) = 2T(n/2) + O(n), where T(n) is the time complexity of sorting an array of size n.
  2. Base Case: The base case is when the array has one or zero elements, which takes constant time, O(1).

Using the Master Theorem for divide-and-conquer recurrences of the form T(n) = aT(n/b) + f(n):

  • a = 2 (number of subproblems)
  • b = 2 (factor by which the problem size is divided)
  • f(n) = O(n) (cost outside the recursive calls, i.e., merging the two halves)

According to the Master Theorem, when f(n) = O(n), the solution to the recurrence relation is T(n) = O(n log n).

Example: Quick Sort

For Quick Sort, the recurrence relation can vary depending on the pivot choice:

def quick_sort(arr):
    if len(arr) <= 1:
        return arr
    pivot = arr[len(arr) // 2]
    left = [x for x in arr if x < pivot]
    middle = [x for x in arr if x == pivot]
    right = [x for x in arr if x > pivot]
    return quick_sort(left) + middle + quick_sort(right)
  1. Recurrence Relation: T(n) = T(k) + T(n – k – 1) + O(n), where k is the number of elements smaller than the pivot.
  2. Base Case: When the array has one or zero elements, taking O(1) time.
  • Best and Average Case: When the pivot divides the array into two equal halves (or close to equal), T(n) = 2T(n/2) + O(n), which solves to T(n) = O(n log n).
  • Worst Case: When the pivot is the smallest or largest element, resulting in T(n) = T(0) + T(n-1) + O(n), which solves to T(n) = O(n^2).

Stability in Algorithms

A sorting algorithm is said to be stable if it preserves the relative order of records with equal keys. In other words, two equal elements will appear in the same order in the sorted output as they appear in the input.

Why Some Algorithms Are Not Stable

  • Quick Sort: Quick Sort is not stable because it might change the relative order of equal elements. The in-place partitioning it uses does not preserve the original order of equal elements.
  • Heap Sort: Heap Sort is also not stable because the process of creating the heap can change the relative order of equal elements.

Example of Stability

Consider an array of tuples where the first element is the key and the second element is an identifier:

pythonCopy codearr = [(4, 'a'), (3, 'b'), (3, 'c'), (2, 'd'), (4, 'e')]
  • Stable Sort (Merge Sort):
pythonCopy codesorted_arr = merge_sort(arr)
# Output: [(2, 'd'), (3, 'b'), (3, 'c'), (4, 'a'), (4, 'e')]
  • Unstable Sort (Quick Sort):
pythonCopy codesorted_arr = quick_sort(arr)
# Output might be: [(2, 'd'), (3, 'c'), (3, 'b'), (4, 'a'), (4, 'e')]

Complexity Calculation Example: Fibonacci Sequence

Consider the recursive algorithm for calculating the n-th Fibonacci number:

def fibonacci(n):
    if n <= 1:
        return n
    return fibonacci(n-1) + fibonacci(n-2)
  1. Recurrence Relation: T(n) = T(n-1) + T(n-2) + O(1)
  2. Base Case: T(0) = T(1) = O(1)

Solving this recurrence relation, we get T(n) = O(2^n). This exponential time complexity is due to the overlapping subproblems and repeated calculations.

Optimizing with Memoization

Using memoization to store previously calculated Fibonacci numbers reduces the complexity to O(n):

def fibonacci_memo(n, memo={}):
    if n in memo:
        return memo[n]
    if n <= 1:
        return n
    memo[n] = fibonacci_memo(n-1, memo) + fibonacci_memo(n-2, memo)
    return memo[n]

Understanding the complexities and stability of algorithms is crucial in selecting the right algorithm for a given problem. Recursive algorithms often provide elegant solutions, but they can also lead to inefficiencies without proper optimization techniques like memoization or choosing appropriate algorithms.

Calculating Complexities and Comparing Various Methods for an Algorithm

When comparing different algorithms, it’s crucial to understand their time and space complexities, stability, and practical efficiency. Here, we’ll go through the steps of calculating complexities and comparing various methods for a given algorithm.

Steps to Calculate Time and Space Complexity

  1. Identify the Basic Operation: Determine the operation that contributes most to the total running time (e.g., comparisons in sorting, additions in summing).
  2. Count the Number of Basic Operations: Express this count as a function of the input size (n).
  3. Establish Recurrence Relations: For recursive algorithms, establish a recurrence relation that describes the running time in terms of smaller inputs.
  4. Solve Recurrence Relations: Use techniques like the Master Theorem, iteration, or recursion trees to solve recurrence relations.
  5. Space Complexity: Analyze the memory usage of the algorithm. This includes the input size, additional memory used by the algorithm, and the memory used by recursive calls.

Comparing Various Methods for an Algorithm

Let’s compare three common sorting algorithms: Bubble Sort, Quick Sort, and Merge Sort.

Bubble Sort

Bubble Sort is a simple sorting algorithm that repeatedly steps through the list, compares adjacent elements, and swaps them if they are in the wrong order.

def bubble_sort(arr):
    n = len(arr)
    for i in range(n):
        for j in range(0, n-i-1):
            if arr[j] > arr[j+1]:
                arr[j], arr[j+1] = arr[j+1], arr[j]
    return arr
  • Time Complexity:
    • Best Case: O(n) (when the array is already sorted)
    • Average Case: O(n^2)
    • Worst Case: O(n^2)
  • Space Complexity: O(1)
  • Stability: Stable

Quick Sort

Quick Sort is a divide-and-conquer algorithm that selects a pivot element and partitions the array into two halves.

def quick_sort(arr):
    if len(arr) <= 1:
        return arr
    pivot = arr[len(arr) // 2]
    left = [x for x in arr if x < pivot]
    middle = [x for x in arr if x == pivot]
    right = [x for x in arr if x > pivot]
    return quick_sort(left) + middle + quick_sort(right)
  • Time Complexity:
    • Best Case: O(n log n)
    • Average Case: O(n log n)
    • Worst Case: O(n^2) (when the pivot is the smallest or largest element)
  • Space Complexity: O(log n) due to the recursive stack
  • Stability: Not stable

Merge Sort

Merge Sort is a stable divide-and-conquer algorithm that divides the array into halves, sorts each half, and then merges them.

def merge_sort(arr):
    if len(arr) <= 1:
        return arr
    mid = len(arr) // 2
    left = merge_sort(arr[:mid])
    right = merge_sort(arr[mid:])
    return merge(left, right)

def merge(left, right):
    result = []
    while left and right:
        if left[0] <= right[0]:
            result.append(left.pop(0))
        else:
            result.append(right.pop(0))
    result.extend(left or right)
    return result
  • Time Complexity: O(n log n) for all cases
  • Space Complexity: O(n) due to the temporary arrays used in merging
  • Stability: Stable

Practical Comparison Example

Let’s compare these sorting algorithms with a practical example:

import time
import random

def measure_time(sort_function, arr):
    start_time = time.time()
    sort_function(arr.copy())
    return time.time() - start_time

arr = [random.randint(0, 1000) for _ in range(1000)]

bubble_time = measure_time(bubble_sort, arr)
quick_time = measure_time(quick_sort, arr)
merge_time = measure_time(merge_sort, arr)

print(f"Bubble Sort Time: {bubble_time:.6f} seconds")
print(f"Quick Sort Time: {quick_time:.6f} seconds")
print(f"Merge Sort Time: {merge_time:.6f} seconds")

When comparing various methods for an algorithm, consider the following:

  1. Time Complexity: How the running time grows with the input size.
  2. Space Complexity: How the memory usage grows with the input size.
  3. Stability: Whether the algorithm preserves the relative order of equal elements.
  4. Practical Efficiency: How the algorithm performs in real-world scenarios, which can be influenced by factors like input size and specific characteristics of the input data.

Understanding these factors helps in choosing the right algorithm for a given problem.

arrays in Python

Arrays in Python can be implemented using several data structures, including lists, tuples, and the array module. However, the most common way to handle arrays in Python is by using lists due to their versatility and ease of use.

Lists in Python

Lists are mutable, ordered collections of items. They can hold elements of any data type, including other lists.

Creating a List

# Creating an empty list
empty_list = []

# Creating a list with initial values
numbers = [1, 2, 3, 4, 5]

# Creating a list with mixed data types
mixed_list = [1, "two", 3.0, [4, 5]]

Accessing Elements

# Accessing elements by index
first_element = numbers[0]  # 1
last_element = numbers[-1]  # 5

# Slicing a list
sub_list = numbers[1:4]  # [2, 3, 4]

Modifying Lists

# Changing an element
numbers[0] = 10

# Adding elements
numbers.append(6)          # [10, 2, 3, 4, 5, 6]
numbers.insert(1, 20)      # [10, 20, 2, 3, 4, 5, 6]

# Removing elements
numbers.pop()              # Removes and returns the last element (6)
numbers.remove(20)         # Removes the first occurrence of 20

List Comprehensions

List comprehensions provide a concise way to create lists.

# Creating a list of squares
squares = [x**2 for x in range(1, 6)]  # [1, 4, 9, 16, 25]

# Creating a list of even numbers
evens = [x for x in range(10) if x % 2 == 0]  # [0, 2, 4, 6, 8]

Arrays using the array Module

The array module provides an array data structure that is more efficient for numerical operations than lists.

import array

# Creating an array of integers
arr = array.array('i', [1, 2, 3, 4, 5])

# Accessing elements
print(arr[0])  # 1

# Modifying elements
arr[0] = 10

# Adding elements
arr.append(6)

# Removing elements
arr.pop()

Arrays using NumPy

NumPy is a powerful library for numerical computing in Python. It provides the ndarray object, which is used for large, multi-dimensional arrays and matrices.

import numpy as np

# Creating a NumPy array
arr = np.array([1, 2, 3, 4, 5])

# Accessing elements
print(arr[0])  # 1

# Modifying elements
arr[0] = 10

# Array operations
arr2 = arr * 2  # [20, 4, 6, 8, 10]

Summary

  • Lists: General-purpose, can hold mixed data types, and support a wide range of operations.
  • array Module: More efficient for numerical operations but less flexible than lists.
  • NumPy Arrays: Highly efficient for large-scale numerical computations and multi-dimensional arrays.

Example Use Case

Here’s an example demonstrating various operations with lists:

# Creating a list of student names
students = ["Alice", "Bob", "Charlie", "David", "Eve"]

# Adding a new student
students.append("Frank")

# Removing a student
students.remove("Charlie")

# Finding a student's position
position = students.index("David")

# Sorting the list
students.sort()

# Printing the sorted list
print("Sorted Students:", students)

This example shows the flexibility and power of lists in Python, making them suitable for a wide range of applications.

arrays from arrays module

Arrays created using the array module in Python are mutable. This means you can change, add, and remove elements after the array is created. Below, I’ll explain the properties and methods of the array module, how to create arrays, and perform various operations on them.

Properties of Array Objects

  1. Typecode: Each array has a typecode, which is a single character that determines the type of elements it can hold. For example, 'i' is for signed integers, 'f' is for floating-point numbers, etc.
  2. Itemsize: The item size is the size in bytes of each element in the array.
  3. Buffer Info: The buffer_info() method returns a tuple containing the memory address of the array and the length of the array.
  4. Mutability: Arrays are mutable, meaning you can change their content.

Creating Arrays

import array

# Creating an array of integers
arr = array.array('i', [1, 2, 3, 4, 5])

# Creating an array of floats
float_arr = array.array('f', [1.0, 2.0, 3.0, 4.0, 5.0])

Accessing Elements

# Accessing elements by index
print(arr[0])  # Output: 1
print(float_arr[1])  # Output: 2.0

# Slicing arrays
print(arr[1:3])  # Output: array('i', [2, 3])

Modifying Arrays

# Changing an element
arr[0] = 10
print(arr)  # Output: array('i', [10, 2, 3, 4, 5])

# Adding elements
arr.append(6)
print(arr)  # Output: array('i', [10, 2, 3, 4, 5, 6])

# Inserting elements
arr.insert(1, 20)
print(arr)  # Output: array('i', [10, 20, 2, 3, 4, 5, 6])

# Removing elements
arr.pop()
print(arr)  # Output: array('i', [10, 20, 2, 3, 4, 5])

arr.remove(20)
print(arr)  # Output: array('i', [10, 2, 3, 4, 5])

Methods of Array Objects

  • append(x): Adds an item to the end of the array.
  • buffer_info(): Returns a tuple (address, length) giving the current memory address and the length in elements of the buffer used to hold array’s contents.
  • byteswap(): “Byteswaps” all items in the array.
  • count(x): Returns the number of occurrences of x in the array.
  • extend(iterable): Appends items from the iterable.
  • fromfile(f, n): Reads n items from the file object f and appends them to the array.
  • fromlist(list): Appends items from the list.
  • fromstring(s): Appends items from the string, interpreting the string as an array of machine values (deprecated since Python 3.2).
  • frombytes(b): Appends items from the bytes object.
  • index(x): Returns the index of the first occurrence of x in the array.
  • insert(i, x): Inserts a new item with value x in the array before position i.
  • pop([i]): Removes and returns the item with index i (or the last item if i is omitted).
  • remove(x): Removes the first occurrence of x in the array.
  • reverse(): Reverses the order of the items in the array.
  • tofile(f): Writes all items to the file object f.
  • tolist(): Converts the array to an ordinary list with the same items.
  • tobytes(): Converts the array to bytes.
  • typecode: Returns the typecode character used to create the array.
  • itemsize: Returns the length in bytes of one array item.

Example: Using array Module

Here’s a complete example demonstrating the creation and manipulation of arrays using the array module:

import array

# Creating an array of signed integers
arr = array.array('i', [1, 2, 3, 4, 5])

# Printing initial array
print("Initial array:", arr)

# Accessing elements
print("First element:", arr[0])
print("Last element:", arr[-1])

# Modifying elements
arr[0] = 10
print("Modified array:", arr)

# Adding elements
arr.append(6)
print("Array after append:", arr)

arr.insert(1, 20)
print("Array after insert:", arr)

# Removing elements
arr.pop()
print("Array after pop:", arr)

arr.remove(20)
print("Array after remove:", arr)

# Array methods
print("Array buffer info:", arr.buffer_info())
print("Array item size:", arr.itemsize)
print("Array type code:", arr.typecode)
print("Count of 3 in array:", arr.count(3))

# Convert array to list
arr_list = arr.tolist()
print("Array converted to list:", arr_list)

# Reversing the array
arr.reverse()
print("Reversed array:", arr)

This example demonstrates the essential operations you can perform on arrays using the array module, showing their mutability and various properties and methods.


Discover more from HintsToday

Subscribe to get the latest posts sent to your email.


Comments

One response to “Sorting Algorithms implemented in Python- Merge Sort, Bubble Sort, Quick Sort”

  1. 96jazzycoder7 Avatar
    96jazzycoder7

    Merge sort is like a sorting wizard that splits your list, sorts each half, and merges them back. It’s way more efficient than bubble sort, especially with bigger lists!

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Discover more from HintsToday

Subscribe now to keep reading and get access to the full archive.

Continue reading

Subscribe