Lesson 6: Algorithms and Problem Solving#
Master fundamental algorithms and develop systematic problem-solving skills.
What You’ll Learn#
Understanding algorithms and their importance
Searching algorithms (linear, binary, interpolation)
Sorting algorithms (bubble, selection, insertion, merge, quick)
Time and space complexity analysis (Big O notation)
Common algorithm patterns and techniques
Problem-solving strategies and approaches
Data structures for algorithms
Real-world applications
Prerequisites#
Understanding of Python functions
Familiarity with lists and basic data structures
Basic understanding of loops and recursion
Why Algorithms Matter#
Algorithms are the foundation of computer science and software engineering:
Efficiency: Choose the right algorithm to save time and resources
Problem Solving: Systematic approaches to complex problems
Interviews: Algorithms are central to technical interviews
Optimization: Improve performance of real-world applications
Scalability: Design systems that handle growth
1. What is an Algorithm?#
An algorithm is a step-by-step procedure for solving a problem or performing a computation.
Characteristics of Good Algorithms#
Correctness: Produces the right output for all valid inputs
Efficiency: Uses minimal time and resources
Clarity: Easy to understand and implement
Generality: Handles a wide range of inputs
Example: Making a Sandwich#
Get two slices of bread
Add filling (cheese, meat, vegetables)
Put slices together
Cut in half (optional)
Done!
In programming, algorithms solve computational problems like searching, sorting, and optimization.
# Simple algorithm: Find the maximum value in a list
def find_maximum(numbers):
"""
Algorithm to find maximum value.
Steps:
1. Start with first element as max
2. Compare with each remaining element
3. Update max if larger element found
4. Return max
"""
if not numbers:
return None
max_value = numbers[0]
comparisons = 0
for num in numbers[1:]:
comparisons += 1
if num > max_value:
max_value = num
print(f"Found maximum {max_value} after {comparisons} comparisons")
return max_value
# Test
data = [64, 34, 25, 12, 22, 11, 90, 88]
result = find_maximum(data)
print(f"Maximum value: {result}")
2. Searching Algorithms#
Searching is one of the most fundamental operations in computer science.
2.1 Linear Search#
Check each element sequentially until target is found.
Time Complexity: O(n) - worst case checks all elements Space Complexity: O(1) - uses constant extra space When to use: Unsorted data or small datasets
def linear_search(arr, target):
"""
Search for target in array by checking each element.
Args:
arr: List of elements
target: Element to find
Returns:
Index of target if found, -1 otherwise
"""
comparisons = 0
for i in range(len(arr)):
comparisons += 1
print(f"Checking index {i}: {arr[i]}")
if arr[i] == target:
print(f"✓ Found {target} at index {i} after {comparisons} comparisons")
return i
print(f"✗ {target} not found after {comparisons} comparisons")
return -1
# Test
numbers = [64, 34, 25, 12, 22, 11, 90]
print("Searching for 22:")
linear_search(numbers, 22)
print("\nSearching for 100:")
linear_search(numbers, 100)
2.2 Binary Search#
Much faster for sorted arrays - divides search space in half each iteration.
Time Complexity: O(log n) - very efficient Space Complexity: O(1) for iterative, O(log n) for recursive Requirement: Array must be sorted
def binary_search(arr, target):
"""
Search in sorted array by repeatedly dividing in half.
Args:
arr: Sorted list of elements
target: Element to find
Returns:
Index of target if found, -1 otherwise
"""
left = 0
right = len(arr) - 1
comparisons = 0
while left <= right:
comparisons += 1
mid = (left + right) // 2
print(f"Checking middle (index {mid}): {arr[mid]} (range: {left}-{right})")
if arr[mid] == target:
print(f"✓ Found {target} at index {mid} after {comparisons} comparisons")
return mid
elif arr[mid] < target:
print(f" {target} > {arr[mid]}, searching right half")
left = mid + 1
else:
print(f" {target} < {arr[mid]}, searching left half")
right = mid - 1
print(f"✗ {target} not found after {comparisons} comparisons")
return -1
# Test (array must be sorted!)
sorted_numbers = [11, 12, 22, 25, 34, 64, 90]
print("Array:", sorted_numbers)
print("\nSearching for 25:")
binary_search(sorted_numbers, 25)
# Recursive implementation of binary search
def binary_search_recursive(arr, target, left=0, right=None):
"""
Recursive version of binary search.
"""
if right is None:
right = len(arr) - 1
# Base case: search space is empty
if left > right:
return -1
mid = (left + right) // 2
if arr[mid] == target:
return mid
elif arr[mid] < target:
return binary_search_recursive(arr, target, mid + 1, right)
else:
return binary_search_recursive(arr, target, left, mid - 1)
# Test
result = binary_search_recursive(sorted_numbers, 64)
print(f"Recursive search found 64 at index: {result}")
2.3 Performance Comparison#
import time
def compare_search_algorithms():
"""Compare linear vs binary search performance."""
sizes = [1000, 10000, 100000]
print("Performance Comparison: Linear vs Binary Search")
print("=" * 60)
for size in sizes:
# Create sorted array
arr = list(range(size))
target = size - 1 # Worst case for linear search
# Linear search
start = time.time()
linear_result = linear_search(arr, target)
linear_time = time.time() - start
# Binary search
start = time.time()
binary_result = binary_search(arr, target)
binary_time = time.time() - start
speedup = linear_time / binary_time if binary_time > 0 else float('inf')
print(f"\nArray size: {size:,}")
print(f" Linear Search: {linear_time:.6f} seconds")
print(f" Binary Search: {binary_time:.6f} seconds")
print(f" Binary is {speedup:.1f}x faster!")
# Note: This will produce a lot of output
# compare_search_algorithms()
3. Sorting Algorithms#
Sorting is arranging elements in a specific order (ascending or descending).
3.1 Bubble Sort#
Repeatedly swaps adjacent elements if they’re in wrong order.
Time Complexity: O(n²) - slow for large datasets Space Complexity: O(1) Stable: Yes (maintains relative order of equal elements)
def bubble_sort(arr, verbose=True):
"""
Sort array using bubble sort algorithm.
Each pass "bubbles" the largest unsorted element to its correct position.
"""
arr = arr.copy() # Don't modify original
n = len(arr)
swaps = 0
for i in range(n):
made_swap = False
# Last i elements are already sorted
for j in range(0, n - i - 1):
if arr[j] > arr[j + 1]:
# Swap adjacent elements
arr[j], arr[j + 1] = arr[j + 1], arr[j]
swaps += 1
made_swap = True
if verbose:
print(f"Pass {i + 1}: {arr}")
# Optimization: if no swaps, array is sorted
if not made_swap:
if verbose:
print(f"Array sorted early at pass {i + 1}")
break
if verbose:
print(f"\nTotal swaps: {swaps}")
return arr
# Test
unsorted = [64, 34, 25, 12, 22, 11, 90]
print("Original:", unsorted)
print("\nBubble Sort Process:")
sorted_arr = bubble_sort(unsorted)
print("\nFinal:", sorted_arr)
3.2 Selection Sort#
Finds the minimum element and places it at the beginning.
Time Complexity: O(n²) Space Complexity: O(1) Stable: No (can change relative order)
def selection_sort(arr, verbose=True):
"""
Sort array using selection sort algorithm.
Repeatedly selects the minimum element from unsorted portion.
"""
arr = arr.copy()
n = len(arr)
for i in range(n):
# Find minimum in remaining unsorted array
min_idx = i
for j in range(i + 1, n):
if arr[j] < arr[min_idx]:
min_idx = j
# Swap minimum with first unsorted element
arr[i], arr[min_idx] = arr[min_idx], arr[i]
if verbose:
print(f"Step {i + 1}: {arr} (placed {arr[i]} at index {i})")
return arr
# Test
unsorted = [64, 25, 12, 22, 11]
print("Original:", unsorted)
print("\nSelection Sort Process:")
selection_sort(unsorted)
3.3 Insertion Sort#
Builds sorted array one element at a time by inserting each element into its correct position.
Time Complexity: O(n²) worst case, O(n) best case (already sorted) Space Complexity: O(1) Stable: Yes Good for: Small datasets or nearly sorted data
def insertion_sort(arr, verbose=True):
"""
Sort array using insertion sort algorithm.
Like sorting playing cards in your hand.
"""
arr = arr.copy()
n = len(arr)
for i in range(1, n):
key = arr[i]
j = i - 1
# Move elements greater than key one position ahead
while j >= 0 and arr[j] > key:
arr[j + 1] = arr[j]
j -= 1
# Insert key at correct position
arr[j + 1] = key
if verbose:
print(f"Step {i}: {arr} (inserted {key})")
return arr
# Test
unsorted = [12, 11, 13, 5, 6]
print("Original:", unsorted)
print("\nInsertion Sort Process:")
insertion_sort(unsorted)
3.4 Merge Sort#
Divide-and-conquer algorithm that splits array in half recursively and merges sorted halves.
Time Complexity: O(n log n) - efficient for large datasets Space Complexity: O(n) - needs extra space Stable: Yes
def merge_sort(arr):
"""
Sort array using merge sort algorithm.
Divides array into halves, sorts them, and merges.
"""
if len(arr) <= 1:
return arr
# Divide
mid = len(arr) // 2
left = merge_sort(arr[:mid])
right = merge_sort(arr[mid:])
# Conquer (merge)
return merge(left, right)
def merge(left, right):
"""
Merge two sorted arrays into one sorted array.
"""
result = []
i = j = 0
# Merge elements in sorted order
while i < len(left) and j < len(right):
if left[i] <= right[j]:
result.append(left[i])
i += 1
else:
result.append(right[j])
j += 1
# Add remaining elements
result.extend(left[i:])
result.extend(right[j:])
return result
# Test
unsorted = [38, 27, 43, 3, 9, 82, 10]
print("Original:", unsorted)
sorted_arr = merge_sort(unsorted)
print("Sorted:", sorted_arr)
3.5 Quick Sort#
Divide-and-conquer algorithm that picks a pivot and partitions array around it.
Time Complexity: O(n log n) average, O(n²) worst case Space Complexity: O(log n) - in-place sorting Unstable: Can change relative order
def quick_sort(arr):
"""
Sort array using quick sort algorithm.
Picks a pivot and partitions array around it.
"""
if len(arr) <= 1:
return arr
# Choose pivot (middle element)
pivot = arr[len(arr) // 2]
# Partition into three parts
left = [x for x in arr if x < pivot]
middle = [x for x in arr if x == pivot]
right = [x for x in arr if x > pivot]
# Recursively sort and combine
return quick_sort(left) + middle + quick_sort(right)
# Test
unsorted = [10, 7, 8, 9, 1, 5]
print("Original:", unsorted)
sorted_arr = quick_sort(unsorted)
print("Sorted:", sorted_arr)
3.6 Sorting Algorithm Comparison#
import random
import time
def compare_sorting_algorithms():
"""Compare performance of different sorting algorithms."""
sizes = [100, 500, 1000]
algorithms = [
('Bubble Sort', lambda x: bubble_sort(x, verbose=False)),
('Selection Sort', lambda x: selection_sort(x, verbose=False)),
('Insertion Sort', lambda x: insertion_sort(x, verbose=False)),
('Merge Sort', merge_sort),
('Quick Sort', quick_sort),
('Python Built-in', lambda x: sorted(x))
]
print("Sorting Algorithm Performance Comparison")
print("=" * 70)
for size in sizes:
print(f"\nArray size: {size}")
print("-" * 70)
# Generate random array
arr = [random.randint(1, 1000) for _ in range(size)]
for name, algorithm in algorithms:
test_arr = arr.copy()
start = time.time()
algorithm(test_arr)
elapsed = time.time() - start
print(f" {name:20s}: {elapsed:.6f} seconds")
# Uncomment to run comparison
# compare_sorting_algorithms()
4. Time and Space Complexity#
Understanding how algorithms scale with input size is crucial for writing efficient code.
4.1 Big O Notation#
Big O describes the upper bound of time/space growth.
Common Time Complexities (from fastest to slowest):#
Notation |
Name |
Example |
Description |
|---|---|---|---|
O(1) |
Constant |
Array access |
Always takes same time |
O(log n) |
Logarithmic |
Binary search |
Halves problem each step |
O(n) |
Linear |
Linear search |
Proportional to input size |
O(n log n) |
Linearithmic |
Merge sort, quick sort |
Efficient sorting |
O(n²) |
Quadratic |
Bubble sort, nested loops |
Slow for large inputs |
O(2ⁿ) |
Exponential |
Recursive fibonacci |
Extremely slow |
O(n!) |
Factorial |
Permutations |
Impractical for n > 10 |
# Examples of different time complexities
# O(1) - Constant time
def get_first_element(arr):
"""Always takes same time regardless of array size."""
return arr[0] if arr else None
# O(log n) - Logarithmic time
def binary_search_simple(arr, target):
"""Halves search space each iteration."""
left, right = 0, len(arr) - 1
while left <= right:
mid = (left + right) // 2
if arr[mid] == target:
return mid
elif arr[mid] < target:
left = mid + 1
else:
right = mid - 1
return -1
# O(n) - Linear time
def find_sum(arr):
"""Must check every element once."""
total = 0
for num in arr:
total += num
return total
# O(n²) - Quadratic time
def find_pairs(arr):
"""Nested loop checks all pairs."""
pairs = []
for i in range(len(arr)):
for j in range(i + 1, len(arr)):
pairs.append((arr[i], arr[j]))
return pairs
# O(2ⁿ) - Exponential time
def fibonacci_recursive(n):
"""Each call branches into two calls."""
if n <= 1:
return n
return fibonacci_recursive(n - 1) + fibonacci_recursive(n - 2)
# Test
print("O(1):", get_first_element([1, 2, 3, 4, 5]))
print("O(n):", find_sum([1, 2, 3, 4, 5]))
print("O(n²):", find_pairs([1, 2, 3]))
print("O(2ⁿ):", fibonacci_recursive(10))
4.2 Space Complexity#
Space complexity measures memory usage.
# O(1) space - constant extra memory
def reverse_in_place(arr):
"""Reverses array without creating new array."""
left, right = 0, len(arr) - 1
while left < right:
arr[left], arr[right] = arr[right], arr[left]
left += 1
right -= 1
return arr
# O(n) space - proportional extra memory
def reverse_with_copy(arr):
"""Creates new array with reversed elements."""
return arr[::-1]
# O(n) space - recursion uses call stack
def factorial_recursive(n):
"""Each recursive call adds to call stack."""
if n <= 1:
return 1
return n * factorial_recursive(n - 1)
# Test
arr = [1, 2, 3, 4, 5]
print("Original:", arr)
print("Reversed (O(1) space):", reverse_in_place(arr.copy()))
print("Reversed (O(n) space):", reverse_with_copy(arr))
5. Common Algorithm Patterns#
Recognizing patterns helps solve problems more efficiently.
5.1 Two Pointer Technique#
Use two pointers to traverse array from different positions.
def is_palindrome(s):
"""Check if string is palindrome using two pointers."""
# Remove spaces and convert to lowercase
s = ''.join(s.split()).lower()
left = 0
right = len(s) - 1
while left < right:
if s[left] != s[right]:
return False
left += 1
right -= 1
return True
def two_sum_sorted(arr, target):
"""
Find two numbers that sum to target in sorted array.
Time: O(n), Space: O(1)
"""
left = 0
right = len(arr) - 1
while left < right:
current_sum = arr[left] + arr[right]
if current_sum == target:
return [left, right]
elif current_sum < target:
left += 1
else:
right -= 1
return None
# Test
print("Is 'racecar' a palindrome?", is_palindrome("racecar"))
print("Is 'hello' a palindrome?", is_palindrome("hello"))
print("Is 'A man a plan a canal Panama' a palindrome?",
is_palindrome("A man a plan a canal Panama"))
sorted_arr = [1, 2, 3, 4, 5, 6, 7]
target = 9
result = two_sum_sorted(sorted_arr, target)
print(f"\nTwo numbers that sum to {target}: indices {result}")
5.2 Sliding Window#
Maintain a window of elements and slide it through the array.
def max_sum_subarray(arr, k):
"""
Find maximum sum of k consecutive elements.
Time: O(n), Space: O(1)
"""
if len(arr) < k:
return None
# Calculate sum of first window
window_sum = sum(arr[:k])
max_sum = window_sum
# Slide window: subtract left, add right
for i in range(k, len(arr)):
window_sum = window_sum - arr[i - k] + arr[i]
max_sum = max(max_sum, window_sum)
return max_sum
def longest_substring_k_unique(s, k):
"""
Find length of longest substring with k unique characters.
"""
if k == 0 or len(s) == 0:
return 0
char_count = {}
left = 0
max_length = 0
for right in range(len(s)):
# Add character to window
char = s[right]
char_count[char] = char_count.get(char, 0) + 1
# Shrink window if too many unique characters
while len(char_count) > k:
left_char = s[left]
char_count[left_char] -= 1
if char_count[left_char] == 0:
del char_count[left_char]
left += 1
# Update maximum length
max_length = max(max_length, right - left + 1)
return max_length
# Test
numbers = [1, 4, 2, 10, 23, 3, 1, 0, 20]
k = 4
result = max_sum_subarray(numbers, k)
print(f"Maximum sum of {k} consecutive elements: {result}")
text = "aabacbebebe"
k = 3
length = longest_substring_k_unique(text, k)
print(f"Longest substring with {k} unique chars: {length}")
5.3 Recursion and Divide-and-Conquer#
Break problem into smaller subproblems of the same type.
def power(base, exponent):
"""
Calculate base^exponent using divide-and-conquer.
Time: O(log n)
"""
if exponent == 0:
return 1
if exponent == 1:
return base
# Divide: calculate half power
half = power(base, exponent // 2)
# Conquer: combine results
if exponent % 2 == 0:
return half * half
else:
return half * half * base
def find_max_recursive(arr, left, right):
"""
Find maximum element using divide-and-conquer.
"""
# Base case: single element
if left == right:
return arr[left]
# Divide
mid = (left + right) // 2
# Conquer: find max in each half
left_max = find_max_recursive(arr, left, mid)
right_max = find_max_recursive(arr, mid + 1, right)
# Combine
return max(left_max, right_max)
# Test
print(f"2^10 = {power(2, 10)}")
print(f"3^5 = {power(3, 5)}")
arr = [13, 47, 89, 12, 45, 102, 63]
max_val = find_max_recursive(arr, 0, len(arr) - 1)
print(f"Maximum in {arr}: {max_val}")
5.4 Greedy Algorithms#
Make locally optimal choice at each step.
def coin_change_greedy(amount, coins):
"""
Make change using greedy approach.
Note: Doesn't always give optimal solution!
"""
coins_sorted = sorted(coins, reverse=True)
result = []
for coin in coins_sorted:
while amount >= coin:
result.append(coin)
amount -= coin
if amount > 0:
return None # Cannot make exact change
return result
def activity_selection(activities):
"""
Select maximum number of non-overlapping activities.
Activities: list of (start, end) tuples
"""
# Sort by end time
activities_sorted = sorted(activities, key=lambda x: x[1])
selected = [activities_sorted[0]]
last_end = activities_sorted[0][1]
for start, end in activities_sorted[1:]:
if start >= last_end:
selected.append((start, end))
last_end = end
return selected
# Test
amount = 63
coins = [25, 10, 5, 1]
change = coin_change_greedy(amount, coins)
print(f"Change for {amount}: {change} ({len(change)} coins)")
activities = [(1, 4), (3, 5), (0, 6), (5, 7), (3, 8), (5, 9), (6, 10), (8, 11)]
selected = activity_selection(activities)
print(f"\nSelected activities: {selected}")
print(f"Maximum activities: {len(selected)}")
6. Data Structures for Algorithms#
Choosing the right data structure is crucial for efficient algorithms.
6.1 Stack (LIFO - Last In, First Out)#
class Stack:
"""Stack implementation using list."""
def __init__(self):
self.items = []
def push(self, item):
"""Add item to top. O(1)"""
self.items.append(item)
def pop(self):
"""Remove and return top item. O(1)"""
if not self.is_empty():
return self.items.pop()
return None
def peek(self):
"""Return top item without removing. O(1)"""
if not self.is_empty():
return self.items[-1]
return None
def is_empty(self):
return len(self.items) == 0
def size(self):
return len(self.items)
# Application: Balanced parentheses checker
def is_balanced(expression):
"""Check if parentheses are balanced using stack."""
stack = Stack()
matching = {'(': ')', '[': ']', '{': '}'}
for char in expression:
if char in matching: # Opening bracket
stack.push(char)
elif char in matching.values(): # Closing bracket
if stack.is_empty():
return False
opening = stack.pop()
if matching[opening] != char:
return False
return stack.is_empty()
# Test
expressions = [
"(a + b) * (c - d)",
"[(x + y) * z]",
"{[a + (b * c)]}",
"((a + b)",
"a + b)"
]
for expr in expressions:
result = "balanced" if is_balanced(expr) else "NOT balanced"
print(f"{expr:25s} -> {result}")
6.2 Queue (FIFO - First In, First Out)#
from collections import deque
class Queue:
"""Queue implementation using deque for efficiency."""
def __init__(self):
self.items = deque()
def enqueue(self, item):
"""Add item to rear. O(1)"""
self.items.append(item)
def dequeue(self):
"""Remove and return front item. O(1)"""
if not self.is_empty():
return self.items.popleft()
return None
def front(self):
"""Return front item without removing. O(1)"""
if not self.is_empty():
return self.items[0]
return None
def is_empty(self):
return len(self.items) == 0
def size(self):
return len(self.items)
# Application: Task scheduler simulation
def process_tasks(tasks):
"""Simulate task processing using queue."""
queue = Queue()
# Add tasks to queue
for task in tasks:
queue.enqueue(task)
print(f"Queued: {task}")
print(f"\nProcessing {queue.size()} tasks...\n")
# Process tasks in order
while not queue.is_empty():
task = queue.dequeue()
print(f"Processing: {task}")
# Test
tasks = ["Download file", "Process data", "Generate report", "Send email"]
process_tasks(tasks)
6.3 Hash Tables (Dictionaries)#
Provide O(1) average-case lookup, insert, and delete.
def two_sum(arr, target):
"""
Find two numbers that sum to target using hash table.
Time: O(n), Space: O(n)
"""
seen = {} # value -> index
for i, num in enumerate(arr):
complement = target - num
if complement in seen:
return [seen[complement], i]
seen[num] = i
return None
def first_non_repeating_char(s):
"""
Find first non-repeating character using hash table.
Time: O(n), Space: O(1) (limited character set)
"""
char_count = {}
# Count occurrences
for char in s:
char_count[char] = char_count.get(char, 0) + 1
# Find first with count 1
for char in s:
if char_count[char] == 1:
return char
return None
# Test
nums = [2, 7, 11, 15]
target = 9
result = two_sum(nums, target)
print(f"Two sum indices for {nums} with target {target}: {result}")
text = "leetcode"
char = first_non_repeating_char(text)
print(f"First non-repeating character in '{text}': {char}")
7. Problem-Solving Framework#
A systematic approach to tackling algorithmic problems.
Step 1: Understand the Problem#
Read carefully and identify:
Input: What data are you given?
Output: What should you return?
Constraints: Size limits, time limits, special conditions
Edge cases: Empty input, single element, duplicates
Step 2: Examples and Test Cases#
Create examples:
Normal case
Edge cases
Invalid input
Step 3: Break It Down#
Identify subproblems
Can you reduce it to a known problem?
What data structure would help?
Step 4: Solve Simpler Version First#
Start with brute force
Verify correctness
Then optimize
Step 5: Analyze and Optimize#
What’s the time complexity?
Can you do better?
Trade-off: time vs space
Step 6: Test Thoroughly#
Test with your examples
Test edge cases
Verify time/space complexity
# Example: Find longest consecutive sequence
# Problem: Given unsorted array, find length of longest consecutive elements sequence
def longest_consecutive_brute(nums):
"""
Brute force: Sort array first.
Time: O(n log n), Space: O(1)
"""
if not nums:
return 0
nums_sorted = sorted(set(nums)) # Remove duplicates
max_length = 1
current_length = 1
for i in range(1, len(nums_sorted)):
if nums_sorted[i] == nums_sorted[i-1] + 1:
current_length += 1
max_length = max(max_length, current_length)
else:
current_length = 1
return max_length
def longest_consecutive_optimal(nums):
"""
Optimized: Use hash set.
Time: O(n), Space: O(n)
"""
if not nums:
return 0
num_set = set(nums)
max_length = 0
for num in num_set:
# Only start counting if it's the beginning of a sequence
if num - 1 not in num_set:
current_num = num
current_length = 1
# Count consecutive numbers
while current_num + 1 in num_set:
current_num += 1
current_length += 1
max_length = max(max_length, current_length)
return max_length
# Test both approaches
test_cases = [
[100, 4, 200, 1, 3, 2], # Expected: 4 (1,2,3,4)
[0, 3, 7, 2, 5, 8, 4, 6, 0, 1], # Expected: 9 (0-8)
[9, 1, -3, 2, 4, 8, 3, -1, 6, -2, -4, 7] # Expected: 4 (-4,-3,-2,-1)
]
for nums in test_cases:
result_brute = longest_consecutive_brute(nums)
result_optimal = longest_consecutive_optimal(nums)
print(f"Array: {nums}")
print(f" Brute force result: {result_brute}")
print(f" Optimal result: {result_optimal}")
print()
8. Real-World Applications#
Algorithms power everyday technology.
Search Engines#
Binary search for finding web pages
Ranking algorithms (PageRank)
String matching for autocomplete
GPS and Maps#
Shortest path algorithms (Dijkstra’s)
Route optimization
Traffic prediction
E-commerce#
Product search and filtering
Recommendation algorithms
Inventory sorting and management
Databases#
B-tree for indexing
Query optimization
Sorting and joining tables
# Example: Simple autocomplete system
class AutoComplete:
"""Simple autocomplete using prefix matching."""
def __init__(self, words):
self.words = sorted(words) # Sort for binary search
def suggestions(self, prefix):
"""
Find all words starting with prefix.
Uses binary search to find start position.
"""
# Find first word with prefix
left, right = 0, len(self.words)
while left < right:
mid = (left + right) // 2
if self.words[mid] < prefix:
left = mid + 1
else:
right = mid
# Collect all words with prefix
results = []
for i in range(left, len(self.words)):
if self.words[i].startswith(prefix):
results.append(self.words[i])
else:
break
return results
# Test
words = [
"apple", "application", "apply", "ape", "apricot",
"banana", "band", "bandana", "berry", "cherry"
]
autocomplete = AutoComplete(words)
test_prefixes = ["app", "ban", "ber", "ch"]
for prefix in test_prefixes:
suggestions = autocomplete.suggestions(prefix)
print(f"Prefix '{prefix}': {suggestions}")
9. Practice Exercises#
Apply your knowledge with these problems.
Exercise 1: Array Manipulation#
Implement the following functions:
Find all duplicate elements
Reverse array in-place
Rotate array k positions
Find missing number in 1 to n
def find_duplicates(arr):
"""
Find all duplicate elements in array.
Time: O(n), Space: O(n)
"""
# Your code here
pass
def reverse_array(arr):
"""
Reverse array in-place.
Time: O(n), Space: O(1)
"""
# Your code here
pass
def rotate_array(arr, k):
"""
Rotate array k positions to the right.
Example: [1,2,3,4,5], k=2 -> [4,5,1,2,3]
"""
# Your code here
pass
def find_missing_number(arr, n):
"""
Array contains numbers 1 to n with one missing.
Find the missing number.
Hint: Use sum formula n*(n+1)/2
"""
# Your code here
pass
# Test your functions
# print(find_duplicates([1, 2, 3, 2, 4, 5, 3])) # [2, 3]
# print(reverse_array([1, 2, 3, 4, 5])) # [5, 4, 3, 2, 1]
# print(rotate_array([1, 2, 3, 4, 5], 2)) # [4, 5, 1, 2, 3]
# print(find_missing_number([1, 2, 4, 5, 6], 6)) # 3
Exercise 2: String Algorithms#
Implement these string manipulation functions:
def is_anagram(s1, s2):
"""
Check if two strings are anagrams.
Example: "listen" and "silent" are anagrams
"""
# Your code here
pass
def compress_string(s):
"""
Compress string using counts.
Example: "aaabbc" -> "a3b2c1"
"""
# Your code here
pass
def longest_palindrome_substring(s):
"""
Find the longest palindromic substring.
Example: "babad" -> "bab" or "aba"
"""
# Your code here
pass
# Test your functions
# print(is_anagram("listen", "silent")) # True
# print(compress_string("aaabbc")) # "a3b2c1"
# print(longest_palindrome_substring("babad")) # "bab" or "aba"
Exercise 3: Advanced Algorithms#
Challenge yourself with these problems:
def merge_sorted_arrays(arr1, arr2):
"""
Merge two sorted arrays into one sorted array.
Time: O(n + m), Space: O(n + m)
"""
# Your code here
pass
def kth_largest_element(arr, k):
"""
Find the kth largest element in array.
Example: [3,2,1,5,6,4], k=2 -> 5
"""
# Your code here
pass
def has_cycle_detection(arr):
"""
Detect if array has a cycle using Floyd's algorithm.
Array represents linked list where arr[i] is next index.
"""
# Your code here
pass
# Test your functions
# print(merge_sorted_arrays([1, 3, 5], [2, 4, 6])) # [1, 2, 3, 4, 5, 6]
# print(kth_largest_element([3, 2, 1, 5, 6, 4], 2)) # 5
10. Self-Check Quiz#
Test your understanding of algorithms and problem-solving.
Question 1#
What is the time complexity of binary search?
A) O(1)
B) O(log n)
C) O(n)
D) O(n log n)
Click to see answer
Answer: B) O(log n)
Binary search divides the search space in half with each comparison, resulting in logarithmic time complexity.
Question 2#
Which sorting algorithm is best for nearly sorted data?
A) Bubble sort
B) Quick sort
C) Insertion sort
D) Selection sort
Click to see answer
Answer: C) Insertion sort
Insertion sort has O(n) time complexity for nearly sorted data, making it very efficient in this case.
Question 3#
What is the space complexity of merge sort?
A) O(1)
B) O(log n)
C) O(n)
D) O(n²)
Click to see answer
Answer: C) O(n)
Merge sort requires O(n) extra space to store the merged subarrays.
Question 4#
Which data structure is best for checking balanced parentheses?
A) Queue
B) Stack
C) Hash table
D) Array
Click to see answer
Answer: B) Stack
Stack’s LIFO (Last In, First Out) property makes it perfect for matching opening and closing brackets.
Question 5#
What algorithm pattern uses two indices moving through an array?
A) Sliding window
B) Two pointer
C) Divide and conquer
D) Greedy
Click to see answer
Answer: B) Two pointer
The two pointer technique uses two indices (pointers) to traverse an array, often from different ends or at different speeds.
Question 6#
Binary search requires the input array to be:
A) Empty
B) Sorted
C) Unsorted
D) Unique elements only
Click to see answer
Answer: B) Sorted
Binary search only works on sorted arrays because it relies on comparing the middle element to determine which half to search.
Pro Tips#
Start Simple: Always implement a working solution first, then optimize
Draw Pictures: Visualize the algorithm on paper with examples
Know Your Data Structures:
Use hash tables for O(1) lookup
Use heaps for finding min/max efficiently
Use stacks/queues for ordered processing
Pattern Recognition: Learn to recognize:
Sliding window problems
Two pointer opportunities
When to use recursion vs iteration
Time-Space Tradeoff: Often you can trade space for time
Caching/memoization
Preprocessing data
Test Edge Cases:
Empty input
Single element
Duplicates
Maximum/minimum values
Big O Intuition:
One loop → O(n)
Nested loops → O(n²)
Halving problem size → O(log n)
Divide and conquer → O(n log n)
Practice Regularly: Sites like LeetCode, HackerRank help build muscle memory
Common Mistakes to Avoid#
Off-by-One Errors: Carefully check loop boundaries and index calculations
Not Handling Edge Cases: Empty arrays, null values, single elements
Infinite Loops: Ensure loop variables update correctly
Modifying While Iterating: Be careful when modifying a collection you’re iterating over
Wrong Base Cases in Recursion: Always define proper termination conditions
Integer Overflow: Be aware of maximum integer values in calculations
Not Considering Duplicates: Many algorithms behave differently with duplicate values
Premature Optimization: Get it working correctly first, then optimize
Forgetting to Sort: Some algorithms (like binary search) require sorted input
Not Analyzing Complexity: Always know your algorithm’s time and space complexity
Summary#
In this lesson, you learned:
Algorithms are step-by-step procedures for solving problems
Searching: Linear search (O(n)), binary search (O(log n))
Sorting: Bubble, selection, insertion (O(n²)), merge, quick (O(n log n))
Complexity Analysis: Big O notation for time and space
Algorithm Patterns: Two pointer, sliding window, recursion, greedy
Data Structures: Stack, queue, hash tables for efficient algorithms
Problem Solving: Systematic framework for tackling problems
Real Applications: Algorithms power search engines, social media, GPS, and more
Key Takeaways#
Choose the right algorithm for your problem
Understand time and space tradeoffs
Practice recognizing common patterns
Always test edge cases
Start simple, then optimize
Next Steps#
Practice Problems:
LeetCode Easy problems
HackerRank algorithm challenges
Project Euler mathematical problems
Advanced Topics:
Dynamic programming
Graph algorithms (BFS, DFS, Dijkstra)
Advanced data structures (trees, heaps, tries)
Backtracking
Books:
“Introduction to Algorithms” (CLRS)
“Cracking the Coding Interview”
“Algorithm Design Manual”
Interview Prep:
Practice on whiteboard/paper
Explain your thought process
Study common interview patterns
Congratulations on completing the Medium level Python course! You now have a solid foundation in Python programming, from functions and data structures to algorithms and machine learning. Keep practicing and building projects to reinforce your skills!
Social Media#
Graph algorithms for friend suggestions
Timeline sorting algorithms
Content recommendation systems