1
$\begingroup$

I am teaching an algorithms class, and I'd like to show some problems just to 'motivate' big O notation and show why it matters. I do the usual sorts, and I show them a timed comparison.

I also show this problem https://leetcode.com/problems/maximum-subarray/description/ because I feel it needs little theory. I guess it could be considered Dynamic Programming, but it has a nice, simple solution that I can show without discussing DP in any length

I run both an O(n^3) solution (that passes around 200 tests), an O(n^2) solution (only 202) and an O(n) solution (that passes all tests with flying colors)

I plan to introduce stacks and graphs latter, but now would not be the time.

I was wondering if any of you have other problems like that. That is to say: simple solutions, no need to teach any data structure or complex idea, you just need a solution of the right complexity. Ideally, on leetcode, hackerrank, or similar -- but if you provide a classic enough problem, odds are that it is implemented somewhere for me to give my students

$\endgroup$

2 Answers 2

5
$\begingroup$

I've always found a fun example to be the naive (recursive) implementation of fibonacci, because you can write it in front of the students, run it for 3 and 5 so they can verify that it works, then run it for 21 and let it run in the background for 10 minutes or so while you lecture.

Then you can use the dynamic programming approach, and suddenly you have a very motivated, even emotionally resonant, rationale for a discussion about runtime.

What I've never managed to pull off, but would be very effective if I could find a topic, is to talk about something else while that program just keeps running. When it stops, I would use that as the trigger to turn back to the Fibonacci algorithm.

"Ah, I see it has finally finished! Why did it take so long?"

And this would inevitably lead to, "is there some way we can do better?"

And that would lead really nicely into big O.

$\endgroup$
2
$\begingroup$

When I talk about complexity, I work with three examples:

  1. Summation: I implement the summation from $1$ to $n$ using a for loop, obtaining an algorithm that is $O(n)$, then I algebraically obtain the formula for the sum of the terms of an arithmetic progression of ratio 1: $\sum_1^n = \frac{n(n+1)}{2}$ which is $O(1)$;

  2. Two-Sum and Three-Sum: I implement using sequential search, obtaining respectively $O(n^2)$ and $O(n^3)$ and then applying binary search, obtaining respectively $O(n \log_2 n)$ and $O(n^2 \log_2 n)$;

  3. Union-Find: I present the problem of dynamic connectivity and implement QuickFind, QuickUnion and WeightedQuickUnion. I apply the implementations in a visual simulation for the percolation problem. A reference is this one https://algs4.cs.princeton.edu/15uf/

order of growth for N sites (worst case)

The percolation simulator can be found here: https://github.com/davidbuzatto/Percolacao

enter image description here

Please note that the purpose of the simulation is only to show how fast the water percolates, i.e. connects the upper medium to the lower medium according to the algorithms used to implement each data structure operations. To show the difference in performance, you should change the simulation parameters and vary the algorithms. The simulation itself is not very accurate, since the percolating water should not "rise" in the percolating material, but this can be ignored, since the purpose is really to show that QuickFind is very slow compared to WeightedQuickUnion. This "error" is evident in the lower left quadrant. You can correct it if you wish. Furthermore, the simulator is implemented in Java and the comments and program elements are in Brazilian Portuguese.

$\endgroup$

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.