Skip to content

Grokking Algorithms


  • Author: [[Aditya Y. Bhargava]]


In this book, when I talk about running time in Big O notation (explained a little later), log always means log2. — location: 512 ^ref-22347

Big O notation tells you how fast an algorithm is. — location: 637 ^ref-23900

takes O(log n) time. Big O establishes a worst-case run time — location: 680 ^ref-54367

Big O establishes a worst-case run time — location: 682 ^ref-38003

You learn how to break a problem down into a base case and a recursive case. The divide-and-conquer strategy (chapter 4) uses this simple concept to solve hard problems. — location: 1070 ^ref-12094

Make a pile of boxes to look through. Grab a box, and look through it. If you find a box, add it to the pile to look through later. If you find a key, you’re done! Repeat. — location: 1094 ^ref-50215

That’s why every recursive function has two parts: the base case, and the recursive case. — location: 1139 ^ref-9846

Tip When you’re writing a recursive function involving an array, the base case is often an empty array or an array with one element. If you’re stuck, try that first. — location: 1409 ^ref-14546

EXERCISES 4.1 Write out the code for the earlier sum function. 4.2 Write a recursive function to count the number of items in a list. 4.3 Find the maximum number in a list. 4.4 Remember binary search from chapter 1? It’s a divide-and-conquer algorithm, too. Can you come up with the base case and recursive case for binary search? — location: 1448 ^ref-18098

If you’re implementing quicksort, choose a random element as the pivot. The average runtime of quicksort is O(n log n)! — location: 1683 ^ref-53532

Put a hash function and an array together, and you get a data structure called a hash table. — location: 1766 ^ref-11280

There are many different ways to deal with collisions. The simplest one is this: if multiple keys map to the same slot, start a linked list at that slot. — location: 1934 ^ref-6848