1. Memory allocation: Heap vs Stack: (Heap = dynamic allocation) vs (Stack = ordered)
    1. Stack: the memory set aside for a thread of execution. When a function is called, a block is reserved on top of stack for local variables. When that function returns, the block becomes unused -> can be freed for next function (freeing a stack = adjust the stack pointer).
    2. Heap: memory for dynamic allocation, can allocate & deallocate blocks at any time -> size of heap can grow if space is needed.
    3. OS allocates stack for each thread when thread is created. Stack is faster as i) it’s easy to allocate, deallocate, ii) each byte can be reused frequently -> stack tends to be mapped to processor’s cache.
    4. Local variables of object types will have the address stored in stack, and object stored in heap.
  2. Abstract Data Types: 
    1. List
    2. Tree
    3. Graph (BFS, DFS)
  3. Sorting: Sorting is the basic block of algorithms and also the most thoroughly studied. Computers spent more time sorting than anything else. Its ideas are also used in design. Sorting has applications in:
    • Searching (most important application)
    • Closest pair (smallest difference)
    • Find duplicates
    • Frequency distribution (how many X in ..)
    • Selection: k-th largest
    • Convex hulls
  4. Different Sorting Algorithms:
    1. Internal vs External
    2. Bubble (Worst sort) – repeatedly swap neighbours if they are in wrong order until no swap needed.
    3. Selection (almost always better than bubble, but worse than insertion) – find minimum, put it first, & continue.
    4. Insertion (O n^2) – takes a number, put in already sorted list in correct place – very quick in partly ordered list.
    5. Shell (use h) – insertion sort list of elements that are h-apart, with h descending to 1 (last pass is insertion sort).
    6. Quick (average O nlogn) – pick a pivot, all less than pivot will be swapped before, all greater swapper after. Then recursively sort sublists.
    7. Heap – build the heap, take out largest & put in sorted array, reconstruct heap & continue. Only initial building heap is costly, after that it’s easy.
      1. Heap : parents always greater (or always less) than children
      2. Heapsort is not better than quicksort on average, but much better in worst case scenario
      3. heapify (O nlogn for initial heap):  swap any dissatisfied child with its parent (bubble up)
    8. Merge – divide & conquer by John Von Neumann. Divides unsorted list to 1-element list. Repeatedly merge sublists until become one.
      1. Heap sort is usually faster in practice than Merge, but Merge is stable sort!
      2. Merge is best at sorting linked list as it parallelizes better (just need +2 pointer).
      3. In Java, the Array.sort() when less than 7 elements it switch to Insertion, else it uses Merge sort.
    9. Radix –
    10. Ext.Merge –
  5.  Tree:
    1. Binary trees
    2. 234 tree
    3. Red-black tree
    4. B+ tree

Other algorithms

  1. Hash & Double Hash + String matching
  2. Compression
  3. Bit Manipulation
  4. Graph
  5. Race Condition & Threads
  6. Concurrent Programming 
    1. Semaphor / Mutex
    2. Monitor , read-write lock
    3. Recursive lock
    4. Distribute lock
    5. Spin lock
  7. OS Scheduler:
    1. FIFO
    2. SJF
    3. Multi-level Queue
    4. Fixed Priority
    5. Round robin
  8. System Design & Memory Limit