Documentation Index
Fetch the complete documentation index at: https://33jsconcepts.com/llms.txt
Use this file to discover all available pages before exploring further.
Why does one solution pass all tests instantly while another times out? Why do interviewers care so much about “time complexity”? Consider these two functions that both find if an array contains duplicates:
// Approach A: Nested loops
function hasDuplicatesA(arr) {
for (let i = 0; i < arr.length; i++) {
for (let j = i + 1; j < arr.length; j++) {
if (arr[i] === arr[j]) return true
}
}
return false
}
// Approach B: Using a Set
function hasDuplicatesB(arr) {
return new Set(arr).size !== arr.length
}
Both work correctly. But with 100,000 elements, Approach A takes several seconds while Approach B finishes in milliseconds. The difference? Big O notation, which tells us how code performance scales with input size.
What you’ll learn in this guide:
- What Big O notation actually measures
- The common complexities: O(1), O(log n), O(n), O(n log n), O(n²)
- How to analyze your own code’s complexity
- JavaScript built-in methods and their complexity
- Implementing binary search and merge sort
- Common interview patterns: two pointers, sliding window, frequency counter
Prerequisite: This guide assumes you’re familiar with data structures like arrays, objects, Maps, and Sets. You should also be comfortable with recursion for the sorting algorithms section.
What is Big O Notation?
Big O notation describes how an algorithm’s runtime or space requirements grow as input size increases. First formalized by Paul Bachmann in 1894 and later popularized by Donald Knuth in The Art of Computer Programming, it provides an upper bound on growth rate and ignores constants, giving us a way to compare algorithms regardless of hardware.
The Package Delivery Analogy
Imagine you need to deliver packages to houses on a street:
- O(1): You have the exact address. Go directly there. Whether there are 10 or 10,000 houses, it takes the same time.
- O(n): You check each house until you find the right one. More houses = proportionally more time.
- O(n²): For each house, you compare it with every other house. 10 houses = 100 comparisons. 100 houses = 10,000 comparisons.
┌─────────────────────────────────────────────────────────────────────────┐
│ HOW ALGORITHMS SCALE │
├─────────────────────────────────────────────────────────────────────────┤
│ │
│ Operations │
│ ▲ │
│ │ O(n²) │
│ 1M │ •••• │
│ │ ••• │
│ │ ••• │
│ │ ••• │
│ 100K │ •••• O(n log n) │
│ │ ••• ════════════ │
│ │ •••• ═════ │
│ │ •••• ═════ │
│ 10K │ ••••• ═════ O(n) │
│ │ •••• ═════ ────────────── │
│ │ •••• ═════ ─────── │
│ │ •••• ═════ ─────── O(log n) │
│ 1K │•••• ═════ ────── ............ │
│ │═════ ─────── ........ │
│ │ ────── .......... O(1) │
│ 100 │──── .......... ══════════════ │
│ └──────────────────────────────────────────────────────────► │
│ 100 1K 10K 100K 1M Input (n) │
│ │
└─────────────────────────────────────────────────────────────────────────┘
The Big O Complexity Scale
Here are the most common complexities you’ll encounter, from fastest to slowest:
| Complexity | Name | Example | 1,000 items | 1,000,000 items |
|---|
| O(1) | Constant | Array access | 1 op | 1 op |
| O(log n) | Logarithmic | Binary search | ~10 ops | ~20 ops |
| O(n) | Linear | Simple loop | 1,000 ops | 1,000,000 ops |
| O(n log n) | Linearithmic | Merge sort | ~10,000 ops | ~20,000,000 ops |
| O(n²) | Quadratic | Nested loops | 1,000,000 ops | 1,000,000,000,000 ops |
The operation takes the same time regardless of input size.// Array access by index
const arr = [1, 2, 3, 4, 5]
const element = arr[2] // O(1) - instant, no matter array size
// Object/Map lookup
const user = { name: 'Alice', age: 30 }
const name = user.name // O(1)
const map = new Map()
map.set('key', 'value')
map.get('key') // O(1)
// Array push and pop (end operations)
arr.push(6) // O(1)
arr.pop() // O(1)
O(log n) - Logarithmic Time
Time increases slowly as input grows. Each step eliminates half the remaining data. This is the “sweet spot” for searching sorted data.// Binary search - covered in detail below
// With 1,000,000 elements, only ~20 comparisons needed!
// Think of it like a phone book:
// Open middle → wrong half eliminated → repeat
Time grows proportionally with input. If you double the input, you double the time.// Finding maximum value
function findMax(arr) {
let max = arr[0]
for (let i = 1; i < arr.length; i++) { // Visits each element once
if (arr[i] > max) max = arr[i]
}
return max
}
// Most array methods are O(n)
arr.indexOf(5) // O(n) - may check every element
arr.includes(5) // O(n)
arr.find(x => x > 3) // O(n)
arr.filter(x => x > 3) // O(n)
arr.map(x => x * 2) // O(n)
O(n log n) - Linearithmic Time
Common for efficient sorting algorithms. Faster than O(n²), but slower than O(n).// JavaScript's built-in sort is O(n log n)
const sorted = [...arr].sort((a, b) => a - b)
// Merge sort and quick sort (average case) are also O(n log n)
Time grows with the square of input. Nested loops over the same data are the typical culprit. Avoid for large datasets.// Checking all pairs
function findPairs(arr) {
const pairs = []
for (let i = 0; i < arr.length; i++) { // O(n)
for (let j = i + 1; j < arr.length; j++) { // O(n) for each i
pairs.push([arr[i], arr[j]])
}
}
return pairs // Total: O(n) × O(n) = O(n²)
}
// Bubble sort - O(n²), mostly used for teaching
How to Analyze Your Code
Follow these steps to determine your code’s complexity:
Identify the input size
What variable represents n? Usually it’s array length or a number parameter.
Count the loops
- One loop over n elements = O(n)
- Nested loops = multiply: O(n) × O(n) = O(n²)
- Loop that halves each time = O(log n)
Drop constants and lower terms
- O(2n) → O(n)
- O(n² + n) → O(n²)
- O(500) → O(1)
// Example analysis
function example(arr) {
console.log(arr[0]) // O(1)
for (let i = 0; i < arr.length; i++) { // O(n)
console.log(arr[i])
}
for (let i = 0; i < arr.length; i++) { // O(n)
for (let j = 0; j < arr.length; j++) { // × O(n)
console.log(arr[i], arr[j])
}
}
}
// Total: O(1) + O(n) + O(n²) = O(n²)
// The n² dominates, so we say this function is O(n²)
JavaScript Built-in Methods Complexity
Knowing these helps you make better decisions:
Array Methods
| Method | Complexity | Why |
|---|
push(), pop() | O(1) | End operations, no shifting |
shift(), unshift() | O(n) | Must re-index all elements |
splice() | O(n) | May shift elements |
slice() | O(n) | Creates copy of portion |
indexOf(), includes() | O(n) | Linear search |
find(), findIndex() | O(n) | Linear search |
map(), filter(), forEach() | O(n) | Visits each element |
sort() | O(n log n) | V8 uses TimSort since 2019 |
Object, Map, and Set
| Operation | Object | Map | Set |
|---|
| Get/Set/Has | O(1) | O(1) | O(1) |
| Delete | O(1) | O(1) | O(1) |
| Keys/Values | O(n) | O(n) | O(n) |
Use Set.has() instead of Array.includes() when checking membership repeatedly. Set lookups are O(1) while array searches are O(n).
Searching Algorithms
Linear Search - O(n)
Check each element one by one. Simple but slow for large arrays.
function linearSearch(arr, target) {
for (let i = 0; i < arr.length; i++) {
if (arr[i] === target) return i
}
return -1
}
linearSearch([3, 7, 1, 9, 4], 9) // Returns 3
Binary Search - O(log n)
Divide and conquer on a sorted array. Eliminates half the remaining elements each step. As noted in Introduction to Algorithms (Cormen et al.), binary search is one of the most efficient search algorithms with guaranteed O(log n) worst-case performance.
function binarySearch(arr, target) {
let left = 0
let right = arr.length - 1
while (left <= right) {
const mid = Math.floor((left + right) / 2)
if (arr[mid] === target) return mid
if (arr[mid] < target) left = mid + 1
else right = mid - 1
}
return -1
}
binarySearch([1, 3, 5, 7, 9, 11, 13], 9) // Returns 4
Binary search requires a sorted array. If your data isn’t sorted, you’ll need to sort it first O(n log n) or use linear search.
Sorting Algorithms
Quick Reference
| Algorithm | Best | Average | Worst | Space | Use When |
|---|
| Bubble Sort | O(n)* | O(n²) | O(n²) | O(1) | Never in production |
| Merge Sort | O(n log n) | O(n log n) | O(n log n) | O(n) | Need guaranteed performance |
| Quick Sort | O(n log n) | O(n log n) | O(n²) | O(log n)** | General purpose |
JS sort() | O(n log n) | O(n log n) | O(n log n) | O(n) | Most cases |
*Bubble sort achieves O(n) best case only with early termination optimization (when no swaps needed).
**Quick sort space is O(log n) average case, O(n) worst case due to recursion stack depth.
Bubble Sort - O(n²)
Repeatedly swaps adjacent elements if they’re in wrong order. Educational, but too slow for real use.
function bubbleSort(arr) {
const result = [...arr]
const n = result.length
for (let i = 0; i < n; i++) {
let swapped = false
for (let j = 0; j < n - i - 1; j++) {
if (result[j] > result[j + 1]) {
[result[j], result[j + 1]] = [result[j + 1], result[j]]
swapped = true
}
}
// If no swaps occurred, array is sorted
if (!swapped) break
}
return result
}
Merge Sort - O(n log n)
Divide array in half, sort each half, merge them back. Consistent performance with guaranteed O(n log n).
function mergeSort(arr) {
if (arr.length <= 1) return arr
const mid = Math.floor(arr.length / 2)
const left = mergeSort(arr.slice(0, mid))
const right = mergeSort(arr.slice(mid))
return merge(left, right)
}
function merge(left, right) {
const result = []
let i = 0
let j = 0
while (i < left.length && j < right.length) {
if (left[i] <= right[j]) {
result.push(left[i])
i++
} else {
result.push(right[j])
j++
}
}
return result.concat(left.slice(i)).concat(right.slice(j))
}
mergeSort([38, 27, 43, 3, 9, 82, 10]) // [3, 9, 10, 27, 38, 43, 82]
In practice, use JavaScript’s built-in sort(). Modern browsers typically use Timsort (V8) or similar O(n log n) algorithms optimized for real-world data. Implement your own sorts for learning or when you have specific requirements.
Common Interview Patterns
These patterns solve many algorithm problems efficiently.
Two Pointers - O(n)
Use two pointers moving toward each other or in the same direction. Great for sorted arrays and finding pairs.
// Find pair that sums to target in sorted array
function twoSum(arr, target) {
let left = 0
let right = arr.length - 1
while (left < right) {
const sum = arr[left] + arr[right]
if (sum === target) return [left, right]
if (sum < target) left++
else right--
}
return null
}
twoSum([1, 3, 5, 7, 9], 12) // [1, 4] (3 + 9 = 12)
Sliding Window - O(n)
Maintain a “window” that slides through the array. Perfect for subarray problems.
// Maximum sum of k consecutive elements
function maxSumSubarray(arr, k) {
if (arr.length < k) return null
// Calculate first window
let windowSum = 0
for (let i = 0; i < k; i++) {
windowSum += arr[i]
}
let maxSum = windowSum
// Slide the window: remove left element, add right element
for (let i = k; i < arr.length; i++) {
windowSum = windowSum - arr[i - k] + arr[i]
maxSum = Math.max(maxSum, windowSum)
}
return maxSum
}
maxSumSubarray([2, 1, 5, 1, 3, 2], 3) // 9 (5 + 1 + 3)
Frequency Counter - O(n)
Use an object or Map to count occurrences. Avoids nested loops when comparing collections.
// Check if two strings are anagrams
function isAnagram(str1, str2) {
if (str1.length !== str2.length) return false
const freq = {}
// Count characters in first string
for (const char of str1) {
freq[char] = (freq[char] || 0) + 1
}
// Subtract counts for second string
for (const char of str2) {
if (!freq[char]) return false
freq[char]--
}
return true
}
isAnagram('listen', 'silent') // true
isAnagram('hello', 'world') // false
Key Takeaways
The key things to remember:
-
Big O measures scalability, not absolute speed. It tells you how performance changes as input grows.
-
O(1) and O(log n) are fast. O(n) is acceptable. O(n²) gets slow quickly. Avoid O(2^n) for any significant input.
-
Nested loops multiply complexity. Two nested loops over n = O(n²). Three = O(n³).
-
Drop constants and lower terms. O(2n + 100) simplifies to O(n).
-
Array end operations are O(1), beginning operations are O(n). Prefer
push/pop over shift/unshift.
-
Use Set for O(1) lookups instead of
Array.includes() for repeated membership checks.
-
Binary search is O(log n) but requires sorted data. Worth sorting first if you’ll search multiple times.
-
JavaScript’s sort() is O(n log n) in all modern browsers. Use it unless you have specific requirements.
-
Learn the patterns: Two pointers, sliding window, and frequency counter solve most interview problems.
-
Space complexity matters too. Creating a new array of size n uses O(n) space.
Test Your Knowledge
Question 1: What's the time complexity of this code?
function mystery(arr) {
for (let i = 0; i < arr.length; i++) {
for (let j = 0; j < 10; j++) {
console.log(arr[i])
}
}
}
Answer: O(n)The outer loop runs n times, but the inner loop always runs exactly 10 times (constant). So it’s O(n × 10) = O(10n) = O(n). The constant 10 is dropped.Question 2: Why is binary search O(log n)?
Answer: Because each comparison eliminates half the remaining elements.With 1,000 elements: 1000 → 500 → 250 → 125 → 62 → 31 → 15 → 7 → 3 → 1That’s about 10 steps. log₂(1000) ≈ 10. With 1,000,000 elements, it only takes ~20 steps.
Question 3: Array has 1 million elements. Which is faster: indexOf() once, or converting to Set then using has()?
Answer: It depends on how many lookups you need.
- One lookup:
indexOf() is faster. O(n) vs O(n) for Set creation + O(1) for lookup.
- Many lookups: Convert to Set first. O(n) creation + O(1) per lookup beats O(n) per lookup.
Rule of thumb: If you’ll search more than once, use a Set. Question 4: What's wrong with using bubble sort?
Answer: It’s O(n²), making it impractical for large datasets.With 10,000 elements: ~100,000,000 operations. JavaScript’s built-in sort() at O(n log n) would take ~130,000 operations for the same data.Bubble sort is useful for learning but should never be used in production code.
Question 5: How would you find if an array has duplicates in O(n) time?
Answer: Use a Set to track seen elements:function hasDuplicates(arr) {
const seen = new Set()
for (const item of arr) {
if (seen.has(item)) return true // O(1) lookup
seen.add(item) // O(1) insert
}
return false
}
This is O(n) time and O(n) space. The naive nested loop approach would be O(n²) time but O(1) space. Question 6: What pattern would you use to find the longest substring without repeating characters?
Answer: Sliding window with a Set or Map.function longestUniqueSubstring(s) {
const seen = new Set()
let maxLen = 0
let left = 0
for (let right = 0; right < s.length; right++) {
while (seen.has(s[right])) {
seen.delete(s[left])
left++
}
seen.add(s[right])
maxLen = Math.max(maxLen, right - left + 1)
}
return maxLen
}
Time: O(n), Space: O(min(n, alphabet size))
Frequently Asked Questions
What is Big O notation in simple terms?
Big O notation describes how an algorithm’s performance scales as the input grows. O(1) means constant time regardless of input size, O(n) means time grows linearly, and O(n²) means time grows quadratically. It focuses on the worst-case growth rate and ignores constants, so O(2n) simplifies to O(n).
What is the fastest sorting algorithm in JavaScript?
JavaScript’s built-in Array.prototype.sort() uses TimSort in V8 (Chrome, Node.js), which runs in O(n log n) time. For most use cases, the built-in sort is optimal. Merge sort guarantees O(n log n) in all cases, while quicksort averages O(n log n) but can degrade to O(n²) with poor pivot selection.
How do I determine the time complexity of my code?
Identify what variable represents your input size (usually array length). Count nested loops: one loop is O(n), two nested loops is O(n²). A loop that halves the input each iteration is O(log n). Then drop constants and lower-order terms — O(2n + 5) becomes O(n), and O(n² + n) becomes O(n²).
When should I use binary search instead of linear search?
Use binary search when your data is already sorted and you need to find elements repeatedly. Binary search runs in O(log n) — for a million elements, that is roughly 20 comparisons versus up to 1,000,000 with linear search. If the data is unsorted, the O(n log n) cost of sorting first is only worthwhile if you plan multiple searches.
What is the difference between time complexity and space complexity?
Time complexity measures how many operations an algorithm performs as input grows. Space complexity measures how much additional memory it needs. For example, merge sort has O(n log n) time but O(n) space because it creates temporary arrays, while bubble sort has O(n²) time but O(1) space because it sorts in place.
Data Structures
Understanding arrays, objects, Maps, and Sets is essential for choosing the right tool
Recursion
Many algorithms like merge sort and binary search can be implemented recursively
Higher-Order Functions
Map, filter, and reduce are built on these concepts
Map, Reduce, Filter
Understanding the complexity of these common operations
Reference
Array — MDN
Complete reference for array methods and their behavior
Map — MDN
Hash-based key-value storage with fast lookups
Set — MDN
O(1) operations for membership testing
Articles
Big O Cheat Sheet
Visual reference for time and space complexity of common algorithms and data structures. Bookmark this one.
JavaScript Algorithms and Data Structures
Comprehensive GitHub repo with 190k+ stars. Every algorithm implemented in JavaScript with explanations.
Time Complexity of JavaScript Array Methods
Detailed breakdown of every array method’s complexity with examples and explanations.
Algorithms in Plain English
FreeCodeCamp’s beginner-friendly guide to Big O with real-world analogies.
Videos
Big O Notation in 12 Minutes
Web Dev Simplified’s concise explanation. Perfect if you want the essentials without filler.
JavaScript Algorithms Playlist
Codevolution’s complete series covering sorting, searching, and common patterns step by step.
Data Structures and Algorithms in JavaScript
FreeCodeCamp’s comprehensive 8-hour course. Great for deep learning when you have the time.