How Arrays and Linked Lists Work — Contiguous vs Connected

How Arrays and Linked Lists Work — Contiguous vs Connected

2026-03-22

Every data structure is a way of arranging data in memory. The two most fundamental arrangements are arrays (elements stored next to each other) and linked lists (elements scattered across memory, connected by pointers). Every other structure — hash maps, trees, graphs, stacks, queues — is built from one or both of these.

How Does an Array Work?

An array stores elements in a contiguous block of memory. If element 0 starts at address 1000 and each element is 8 bytes, then element 1 is at 1008, element 2 is at 1016, element n is at 1000 + (n × 8).

This arithmetic makes access instant: to read element n, compute the address and read it. One multiplication, one addition. O(1) regardless of array size.

[0] [1] [2] [3] [4] [5] [6] 10 23 5 42 17 8 31 1000 1008 1016 1024 1032 1040 1048

Strengths:

  • O(1) random access — read any element by index instantly.
  • Cache-friendly — elements are next to each other in memory, so the CPU cache loads multiple elements at once. Sequential array traversal is the fastest possible memory access pattern.
  • Simple — no overhead per element. Just data, packed tightly.

Weaknesses:

  • O(n) insertion/deletion in the middle — inserting at position 3 means shifting elements 3, 4, 5, ... one position right. Deleting means shifting left.
  • Fixed size — a raw array has a fixed capacity. Growing requires allocating a new, larger array and copying everything. Dynamic arrays (Vec in Rust, ArrayList in Java, list in Python) handle this automatically, but the copy is O(n) when it happens.
  • Contiguous allocation — you need a single block of memory large enough for the entire array. At extreme sizes, fragmented memory might not have a large enough contiguous block.

How Does a Linked List Work?

A linked list stores each element in a node that contains the data and a pointer to the next node. Nodes can be anywhere in memory — they don't need to be contiguous.

10 23 5 42 null

Strengths:

  • O(1) insertion/deletion at a known position — change two pointers. No shifting.
  • Dynamic size — grows and shrinks without copying. Each node is allocated independently.
  • No contiguous memory needed — nodes can be scattered across the heap.

Weaknesses:

  • O(n) access — to reach element 5, you must follow pointers through elements 0, 1, 2, 3, 4. No random access.
  • Cache-unfriendly — nodes are scattered in memory. Following pointers causes cache misses. In practice, this makes linked lists much slower than arrays for sequential access, despite the same O(n) complexity.
  • Memory overhead — each node carries a pointer (8 bytes on 64-bit systems) in addition to its data.

When Do You Use Each?

OperationArrayLinked List
Access by indexO(1)O(n)
Search (unsorted)O(n)O(n)
Insert at beginningO(n)O(1)
Insert at endO(1) amortizedO(1) with tail pointer
Insert in middleO(n)O(1) at known position
DeleteO(n)O(1) at known position
Memory localityExcellentPoor

Use arrays when: you need random access, iterate sequentially, or care about cache performance. This is most of the time. In modern systems, CPU cache effects often matter more than algorithmic complexity — an O(n) array scan can be faster than an O(1) linked list operation because of cache misses.

Use linked lists when: you need frequent insertion/deletion at the front or at known positions, the data size is unpredictable, or you're building higher-level structures (queues, LRU caches).

Dynamic Arrays — The Best of Both

In practice, most developers use dynamic arrays — arrays that resize automatically:

  • Rust's Vec<T>
  • Java's ArrayList
  • Python's list
  • C++'s std::vector
  • Go's slices

When the array is full and you push a new element, the dynamic array allocates a new array (typically 2x the size), copies the old data, and frees the old array. This copy is O(n) but happens rarely — on average, each push is O(1) amortized.

Dynamic arrays give you the cache performance and random access of arrays with the flexibility of dynamic sizing. They're the default choice in nearly every language.

Stacks and Queues

Two important structures built from arrays or linked lists:

Stack (LIFO — last in, first out) — push and pop from the same end. The function call stack is the most important stack in computing. Undo history, expression evaluation, and depth-first search all use stacks.

Queue (FIFO — first in, first out) — enqueue at one end, dequeue from the other. Job queues, message queues, breadth-first search, and request buffering all use queues.

Both are O(1) for their primary operations when implemented correctly.

Next Steps