Most developers encounter generators, learn function* syntax, and conclude they’re a clever trick for iterating custom data structures. That’s roughly like learning what a callback is and concluding JavaScript can handle asynchronous operations. Generators are a control flow mechanism — they enable lazy evaluation, infinite sequences, cooperative multitasking, and async iteration patterns that no other JavaScript feature can replicate.
⚡ TL;DR: Generators are pauseable functions.
yieldsuspends execution and returns a value. The caller resumes execution by calling.next(). This enables infinite sequences without infinite loops, lazy pipelines that only compute values when needed, and two-way communication between generator and caller.
The Core Mechanic: Execution Suspension
// A generator function returns a Generator object, not a value
function* counter(start = 0) {
let n = start;
while (true) { // Infinite loop — but only runs when asked
const reset = yield n++; // Suspends here, returns n, waits for next()
if (reset) n = start; // Caller can send values back via next(value)
}
}
const gen = counter(1);
console.log(gen.next()); // { value: 1, done: false }
console.log(gen.next()); // { value: 2, done: false }
console.log(gen.next()); // { value: 3, done: false }
console.log(gen.next(true)); // { value: 1, done: false } — reset!
console.log(gen.next()); // { value: 2, done: false }
// Key insight: the generator runs ONLY when .next() is called
// No CPU time wasted. Memory: O(1) regardless of sequence length.
// This is lazy evaluation — compute only what you need, when you need it.
Infinite Sequences Without Infinite Memory
Master JavaScript generators and async
→ Complete JavaScript Course (Udemy) — Full generators and async iteration chapter — from basics to production patterns.
Sponsored links. We may earn a commission at no extra cost to you.
// Fibonacci sequence — infinite, O(1) memory
function* fibonacci() {
let [a, b] = [0, 1];
while (true) {
yield a;
[a, b] = [b, a + b];
}
}
// Take first N values — no array of N items ever created
function take(n, iterable) {
const result = [];
for (const value of iterable) {
result.push(value);
if (result.length === n) break;
}
return result;
}
console.log(take(10, fibonacci()));
// [0, 1, 1, 2, 3, 5, 8, 13, 21, 34]
// Range generator (Python's range() in JavaScript)
function* range(start, end, step = 1) {
for (let i = start; i < end; i += step) yield i;
}
// Sum 1 to 1,000,000 without creating an array of 1M numbers
const sum = [...range(1, 1_000_001)].reduce((a, b) => a + b, 0);
// But even better — use the generator directly:
let total = 0;
for (const n of range(1, 1_000_001)) total += n;
// O(1) memory regardless of range size
Lazy Pipeline Processing — Transform Data Without Materializing It
// Process a 10GB log file without loading it into memory
function* readLines(filePath) {
const { createReadStream } = require('fs');
const { createInterface } = require('readline');
const rl = createInterface({ input: createReadStream(filePath) });
for await (const line of rl) yield line;
}
function* filter(predicate, iterable) {
for (const item of iterable) {
if (predicate(item)) yield item;
}
}
function* map(transform, iterable) {
for (const item of iterable) {
yield transform(item);
}
}
function* take(n, iterable) {
let count = 0;
for (const item of iterable) {
yield item;
if (++count >= n) return;
}
}
// Lazy pipeline — reads file line by line, never loads more than 1 line
// into memory at a time
async function processLogs() {
const lines = readLines('server.log'); // Lazy — reads on demand
const errors = filter(l => l.includes('ERROR'), lines); // Lazy filter
const messages = map(l => l.split('|')[2], errors); // Lazy transform
const first100 = take(100, messages); // Stop after 100
for await (const msg of first100) {
console.log(msg);
}
// Total memory used: O(1) — only current line ever in memory
}
Two-Way Communication — Sending Values INTO Generators
// yield is an expression — next(value) sends value back to the generator
function* accumulator() {
let total = 0;
while (true) {
const value = yield total; // yield current total, receive next value
if (value === null) return total; // Sentinel value to end
total += value;
}
}
const acc = accumulator();
acc.next(); // Start the generator (returns {value: 0})
acc.next(10); // Add 10 → {value: 10}
acc.next(20); // Add 20 → {value: 30}
acc.next(5); // Add 5 → {value: 35}
acc.next(null); // End → {value: 35, done: true}
// Real use case: stateful parsers
function* csvParser() {
const rows = [];
let line;
while ((line = yield rows) !== null) {
rows.push(line.split(',').map(v => v.trim()));
}
return rows;
}
const parser = csvParser();
parser.next(); // Initialize
parser.next('name, age, city'); // Send header row
parser.next('Alice, 30, New York'); // Send data row
parser.next('Bob, 25, London'); // Send data row
const result = parser.next(null).value; // Finish and get all rows
Async Generators — The Missing Piece for Streaming APIs
// Async generator: yields Promises, awaits them automatically
async function* paginatedAPI(endpoint) {
let page = 1;
let hasMore = true;
while (hasMore) {
const response = await fetch(`${endpoint}?page=${page}&limit=100`);
const { data, total, per_page } = await response.json();
yield* data; // yield* delegates to another iterable — yields each item
hasMore = page * per_page < total;
page++;
}
}
// Stream all users from a paginated API without loading all pages upfront
async function processAllUsers() {
for await (const user of paginatedAPI('https://api.example.com/users')) {
await sendWelcomeEmail(user); // Process one user at a time
}
}
// If API has 50,000 users across 500 pages:
// - Traditional: fetch all 500 pages, build 50K user array, then process
// - Generator: fetch page 1, process user 1, fetch page 1 user 2... fetch page 2 when needed
// Result: constant memory, streaming processing, handles infinite datasets
Generator Functions Cheat Sheet
- ✅ Use generators for infinite/large sequences to avoid loading data into memory
- ✅
yield*delegates to another iterable — great for flattening or composing - ✅
gen.next(value)sendsvalueback as the result of theyieldexpression - ✅
gen.return(value)terminates generator, triggersfinallyblocks - ✅
gen.throw(error)injects an error at the yield point - ✅ Async generators (
async function*) for streaming API responses - ❌ Don’t use generators when a simple array or map/filter works — overkill
- ❌ Don’t forget generators hold state — a single generator instance is stateful
Generators pair naturally with the Node.js streams and backpressure guide — async generators are conceptually a higher-level abstraction over readable streams, and understanding both gives you the full picture of lazy evaluation in JavaScript. The WeakRef and memory management guide completes the picture for memory-efficient JavaScript — generators avoid creating large arrays, WeakRef lets those objects be collected when done. External reference: MDN Generator Functions documentation.
Recommended resources
- You Don’t Know JS: Async & Performance — Kyle Simpson devotes several chapters to generators as an async primitive, including how they powered early async/await implementations. The best deep-dive on generator internals available.
- Node.js Design Patterns — The chapter on streams and iterators shows how async generators replace readable streams in modern Node.js — directly extending the lazy pipeline pattern from this post.
Disclosure: This post contains affiliate links. If you purchase through these links, CheatCoders earns a small commission at no extra cost to you. We only recommend tools and books we genuinely find valuable.
Discover more from CheatCoders
Subscribe to get the latest posts sent to your email.
