Part of series: MongoDB Roadmap
Bulk Writes: How to Insert 1 Million Docs Fast
Welcome to Day 2! 📦
A common mistake new developers make is running database operations in a loop.
Bad Code:
// 10,000 requests to the DB! 🐢
for (const user of users) {
await db.collection('users').insertOne(user);
}
Even with Promise.all(), you are flooding the network with 10,000 separate TCP packets.
1. Enter bulkWrite()
MongoDB allows you to send one massive batch of operations in a single network request.
const bulkOps = users.map(user => ({
insertOne: {
document: user
}
}));
// 1 request! 🚀
await db.collection('users').bulkWrite(bulkOps);
2. It’s not just for Inserts
You can mix and match operations!
db.collection('products').bulkWrite([
{ insertOne: { document: { name: "New Product" } } },
{ updateOne: { filter: { id: 1 }, update: { $set: { price: 99 } } } },
{ deleteOne: { filter: { status: "defective" } } }
]);
3. Ordered vs Unordered
By default, bulkWrite is Ordered (ordered: true).
- Operations execute in sequence (1, then 2, then 3).
- If Op #5 fails, it stops. Operations 6-10 are NOT executed.
If you don’t care about order (e.g., importing independent log entries), use Unordered.
ordered: false- MongoDB creates parallel threads to execute them.
- If Op #5 fails, it continues with the rest. Faster!
await db.collection('logs').bulkWrite(ops, { ordered: false });
🧠 Daily Challenge
- Create a script that generates 10,000 dummy products.
- Measure the time it takes to insert them using a
for...ofloop withawait. - Measure the time it takes using
bulkWrite. - Be amazed. (Usually 10x-50x faster).
See you on Day 3 for Change Streams! (Real-time magic) ✨