Table of contents
- MongoDB Architecture
- CRUD Operations
- Schema Validation
- Model vs DAO Pattern
- Indexing
- Array Operations
- Adding Elements to an Array
- Querying Arrays with $elemMatch
- Add Unique - $addToSet
- Add Multiple $push and $each
- Add Sorted - $push and $sort
- Add Limited - $push and $slice
- Removing Elements from an Array
- Remove the first or last element - $pop
- Remove Multiple - $pullAll
- Remove Multiple - $pull and $in
- Remove Condition - $pull and $gt
- Remove Not Equal - $pull and $ne
- Update Condition - $set and $
- Update All - $[]
- Update Filtered - $[<identifier>]
- Aggregation
- Transactions
- Replica Sets
- Sharding
- Mongoose
- Patterns
- Q&A
- What is .exec() in Mongoose?
- What is the difference between findOne() and find() in Mongoose?
- What is the difference between Model.create() and new Model().save() in Mongoose?
- What is the purpose of the lean() method in Mongoose queries, and when should it be used?
- How to implement soft deletes in Mongoose?
MongoDB Architecture
MongoDB is a popular NoSQL database designed for high performance, high availability, and easy scalability. It stores data in flexible, JSON-like documents, making it easy to work with structured, semi-structured, and unstructured data.
Database: A container for collections.
Collection: A group of MongoDB documents.
Document: A set of key-value pairs (similar to JSON objects).
CRUD Operations
CRUD stands for Create, Read, Update, and Delete. These are the basic operations for interacting with data in MongoDB.
Create
To insert a new document into a collection, you use the insertOne()
or insertMany()
methods.
insertOne
db.collection('users').insertOne({ name: 'Alice', age: 25 });
Auto-Generated _id Field If you don't specify an _id
field, MongoDB automatically generates a unique identifier for each document. To specify your own _id
value, you can include it in the document.
db.collection('users').insertOne({ _id: 1, name: 'Alice', age: 25 });
Make sure to handle duplicate _id
values to avoid conflicts.
insertMany
To insert multiple documents, you can use the insertMany()
method.
db.collection('users').insertMany([
{ name: 'Alice', age: 25 },
{ name: 'Bob', age: 30 }
]);
If one of the documents fails to insert, the operation will be aborted unless you specify the ordered: false
option.
For example if you have schema vadiation like this. Go to Schema Validation
db.createCollection('users', {
validator: {
$jsonSchema: {
bsonType: 'object',
required: ['name', 'age'],
properties: {
name: { bsonType: 'string' },
age: { bsonType: 'int' }
}
}
}
});
db.collection('users').insertMany([
{ name: 'Alice', age: 25 },
{ name: 'Bob', age: 30 },
{ name: 'Charlie' } // missing age field
], { ordered: false });
Third document will be ignored and first two will be inserted.
The result would be
{
acknowledged: true,
insertedIds: {
'0': ObjectId("..."), // ID for Alice
'1': ObjectId("...") // ID for Bob
}
}
Read
To read documents from a collection, you use the find()
method. The find() method retrives multiple documents and returns a cursor which can be iterated to access the documents.
find
const cursor = db.collection('users').find({ age: { $gte: 18 } });
In Node.js, you can convert the cursor to an array using the toArray()
method.
const docs = await cursor.toArray();
console.log(docs)
Result:
[
{ "_id": 1, "name": "Alice", "age": 25 },
{ "_id": 2, "name": "Bob", "age": 30 }
]
findOne
To retrieve a single document, you can use the findOne()
method.
db.collection('users').findOne({ name: 'Alice' });
Result:
{ "_id": 1, "name": "Alice", "age": 25 }
Projection
You can specify which fields to include or exclude in the result using the projection parameter.
db.collection('users').find({}, { projection: { name: 1, age: 1 } });
Result:
[
{ "_id": 1, "name": "Alice", "age": 25 },
{ "_id": 2, "name": "Bob", "age": 30 }
]
Using projection is a critical practice, especially when working with large collections. It reduces the amount of data transferred over the network and improves query performance. It also prevent unintentional exposure of sensitive data.
Update
To update existing documents, you use the updateOne()
or updateMany()
methods.
db.collection('users').updateOne({ name: 'Alice' }, { $set: { age: 26 } });
db.collection('users').updateMany({ city: 'New York' }, { $set: { city: 'San Francisco' } });
Result Object
{
"acknowledged": true,
"matchedCount": 5,
"modifiedCount": 5,
"upsertedId": null,
"upsertedCount": 0
}
Base on the result object you can return proper response.
const result = await db.collection('users').updateOne(
{ _id: userId },
{ $set: { age: 30 } }
);
if (result.matchedCount === 0) {
return { success: false, message: 'No matching document found' };
}
if (result.modifiedCount === 0) {
return { success: true, message: 'Document already up-to-date' };
}
return { success: true, message: 'Document updated successfully' };
Upsert
If you want to insert a new document when no matching document is found, you can use the upsert
option.
db.collection('users').updateOne(
{ name: 'Alice' },
{ $set: { age: 26 }
},
{ upsert: true }
);
Result Object
{
"acknowledged": true,
"matchedCount": 1,
"modifiedCount": 1,
"upsertedId": ObjectId("..."),
"upsertedCount": 1
}
findOneAndUpdate
This method is a combination of findOne()
and updateOne()
as atomic operation. Atomic operation are operations that are performed as a single unit of work. This helps to prevent race conditions and ensure data consistency.
// Update or create a user profile
db.users.findOneAndUpdate(
{ email: "alice@example.com" },
{ $set: { age: 26 } },
{ returnDocument: 'after' }
);
Result Object
returnDocument
can be 'before' or 'after' to return the document before or after the update.
{
"value": {
"_id": ObjectId("5f7d3b1c8e1f9a1c9c8e1f9a"),
"email": "alice@example.com",
"age": 26,
"name": "Alice"
},
"lastErrorObject": {
"updatedExisting": true,
"n": 1
},
"ok": 1
}
Update Operators
// $set operator to update fields
db.collection('users').updateOne({ name: 'Alice' }, { $set: { age: 26, city: 'New York' } });
// $set operator to update nested fields
db.collection('users').updateOne({ name: 'Alice' }, { $set: { 'address.city': 'New York' } });
// $inc operator to increment a field
db.collection('users').updateOne({ name: 'Alice' }, { $inc: { age: 1 } });
// $mul operator to multiply a field value
db.collection('users').updateOne({ name: 'Alice' }, { $mul: { age: 2 } });
// $unset operator to remove a field
db.collection('users').updateOne({ name: 'Alice' }, { $unset: { city: '' } });
// $rename operator to rename a field
db.collection('users').updateOne({ name: 'Alice' }, { $rename: { city: 'location' } });
Delete
To delete documents from a collection, you use the deleteOne()
or deleteMany()
methods.
db.collection('users').deleteOne({ name: 'Alice' }); // delete first matching document
db.collection('users').deleteMany({ city: 'New York' });
Result Object
{
"acknowledged": true,
"deletedCount": 1
}
Based on the result object you can return proper response.
const result = await db.collection('users').deleteOne({ _id: userId });
if (result.deletedCount === 0) {
return { success: false, message: 'No matching document found' };
}
return { success: true, message: 'Document deleted successfully' };
Backup your data: Before deleting documents, make sure to back up your data to prevent accidental data loss.
Soft Delete: Instead of permanently deleting documents, you can mark them as deleted by adding a deletedAt
field. This allows you to retain the data for auditing or recovery purposes.
db.collection('users').updateOne(
{ _id: userId },
{ $set: { deletedAt: new Date() } }
);
Precise Filter: When deleting documents, use a precise filter to avoid unintentionally deleting more documents than intended. You can use find() to preview the documents that will be deleted before running the delete operation.
Schema Validation
Model vs DAO Pattern
Sample Todo Model
Model Pattern
- Encapsulates both data and behavior. This may include validation, business logic, and database operations. This can become "fat" but simpler implementation.
This is sample of files structure for Model Pattern
src/
├── config/
│ └── database.js
│
├── models/
│ └── todo.model.js
│
├── controllers/
│ └── todo.controller.js
│
├── routes/
│ └── todo.routes.js
│
├── middleware/
│ └── auth.middleware.js
│
└── app.js
If the model becomes too complex, you can split it into multiple files or classes. For example, you can have separate files for validation, business logic, and database operations.
Data Configuration
// src/config/database.js
const { MongoClient } = require('mongodb');
const dbConfig = {
url: process.env.MONGODB_URI || 'mongodb://localhost:27017',
dbName: 'todoapp'
};
let db = null;
const connectDB = async () => {
try {
const client = await MongoClient.connect(dbConfig.url, {
useUnifiedTopology: true
});
db = client.db(dbConfig.dbName);
console.log('Connected to MongoDB successfully');
return db;
} catch (error) {
console.error('MongoDB connection error:', error);
process.exit(1);
}
};
const getDB = () => {
if (!db) {
throw new Error('Database not initialized');
}
return db;
};
module.exports = { connectDB, getDB };
Model
The model file contains the data structure, validation, and database operations for a todo item. This contains both data and behavior related to todos.
// src/models/todo.model.js
onst { ObjectId } = require('mongodb');
const { getDB } = require('../config/database');
const COLLECTION_NAME = 'todos';
// Validation functions
const validateTodoData = (todoData) => {
const errors = [];
if (!todoData.title) {
errors.push('Title is required');
} else if (todoData.title.length < 3) {
errors.push('Title must be at least 3 characters long');
}
if (todoData.status && !['pending', 'in-progress', 'completed'].includes(todoData.status)) {
errors.push('Invalid status. Must be pending, in-progress, or completed');
}
if (todoData.dueDate && new Date(todoData.dueDate) < new Date()) {
errors.push('Due date cannot be in the past');
}
if (todoData.priority && !['low', 'medium', 'high'].includes(todoData.priority)) {
errors.push('Invalid priority. Must be low, medium, or high');
}
return errors;
};
const todoModel = {
async create(todoData) {
const errors = validateTodoData(todoData);
if (errors.length > 0) {
throw new Error(`Validation failed: ${errors.join(', ')}`);
}
const db = getDB();
const todo = {
...todoData,
title: todoData.title.trim(),
status: todoData.status || 'pending',
priority: todoData.priority || 'medium',
createdAt: new Date(),
updatedAt: new Date(),
completedAt: null
};
const result = await db.collection(COLLECTION_NAME).insertOne(todo);
return { ...todo, _id: result.insertedId };
},
async findById(id) {
if (!ObjectId.isValid(id)) {
throw new Error('Invalid todo ID');
}
const db = getDB();
const todo = await db.collection(COLLECTION_NAME).findOne({
_id: new ObjectId(id)
});
if (!todo) {
throw new Error('Todo not found');
}
return todo;
},
async find(query = {}, options = {}) {
const db = getDB();
const {
page = 1,
limit = 10,
sortBy = 'createdAt',
sortOrder = -1
} = options;
if (page < 1 || limit < 1) {
throw new Error('Invalid pagination parameters');
}
const skip = (page - 1) * limit;
const sortOptions = { [sortBy]: sortOrder };
// Apply filters
const filters = { ...query };
if (filters.priority) {
if (!['low', 'medium', 'high'].includes(filters.priority)) {
throw new Error('Invalid priority filter');
}
}
if (filters.status) {
if (!['pending', 'in-progress', 'completed'].includes(filters.status)) {
throw new Error('Invalid status filter');
}
}
const [todos, totalCount] = await Promise.all([
db.collection(COLLECTION_NAME)
.find(filters)
.sort(sortOptions)
.skip(skip)
.limit(limit)
.toArray(),
db.collection(COLLECTION_NAME)
.countDocuments(filters)
]);
return {
todos,
pagination: {
total: totalCount,
page,
limit,
pages: Math.ceil(totalCount / limit)
}
};
},
async update(id, updateData) {
if (!ObjectId.isValid(id)) {
throw new Error('Invalid todo ID');
}
const errors = validateTodoData(updateData);
if (errors.length > 0) {
throw new Error(`Validation failed: ${errors.join(', ')}`);
}
const db = getDB();
const existingTodo = await this.findById(id);
// Business logic for status changes
if (updateData.status === 'completed' && existingTodo.status !== 'completed') {
updateData.completedAt = new Date();
}
if (updateData.status && updateData.status !== 'completed') {
updateData.completedAt = null;
}
const result = await db.collection(COLLECTION_NAME).findOneAndUpdate(
{ _id: new ObjectId(id) },
{
$set: {
...updateData,
updatedAt: new Date()
}
},
{ returnDocument: 'after' }
);
if (!result.value) {
throw new Error('Todo not found');
}
return result.value;
},
async delete(id) {
if (!ObjectId.isValid(id)) {
throw new Error('Invalid todo ID');
}
const db = getDB();
const result = await db.collection(COLLECTION_NAME).deleteOne({
_id: new ObjectId(id)
});
if (result.deletedCount === 0) {
throw new Error('Todo not found');
}
return true;
},
// Additional business logic methods
async markAsComplete(id) {
return await this.update(id, {
status: 'completed'
});
},
async findOverdue() {
const db = getDB();
return await db.collection(COLLECTION_NAME).find({
dueDate: { $lt: new Date() },
status: { $ne: 'completed' }
}).toArray();
}
};
Sample Todo DAO
DAO Pattern
Data Access logic is separated from business logic.
Service layer for business logic and DAO layer for data access.
Better for testing by mocking the data access layer.
Better support dependency injection.
More complex implementation and code repetition.
src/
├── config/
│ └── database.js # Database connection configuration
│
├── models/
│ └── todo.entity.js # Data structure/schema definition
│
├── daos/
│ └── todo.dao.js # Data Access Object - handles database operations
│
├── services/
│ └── todo.service.js # Business logic layer
│
├── controllers/
│ └── todo.controller.js # Request handling & response formatting
│
├── routes/
│ └── todo.routes.js # Route definitions
│
└── app.js # Application entry point
Database Configuration
// src/config/database.js
const { MongoClient } = require('mongodb');
class Database {
constructor(config) {
this.config = {
url: config.url || process.env.MONGODB_URI || 'mongodb://localhost:27017',
dbName: config.dbName || process.env.DB_NAME || 'todoapp',
options: {
useUnifiedTopology: true,
...config.options
}
};
this.client = null;
this.db = null;
}
async connect() {
try {
this.client = await MongoClient.connect(this.config.url, this.config.options);
this.db = this.client.db(this.config.dbName);
console.log('Connected to MongoDB successfully');
return this.db;
} catch (error) {
console.error('MongoDB connection error:', error);
throw new DatabaseError('Failed to connect to database', error);
}
}
async disconnect() {
try {
if (this.client) {
await this.client.close();
this.client = null;
this.db = null;
console.log('Disconnected from MongoDB');
}
} catch (error) {
console.error('MongoDB disconnection error:', error);
throw new DatabaseError('Failed to disconnect from database', error);
}
}
getDB() {
if (!this.db) {
throw new DatabaseError('Database not initialized. Call connect() first.');
}
return this.db;
}
}
Entity
// src/models/todo.entity.js
class Todo {
static STATUS = {
PENDING: 'pending',
IN_PROGRESS: 'in-progress',
COMPLETED: 'completed'
};
static PRIORITY = {
LOW: 'low',
MEDIUM: 'medium',
HIGH: 'high'
};
constructor(data = {}) {
this._id = data._id || null;
this.title = data.title || '';
this.description = data.description || '';
this.status = data.status || Todo.STATUS.PENDING;
this.priority = data.priority || Todo.PRIORITY.MEDIUM;
this.dueDate = data.dueDate ? new Date(data.dueDate) : null;
this.createdAt = data.createdAt ? new Date(data.createdAt) : new Date();
this.updatedAt = data.updatedAt ? new Date(data.updatedAt) : new Date();
this.completedAt = data.completedAt ? new Date(data.completedAt) : null;
this.tags = Array.isArray(data.tags) ? [...data.tags] : [];
this.assignedTo = data.assignedTo || null;
}
validate() {
const errors = [];
if (!this.title?.trim()) {
errors.push('Title is required');
} else if (this.title.trim().length < 3) {
errors.push('Title must be at least 3 characters long');
}
if (this.status && !Object.values(Todo.STATUS).includes(this.status)) {
errors.push(`Invalid status. Must be one of: ${Object.values(Todo.STATUS).join(', ')}`);
}
if (this.dueDate) {
if (!(this.dueDate instanceof Date) || isNaN(this.dueDate.getTime())) {
errors.push('Invalid due date format');
} else if (this.dueDate < new Date()) {
errors.push('Due date cannot be in the past');
}
}
if (this.priority && !Object.values(Todo.PRIORITY).includes(this.priority)) {
errors.push(`Invalid priority. Must be one of: ${Object.values(Todo.PRIORITY).join(', ')}`);
}
if (this.tags && !Array.isArray(this.tags)) {
errors.push('Tags must be an array');
}
return errors;
}
isOverdue() {
return this.dueDate && this.dueDate < new Date() && this.status !== Todo.STATUS.COMPLETED;
}
toJSON() {
return {
_id: this._id,
title: this.title,
description: this.description,
status: this.status,
priority: this.priority,
dueDate: this.dueDate,
createdAt: this.createdAt,
updatedAt: this.updatedAt,
completedAt: this.completedAt,
tags: this.tags,
assignedTo: this.assignedTo
};
}
}
Validation Errors Class
// src/errors/index.js
class ValidationError extends Error {
constructor(message) {
super(message);
this.name = 'ValidationError';
this.status = 400;
}
}
class DatabaseError extends Error {
constructor(message, originalError = null) {
super(message);
this.name = 'DatabaseError';
this.status = 500;
this.originalError = originalError;
}
}
class NotFoundError extends Error {
constructor(message) {
super(message);
this.name = 'NotFoundError';
this.status = 404;
}
}
DAO
Class-Based with Constructor Injection
This style defines a BaseDAO class that provides a clean, reusable interface with methods like findOne, insertOne, and updateOne. Each DAO (like TodoDAO) extends BaseDAO, passing in a specific collection name
Pros:
Clean and reusable: BaseDAO is a good foundation, especially if you have multiple DAOs that follow similar structures.
Constructor validation: Ensures db and collectionName are provided, helping catch issues early.
Readability and maintainability: The CRUD operations are well-encapsulated and can be reused easily across different DAOs by extending BaseDAO.
Cons:
More boilerplate code
Every DAO must extend BaseDAO, which might limit flexibility for cases where a DAO might need additional initialization or a unique structure.
// src/daos/base.dao.js
class BaseDAO {
constructor(db, collectionName) {
if (!db) {
throw new Error('Database connection is required');
}
if (!collectionName) {
throw new Error('Collection name is required');
}
this.db = db;
this.collection = this.db.collection(collectionName);
}
async findOne(filter) {
try {
return await this.collection.findOne(filter);
} catch (error) {
throw new DatabaseError('Database query failed', error);
}
}
async find(filter = {}, options = {}) {
try {
return await this.collection.find(filter, options).toArray();
} catch (error) {
throw new DatabaseError('Database query failed', error);
}
}
async insertOne(data) {
try {
return await this.collection.insertOne(data);
} catch (error) {
throw new DatabaseError('Database insert failed', error);
}
}
async updateOne(filter, update, options = {}) {
try {
return await this.collection.updateOne(filter, update, options);
} catch (error) {
throw new DatabaseError('Database update failed', error);
}
}
async deleteOne(filter) {
try {
return await this.collection.deleteOne(filter);
} catch (error) {
throw new DatabaseError('Database delete failed', error);
}
}
}
// src/daos/todo.dao.js
class TodoDAO extends BaseDAO {
constructor(db) {
super(db, 'todos');
}
async create(todoData) {
const todo = new Todo(todoData);
const errors = todo.validate();
if (errors.length > 0) {
throw new ValidationError(errors.join(', '));
}
const todoToInsert = {
...todo.toJSON(),
title: todo.title.trim(),
createdAt: new Date(),
updatedAt: new Date()
};
try {
const result = await this.insertOne(todoToInsert);
return new Todo({ ...todoToInsert, _id: result.insertedId });
} catch (error) {
throw new DatabaseError('Failed to create todo', error);
}
}
async findById(id) {
if (!ObjectId.isValid(id)) {
throw new ValidationError('Invalid todo ID');
}
const todo = await this.findOne({ _id: new ObjectId(id) });
if (!todo) {
throw new NotFoundError('Todo not found');
}
return new Todo(todo);
}
async find(query = {}, options = {}) {
const {
page = 1,
limit = 10,
sortBy = 'createdAt',
sortOrder = -1,
status,
priority,
searchTerm,
fromDate,
toDate,
tags
} = options;
if (page < 1 || limit < 1) {
throw new ValidationError('Invalid pagination parameters');
}
const filter = this._buildFilter({
...query,
status,
priority,
searchTerm,
fromDate,
toDate,
tags
});
const skip = (page - 1) * limit;
const sortOptions = { [sortBy]: sortOrder };
try {
const [todos, totalCount] = await Promise.all([
this.collection
.find(filter)
.sort(sortOptions)
.skip(skip)
.limit(limit)
.toArray(),
this.collection.countDocuments(filter)
]);
return {
todos: todos.map(todo => new Todo(todo)),
pagination: {
total: totalCount,
page,
limit,
pages: Math.ceil(totalCount / limit)
}
};
} catch (error) {
throw new DatabaseError('Failed to fetch todos', error);
}
}
async update(id, updateData) {
const existingTodo = await this.findById(id);
const updatedTodo = new Todo({
...existingTodo,
...updateData,
_id: existingTodo._id,
updatedAt: new Date()
});
const errors = updatedTodo.validate();
if (errors.length > 0) {
throw new ValidationError(errors.join(', '));
}
const result = await this.updateOne(
{ _id: new ObjectId(id) },
{ $set: updatedTodo.toJSON() },
{ returnDocument: 'after' }
);
if (result.matchedCount === 0) {
throw new NotFoundError('Todo not found');
}
return updatedTodo;
}
async delete(id) {
if (!ObjectId.isValid(id)) {
throw new ValidationError('Invalid todo ID');
}
const result = await this.deleteOne({ _id: new ObjectId(id) });
if (result.deletedCount === 0) {
throw new NotFoundError('Todo not found');
}
return true;
}
_buildFilter(options) {
const filter = {};
if (options.status) {
if (!Object.values(Todo.STATUS).includes(options.status)) {
throw new ValidationError('Invalid status filter');
}
filter.status = options.status;
}
if (options.priority) {
if (!Object.values(Todo.PRIORITY).includes(options.priority)) {
throw new ValidationError('Invalid priority filter');
}
filter.priority = options.priority;
}
if (options.searchTerm) {
filter.$or = [
{ title: { $regex: options.searchTerm, $options: 'i' } },
{ description: { $regex: options.searchTerm, $options: 'i' } }
];
}
if (options.fromDate || options.toDate) {
filter.dueDate = {};
if (options.fromDate) {
filter.dueDate.$gte = new Date(options.fromDate);
}
if (options.toDate) {
filter.dueDate.$lte = new Date(options.toDate);
}
}
if (options.tags && Array.isArray(options.tags)) {
filter.tags = { $all: options.tags };
}
return filter;
}
}
Static Method with Injected Connection
Simplified Usage: Static methods remove the need to instantiate the class, reducing memory overhead.
Testing Challenges: Static methods are less flexible with dependency injection, making mocks and tests trickier.
Limited Reusability: Static methods don’t support inheritance, so shared logic across DAOs is harder to centralize.
import { ObjectId } from 'mongodb';
let todos;
export default class TodoDAO {
static async injectDB(conn) {
if (todos) return;
try {
todos = await conn.db(process.env.DB_NAME).collection('todos');
} catch (error) {
console.error(`Unable to establish collection handles in TodoDAO: ${error}`);
}
}
static async create(todoData) {
// Initialize and validate the Todo, build the data object, and insert into the collection
}
static async findById(id) {
// Validate ID, query by `_id`, throw `NotFoundError` if not found
}
static async find(query = {}, options = {}) {
// Construct filter, pagination, and sort options, execute the find query with pagination
}
static async update(id, updateData) {
// Fetch the todo, validate and update with `updateOne`
}
static async delete(id) {
// Validate ID, delete by `_id`, and handle `NotFoundError`
}
static _buildFilter(options) {
// Generate the filter object for queries based on status, priority, search terms, etc.
}
}
Service
// src/services/todo.service.js
class TodoService {
constructor(todoDAO) {
if (!todoDAO) {
throw new Error('TodoDAO is required');
}
this.todoDAO = todoDAO;
}
async createTodo(todoData) {
return await this.todoDAO.create(todoData);
}
async getTodoById(id) {
return await this.todoDAO.findById(id);
}
async updateTodo(id, updateData) {
return await this.todoDAO.update(id, updateData);
}
async deleteTodo(id) {
return await this.todoDAO.delete(id);
}
async updateTodoStatus(id, status) {
const todo = await this.todoDAO.findById(id);
const updates = {
status,
updatedAt: new Date()
};
if (status === Todo.STATUS.COMPLETED && todo.status !== Todo.STATUS.COMPLETED) {
updates.completedAt = new Date();
} else if (status !== Todo.STATUS.COMPLETED && todo.status === Todo.STATUS.COMPLETED) {
updates.completedAt = null;
}
return await this.todoDAO.update(id, updates);
}
async findTodos(options = {}) {
return await this.todoDAO.find({}, options);
}
async findOverdueTodos() {
const now = new Date();
return await this.todoDAO.find({
dueDate: { $lt: now },
status: { $ne: Todo.STATUS.COMPLETED }
});
}
async findTodosByPriority(priority) {
return await this.todoDAO.find({ priority });
}
async assignTodo(id, userId) {
return await this.todoDAO.update(id, {
assignedTo: userId,
updatedAt: new Date()
});
}
async addTags(id, tags) {
const todo = await this.todoDAO.findById(id);
const uniqueTags = [...new Set([...todo.tags, ...tags])];
return await this.todoDAO.update(id, { tags: uniqueTags });
}
async removeTags(id, tags) {
const todo = await this.todoDAO.findById(id);
const updatedTags = todo.tags.filter(tag => !tags.includes(tag));
return await this.todoDAO.update(id, { tags: updatedTags });
}
}
Indexing
Indexing in MongoDB improves query performance by creating efficient data structures for faster data retrieval.
Single Field Index: Index on a single field of a document.
Compound Index: Index on multiple fields.
Multikey Index: Index on array fields.
Text Index: Supports text search queries on string content.
Single Field Index
To create an index, you use the createIndex()
method.
db.collection('users').createIndex({ name: 1 });
Compound Index
A compound index in MongoDB is an index on multiple fields in a document.
db.collection('users').createIndex({ name: 1, age: 1, city: 1, country: 1, hobbies: 1 });
Prefixes of a compound index can be used to satisfy queries that match the index fields from left to right.
db.collection('users').find({ name: 'Alice', age: 25 });
db.collection('users').find({ name: 'Alice', age: 25, city: 'New York' });
Multikey Index
A multikey index in MongoDB is an index on an array field.
db.collection('users').createIndex({ hobbies: 1 });
Limitations of Multikey Indexes
- you cannot create a compound index if more than one field is an array.
db.collection('users').createIndex({ "hobbies": 1, "tags": 1 }); // Not allowed if both hobbies and tags are arrays
Text Index
Text indexes in MongoDB provide a way to perform text search queries on string content. Each collection can have at most one text index.
db.products.createIndex({
name: "text",
description: "text",
tags: "text"
});
// Sample Data
db.products.insertMany([
{
name: "MacBook Pro 16-inch",
description: "Powerful laptop for developers",
tags: ["apple", "laptop", "computer"]
},
{
name: "iPhone 15 Pro",
description: "Latest smartphone with great camera",
tags: ["apple", "smartphone", "mobile"]
},
{
name: "Gaming Laptop ROG",
description: "High-performance gaming laptop",
tags: ["asus", "laptop", "gaming"]
}
]);
Exact word matches
db.products.find({ $text: { $search: "laptop" }}); // Will find MacBook and ROG
Multiple words (OR)
// By default, multiple words are treated as logical OR
// Will find documents containing "laptop" or "gaming"
db.products.find({ $text: { $search: "laptop gaming" }});
Multiple Words (Logical AND)
// Using the plus sign to ensure both terms are present
// Will find documents containing both "laptop" and "gaming"
db.products.find({ $text: { $search: "+laptop +gaming" }});
Exact phrases
// Using double quotes to search for an exact phrase
// Will find documents containing the exact phrase "gaming laptop"
db.products.find({ $text: { $search: "\"gaming laptop\"" }});
Word exclusions db.products.find({ $text: { $search: "laptop -gaming" }}); // Laptops but not gaming ones
Case-insensitive matches
// Text search is case-insensitive by default
// Will find all documents containing "laptop" regardless of case
db.products.find({ $text: { $search: "LAPTOP" }});
Diacritic-insensitive
// Text search ignores diacritic marks by default
// Will match "cafe" and "café"
db.products.find({ $text: { $search: "café" }});
Partial Word Matches
// Searching for "lap" won't find "laptop"
db.products.find({ $text: { $search: "lap" }});
Wildcard Searches
// Wildcards are not supported in text search
db.products.find({ $text: { $search: "lap*" }});
Fuzzy Matching (Typos)
// Regular expressions cannot be used inside $text search
db.products.find({ $text: { $search: "/lap.*/" }});
Complex Boolean Expressions
// Nested boolean logic is not supported in $text search
db.products.find({ $text: { $search: "(laptop AND gaming) OR (smartphone AND camera)" }});
Case-Sensitive Searches
// Cannot perform case-sensitive searches using $text
db.products.find({ $text: { $search: "MacBook", caseSensitive: true }});
Atlas Search
Atlas Search is a fully managed search service that allows you to build fast, relevant, full-text search capabilities on top of your MongoDB data. Compare to text indexes, Atlas Search provides more advanced features like faceted search, fuzzy matching, and relevance scoring.
// Basic Atlas Search Setup
db.products.createSearchIndex({
"mappings": {
"dynamic": true,
"fields": {
"name": {
"type": "string",
"analyzer": "lucene.standard"
},
"description": {
"type": "string",
"analyzer": "lucene.english"
}
}
}
});
Fuzzy Matching - Handles typos and misspellings
db.products.aggregate([
{
$search: {
text: {
query: "labtop", // Will match "laptop"
path: "name",
fuzzy: { maxEdits: 1 }
}
}
}
]);
Autocomplete - Real-time suggestions
db.products.aggregate([
{
$search: {
autocomplete: {
query: "mac", // Will suggest "macbook", "machine", etc.
path: "name"
}
}
}
]);
Custom Scoring - Boost relevance
Can boost the relevance of certain documents based on specific criteria.
db.products.aggregate([
{
$search: {
compound: {
must: [
{ text: { query: "laptop", path: "name" } }
],
should: [
{
text: {
query: "gaming",
path: "category",
score: { boost: { value: 2.0 } } // Boost gaming laptops
}
}
]
}
}
}
]);
Documents must contain "laptop" in the name field
Documents containing "gaming" in the description field are boosted.
Multi-language Support
db.products.aggregate([
{
$search: {
text: {
query: "ordinateur portable", // French for "laptop"
path: "description",
analyzer: "lucene.french"
}
}
}
]);
Faceted Search - Filter results
Faceted search allows users to filter search results based on categories, price ranges, and other attributes.
db.products.aggregate([
{
$searchMeta: {
facet: {
operator: { text: { query: "laptop", path: "description" } },
facets: {
categories: { type: "string", path: "category" },
priceRanges: {
type: "number",
path: "price",
boundaries: [500, 1000, 2000]
}
}
}
}
}
]);
**Facets:**
- categoryFacet: Groups results by the category field.
- priceFacet: Groups results into price ranges defined by the boundaries.
Trade-offs
Besides the benefits, Atlas Search also has some trade-offs compared to traditional text indexes.
Requires M10+ clusters
Additional cost
More complex setup
Higher resource usage
Array Operations
MongoDB supports a variety of array operations for working with arrays in documents.
Adding Elements to an Array
To add elements to an array in a document, you use the $push
operator.
db.collection('users').updateOne({ name: 'Alice' }, { $push: { hobbies: 'Reading' } });
Querying Arrays with $elemMatch
Given a collection of students with scores in different subjects:
{
"_id": ObjectId("64d39a7a8b0e8c284a2c1234"),
"name": "Alice",
"scores": [
{ "subject": "Math", "score": 95 },
{ "subject": "English", "score": 88 }
]
},
{
"_id": ObjectId("64d39a808b0e8c284a2c1235"),
"name": "Bob",
"scores": [
{ "subject": "Math", "score": 78 },
{ "subject": "English", "score": 92 }
]
}
Querying with $elemMatch:
To find students who specifically scored above 90 in Math, we need $elemMatch
:
db.students.find({
scores: {
$elemMatch: { subject: "Math", score: { $gt: 90 } }
}
})
Result: This will return only Alice, as she is the only student with a score above 90 in the "Math" subject. $elemMatch
ensures that all the conditions within the array element must be met.
Without $elemMatch
- This will return both Alice and Bob, as they both have scores above 90 in different subjects and not necessarily in the "Math" subject.
Add Unique - $addToSet
The $addToSet
operator in MongoDB is used to add elements to an array only if they are not already present. This prevents duplicate entries in the array.
db.collection('users').updateOne({ name: 'Alice' }, { $addToSet: { hobbies: 'Reading' } });
Add Multiple $push
and $each
The $push
operator in MongoDB is used to add elements to an array. The $each
modifier allows you to add multiple elements to the array.
db.collection('users').updateOne({ name: 'Alice' }, { $push: { hobbies: { $each: ['Reading', 'Swimming'] } } });
Add Sorted - $push
and $sort
The $push
operator in MongoDB is used to add elements to an array. The $sort
modifier allows you to sort the array elements.
db.collection('users').updateOne({ name: 'Alice' }, { $push: { scores: { $each: [85, 90], $sort: -1 } } });
Push and Sort Array of Objects by a specific field
db.collection('users').updateOne(
{ name: 'Alice' },
{
$push: {
scores: {
$each: [
{ score: 85, date: "2023-03-01" },
{ score: 90, date: "2023-04-01" }
],
$sort: { date: -1 }
}
}
}
);
Add Limited - $push
and $slice
The $push
operator in MongoDB is used to add elements to an array. The $slice
modifier allows you to limit the number of elements in the array.
db.collection('users').updateOne({ name: 'Alice' }, { $push: { scores: { $each: [85, 90], $slice: -3 } } });
If Alice currently has a scores array like [70, 75, 80], this query will push 85 and 90, making it [70, 75, 80, 85, 90]. The $slice: -3 will then trim it to the last 3 elements, resulting in [80, 85, 90].
Removing Elements from an Array
To remove elements from an array in a document, you use the $pull
operator.
db.collection('users').updateOne({ name: 'Alice' }, { $pull: { hobbies: 'Reading' } });
Remove the first or last element - $pop
The $pop
operator in MongoDB is used to remove the first or last element from an array.
db.collection('users').updateOne({ name: 'Alice' }, { $pop: { hobbies: 1 } });
Remove Multiple - $pullAll
The $pullAll
operator in MongoDB is used to remove all occurrences of specified values from an array.
db.collection('users').updateOne({ name: 'Alice' }, { $pullAll: { hobbies: ['Reading', 'Swimming'] } });
Remove Multiple - $pull
and $in
The $pull
operator in MongoDB is used to remove elements from an array. The $in
modifier allows you to specify multiple values to remove.
db.collection('users').updateOne({ name: 'Alice' }, { $pull: { hobbies: { $in: ['Reading', 'Swimming'] } } });
Remove Condition - $pull
and $gt
The $pull
operator in MongoDB is used to remove elements from an array. The $gt
modifier allows you to specify a condition for removing elements.
db.collection('users').updateOne({ name: 'Alice' }, { $pull: { scores: { $gt: 85 } } });
Remove Not Equal - $pull
and $ne
The $pull
operator in MongoDB is used to remove elements from an array. The $ne
modifier allows you to specify a condition for removing elements that are not equal to a value.
db.collection('users').updateOne({ name: 'Alice' }, { $pull: { scores: { $ne: 85 } } });
Update Condition - $set
and $
The $set
operator in MongoDB is used to update fields in a document. The $
positional operator allows you to update the first element that matches a condition in an array.
db.collection('users').updateOne({ name: 'Alice', 'scores.subject': 'Math' }, { $set: { 'scores.$.score': 90 } });
Update All - $[]
The $[]
operator in MongoDB is used to update all elements in an array that match a condition.
db.collection('users').updateOne({ name: 'Alice' }, { $set: { 'scores.$[].score': 90 } });
To increment a field in all elements of an array, you can use the $[]
operator with the $inc
operator.
db.collection('users').updateOne({ name: 'Alice' }, { $inc: { 'scores.$[].score': 5 } });
Update Filtered - $[<identifier>]
The $[<identifier>]
operator in MongoDB is used to update elements in an array that match a condition.
db.collection('users').updateOne({ name: 'Alice' }, { $set: { 'scores.$[elem].score': 90 } }, { arrayFilters: [{ 'elem.subject': 'Math' }] });
Aggregation
Aggregation operations process data records and return computed results. Aggregation allows you to perform complex data processing and transformation.
Aggregation Pipeline
The aggregation framework in MongoDB uses a pipeline approach, where multiple stages transform the documents.
db.collection('orders').aggregate([
{ $match: { status: 'A' } },
{ $group: { _id: '$cust_id', total: { $sum: '$amount' } } },
{ $sort: { total: -1 } }
]);
Join Collections
MongoDB does not support joins like relational databases. Instead, you can use the $lookup
operator to perform a left outer join between two collections.
db.collection('orders').aggregate([
{
$lookup: {
from: 'customers',
localField: 'cust_id',
foreignField: '_id',
as: 'customer'
}
}
]);
Unwind Arrays
The $unwind
operator in MongoDB is used to deconstruct an array field into multiple documents.
db.collection('orders').aggregate([
{ $unwind: '$items' }
]);
Unwind and Group
db.collection('orders').aggregate([
{ $unwind: '$items' },
{
$group: {
_id: '$items.productId',
totalQuantity: { $sum: '$items.quantity' },
totalRevenue: { $sum: { $multiply: ['$items.price', '$items.quantity'] } },
ordersCount: { $sum: 1 }
}
}
]);
Unwind Multiple Arrays
db.collection('restaurants').aggregate([
{ $unwind: '$categories' },
{ $unwind: '$reviews' },
{
$group: {
_id: '$categories',
averageRating: { $avg: '$reviews.rating' },
reviewCount: { $sum: 1 }
}
}
]);
Group Documents
The $group
operator in MongoDB is used to group documents by a specified key.
db.collection('orders').aggregate([
{ $group: { _id: '$cust_id', total: { $sum: '$amount' } } }
]);
Group and Count
db.collection('orders').aggregate([
{ $group: { _id: '$status', count: { $sum: 1 } } }
]);
Group and Sum
db.collection('orders').aggregate([
{ $group: { _id: '$status', total: { $sum: '$amount' } } }
]);
Group and Average
db.collection('orders').aggregate([
{ $group: { _id: '$status', average: { $avg: '$amount' } } }
]);
Group and Push
db.collection('orders').aggregate([
{ $group: { _id: '$cust_id', items: { $push: '$item' } } }
]);
Group with Multiple Fields
db.collection('orders').aggregate([
{
$group: {
_id: {
status: '$status',
category: '$category'
},
count: { $sum: 1 },
totalAmount: { $sum: '$amount' }
}
}
]);
Group with Date Operations
db.collection('orders').aggregate([
{
$group: {
_id: {
year: { $year: '$orderDate' },
month: { $month: '$orderDate' }
},
totalOrders: { $sum: 1 },
revenue: { $sum: '$amount' }
}
}
]);
Project Fields
The $project
operator in MongoDB is used to include, exclude, or rename fields in the output documents.
db.collection('orders').aggregate([
{ $project: { _id: 0, cust_id: 1, amount: 1 } }
]);
Project with Computed Fields
db.collection('orders').aggregate([
{
$project: {
_id: 0,
cust_id: 1,
amount: 1,
discount: { $subtract: ['$total', '$amount'] }
}
}
]);
Project With Array Operations
db.collection('orders').aggregate([
{
$project: {
orderId: 1,
itemCount: { $size: '$items' },
firstItem: { $arrayElemAt: ['$items', 0] },
lastItem: { $arrayElemAt: ['$items', -1] },
items: {
$map: {
input: '$items',
as: 'item',
in: {
name: '$$item.name',
subtotal: {
$multiply: ['$$item.price', '$$item.quantity']
}
}
}
}
}
}
]);
Project With String Operations
db.collection('users').aggregate([
{
$project: {
fullName: { $concat: ['$firstName', ' ', '$lastName'] },
email: { $toLower: '$email' },
age: { $toString: '$age' }
}
}
]);
Project With Conditional Fields
db.collection('users').aggregate([
{
$project: {
name: 1,
status: {
$cond: {
if: { $gte: ['$age', 18] },
then: 'Adult',
else: 'Minor'
}
}
}
}
]);
Run Multiple Aggregations
You can run multiple aggregation pipelines in a single query using the $facet
operator.
db.collection('orders').aggregate([
{
$facet: {
totalAmount: [
{ $group: { _id: null, total: { $sum: '$amount' } } }
],
averageAmount: [
{ $group: { _id: null, average: { $avg: '$amount' } } }
]
}
}
]);
Transactions
Transactions in MongoDB allow you to perform multiple operations as a single, all-or-nothing unit of work. They ensure data integrity and consistency across multiple documents and collections.
Transaction Properties (ACID)
Using Transactions
To use transactions in MongoDB, you typically follow these steps:
Start a session
Start a transaction
Perform operations
Commit or abort the transaction
// Define a client
const { MongoClient } = require('mongodb');
const client = new MongoClient('mongodb://localhost:27017');
//...
// Start a session
const session = client.startSession();
try {
session.startTransaction();
// Perform multiple operations
await collection1.updateOne({ _id: 1 }, { $set: { status: 'processing' } }, { session });
await collection2.insertOne({ orderId: 1, items: ['item1', 'item2'] }, { session });
// Commit the transaction
await session.commitTransaction();
} catch (error) {
// If an error occurred, abort the transaction
await session.abortTransaction();
console.error('Transaction aborted:', error);
} finally {
// End the session
session.endSession();
}
Considerations for Transactions
Performance: Transactions may impact performance, especially for write-heavy workloads.
Timeout: Transactions have a default timeout of 60 seconds.
Replica Sets: Transactions require a replica set configuration.
Sharded Clusters: Transactions on sharded clusters have additional considerations and limitations.
By using transactions, you can ensure data consistency and integrity across multiple operations in MongoDB, especially when dealing with complex data models or critical business logic.
Replica Sets
A replica set is a group of MongoDB instances that maintain the same data set. Replica sets provide redundancy and high availability.
Components of a Replica Set
Primary: Receives all write operations.
Secondary: Replicates data from the primary. Can be used for read operations.
Arbiter: Participates in elections for primary but does not hold data.
Replica Set Configuration
To configure a replica set, you use the rs.initiate()
method.
rs.initiate({
_id: 'rs0',
members: [
{ _id: 0, host: 'mongo1:27017' },
{ _id: 1, host: 'mongo2:27017' },
{ _id: 2, host: 'mongo3:27017', arbiterOnly: true }
]
});
Read Preference
Read preference in MongoDB determines how read operations are distributed across the replica set.
Primary: Reads from the primary.
Secondary: Reads from the secondary.
PrimaryPreferred: Reads from the primary if available, otherwise from the secondary.
SecondaryPreferred: Reads from the secondary if available, otherwise from the primary.
Nearest: Reads from the nearest member of the replica set.
db.collection('users').find().readPref('secondary');
Write Concern
Write concern in MongoDB determines the level of acknowledgment for write operations.
w: 0: No acknowledgment.
w: 1: Acknowledgment from the primary.
w: majority: Acknowledgment from the majority of the replica set.
db.collection('users').insertOne({ name: 'Alice' }, { writeConcern: { w: 'major
Automatic Failover
MongoDB uses a heartbeat mechanism to detect the availability of replica set members. If the primary becomes unavailable, a new primary is elected.
Primary
is elected based on the number of votes from the replica set members.Secondary
can be promoted to primary if the primary is unavailable.Arbiter
is used to break the tie in elections.
Manual Failover
You can initiate a manual failover in MongoDB by forcing a replica set member to become the primary.
rs.stepDown();
Sharding
Sharding is a method for distributing data across multiple machines. It allows you to scale horizontally by adding more machines to your system.
A Collection is divided into chunks, and each chunk is stored on a different shard.
Each Shard is a subset of the data in a sharded cluster.
Components of Sharding
Shard: A subset of the data in a sharded cluster.
Config Server: Stores metadata and configuration settings for the cluster.
Query Router: Routes queries to the appropriate shard.
Sharding Key
The sharding key is the field used to distribute data across the shards. It should be chosen carefully to ensure a balanced distribution of data.
db.collection.createIndex({ _id: 'hashed' });
When selecting a shard key, consider the following factors:
Cardinality: The number of unique values in the shard key.
Write Scaling: The ability to distribute write operations across shards.
Query Isolation: The ability to target specific shards for read operations.
Shard Key Strategies
Hashed Sharding: Distributes data evenly across the shards using a hash function.
Range Sharding: Distributes data based on a range of values in the shard key.
Compound Sharding: Distributes data based on multiple fields in the shard key.
db.collection.createIndex({ _id: 'hashed' });
db.collection.createIndex({ date: 1 });
db.collection.createIndex({ country: 1, city: 1 });
Mongoose
Mongoose is an Object Data Modeling (ODM) library for MongoDB and Node.js. It provides a schema-based solution to model your application data.
Connecting to MongoDB
To connect to MongoDB using Mongoose, you use the connect()
method.
const mongoose = require('mongoose');
mongoose.connect('mongodb://localhost:27017/myapp', { useNewUrlParser: true, useUnifiedTopology: true });
Defining a Schema
A Mongoose schema defines the structure of the documents in a collection.
const userSchema = new mongoose.Schema({
name: String,
age: Number
});
Creating a Model
A Mongoose model is a class that represents a collection in MongoDB.
const User = mongoose.model('User', userSchema);
Inserting Documents
To insert a document into a collection, you create an instance of the model and call the save()
method.
const user = new User({ name: 'Alice', age: 25 });
user.save();
Querying Documents
To query documents from a collection, you use the find()
method.
User.find({ name: 'Alice' });
With Projection:
User.find({ name: 'Alice' }, { name: 1, age: 1 });
Updating Documents
To update documents in a collection, you use the updateOne()
method.
User.updateOne({ name: 'Alice' }, { age: 26 });
Deleting Documents
To delete documents from a collection, you use the deleteOne()
method.
User.deleteOne({ name: 'Alice' });
Middleware
Mongoose middleware are functions that are executed before or after certain operations.
userSchema.pre('save', function(next) {
console.log('Saving user...');
next();
});
Virtuals
Mongoose virtuals are document properties that you can get and set but that do not get persisted to MongoDB.
userSchema.virtual('fullName').get(function() {
return this.name + ' ' + this.age;
});
Plugins
Mongoose plugins are reusable pieces of schema middleware that can be added to any schema.
const timestampPlugin = require('./plugins/timestamp');
userSchema.plugin(timestampPlugin);
Transactions
Mongoose transactions allow you to perform multiple operations on multiple documents in a single transaction.
const session = await mongoose.startSession();
session.startTransaction();
try {
await User.create({ name: 'Alice' }, { session });
await User.create({ name: 'Bob' }, { session });
await session.commitTransaction();
} catch (error) {
await session.abortTransaction();
} finally {
session.endSession();
}
Aggregation
Mongoose provides a fluent API for building aggregation pipelines.
const result = await User.aggregate([
{ $match: { name: 'Alice' } },
{ $group: { _id: '$name', total: { $sum: '$age' } }
]);
Indexes
Mongoose allows you to define indexes on your schemas.
userSchema.index({ name: 1 });
Population
Mongoose population allows you to reference documents in other collections.
const userSchema = new mongoose.Schema({
name: String,
posts: [{ type: mongoose.Schema.Types.ObjectId, ref: 'Post' }]
});
Validation
Mongoose provides built-in validation for schema fields.
const userSchema = new mongoose.Schema({
name: { type: String, required: true }
});
Patterns
Pattern | Description | Use Case | Advantages | Disadvantages |
Bucket Pattern | Groups related documents into fixed-size "buckets" or arrays | Time-series data, IoT sensor readings | - Reduces number of documents |
- Improves query performance for range scans | - Complex to update individual items
- May lead to document growth |
| Attribute Pattern | Stores a set of fields with similar access patterns as an embedded document | Products with varying attributes | - Flexible schema
- Efficient querying of common attributes | - More complex queries for specific attributes
- Potential for unused fields |
| Outlier Pattern | Stores common data in one collection and rare, oversized data in another | Social media posts with varying engagement levels | - Optimizes for common case performance
- Prevents document size issues | - Requires two queries for outliers
- More complex application logic |
| Subset Pattern | Stores a subset of fields from a document in a separate collection | User profiles with frequently accessed fields | - Improves read performance for common queries
- Reduces working set size | - Data duplication
- Requires keeping subsets in sync |
Q&A
What is .exec() in Mongoose?
The exec()
function in Mongoose is used to execute a query and return a promise. It allows you to chain query methods and then execute the query at the end.
User.find({ name: 'Alice' }).exec();
You can run the query without exec()
, by callback or using async/await.
User.find({ name: 'Alice' }, (error, users) => {
console.log(users);
});
const users = await User.find({ name: 'Alice' });
What is the difference between findOne()
and find()
in Mongoose?
find()
: Returns an array of all documents that match the query criteria.findOne()
: Returns the first document that matches the query criteria.
What is the difference between Model.create() and new Model().save() in Mongoose?
Model.create()
: Creates a new document and saves it to the database in a single step.
User.create({ name: 'Alice' });
new Model().save()
: reates a new instance of the model but doesn't save it to the database immediately. You can modify the instance, perform validations, or run any other operations before calling .save() to persist the changes.
const doc = new Model({ name: 'John', age: 30 });
doc.age = 31; // Modify the document
await doc.save(); // Save the document after modification
What is the purpose of the lean() method in Mongoose queries, and when should it be used?
The lean()
method in Mongoose queries returns plain JavaScript objects instead of Mongoose documents which come with a lot of additional features, such as getters, setters, and methods that are useful for working with the document . It should be used when you don't need the full Mongoose document features and want to improve query performance.
User.find({ name: 'Alice' }).lean();
How to implement soft deletes in Mongoose?
Soft deletes in Mongoose involve marking documents as deleted instead of physically removing them from the database. You can achieve this by adding a deleted
field to your schema and setting it to true when a document is deleted.
const userSchema = new mongoose.Schema({
name: String,
deleted: { type: Boolean, default: false }
});
Use pre middleware to exclude deleted documents from query results.
userSchema.pre(/^find/, function(next) {
this.where({ deleted: false });
next();
});
Add a method to "soft delete" a document.
userSchema.methods.softDelete = function() {
this.deleted = true;
return this.save();
};