# Database & Query Optimization Guide

Techniques used in this project and recommendations for loading data efficiently.

---

## 1. Avoid N+1 queries (batch lookups)

**Problem:** Looping over items and querying the DB per item causes N+1 round-trips.

**Solution:** Collect IDs, then run one or two bulk queries and map in memory.

**Example (Documents list):**
- Before: For each document, run `Person.findById` + `Participant.findOne` → 2N queries.
- After: Get all unique `personId`s → `Person.find({ _id: { $in: personIds } })` + `Participant.find({ person_id: { $in: personIds } })` → 2 queries total, then build a Map and resolve each doc in memory.

---

## 2. Use `.lean()` for read-only lists

**When:** You only need plain objects (no Mongoose methods, no change tracking).

**Effect:** Skips building full documents; faster and lower memory.

```js
const list = await Model.find(filter).lean();
```

---

## 3. Use `.select()` to limit fields

**When:** You don’t need all fields (e.g. list views).

**Effect:** Less data over the wire and less work for MongoDB.

```js
const persons = await Person.find({ _id: { $in: ids } })
  .select("name lname email")
  .lean();
```

---

## 4. Indexes on frequently queried fields

Indexes are defined in the models. Ensure they exist in MongoDB (they are created on first use or via migration).

**Document:** `personId`, `lastUpdated`, `status`  
**Participant:** `person_id`, `competition_id`, `group_id`, `is_deleted`, compound `(person_id, competition_id)`  
**Person:** `email` (unique), `(name, lname)` for search

**Check indexes in Mongo shell:**
```js
db.documents.getIndexes();
db.participants.getIndexes();
db.people.getIndexes();
```

---

## 5. Pagination for large lists

**When:** Lists can grow large (e.g. participants, persons).

**Pattern:**
```js
const page = Math.max(1, parseInt(req.query.page) || 1);
const limit = Math.min(100, Math.max(1, parseInt(req.query.limit) || 20));
const skip = (page - 1) * limit;
const [items, total] = await Promise.all([
  Model.find(filter).sort({ createdAt: -1 }).skip(skip).limit(limit).lean(),
  Model.countDocuments(filter),
]);
res.json({ data: items, total, page, limit });
```

---

## 6. Use `limit` on list endpoints

**When:** Frontend doesn’t need the entire set at once.

**Example:** Participants route already supports `?limit=500`. Prefer a reasonable default (e.g. 100–500) and optional pagination.

---

## 7. Projection in aggregation

**When:** You need computed fields or heavy joins.

**Example:** Use `$lookup` once and `$project` to shape the result instead of multiple `populate()` calls or N+1.

---

## 8. Caching (optional)

**When:** Data changes infrequently (e.g. countries, org types).

**Options:** In-memory cache with TTL, or Redis. Cache list responses and invalidate on create/update/delete.

---

## Summary checklist

- [x] Documents list: batch Person + Participant (no N+1)
- [x] Indexes on Document, Participant, Person
- [ ] Add pagination where lists can be large (participants, persons, changelogs)
- [ ] Use `.lean()` and `.select()` on list routes that don’t already
- [ ] Consider caching for rarely changing reference data
