Skip to main content

Rate limits ⏱️

Mappa uses rate limiting to ensure fair usage and maintain service quality for everyone. Here's how to work with them.

Quick reference​

EndpointLimitNotes
File uploads (POST /v1/files)10/minLarge files count as 1 request
Report jobs (POST /v1/reports/jobs)20/minPer API key
Job status (GET /v1/jobs/:id)60/minUse webhooks instead!
File operations30/minGet, list, delete
Entity operations30/minGet, list, tag
Credit checks100/minBalance and transactions
Everything else100/minGeneral limit
Pro Tip

Use webhooks instead of polling job statusβ€”saves 90% of your rate limit budget!


Reading rate limit headers​

Every response includes rate limit information:

X-RateLimit-Limit: 100        # Total requests allowed per window
X-RateLimit-Remaining: 95 # Requests remaining in this window
X-RateLimit-Reset: 1704067200 # Unix timestamp when limit resets

Example:

const response = await fetch("https://api.mappa.ai/v1/credits", {
headers: { "Mappa-Api-Key": apiKey },
});

const limit = response.headers.get("X-RateLimit-Limit");
const remaining = response.headers.get("X-RateLimit-Remaining");
const reset = response.headers.get("X-RateLimit-Reset");

console.info(`${remaining}/${limit} requests remaining`);
console.info(`Resets at ${new Date(Number.parseInt(reset!) * 1000)}`);

Handling 429 responses​

When you exceed the limit, you'll get a 429 Too Many Requests response:

{
"error": {
"code": "RATE_LIMITED",
"message": "Rate limit exceeded for this endpoint",
"retryAfter": 60
}
}

SDK handling (automatic)​

The SDK retries automatically with exponential backoff:

import { Mappa } from "@mappa-ai/mappa-node";

const mappa = new Mappa({
apiKey: process.env.MAPPA_API_KEY!,
maxRetries: 3, // Will retry on 429
});

// SDK handles rate limits automatically
const job = await mappa.reports.createJob({
media: { mediaId: "media_123" },
output: { template: "general_report" },
target: { strategy: "dominant" },
});

Manual handling (REST API)​

If using the REST API directly:

async function withRetry<T>(fn: () => Promise<T>, maxRetries = 3): Promise<T> {
for (let i = 0; i < maxRetries; i++) {
try {
return await fn();
} catch (err) {
if (err.status === 429) {
const retryAfter = err.retryAfter || Math.pow(2, i) * 1000;
console.warn(`Rate limited. Retrying after ${retryAfter}ms`);
await sleep(retryAfter);
continue;
}
throw err;
}
}
throw new Error("Max retries exceeded");
}

// Usage
const job = await withRetry(() =>
fetch("https://api.mappa.ai/v1/reports/jobs", {
method: "POST",
headers: {
"Mappa-Api-Key": apiKey,
"Content-Type": "application/json",
},
body: JSON.stringify({...}),
})
);

Best practices​

1. Use webhooks, not polling​

❌ Bad: Polling wastes your rate limit

// Polls 120 times for a 4-minute job (720 requests/hour!)
while (true) {
const job = await mappa.jobs.get(jobId); // Uses rate limit
if (job.status === "COMPLETED") break;
await sleep(2000);
}

βœ… Good: Webhooks save your rate limit

// Single request, webhook notifies when done
const job = await mappa.reports.createJob({
...params,
webhook: { url: "https://yourapp.com/webhooks/mappa" },
target: { strategy: "dominant" },
});
// No polling needed!

Set up webhooks β†’

2. Monitor rate limit headers​

async function makeRequest(url: string) {
const response = await fetch(url, {
headers: { "Mappa-Api-Key": process.env.MAPPA_API_KEY! },
});

const remaining = Number.parseInt(
response.headers.get("X-RateLimit-Remaining") || "0"
);

if (remaining < 10) {
console.warn(`⚠️ Only ${remaining} requests remaining!`);
// Alert your team or throttle requests
}

return response.json();
}

3. Implement request queuing​

For high-volume apps, queue requests to stay within limits:

class RateLimitedQueue {
private queue: Array<() => Promise<any>> = [];
private processing = false;
private requestsPerMinute = 20;
private interval = 60000 / this.requestsPerMinute; // ms between requests

async add<T>(fn: () => Promise<T>): Promise<T> {
return new Promise((resolve, reject) => {
this.queue.push(async () => {
try {
const result = await fn();
resolve(result);
} catch (err) {
reject(err);
}
});

if (!this.processing) {
this.process();
}
});
}

private async process() {
this.processing = true;

while (this.queue.length > 0) {
const fn = this.queue.shift()!;
await fn();
await sleep(this.interval);
}

this.processing = false;
}
}

// Usage
const queue = new RateLimitedQueue();

// Queues requests to respect rate limit
const job1 = await queue.add(() => mappa.reports.createJob({...}));
const job2 = await queue.add(() => mappa.reports.createJob({...}));

4. Cache responses​

Don't re-fetch data that hasn't changed:

const cache = new Map<string, { data: any; expiry: number }>();

async function getCreditBalanceWithCache() {
const key = "credit-balance";
const cached = cache.get(key);

if (cached && Date.now() < cached.expiry) {
return cached.data; // Return cached data
}

// Fetch fresh data
const balance = await mappa.credits.getBalance();

// Cache for 5 minutes
cache.set(key, {
data: balance,
expiry: Date.now() + 300000,
target: { strategy: "dominant" },
});

return balance;
}

5. Batch operations​

Process multiple items concurrently (but within limits):

async function batchUploadFiles(files: File[]) {
const BATCH_SIZE = 5; // Stay under 10/min limit

for (let i = 0; i < files.length; i += BATCH_SIZE) {
const batch = files.slice(i, i + BATCH_SIZE);

// Upload batch concurrently
const uploads = await Promise.all(
batch.map(file => mappa.files.upload({ file }))
);

console.info(`Uploaded batch ${i / BATCH_SIZE + 1}`);

// Wait 1 minute before next batch (if more batches remain)
if (i + BATCH_SIZE < files.length) {
await sleep(60000);
}
}
}

Increasing limits​

Need higher limits for production? Contact us with:

  1. Use case - What you're building
  2. Volume - Expected requests per minute/hour
  3. Timeline - When you need increased limits

We're happy to work with production apps that need higher throughput!


Debugging rate limit issues​

Check your current usage​

// Track requests in your app
let requestCount = 0;
let windowStart = Date.now();

async function trackRequest<T>(fn: () => Promise<T>): Promise<T> {
requestCount++;

const elapsed = Date.now() - windowStart;

if (elapsed >= 60000) {
console.info(`Made ${requestCount} requests in last minute`);
requestCount = 0;
windowStart = Date.now();
}

return await fn();
}

// Usage
const job = await trackRequest(() => mappa.reports.createJob({...}));

Common causes of rate limiting​

IssueSolution
Polling job statusUse webhooks instead
Uploading many filesBatch uploads with delays
Multiple API keysConsolidate to one key per environment
Retry loops without backoffImplement exponential backoff
Not checking headersMonitor X-RateLimit-Remaining

What's next? πŸš€β€‹

Optimize your integration:

Need help?