The Hidden 100-300ms Tax: How CORS Preflight Requests Are Slowing Your App

Your web application is probably 100-300ms slower than it needs to be, and you might not even know it. The culprit? CORS preflight requests. Every time your frontend makes a cross-origin API call, the browser might be adding an invisible round trip that doubles your perceived latency. Even requests to your own API on a subdomain (api.yoursite.com from yoursite.com) can trigger this penalty. As documented in this FreeCodeCamp analysis, preflight requests can add up to 500ms of delay even when your API responds in under 10ms. In this post, we'll dive deep into why this happens and explore every approach to eliminate this hidden performance tax.
Understanding the Problem: What Are Preflight Requests?
Before we fix the problem, let's understand exactly what's happening under the hood.
The Same-Origin Policy
Browsers enforce the Same-Origin Policy as a security measure. An origin consists of three parts:
- Protocol:
https:// - Domain:
example.com - Port:
:443
If any of these differ between your frontend and your API, you're making a cross-origin request. This includes:
app.example.com→api.example.com(different subdomain)example.com→api.example.com(subdomain difference)localhost:3000→localhost:8080(different port)http://→https://(different protocol)
When Preflight Requests Trigger
CORS preflight is an OPTIONS request that the browser automatically sends before your actual request to ask the server: "Is this cross-origin request allowed?" This behavior is defined in the WHATWG Fetch Standard.
A preflight request is triggered when your request is NOT a "simple request." According to the MDN CORS documentation, a request is considered simple only if ALL of these conditions are met:
- Method is one of:
GET,HEAD, orPOST - Headers are limited to:
Accept,Accept-Language,Content-Language,Content-Type - Content-Type (if present) is one of:
application/x-www-form-urlencodedmultipart/form-datatext/plain
Any deviation triggers a preflight. Common triggers include:
// These ALL trigger preflight requests:
// Custom headers
fetch('https://api.example.com/data', {
headers: {
'Authorization': 'Bearer token', // Custom header!
'X-Custom-Header': 'value' // Custom header!
}
});
// JSON content type
fetch('https://api.example.com/data', {
method: 'POST',
headers: {
'Content-Type': 'application/json' // Not a "simple" content type!
},
body: JSON.stringify({ key: 'value' })
});
// Non-simple methods
fetch('https://api.example.com/data', {
method: 'PUT' // Not GET, HEAD, or POST!
});
fetch('https://api.example.com/data', {
method: 'DELETE'
});
The Real-World Impact
Here's what a typical API call looks like with preflight:
Timeline (Cross-Origin with Preflight):
├── 0ms - Browser sends OPTIONS request
├── 100ms - Server responds to OPTIONS
├── 100ms - Browser sends actual GET/POST request
├── 200ms - Server responds with data
└── Total: 200ms
Timeline (Same-Origin, No Preflight):
├── 0ms - Browser sends GET/POST request
├── 100ms - Server responds with data
└── Total: 100ms
That's 2x the latency for every single API call. On mobile networks or with distant servers, this can easily be 200-400ms of added latency per request.
Diagnosing Preflight Requests in Your App
Before optimizing, you need to identify which requests are triggering preflights.
Using Chrome DevTools
- Open DevTools → Network tab
- Filter by "Fetch/XHR"
- Look for paired requests: an
OPTIONSrequest immediately followed by your actual request - The "Timing" tab shows the exact breakdown
Using the Performance API
// Add this to your app to log all preflight requests
const observer = new PerformanceObserver((list) => {
for (const entry of list.getEntries()) {
if (entry.initiatorType === 'fetch' || entry.initiatorType === 'xmlhttprequest') {
// Check if this might be a preflight by looking at the timing
const timeToFirstByte = entry.responseStart - entry.requestStart;
console.log(`${entry.name}: ${timeToFirstByte.toFixed(0)}ms TTFB`);
}
}
});
observer.observe({ entryTypes: ['resource'] });
Quick Audit Script
Run this in your browser console to find cross-origin requests:
// Intercept and log all fetch requests
const originalFetch = window.fetch;
window.fetch = async (...args) => {
const url = new URL(args[0], window.location.origin);
const isCrossOrigin = url.origin !== window.location.origin;
if (isCrossOrigin) {
console.warn(`Cross-origin request to: ${url.origin}`);
console.log('Request details:', args[1]);
}
return originalFetch.apply(this, args);
};
Solution 1: Same-Origin Proxy (Recommended)
The most effective solution is to eliminate cross-origin requests entirely by proxying your API through the same origin as your frontend.
How It Works
Instead of:
Frontend (app.com) → API (api.app.com)
↓
Preflight required!
You have:
Frontend (app.com) → Same-origin proxy (app.com/api) → API (api.app.com)
↓ ↓
No preflight! Server-to-server (no CORS)
Next.js Implementation
Next.js makes this incredibly easy with rewrites in next.config.js:
// next.config.js
/** @type {import('next').NextConfig} */
const nextConfig = {
async rewrites() {
return [
{
source: '/api/:path*',
destination: 'https://api.yourbackend.com/:path*',
},
];
},
};
module.exports = nextConfig;
Now your frontend calls /api/users instead of https://api.yourbackend.com/users. The rewrite happens server-side, completely bypassing CORS.
Vite Implementation
// vite.config.js
export default defineConfig({
server: {
proxy: {
'/api': {
target: 'https://api.yourbackend.com',
changeOrigin: true,
rewrite: (path) => path.replace(/^\/api/, ''),
},
},
},
});
Nginx Reverse Proxy
For production environments, Nginx is extremely efficient:
# nginx.conf
server {
listen 80;
server_name app.example.com;
# Serve your frontend
location / {
root /var/www/frontend;
try_files $uri $uri/ /index.html;
}
# Proxy API requests
location /api/ {
proxy_pass https://api.example.com/;
proxy_http_version 1.1;
proxy_set_header Host api.example.com;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
# Disable buffering for streaming responses
proxy_buffering off;
# Timeouts
proxy_connect_timeout 60s;
proxy_send_timeout 60s;
proxy_read_timeout 60s;
}
}
Cloudflare Workers (Edge Proxy)
For globally distributed low-latency proxying:
// worker.js
export default {
async fetch(request) {
const url = new URL(request.url);
// Only proxy /api/* requests
if (url.pathname.startsWith('/api/')) {
const apiUrl = new URL(url.pathname.replace('/api', ''), 'https://api.yourbackend.com');
apiUrl.search = url.search;
// Forward the request to your API
const apiRequest = new Request(apiUrl, {
method: request.method,
headers: request.headers,
body: request.body,
});
return fetch(apiRequest);
}
// Pass through other requests
return fetch(request);
},
};
Express.js Middleware
// server.js
const express = require('express');
const { createProxyMiddleware } = require('http-proxy-middleware');
const app = express();
// Serve static frontend files
app.use(express.static('public'));
// Proxy API requests
app.use('/api', createProxyMiddleware({
target: 'https://api.yourbackend.com',
changeOrigin: true,
pathRewrite: {
'^/api': '', // Remove /api prefix
},
onProxyReq: (proxyReq, req) => {
// Forward authentication headers
if (req.headers.authorization) {
proxyReq.setHeader('Authorization', req.headers.authorization);
}
},
}));
app.listen(3000);
Solution 2: Optimize Preflight Caching
If you can't eliminate cross-origin requests, you can at least cache the preflight responses so browsers don't repeat them.
Access-Control-Max-Age Header
Configure your API to return the Access-Control-Max-Age header:
// Express.js CORS configuration
const cors = require('cors');
app.use(cors({
origin: 'https://app.example.com',
methods: ['GET', 'POST', 'PUT', 'DELETE', 'OPTIONS'],
allowedHeaders: ['Content-Type', 'Authorization', 'X-Requested-With'],
credentials: true,
maxAge: 86400, // Cache preflight for 24 hours (in seconds)
}));
# FastAPI
from fastapi.middleware.cors import CORSMiddleware
app.add_middleware(
CORSMiddleware,
allow_origins=["https://app.example.com"],
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
max_age=86400, # 24 hours
)
// Go with gorilla/handlers
handlers.CORS(
handlers.AllowedOrigins([]string{"https://app.example.com"}),
handlers.AllowedMethods([]string{"GET", "POST", "PUT", "DELETE", "OPTIONS"}),
handlers.AllowedHeaders([]string{"Content-Type", "Authorization"}),
handlers.MaxAge(86400),
)(router)
Browser Cache Limits
Be aware that browsers impose their own limits on Access-Control-Max-Age, as documented by MDN:
| Browser | Maximum Cache Time |
|---|---|
| Chrome | 2 hours (7200s) |
| Firefox | 24 hours (86400s) |
| Safari | 5 minutes (300s) |
The Chromium source code indicates this limit exists to "minimize the risk of using a poisoned cache after switching to a secure network." So even setting max-age: 86400 will only cache for 2 hours in Chrome.
Nginx CORS Headers
location /api/ {
# Handle preflight requests
if ($request_method = 'OPTIONS') {
add_header 'Access-Control-Allow-Origin' 'https://app.example.com' always;
add_header 'Access-Control-Allow-Methods' 'GET, POST, PUT, DELETE, OPTIONS' always;
add_header 'Access-Control-Allow-Headers' 'Authorization, Content-Type, X-Requested-With' always;
add_header 'Access-Control-Max-Age' 86400 always;
add_header 'Content-Type' 'text/plain; charset=utf-8';
add_header 'Content-Length' 0;
return 204;
}
# Actual request headers
add_header 'Access-Control-Allow-Origin' 'https://app.example.com' always;
add_header 'Access-Control-Allow-Credentials' 'true' always;
proxy_pass https://api-backend;
}
Solution 3: Avoid Preflight Triggers
Sometimes you can restructure requests to avoid triggering preflights entirely.
Use Simple Content Types
Instead of sending JSON:
// Triggers preflight
fetch('/api/data', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ key: 'value' })
});
Use URL-encoded form data:
// No preflight (simple request)
fetch('/api/data', {
method: 'POST',
headers: { 'Content-Type': 'application/x-www-form-urlencoded' },
body: new URLSearchParams({ key: 'value' })
});
Trade-off: Your backend needs to handle form data parsing, and nested objects become awkward.
Move Auth to Cookies
Custom Authorization headers trigger preflights. Cookies don't:
// Triggers preflight
fetch('https://api.example.com/data', {
headers: { 'Authorization': 'Bearer token' }
});
// No preflight (if other conditions met)
fetch('https://api.example.com/data', {
credentials: 'include' // Sends cookies automatically
});
Configure your API to set HTTP-only cookies:
// Backend sets cookie on login
res.cookie('session', token, {
httpOnly: true,
secure: true,
sameSite: 'none', // Required for cross-origin
domain: '.example.com', // Share across subdomains
maxAge: 7 * 24 * 60 * 60 * 1000 // 7 days
});
Security Note: SameSite=None requires Secure=true and opens CSRF considerations. Use CSRF tokens.
Query String Authentication
For read-only endpoints, pass tokens via query strings:
// No custom headers, no preflight
fetch(`https://api.example.com/data?token=${encodeURIComponent(token)}`);
Security Warning: Tokens in URLs appear in server logs, browser history, and referrer headers. Only use for non-sensitive endpoints.
Solution 4: Edge-Side Solutions
Cloudflare Transform Rules
If your frontend and API are both behind Cloudflare, you can use Transform Rules to proxy requests at the edge:
- Add both
app.example.comandapi.example.comto Cloudflare - Create a Transform Rule to rewrite
/api/*requests - Requests appear same-origin to the browser
AWS CloudFront
Configure CloudFront to serve both frontend and API from the same distribution:
{
"Origins": [
{
"Id": "frontend",
"DomainName": "frontend-bucket.s3.amazonaws.com"
},
{
"Id": "api",
"DomainName": "api.example.com",
"CustomOriginConfig": {
"HTTPSPort": 443,
"OriginProtocolPolicy": "https-only"
}
}
],
"CacheBehaviors": [
{
"PathPattern": "/api/*",
"TargetOriginId": "api",
"ViewerProtocolPolicy": "https-only",
"AllowedMethods": ["GET", "HEAD", "OPTIONS", "PUT", "POST", "PATCH", "DELETE"],
"CachedMethods": ["GET", "HEAD", "OPTIONS"],
"ForwardedValues": {
"Headers": ["Authorization", "Content-Type"],
"QueryString": true
}
}
],
"DefaultCacheBehavior": {
"TargetOriginId": "frontend"
}
}
Solution 5: Service Worker Interception
For complex scenarios, a Service Worker can intercept and modify requests:
// sw.js
self.addEventListener('fetch', (event) => {
const url = new URL(event.request.url);
// Intercept API calls and proxy through same origin
if (url.hostname === 'api.example.com') {
const proxyUrl = new URL(event.request.url);
proxyUrl.hostname = self.location.hostname;
proxyUrl.pathname = '/api' + proxyUrl.pathname;
event.respondWith(
fetch(new Request(proxyUrl, {
method: event.request.method,
headers: event.request.headers,
body: event.request.body,
mode: 'same-origin', // No CORS needed
credentials: event.request.credentials,
}))
);
}
});
Limitation: Service Workers don't help on the first page load before they're installed.
Comparing the Solutions
| Solution | Latency Improvement | Complexity | Production Ready |
|---|---|---|---|
| Same-origin proxy | Eliminates 100% | Low | Yes |
| Preflight caching | Reduces ~80% | Low | Yes |
| Simple requests | Eliminates 100% | Medium | Maybe |
| Cookie auth | Eliminates 100% | Medium | Yes |
| Edge proxy | Eliminates 100% | Medium | Yes |
| Service Worker | Eliminates ~90% | High | Partial |
Quick Decision Guide
-
Can you deploy a reverse proxy?
- Yes → Use Nginx/Next.js rewrites (Solution 1)
-
Using a CDN like Cloudflare or CloudFront?
- Yes → Configure edge routing (Solution 4)
-
Third-party API you don't control?
- Use a backend proxy or serverless function
-
Can't change infrastructure?
- Maximize
Access-Control-Max-Age(Solution 2) - Consider cookie-based auth (Solution 3)
- Maximize
Measuring Your Improvement
After implementing a solution, measure the impact:
// Before/after timing comparison
async function measureApiLatency(url, iterations = 10) {
const times = [];
for (let i = 0; i < iterations; i++) {
// Clear any cached preflight
await new Promise(r => setTimeout(r, 100));
const start = performance.now();
await fetch(url, {
headers: { 'Authorization': 'Bearer test' }
});
times.push(performance.now() - start);
}
const avg = times.reduce((a, b) => a + b) / times.length;
const min = Math.min(...times);
const max = Math.max(...times);
console.log(`Average: ${avg.toFixed(0)}ms, Min: ${min.toFixed(0)}ms, Max: ${max.toFixed(0)}ms`);
return { avg, min, max, times };
}
// Compare cross-origin vs same-origin
await measureApiLatency('https://api.example.com/test'); // Before
await measureApiLatency('/api/test'); // After proxy
Conclusion
CORS preflight requests are one of those silent performance killers that most developers never think about. A single 100-300ms penalty might seem small, but multiply that across every API call in your application, and you're looking at seconds of accumulated delay per user session.
The good news is that eliminating this latency is usually straightforward:
- First choice: Set up a same-origin proxy through your web server or framework
- Second choice: Use edge routing if you're already on a CDN
- Fallback: Maximize preflight caching with
Access-Control-Max-Age
The extra 30 minutes spent configuring a proxy will pay dividends every time a user interacts with your application. Your users will thank you with better engagement, and your metrics will thank you with improved Core Web Vitals scores.
References & Further Reading:
- MDN: Cross-Origin Resource Sharing (CORS) - Comprehensive guide to CORS concepts
- MDN: Preflight Request - What triggers preflight and how it works
- MDN: Access-Control-Max-Age - Browser cache limits documentation
- WHATWG Fetch Standard: CORS Protocol - The official specification
- The Performance Cost of CORS on SPAs - Real-world latency measurements