Configuring Nginx for Reverse Proxy, Load Balancing, and Caching
Nginx functions as a versatile web server, capable of acting as a reverse proxy, load balancer, and cache. This guide covers configurations for these three primary applications.
The setup utilizes Windows 10 with nginx-1.18.0, and an ASP.NET Core Web API developed in VS2019 serves as the test backend.
Reverse Proxy Configuration
Nginx can forward client requests to different backend servers, making it an effective reverse proxy. To configure this, modify the http -> server block in nginx.conf. Define the listening port and server name, then use a location block to specify the proxy forwarding.
server {
listen 8080;
server_name localhost;
location / {
proxy_pass http://backend_api;
}
}
Next, define the upstream server list within the http block:
upstream backend_api {
ip_hash;
server localhost:6001;
server localhost:6002 weight=5;
server localhost:6003 weight=10;
}
With this configuration, requests directed to port 8080 on localhost will be proxied to the servers running on ports 6001, 6002, and 6003.
Load Balancing Strategies
Nginx supports several load balancing algorithms:
- Round Robin: The default strategy, distributing requests sequentially across available servers.
- Weighted Round Robin: Assigns different weights to servers, influencing their proportion of incoming requests. In the
backend_apiexample, the server withweight=10receives proportionally more traffic than the one withweight=5or the defaultweight=1. - IP Hash: Routes requests from the same client IP address to the same backend server, which is useful for maintaining session state across requests.
- Least Connections: Directs new requests to the backend server with the fewest active connections.
Third-party modules offer additional strategies like fair (response time-based) and url_hash (URL-based).
The reverse proxy directs trafffic to a specific backend server based on the chosen load balancing policy.
Caching Implementation
Caching in Nginx involves storing responses from backend servers, keyed by a generated identifier, and serving them directly to clients for subsequent requests. This significantly improves response times, especially when used in conjunction with reverse proxy and load balancing.
-
Define Cache Storage Path: Specify where cached data will be stored within the
httpblock.http { proxy_cache_path /path/to/nginx/cache levels=1:2 keys_zone=api_cache:50m inactive=10m max_size=1g; }/path/to/nginx/cache: The directory for storing cache files. Ensure this directory exists.levels=1:2: Defines the directory structure for cache files.keys_zone=api_cache:50m: Creates a shared memory zone namedapi_cachewith a size of 50MB to store cache keys.inactive=10m: Specifies that cached items not accessed for 10 minutes will be removed.max_size=1g: Sets the maximum size of the cache to 1GB.
-
Enable Caching for Specific Locations: Apply caching to relevant
locationblocks.location /api/data/ { proxy_store off; proxy_redirect off; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header X-Real-IP $remote_addr; proxy_set_header Host $http_host; proxy_pass http://backend_api/api/data/; proxy_cache api_cache; proxy_cache_valid 200 304 5m; proxy_cache_key "$scheme$proxy_host$request_uri"; }proxy_cache api_cache;: Enables caching using theapi_cachezone defined earlier.proxy_cache_valid 200 304 5m;: Caches responses with 200 and 304 status codes for 5 minutes.proxy_cache_key "$scheme$proxy_host$request_uri";: Defines the cache key based on the request's scheme, host, and URI.
-
Bypass Cache for Fresh Data: To retrieve the latest data, append a unique query parameter to the URL. This variation creates a new cache key, ensuring a fresh response is fetched from the backend. Example:
http://localhost:8080/api/data/get?v=123456
Backend API Example
A simple ASP.NET Core Web API controller for testing:
using Microsoft.AspNetCore.Mvc;
using System;
namespace WebApi.Controllers
{
[Route("api/[controller]")]
[ApiController]
public class DataController : ControllerBase
{
[HttpGet]
[Route("Get")]
public string Get()
{
return $"Data fetched at: {DateTime.Now}";
}
}
}
To run multiple instances of the API for load balancing testing:
dotnet WebApi.dll --urls="http://*:6001"
dotnet WebApi.dll --urls="http://*:6002"
dotnet WebApi.dll --urls="http://*:6003"
Complete Nginx Configuration Example
#user nobody;
worker_processes 1;
#error_log logs/error.log;
#error_log logs/error.log notice;
#error_log logs/error.log info;
#pid logs/nginx.pid;
events {
worker_connections 1024;
}
http {
proxy_cache_path /Work/Data/Nginx levels=1:2 keys_zone=web_cache:50m inactive=10m max_size=1g;
include mime.types;
default_type application/octet-stream;
#log_format main '$remote_addr - $remote_user [$time_local] "$request" '
# '$status $body_bytes_sent "$http_referer" '
# '"$http_user_agent" "$http_x_forwarded_for"';
#access_log logs/access.log main;
sendfile on;
#tcp_nopush on;
#keepalive_timeout 0;
keepalive_timeout 65;
#gzip on;
upstream WebApi {
ip_hash;
server localhost:6001 weight=1;
server localhost:6002 weight=5;
server localhost:6003 weight=10;
}
server {
listen 8080;
server_name localhost;
location / {
proxy_pass http://WebApi;
}
location /api/test/ {
proxy_store off;
proxy_redirect off;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header Host $http_host;
proxy_pass http://WebApi/api/test/;
proxy_cache web_cache;
proxy_cache_valid 200 304 2m;
proxy_cache_key $scheme$proxy_host$request_uri;
}
error_page 500 502 503 504 /50x.html;
location = /50x.html {
root html;
}
}
}