
Preventing multiple HTTP requests to the same endpoint
Prevent multiple sequential HTTP requests in Angular and TypeScript by using caching and request deduplication techniques to enhance performance.
To prevent multiple HTTP requests to the same endpoint, we can use a caching technique that stores and reuses previously fetched data.
This approach ensures that only the first request is sent to the server, while subsequent requests are held in a pending state and resolved with the same response as the first request.
Why preventing multiple requests is important
In modern web applications, it’s common for multiple components, but not only, to make HTTP requests to the same API endpoint, often simultaneously.
However, this can lead to a number of problems, including:
- Increased server load: multiple requests to the same endpoint can put a significant strain on the server, leading to slower response times, increased latency, and even crashes.
- Wasted bandwidth: multiple requests can result in unnecessary data transfer, wasting bandwidth and increasing costs.
- Inconsistent data: when multiple requests are made to the same endpoint, it can lead to inconsistent data being returned, causing errors and inconsistencies in the application.
- Poor user experience: excessive requests can result in a slow and unresponsive user interface, leading to a poor user experience.
To mitigate these issues, it’s essential to implement a mechanism to prevent multiple HTTP requests to the same endpoint. This is where the Cache API comes. It provides a way to store and retrieve network requests and responses. It is part of the Service Worker API, which enables background scripts to manage caching and network requests.
The primary purpose of caching is to enhance the performance of web applications by reducing load times and enabling offline functionality. When a user visits a site, the Service Worker can cache resources (like HTML, CSS, JavaScript, images) so they can be served quickly on subsequent visits.
The below example of TypeScript Network
class will demonstrate how it could be implemented and used.
By caching responses and preventing duplicate requests, the Network
class helps to:
- Reduce server load and improve response times.
- Minimize bandwidth usage and costs.
- Ensure consistent data is returned.
- Improve the overall user experience.
Preventing multiple requests with caching, TypeScript version
The Network
class provides a mechanism for handling HTTP requests with caching capabilities. It prevents multiple HTTP requests to the same endpoint by queuing subsequent requests and only sending the first request to the server, and then resolves the queued requests with the same response as the first request, thereby enhancing performance and reducing unnecessary network traffic.
We had to create additional interface IResponse
because the Response
type in TypeScript does not have a timestamp
and data
properties by default. However, the Response
object in modern browsers does have a timestamp property that returns the timestamp of the response.
The timestamp
property is used to track when the response was cached.
/**
* Custom response interface to include a timestamp property.
* This interface is used to extend the standard Response type to include a timestamp,
* which is used for caching purposes.
*/
interface IResponse {
data: any;
timestamp: number;
}
class Network {
private pendingRequests: Map<string, Promise<IResponse>>;
private cache: Cache;
constructor() {
this.pendingRequests = new Map();
}
public async makeRequest(url: string, options: RequestInit): Promise<IResponse> {
try {
const cachedResponse: Response | undefined = await this.cache.match(url);
if (cachedResponse) {
const cacheEntry: IResponse = await cachedResponse.json();
const timestamp: number = cacheEntry.timestamp;
const now: number = new Date().getTime();
const cacheDuration: number = 14 * 24 * 60 * 60 * 1000; // 14 days
if (now - timestamp < cacheDuration) {
return cacheEntry;
}
await this.cache.delete(url);
}
const response: Response = await window.fetch(url, options);
if (response.ok === false) {
throw new Error(`API Endpoint not available: ${url}`);
}
const json: string = await response.json();
const cacheEntry: IResponse = {
data: json,
timestamp: new Date().getTime()
};
try {
await this.cache.put(url, new window.Response(JSON.stringify(cacheEntry)));
} catch (err) {
console.error('[Network.makeRequest]', err);
}
return cacheEntry;
} catch (error) {
console.error(error.message);
throw error;
}
}
public async fetch(url: string, options: RequestInit): Promise<IResponse> {
if (this.pendingRequests.has(url)) {
return this.pendingRequests.get(url) as Promise<IResponse>;
}
const promise: Promise<IResponse> = this.makeRequest(url, options);
this.pendingRequests.set(url, promise);
promise.then((): void => {
this.pendingRequests.delete(url);
});
return promise;
}
public async init(): Promise<void> {
this.cache = await window.caches.open('fetch-cache');
}
}
Example use case for TypeScript
// Create a new instance of the Network class
const network = new Network();
// Initialize the cache
await network.init();
// Make a request to an API endpoint
const response = await network.fetch('https://api.example.com/data', {
method: 'GET',
headers: {
'Content-Type': 'application/json'
}
});
// Log the response data
console.log(response.data);
// Make another request to the same API endpoint
const cachedResponse = await network.fetch('https://api.example.com/data', {
method: 'GET',
headers: {
'Content-Type': 'application/json'
}
});
// The second request should return the cached response
console.log(cachedResponse.data);
Preventing multiple requests with caching, Angular version
We are going to define the RequestShareInterceptor
Angular HTTP interceptor that shares the same pending requests for multiple callers.
The interceptor prevents multiple requests to the same endpoint from being sent to the API simultaneously. It stores the first request and shares its response with subsequent requests.
import { HttpEvent, HttpHandler, HttpInterceptor, HttpRequest } from '@angular/common/http';
import { Observable, ReplaySubject } from 'rxjs';
import { finalize } from 'rxjs/operators';
/*
* Share the same pending requests for multiple callers
*
* Scenario:
* Components A and B ask the same endpoint (almost) at the same time,
* To avoid two requests to API we store the first one till it resolves
* and we share its response.
*/
export class RequestShareInterceptor implements HttpInterceptor {
private requests: Map<string, ReplaySubject<HttpEvent<unknown>>>;
constructor() {
this.requests = new Map();
}
public intercept(request: HttpRequest<unknown>, next: HttpHandler): Observable<HttpEvent<unknown>> {
if (request.method !== 'GET') {
return next.handle(request);
}
const cacheKey: string = request.urlWithParams;
if (this.requests.has(cacheKey)) {
return this.requests.get(cacheKey).asObservable();
}
const subject: ReplaySubject<HttpEvent<unknown>> = new ReplaySubject<HttpEvent<unknown>>(1);
this.requests.set(cacheKey, subject);
const deleteFromRequests = (): void => {
this.requests.delete(cacheKey);
};
next.handle(request).pipe(finalize(deleteFromRequests)).subscribe(subject);
return subject.asObservable();
}
}
Example of how to include interceptor in Angular
To use that you need to import Angular interceptor in the app.module.ts
:
import { RequestShareInterceptor } from 'request-share.interceptor';
@NgModule({
declarations: [
{
multi: true,
provide: HTTP_INTERCEPTORS,
useClass: RequestShareInterceptor
}
]
});
Execution flow
- If the request is not a
GET
request, the interceptor simply passes it to the next handler in the chain. - If a pending request with the same cache key already exists, the interceptor returns the cached response as an observable.
- If no pending request exists, the interceptor creates a new
ReplaySubject
to store the response and adds it to therequests
map. - The interceptor then handles the request using the
next
handler and pipes the response through thefinalize
operator, which deletes the request from therequests
map when the response is received. - The interceptor subscribes to the response and stores it in the
ReplaySubject
. - Finally, the interceptor returns the cached response as an observable.
Benefits
- Reduces the number of requests sent to the API, which can improve performance and reduce server load.
- Ensures that multiple components requesting the same data receive the same response, which can help maintain data consistency.
Potential issues
- If the API returns an error response, it will be cached and returned to subsequent requests, which may not be desirable.
- If the API returns a response that is not cacheable (e.g., a response with a
Cache-Control
header set tono-cache
), the interceptor may still cache the response, which could lead to stale data being returned to subsequent requests.
To address these issues, you may want to consider adding additional logic to handle error responses and non-cacheable responses.
Comments