Generally, UI race conditions in Salesforce are rare. It’s not something you usually face during the implementation, but it can become relevant in the advanced cases.
For instance, I worked together with one of the Salesforce ISV partner in the travel space to develop their managed package.
I’ve been responsible for developing and implementing the front-end promotion engine for their multi-step checkout process. The application is LWC-based and has a very complex UI state.
The promotions, which I needed to fetch from the API, depended on that state. For example, the passengers selected, room configurations, etc.
In practice, it means that every time the user interacts with the state, making selections in the UI, we need to fetch the valid promotions data.
The most straightforward approach is to initiate the fetch, show the loader, process the response, and update the UI state.
The particular challenge is that it results in a poor user experience because there are many options they can toggle and select. If we block the entire UI with the loader, the application becomes unusable in some cases.
The alternative is to avoid blocking the entire UI and allow users to modify the state without interruption, while handling race conditions to ensure they only receive promotions for the latest selected state.
It is one of the examples where you might want to consider allowing the race conditions and choosing a strategy to manage them.
In this post, I would like to discuss the frontend race conditions, especially in the LWC context, but I believe it’s useful for any software engineer.
Existing Libraries
If you have experience working with Observables and particularly with Angular and the RxJS library, you might be familiar with their map operators to handle race conditions.
Those are mergeMap, concatMap, switchMap, and exhaustMap.
Each represents different strategies for handling asynchronous tasks based on concurrency requirements, ordering, and cancellation.
mergeMap- processes all inner observables concurrently. Useful when order does not matter, and you want to run operations in parallel.concatMap- processes all inner observables one-by-one in order, queueing them if necessary. Useful for dependent operations where order matters, like database updates.switchMap- cancels the previous inner observable when a new value arrives. Ideal for search type-ahead.exhaustMap- ignores newly incoming values while the current inner observable is running. Useful for preventing multiple form submissions or login clicks.
Having such operators in the arsenal is very powerful, and that’s what makes RxJS so useful.
However, in the LWC context and in other web component frameworks, we do not use observables and cannot benefit from such libraries. Each Apex method and HTTP callout is a Promise.
Nevertheless, I’d like to share the ways to replicate these functionalities for Promises.
Merge Map
Let’s start with the merge map. Generally, that’s the default Promise behavior. You can run Promise.all to run a set of promises in parallel.
However, this approach has a few downsides:
- It only works if you already have a list of promises.
- Starts everything immediately.
- No built-in concurrency limit.
- No lifecycle control for later calls.
- One rejection rejects the whole combined promise.
If the case is to allow requests to keep arriving from user interactions, we can adopt a controlled approach with parallelism, queueing, and cancellation.
export function createMergeMap<I, O>(
task: PromiseTask<I, O>,
options: MergeMapOptions = {}
): TaskRunner<I, O> {
const concurrency = options.concurrency ?? Number.POSITIVE_INFINITY;
if (!Number.isFinite(concurrency) && concurrency !== Number.POSITIVE_INFINITY) {
throw new Error("mergeMap concurrency must be a finite positive number or Infinity");
}
if (concurrency <= 0) {
throw new Error("mergeMap concurrency must be greater than 0");
}
let callId = 0;
let activeCount = 0;
let generation = 0;
type QueueItem = {
input: I;
resolve: (value: O | undefined) => void;
reject: (reason?: unknown) => void;
callId: number;
generation: number;
};
const queue: QueueItem[] = [];
const activeControllers = new Map<number, AbortController>();
const pumpQueue = (): void => {
while (activeCount < concurrency && queue.length > 0) {
const next = queue.shift();
if (!next) {
break;
}
if (next.generation !== generation) {
next.resolve(undefined);
continue;
}
activeCount += 1;
const controller = createController();
activeControllers.set(next.callId, controller);
void (async () => {
try {
const result = await task(next.input, {
signal: controller.signal,
callId: next.callId
});
next.resolve(result);
} catch (error) {
if (isAbortError(error) || next.generation !== generation) {
next.resolve(undefined);
} else {
next.reject(error);
}
} finally {
activeCount -= 1;
activeControllers.delete(next.callId);
pumpQueue();
}
})();
}
};
return {
execute(input: I): Promise<O | undefined> {
callId += 1;
const thisCallId = callId;
const thisGeneration = generation;
return new Promise<O | undefined>((resolve, reject) => {
queue.push({
input,
resolve,
reject,
callId: thisCallId,
generation: thisGeneration
});
pumpQueue();
});
},
cancel(reason?: string): void {
generation += 1;
for (const controller of activeControllers.values()) {
controller.abort(reason);
}
while (queue.length > 0) {
const queued = queue.shift();
queued?.resolve(undefined);
}
}
};
}
The shared implementation lets you wrap a promise in a merge map and define its concurrency.
It creates a shared behavior by reusing the same runner instance via the internal queue.
Concurrency is set to infinity by default unless you set one. It allows you to limit the number of promises processed simultaneously. The extra calls stay queued until one finishes. When a task completes, the active count is decremented, and the queue is pumped again.
An important note about the cancellations.
We cannot cancel promises by default, and the only viable option is to use AbortController via signals.
Abort only truly stops the underlying work if your task honors the signal, e.g., fetch with signals.
If the async task ignores the signal, the operation may still run physically, but this runner treats it as stale/canceled and suppresses result propagation.
Here is an example of how you can use the created runner instance.
import { createMergeMap } from "@frelseren/promise-task-maps";
const fetchProfile = createMergeMap(
async (userId: string, { signal }) => {
const response = await fetch(`/api/users/${userId}`, { signal });
return response.json();
},
{ concurrency: 3 }
);
// Up to 3 requests run in parallel; remaining calls wait in queue.
const profiles = await Promise.all([
fetchProfile.execute("u1"),
fetchProfile.execute("u2"),
fetchProfile.execute("u3"),
fetchProfile.execute("u4")
]);
Concat Map
The concat map implements a sequential queue, where all tasks execute one at a time in FIFO order.
export function createConcatMap<I, O>(task: PromiseTask<I, O>): TaskRunner<I, O> {
let active = false;
let currentCallId = 0;
let currentController = createController();
type QueueItem = {
input: I;
resolve: (value: O | undefined) => void;
reject: (reason?: unknown) => void;
callId: number;
};
const queue: QueueItem[] = [];
const processQueue = async (): Promise<void> => {
if (active) {
return;
}
const next = queue.shift();
if (!next) {
return;
}
active = true;
currentCallId = next.callId;
currentController = createController();
try {
const result = await task(next.input, {
signal: currentController.signal,
callId: next.callId
});
next.resolve(result);
} catch (error) {
if (isAbortError(error)) {
next.resolve(undefined);
} else {
next.reject(error);
}
} finally {
active = false;
void processQueue();
}
};
return {
execute(input: I): Promise<O | undefined> {
const callId = currentCallId + queue.length + 1;
return new Promise<O | undefined>((resolve, reject) => {
queue.push({ input, resolve, reject, callId });
void processQueue();
});
},
cancel(reason?: string): void {
currentController.abort(reason);
while (queue.length > 0) {
const queued = queue.shift();
queued?.resolve(undefined);
}
}
};
}
The execute method enqueues the task immediately and calls the process queue method.
It runs only if not already active, takes the first queued item, waits for completion, sets active = false, and recursively calls itself to process the next item.
If you call cancel, it aborts the current task and resolves all queued items as undefined.
It’s perfect for preventing race conditions in mutations. Here's a usage example.
import { createConcatMap } from "@frelseren/promise-task-maps";
const saveDraft = createConcatMap(async (payload: { id: string; body: string }, { signal }) => {
const response = await fetch(`/api/drafts/${payload.id}`, {
method: "PUT",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ body: payload.body }),
signal
});
return response.ok;
});
// Calls are queued and executed in order to avoid race conditions.
await Promise.all([
saveDraft.execute({ id: "a1", body: "v1" }),
saveDraft.execute({ id: "a1", body: "v2" }),
saveDraft.execute({ id: "a1", body: "v3" })
]);
Switch Map
The switch map implements “latest wins”: the most recent task result is accepted, and prior in-flight tasks are canceled immediately.
export function createSwitchMap<I, O>(task: PromiseTask<I, O>): TaskRunner<I, O> {
let latestCallId = 0;
let latestController = createController();
return {
async execute(input: I): Promise<O | undefined> {
latestCallId += 1;
const callId = latestCallId;
latestController.abort();
latestController = createController();
try {
const result = await task(input, { signal: latestController.signal, callId });
return callId === latestCallId ? result : undefined;
} catch (error) {
if (callId !== latestCallId || isAbortError(error)) {
return undefined;
}
throw error;
}
},
cancel(reason?: string): void {
latestCallId += 1;
latestController.abort(reason);
latestController = createController();
}
};
}
The execute method increments the latest call ID and immediately aborts the previous controller. Then it creates a new AbortController and starts the tasks.
When the task completes, it checks whether it’s still the latest call ID and returns the result if it is, or leaves it undefined otherwise.
Ideal for the search-as-you-type or autocomplete, where you only want results from the most recent query.
This is exactly what I needed in the real-world scenario I described at the beginning of the post, because I only care about the latest application state when fetching promotions.
import { LightningElement } from "lwc";
import { createSwitchMap } from "@frelsren/promise-task-maps";
export default class UserLookup extends LightningElement {
users = [];
lookup = createSwitchMap(async (query: string, { signal }) => {
const res = await fetch(`/services/apexrest/users?q=${encodeURIComponent(query)}`, { signal });
return res.json();
});
async handleInput(event: Event) {
const target = event.target as HTMLInputElement;
const data = await this.lookup.execute(target.value);
if (data) {
this.users = data;
}
}
disconnectedCallback() {
this.lookup.cancel("component disconnected");
}
}
Exhaust Map
The exhaust map is probably the most straightforward one. It implements “ignore while busy”: new execute calls are rejected while a task is already running and don’t require a queue.
export function createExhaustMap<I, O>(task: PromiseTask<I, O>): TaskRunner<I, O> {
let active = false;
let callId = 0;
let generation = 0;
let controller = createController();
return {
async execute(input: I): Promise<O | undefined> {
if (active) {
return undefined;
}
callId += 1;
const thisCallId = callId;
const thisGeneration = generation;
active = true;
controller = createController();
try {
const result = await task(input, { signal: controller.signal, callId: thisCallId });
return thisGeneration === generation ? result : undefined;
} catch (error) {
if (isAbortError(error) || thisGeneration !== generation) {
return undefined;
}
throw error;
} finally {
active = false;
}
},
cancel(reason?: string): void {
generation += 1;
controller.abort(reason);
}
};
}
The execute method first checks whether anything is already running and returns undefined immediately if so. Otherwise, it sets active to true and starts the task.
Any new calls while one is running are instantly rejected, which is perfect for double-click protection or form submission.
import { createExhaustMap } from "@frelsren/promise-task-maps";
const submitOrder = createExhaustMap(async (order: { items: string[] }, { signal }) => {
const response = await fetch("/api/orders", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify(order),
signal
});
return response.json();
});
// Double-click protection: second call returns undefined while first is in flight.
const firstAttempt = submitOrder.execute({ items: ["sku-1"] });
const secondAttempt = submitOrder.execute({ items: ["sku-1"] });
await firstAttempt; // order accepted
await secondAttempt; // undefined
Conclusion
UI race conditions are an interesting topic that is often overlooked until issues arise.
Having a strategy to handle asynchronous operations in mind can help you develop better user experiences, significantly improve app performance, and reliably handle complex state scenarios.
In the post, I explored different approaches to async task management.
My main goal was to replicate RxJS-like task-control semantics for plain Promises, lightweight and LWC-compatible.
I will also share the provided code snippets as an npm package for use in LWC and other vanilla JS implementations.
https://www.npmjs.com/package/@frelseren/promise-task-maps
In summary, while this post focused on Salesforce LWC, the async patterns explored here are valuable tools for any frontend developer dealing with race conditions and UI concurrency.
By understanding when to apply each strategy, you can write more reliable and predictable user interfaces.

Nikita Verkhoshintcev
Senior Salesforce Technical Architect & Developer
I'm a senior Salesforce technical architect and developer, specializing in Experience Cloud, managed packages, and custom implementations with AWS and Heroku. I have extensive front-end engineering experience and have worked as an independent contractor since 2016. My goal is to build highly interactive, efficient, and reliable systems within the Salesforce platform. Typically, companies contact me when a complex implementation is required. I'm always open to collaboration, so please don't hesitate to reach out!
