Skip to content

Conversation

@sbarrio
Copy link
Contributor

@sbarrio sbarrio commented Jan 30, 2026

What does this PR do?

This PR introduces 2 improvements to the rumAuto scenario on the benchmark app, both aimed to improve the reliability and stability of synthetics run executing this scenario:

  • Addition of a request queue, enforcing a single request at a time and a configurable delay between requests to avoid getting 429'd by the test API.
  • The addition of fetchByIds function to perform a single request with a payload of multiple ids instead of performing a request per id when fetching characters, episodes and locations.

Motivation

The RUM Auto scenario test runs are very flimsy and unreliable due to the request rate limits imposed by the test API.

Review checklist (to be filled by reviewers)

  • Feature or bugfix MUST have appropriate tests
  • Make sure you discussed the feature or bugfix with the maintaining team in an Issue
  • Make sure each commit and the PR mention the Issue number (cf the CONTRIBUTING doc)
  • If this PR is auto-generated, please make sure also to manually update the code related to the change

@sbarrio sbarrio requested a review from a team as a code owner January 30, 2026 15:01
@sbarrio sbarrio self-assigned this Feb 2, 2026
…sbarrio/chore/rumauto-benchmark-network-improvements
Comment on lines +30 to +66
private async processQueue(): Promise<void> {
if (this.isProcessing) {
return;
}

this.isProcessing = true;

while (this.requestQueue.length > 0 && this.activeRequests < MAX_CONCURRENT_REQUESTS) {
const request = this.requestQueue.shift();
if (!request) break;

this.activeRequests++;

try {
await this.delay(REQUEST_DELAY_MS);

const response = await fetch(request.url);

if (!response.ok) {
throw new Error(`HTTP ${response.status}: ${response.statusText}`);
}

const data = await response.json();
request.resolve(data);
} catch (error) {
request.reject(error);
} finally {
this.activeRequests--;
}
}

this.isProcessing = false;

if (this.requestQueue.length > 0) {
this.processQueue();
}
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't care too much since this is for the benchmarks app, however I believe this function should be refactored a bit, there is no concurrency happening here, mostly due to the await inside the while loop, which makes the request processing sequentially, and the isProcessing prevents any requests from re-entering the the processQueue.

So this is simply defering the execution of requests not paralelizing them.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, that's the whole point of it, to act as a gatekeeper to not overwhelm the API and get cut out. Paralellization was not the aim for this. I do agree however that it could be refactored a bit.

@sbarrio sbarrio merged commit 00e5abf into develop Feb 6, 2026
11 checks passed
@sbarrio sbarrio deleted the sbarrio/chore/rumauto-benchmark-network-improvements branch February 6, 2026 10:42
@sbarrio sbarrio mentioned this pull request Feb 9, 2026
4 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants