-
Notifications
You must be signed in to change notification settings - Fork 54
[FIX][Benchmarks] RumAuto scenario networking improvements #1121
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FIX][Benchmarks] RumAuto scenario networking improvements #1121
Conversation
…sbarrio/chore/rumauto-benchmark-network-improvements
| private async processQueue(): Promise<void> { | ||
| if (this.isProcessing) { | ||
| return; | ||
| } | ||
|
|
||
| this.isProcessing = true; | ||
|
|
||
| while (this.requestQueue.length > 0 && this.activeRequests < MAX_CONCURRENT_REQUESTS) { | ||
| const request = this.requestQueue.shift(); | ||
| if (!request) break; | ||
|
|
||
| this.activeRequests++; | ||
|
|
||
| try { | ||
| await this.delay(REQUEST_DELAY_MS); | ||
|
|
||
| const response = await fetch(request.url); | ||
|
|
||
| if (!response.ok) { | ||
| throw new Error(`HTTP ${response.status}: ${response.statusText}`); | ||
| } | ||
|
|
||
| const data = await response.json(); | ||
| request.resolve(data); | ||
| } catch (error) { | ||
| request.reject(error); | ||
| } finally { | ||
| this.activeRequests--; | ||
| } | ||
| } | ||
|
|
||
| this.isProcessing = false; | ||
|
|
||
| if (this.requestQueue.length > 0) { | ||
| this.processQueue(); | ||
| } | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't care too much since this is for the benchmarks app, however I believe this function should be refactored a bit, there is no concurrency happening here, mostly due to the await inside the while loop, which makes the request processing sequentially, and the isProcessing prevents any requests from re-entering the the processQueue.
So this is simply defering the execution of requests not paralelizing them.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, that's the whole point of it, to act as a gatekeeper to not overwhelm the API and get cut out. Paralellization was not the aim for this. I do agree however that it could be refactored a bit.
What does this PR do?
This PR introduces 2 improvements to the rumAuto scenario on the benchmark app, both aimed to improve the reliability and stability of synthetics run executing this scenario:
429'd by the test API.Motivation
The RUM Auto scenario test runs are very flimsy and unreliable due to the request rate limits imposed by the test API.
Review checklist (to be filled by reviewers)