Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rate limit transparency #942

Open
silvertech-daniel opened this issue Dec 17, 2024 · 1 comment
Open

Rate limit transparency #942

silvertech-daniel opened this issue Dec 17, 2024 · 1 comment

Comments

@silvertech-daniel
Copy link

Describe the bug

The client needs to transparently handle rate-limiting separately from the server error retry logic.

To Reproduce

Fire large enough requests fast enough. I hit this during next.js's site-wide build-time SSR and have to try to work around it by wrapping every server-side client fetch() and loadQuery() with my own rate-limit logic that has to be more conservative than necessary because it has no access to the response.

Expected behavior

While #199 addresses automatic retries which is good for server errors, API rate limits should be handled using the headers in the response. For example, a 429 response includes the retry-after header, which is the actual number of seconds that the client should be waiting to retry. However, the client should rarely see a 429 because it should also be using the ratelimit-limit, ratelimit-remaining, and ratelimit-reset headers from previous successful responses. The rate limiting should be handled by a globally shared fetch queue.

Screenshots

image

Which versions of Sanity are you using?

sanity 3.64.0 (latest: 3.67.1)

What operating system are you using?

N/A

Which versions of Node.js / npm are you running?

N/A

Additional context

N/A

Security issue?

No

@silvertech-daniel
Copy link
Author

This is my workaround (I use a promise pool to limit concurrency heuristically because I have no access to the actual ratelimit-x headers.

import { newQueue } from '@henrygd/queue'
import type { ClientError } from 'next-sanity'

const waitToRetryAfter = (
	retryAfter?: string | null,
	{
		fallback = 10,
		max = 15
	}: {
		fallback?: number
		max?: number
	} = {}
) => {
	let seconds: number | undefined
	if (retryAfter) {
		seconds = Number(retryAfter)

		if (!isFinite(seconds)) {
			seconds = Math.ceil(new Date(retryAfter).getTime() - Date.now())
		}

		if (!isFinite(seconds)) {
			seconds = undefined
		}
	}
	// Fallback, add an extra little safety second, and clamp the wait time
	seconds = Math.min(Math.max(0, seconds ?? fallback) + 1, max)

	return new Promise<void>(resolve => setTimeout(resolve, seconds * 1000))
}

type RateLimitQueueItem<R = unknown> = {
	f: () => Promise<R>
	resolve: (value: R) => void
	reject: () => void
}

const rateLimitQueue = new Set<RateLimitQueueItem>()
const maxConcurrencyQueue = newQueue(10)

let isProcessing = false
const processQueue = async () => {
	if (isProcessing) return
	isProcessing = true

	while (rateLimitQueue.size) {
		for (const item of rateLimitQueue) {
			maxConcurrencyQueue
				.add(async () => {
					try {
						return await item.f()
					} catch (e) {
						const clientError =
							e && typeof e === 'object' && 'response' in e
								? (e as ClientError)
								: undefined
						if (!clientError) throw e

						const retryAfter = clientError.response.headers['retry-after'] as
							| string
							| undefined
						if (!retryAfter) throw e

						console.error(`Retrying after "${retryAfter}"`, e)

						maxConcurrencyQueue.clear()
						await waitToRetryAfter(retryAfter)
						return await item.f()
					}
				})
				.then(item.resolve)
				.catch(item.reject)
				.finally(() => {
					rateLimitQueue.delete(item)
				})
		}

		await maxConcurrencyQueue.done()
	}

	isProcessing = false
}

export const enqueue = async <R>(f: RateLimitQueueItem<R>['f']) =>
	new Promise<R>((resolve, reject) => {
		rateLimitQueue.add({
			f,
			resolve: resolve as any,
			reject
		})

		processQueue()
	})

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant