Learn about Alkemy's API rate limits and how to work with them.
The Alkemy API employs a number of safeguards against bursts of incoming traffic to help maximize its stability. Users who send many requests in quick succession may see error responses that show up as status code 429. We have several limiters in the API, including:
-
A rate limiter that limits the number of requests received by the API within any given second.
-
For most APIs, Alkemy allows up to 100 API calls per second and 50000 calls a day in live mode, and 25 operations per second and 15000 calls a day in Sandbox mode.
-
A concurrency limiter that limits the number of requests that are active at any given time. Problems with this limiter are less common compared to the request rate limiter, but it’s more likely to result in resource-intensive, long-lived requests.
Treat these limits as maximums and don’t generate unnecessary load. See Handling limiting gracefully below for advice on handling 429s. If you suddenly see a rising number of rate limited requests, please contact support.
We may reduce limits to prevent abuse, or increase limits to enable high-traffic applications. To request an increased rate limit, please contact support. If you’re requesting a large increase, contact us 6 weeks in advance of when you’ll need the increased rate limit.
Handling Limiting Gracefully
A basic technique for integrations to gracefully handle limiting is to watch for 429 status codes and build in a retry mechanism. The retry mechanism should follow an exponential backoff schedule to reduce request volume when necessary. We’d also recommend building some randomness into the backoff schedule to avoid a thundering herd effect.
You can only optimize individual requests to a limited degree, so an even more sophisticated approach would be to control traffic to Alkemy at a global level, and throttle it back if you detect substantial rate limiting. A common technique for controlling rate is to implement something like a token bucket rate limiting algorithm on the client-side. Ready-made and mature implementations for token bucket are available in almost any programming language.