Caching strategies
Caching is difficult to get right and often means you need to pull in additional frameworks into your code. Fine tuning the balance between performance and data freshness takes time and experience. In case of User-Agent
integrations (for example, an application UI running in your user’s browser), it is even more crucial, as the User-Agent
is rarely under your control and yet demands fast response times. This is why we’ve opted to provide cache recommendations on the Authress side in many cases.
It doesn’t mean you can’t cache returned values for longer - if you’re making a lot of the same, low variability permission checks, you may want to build a cache on top of Authress to limit your costs. It is not strictly necessary though.
General caching strategies​
Frequently the goal is to cache Authress Authorization Requests as much as is useful. The following strategies will review the available possibilities. Recommendations for cache times will always be returned in the Cache-Control
header in the response from Authress Authorization User Permission Requests.
A. Cloud API Gateways​
If you run an API Gateway through one of the hyperscalers or through a third party, there is usually an automatic caching strategy to support caching data for a short period of time. If data can be cached on a per request basis, then adding into the cache details about the user's permissions and authorization is an option.
Depending on the API Gateway, this can work better for Serverless solutions, however with some providers this might not work at all. One example is AWS API Gateway. The API Gateway caching uses the Access Token as the cache key and not the Resource URI Path. Therefore, it cannot be used to store specific path based URI authorizations.
This means first level authorizations can be stored here fairly effectively. Examples would include A list of all the tenants or customer accounts a user has access to
. Since that list would change rarely, storing this information in the AWS API Gateway cache works well.
import { AuthressClient } from '@authress/sdk';
const authressClient = new AuthressClient({ authressApiUrl: 'https://auth.yourdomain.com' });
const userResources = await authressClient.userPermissions.getUserResources(userId, `tenants/*`, 10, null, CollectionConfiguration.TOP_LEVEL_ONLY);
return {
context: {
// Stringify is here because some API Gateways do not support lists and only support primitives like string
userResources: userResources.data.resources.join(',')
}
};
Caveat: Nested authorization cannot be stored, such as Does this user have the Action POST for Resource R
. That's because while that information could be stored, it would stored and reused for all routes to your API irrespective of Action or Resource. This would create an vulnerability in your application. If you are not sure what this means please consult with your API Gateway documentation.
B. Content Delivery Networks and Edge-based caching​
A CDN can often work to proxy all requests to Authress. Instead of integrating directly with Authress, you can proxy the requests through another solution that sits in front of Authress. Some CDNs work well for this, others might not. One example is AWS CloudFront. From he experience of the Authress Development team, using AWS CloudFront to be a bit finicky when putting CloudFront in front of other services that you don't own. Some of our customers say that it has worked, others have run into limitations from CloudFront especially regarding cache times and configuration. Usually in these cases, you might need to use a Lambda@Edge function attached to your CloudFront to interact with Authress.
Due to this, there might be limited value in the benefit from the caching that CloudFront could provide. A common corner case we've found is that sometimes you are thinking about doing this to help reduce costs. Costs of course are relevant at scale, however at that same scale, we tend to think about partial volume discounts so that rather than paying for the CDN in addition to Authress, you would get the benefit directly from Authress Billing without having to write or maintain anything yourself or pay for a second technology on top (Price or Total Cost of Ownership). If you are investigating a caching solution to handle scale due primarily to costs, please contact our team.
Once a request is passed to Lambda@Edge, that would grant full capabilities to storing and retrieving data through different data stores, such as DynamoDB. the implementation details would be up to you.
Troubleshoot AWS CloudFront​
One possible error you might see is related to a CloudFront stacking issue. Since Authress itself is using CloudFront, depending on your setup you might run into a stacking problem. At the current moment, if you are seeing this issue, there isn't a way for CloudFront to be used in your scenario, so we recommend switch to Lambda@Edge with CloudFront and interacting with Authress through there. This is explored further in the next sections.
C. Self-hosted internal proxy​
When you are at the point of wanting a proxy to cache authorization requests, a quick microservice service could be separated and created to proxy all the requests to Authress. This could be run as standalone service. The proxy would need to pass along requests Authress after interacting with your cache datastore.
The Authress SDKs support an authressApiUrl
. Instead of setting it be your Custom Domain such as https://auth.yourdomain.com
, you would set the authressApiUrl
to be your microservice's URL.
import { AuthressClient } from '@authress/sdk';
// Switch this to be your cache's URL:
const authressClient = new AuthressClient({ authressApiUrl: 'https://cache.yourdomain.com' });
const userId = 'User';
const resourceUri = `resources/${resourceId}`;
const permission = 'READ';
try {
await authressClient.userPermissions.authorizeUser(userId, resourceUri, permission);
} catch (error) {
if (error.code === 'UnauthorizedError') {
return { statusCode: 403 };
}
throw error;
}
For assistance with creating a proxy, please reach out with any questions to Authress Support.
D. Authress SDK configured caching​
Recently we've been investing further resources into improving built-in caching for our SDKs, each of the SDKs have varying levels of support for caching.
Caching in the SDK works well for longer lived containers. For sustained requests to your API, even with a serverless solution, your function will have this data cached for the lifetime of the container. This works great for balanced predictable usage. This is less valuable for bursts. For non-serverless solutions when utilizing the caching if it is provided by the Authress SDK, in your language, it can work out of the box.
Some SDKs support caching and caching configuration and others do not. The reason for this is contingent on the tools available in the language as well as libraries supporting memoization.
In-memory caching​
Depending on the sort of caching you are looking for or how your requests look, in memory can often provide the best impact. This would give you full control over how caching is done. So there are a bunch of options available, and which levers you want to pull is going to be based on your core needs.
Long term, if the SDK you are using doesn't support the caching configuration you need and you have a solution you have been using effectively, please let us know and we can opt for converting your In-memory caching configuration into a first-class option in our SDK for that language.
This example of how a cache could work:
import { AuthressClient } from '@authress/sdk';
const authressClient = new AuthressClient({ authressApiUrl: 'https://auth.yourdomain.com' });
// create a cache that stores the results for 10 seconds
const cache = new Cache(10 * 1000);
const userId = 'User';
const resourceUri = `resources/${resourceId}`;
const permission = 'READ';
let hasAccess = await cache.getValue(userId, resourceUri, permission);
// No value is cached
if (hasAccess === null) {
try {
await authressClient.userPermissions.authorizeUser(userId, resourceUri, permission);
await cache.storeValue(userId, resourceUri, permission, true);
hasAccess = true;
} catch (error) {
if (error.code === 'UnauthorizedError') {
await cache.storeValue(userId, resourceUri, permission, false);
hasAccess = false;
}
throw error;
}
}
if (!hasAccess) {
return { statusCode: 403 };
}
Shared internal cache​
One strategy that works well with multiple services when not using serverless or even when using serverless, is using a server that optimizes providing fast-lookup caches. After the Authress SDK returns a success for an authorization request, you can store the result in cache-optimized solution. Our recommendation for this strategy would be to use Valkey. Most cloud providers either support a Valkey solution or support deploying the open source container to your infrastructure:
Further Caching Support​
Have some ideas that are listed here and want to try them out, go for it, and please let us know so we can extend our recommended caching strategies.