Api.cache in M2M flow action question

I am using the api.cache functionality in an M2M api to set key values. As a test I am setting the key based on the application name. I have 2 applications. On each call to request a bearer token for the api I am simply writing the key and incrementing the # of times called and then issuing a deny. I am not setting an explicit expire time. Per the documentation the value should be cached 15 minutes.

What I have found is values ARE BEING CACHED however there seems to be multiple copies of the cache as I am getting random values returned as I count up per key. It is like the endpoint is load balanced and each node is not aware of the other nodes.

Is this expected behavior?

Thanks for any insights.

Hi @podle,

Welcome to the Auth0 Community!

I reached out to the team to confirm. I’ll update here with the response.

1 Like

Hi again,

Our engineers responded; this behavior is expected.

The main use-case solved by the cache is for an API access token, where the tokens are relatively uniform, but requesting additional tokens is costly. It is not meant to replace a database.

Dan,

Is there an article outlining a valid use case for the API cache? Or can you explain an example of how the main use-case you are describing might work?

Without sticky sessions (you can’t actually depend on hitting the same cache each time you call a service) I am struggling to understand a valid use case.

Thanks,
Paul

This article describes a few sample use cases: Actions Caching Is Now Available

A simple example would be caching tokens for an external API:

  • Without caching, you would have to request a new token during every signup.
  • With caching, you only have to request a new token when one isn’t already stored in the cache.

Does that make sense?

1 Like

Dan,

Thanks for the link. This was helpful. Even with the limitations of not necessarily returning to the same node (and hitting the same cache) and the 15 minute max on cache times, the feature is good enough to prevent overt authentication spamming. I define authentication spamming as follows.

  1. client requests a bearer token using their client_id and client secret
  2. client runs a request with bearer token
  3. client makes no effort to cache their bearer token and for subsequent requests just repeats #1 and #2 above.

This is easier to program (albeit slower) but would quickly blow through authentication request. Rather than use an external caching mechanism to front end these request, which would have its own issues with uptime, reliability, failure, etc., I was hoping to use something in the platform to accomplish the same.

The api cache feature as outlined by the link I think handles this if not in the most optimal way in terms of the actual client calls, it is good enough to get the job done.

Thanks for the link and clarifications.

Regards,
Paul Odle

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.