Include the details of executed actions in Log Streams

When streaming log entries to our own custom logging system, it should be possible to capture all the same information that is available in the log viewer in the Auth0 management portal. Currently there is a functionality gap with log streaming (at least for a custom webhook stream which is what I am using): the details of any custom actions that were executed as part of an event. These details should be included in the data sent to the log stream.

What we get in the log stream data for executed actions looks like this: "actions": { "executions": ["random_unique_identifier"] }, - in other words it is a pointer to the action execution log, not the data itself. This should be replaced with all the data available for the action execution, which looks like this:

{
  "action_name": "My action name",
  "response": {
    "logs": "Custom data written to console.log by our code...",
    "stats": {
      "total_request_duration_ms": 414,
      "total_runtime_execution_duration_ms": 411,
      "runtime_processing_duration_ms": 4,
      "action_duration_ms": 345,
      "runtime_external_call_duration_ms": 62,
      "boot_duration_ms": 66,
      "network_duration_ms": 3
    }
  },
  "started_at": "2022-04-28T11:07:08.986801291Z",
  "ended_at": "2022-04-28T11:07:09.402169426Z"
}

Actions are extremely useful, but diagnosing unexpected behaviour in them in production scenarios is currently painful, because simply finding the relevant log entry in your user interface is quite a challenge and you can only look at one at a time, so can’t find patterns in the data. Allowing us to import our own custom log data from console.log writes in an action to a custom logging environment (ElasticSearch in my case), would make it much easier to work with actions in production.

Hey there! Thank you for creating this feedback card! Let’s see if it gets some additional advocates upvoting it!

Hi @wwarby - I am trying to send the logs from Auth0 to elasticsearch through the “Auth0 Log Streams Integration”, what should be the payload URL that we need to configure on the custom webhook settings? Kindly advise.

Below are the two docs that I am following but not clear on what should be the payload URL.

Auth0 doc: Create Custom Log Streams Using Webhooks
Elasticsearch Auth0 Log Streams Integration: Auth0 | Elastic docs

Any input will be helpful for our initial integration.

Thanks,
Parani.

1 Like

Hi @parani.kumar, I can’t help you there I’m afraid. My Elastic Search environment isn’t web facing, so I send my logs to a custom API written in C# which I then write to Elastic Search using the Elastic Search client for .NET.

Hi Parani,

I am having the exact same problem. We are using an elastic cloud deployment. Did you manage to figure that out?

Thanks a lot in advance.

Cheers,
Alei

Hi @a.salem - Not yet, still a open item to be addressed… As a matter of fact, I am going to resume this work item - will keep posted if I figure out a solution… Thanks…

Hey there!

As this topic is related to Actions and Rules & Hooks are being deprecated soon in favor of Actions, I’m excited to let you know about our next Ask me Anything session in the Forum on Thursday, January 18 with the Rules, Hooks and Actions team on Rules & Hooks and why Actions matter! Submit your questions in the thread above and our esteemed product experts will provide written answers on January 18. Find out more about Rules & Hooks and why Actions matter! Can’t wait to see you there!

Learn more here!

@konrad.sopala I’ve upvoted this but also replying for extra visibility. Log streams are not useful today for debugging actions. We’re adding API calls to our actions to send events into Datadog despite already having log stream integrations configured for Datadog - it feels like we’re doing more work than we should need to in order to see useful information from our action executions.

3 Likes

Would also love to have this feature–want to be able to set up SLOs / Monitors based on how quickly/slowly our actions pipeline is executing different parts of the overall process, and need this level of logging to do that!

3 Likes