Get user accounts on daily basis

I want to pull user details those were created on daily basis and also need to extract data from their user metadata and other flag values. Is there any apis or any services available for this?

If you want to get your new users from the past 24 hours, and you have <= 1,000 new users, you can query the /users endpoint with a query like:

created_at:[2018-12-30T00:00:00.000Z TO 2018-12-30T23:59:59.999Z]

If you have > 1,000 new users, but <= 10,000 new users per 24 hours, you can use the Import/Export Extension. Same query syntax as above but you need to manually specify which attributes you want returned (or accept the default set). Queries cannot be saved which makes this method painful if you have a lot of attributes to specify and need to run this query regularly.

For more than 10,000 results you need to use the users-exports/ endpoint. This option always returns your entire user database, with either a default set of attributes or a set of attributes specified by you. You then need to filter the results yourself.

Hi Mark, Thanks it helped a lot.
But I have one concern, if there is a scenario like you have >1000 new users everyday and you have to fetch it on daily basis, Do I have to use Import/Export Extension only, because it is too manual for me or is there any other way to automate the process.

Edit: This is a “backup” script I wrote that implements steps 1 - 2.1 from below and should help you get started. You need to change the export_job dictionary to export the data fields you want, and you still need to implement the code that will filter and massage the results. Refer to my comments here regarding this script and other solutions.

Unfortunately, at the moment your only option is to use the users-exports/ endpoint to export your entire user database, and then filter the results. This can be automated with a script that implements the following logic:

  1. Submit a job file to users-exports/ endpoint,
    1.1 Retrieve the job ID from the result,
  2. Start a loop, checking the jobs/ endpoint for "status": "completed"
    2.1 Once completed retrieve the results file located at the "location": specified in the data returned by the jobs/ endpoint,
  3. Manually process the file to filter and manipulate the data as needed.

Important to note that the results file is not (I think) a proper JSON file (assuming you specify json in your job file). Instead it is a text file where each line is a JSON string with a users profile, one user / JSON string per line.

yes Mark I have implemented the created_at:[2018-12-30T00:00:00.000Z TO 2018-12-30T23:59:59.999Z] query to fetch records in daily basis for now, but if sometimes the record count will increase more than 1000 then I have to use users-exports endpoint and after exporting the file I have to apply logic to filter out record for past 24 hours. Is that what you are saying?

That’s correct. The only way to automate retrieval of >1,000 results is via the jobs/ endpoint.

1 Like

Thanks Mark, That’s all I need from you. Much appreciated. :slight_smile:

1 Like

This topic was automatically closed 15 days after the last reply. New replies are no longer allowed.