My use case is a script being written to migrate some fields from user_metadata
to app_metadata
for all users. Given some tenants I need to run this against have over 1000 users, I have to use the /api/v2/jobs
APIs to export all users, fetch the resulting file upon completion and then import the updated user objects (with upsert: true
).
I have tried both axios and fetch libraries to get the exported users file (.../my-tenant-name.json.gz...
) from within my nodejs script, each trying to decompress the response.data contents as gzip, compress and br. I’ve tried setting responseType to ‘arrayBuffer’ as well as using response.data.toString(‘utf-8’)…
I seem to always get errors during the decompression steps… for example, using the zlib node module:
Error: incorrect header check
I am sure the file exists, I’ve got code polling the jobs endpoint waiting for either state === 'completed'
or an error state/condition. If I console.log() the location and paste it into the browser it works, as other similar questions have identified. My problem is I need this to be all non-interactive and processed in code.
Here are two of the several ways I have made calls to get the export output file:
const response = await axios.get(exportResult.location, { responseType: 'arrayBuffer' });
and…
const response = await fetch(exportResult.location).then(res => zlib.unzipSync(res.body.toString()));
Any suggestions greatly appreciated!