My current understanding is that Netlify Functions does not support any binary data format, so it just makes everything into a string in JSON.
I created a workaround that I have no idea if it is a good way to solve the problem, nor I’m aware of the trade-offs, but it is working for me. There might be unnecessary steps in it too.
I first transform the Blob
(with a mime type of audio/mpeg
) generated by the MediaRecorder
API into an ArrayBuffer
object:
const arrayBuffer = await new Response(audioBlob).arrayBuffer();
This step is necessary because later, inside the Netlify Function, I have to decode the information back into binary data to save on S3, but I am using NodeJS and Node cannot handle a Blob
, but it can handle a Buffer
.
Then I transform the arrayBuffer
into a regular string
const audioBufferStringified = String.fromCharCode(
...new Uint8Array(arrayBuffer)
);
so I can pass it in a JSON to the Netlify Function.
const response = await fetch(
`/.netlify/functions/upload-audio?user=${user}&filename=${filename}`,
{
headers: { Accept: "application/json" },
method: "POST",
body: audioBufferStringified
}
);
Inside the Netlify Function, I map the string
back into an ArrayBuffer
object
const audioUint8Array = Uint8Array.from(
[...audioBufferStringified].map(ch => ch.charCodeAt())
).buffer;
and then transform the ArrayBuffer
into a Buffer
that keeps the mime type audio/mpeg
.
const audioBuffer = Buffer.from(audioUint8Array);
This Buffer
I can store directly in AWS S3 which can be played later as a valid MP3 file.
const resourceKey = `uploads/${user}/${filename}.mp3`;
const params = {
Bucket: "bucketname",
Key: resourceKey,
Body: audioBuffer,
ACL: "public-read",
ContentType: "audio/mpeg",
ContentDisposition: "inline"
};
const putObjectPromise = await s3.upload(params).promise();
The whole process is:
blob → arraybuffer → string (request to Netlify Function) string → arraybuffer → buffer
I would appreciate comments on any issue with this code that I am missing.