Checking MDN I see there used to be BlobBuilder and that I could call blobBuilder.append to continue adding data to a blob but according to MDN BlobBuilder is deprecated in favor of the Blob constructor. Unfortunately the Blob constructor requires all data in memory at construction time. My data is too large to be in memory at construction time. Looking at the File API see nothing there either.
Is there a way to generate large data client side and put it in a blob? For example say I wanted to render a 16k by 16k image. Uncompressed that's a 1gig image.
I have an algorithm that can generate it 1 or a few scan lines at a time but I need way to write those scan lines into a file/blob and then when finished I can use the standard way to let the user download that blob but, I can't seem to find an API that let's me stream data into a blob.
The only thing I can think of is apparently I can make a Blob from Blobs so I suppose I can write each part of the image to a separate blob and then send all the blobs to another blob to get a big blob.
Is that the only solution? Seems kind of um .... strange. Though if it works then ¯\_(ツ)_/¯
Someone voted to close as they don't understand the question. Here's another explanation.
Write 4 gig to a blob
const arrays = [];
for (let i = 0; i < 4096; ++i) {
arrays.push(new Uint8Array(1024 * 1024)); // 1 meg
}
// arrays now holds 4 gig of data
const blob = new Blob(arrays);
The code above will crash because the browser will kill the page for using too much memory. Using BlobBuilder I could have done something like
const builder = new BlobBuilder();
for (let i = 0; i < 4096; ++i) {
const data = new Uint8Array(1024 * 1024); // 1 meg
builder.append(data);
}
const blob = builder.getBlob(...);
That would not have run out of memory because there is never more than 1meg of data around. The browser can flush the data being appended to the BlobBuilder out to disk.
What's the new way to achieve writing 4 gig to a blob? Is it only writing lots of small blobs and then using those to generate a larger one or is there some more traditional way where traditional means steaming into some object/file/blob/storage.