How do I compute a digest for a given blob in big size, e.g. 5GB?


I know crypto.subtle.digest could be used to generate a digest of a given ArrayBuffer.

However, when the file is large, e.g. 5GB, I always get this error

Uncaught (in promise) DOMException: The requested file could not be read, typically due to permission problems that have occurred after a reference to a file was acquired.

click to see the full version.

How do I solve this?


I believe the right answer would be to stream the file content instead of reading whole file in memory at once. Blob allows to read a file as a stream:

Now the problem is that Web Cryptography API you’re using doesn’t support streams or incremental hashing. There is a long (and quite old) discussion about that with no clear outcome: .

I would suggest using some 3rd party library that supports incremental hashing. E.g.

The resulting code could look like

async function calcDigest() {
    const reader = finput.files[0].stream().getReader()
    const shaObj = new jsSHA("SHA-256", "ARRAYBUFFER")

    while (true) {
        const {done, value} = await
        if (done) break


Source: stackoverflow