Skip to content
Advertisement

Streaming into element

I would like to play audio from a web socket that sends packages of sound data of unknown total length. The playback should start as soon as the first package arrives and it should not be interrupted by new packages.

What I have done so far:

ws.onmessage = e => {
  const soundDataBase64 = JSON.parse(e.data);
  const bytes = window.atob(soundDataBase64);
  const arrayBuffer = new window.ArrayBuffer(bytes.length);
  const bufferView = new window.Uint8Array(arrayBuffer);
  for (let i = 0; i < bytes.length; i++) {
    bufferView[i] = bytes.charCodeAt(i);
  }
  const blob = new Blob([arrayBuffer], {"type": "audio/mp3"});
  const objectURL = window.URL.createObjectURL(blob);
  const audio = document.createElement("audio");
  audio.src = objectURL;
  audio.controls = "controls";
  document.body.appendChild(audio);
};

However, to my knowledge, it is not possible to extend the size of ArrayBuffer and Uint8Array. I would have to create a new blob, object URL and assign it to the audio element. But I guess, this would interrupt the audio playback.

On the MDN page of <audio>, there is a hint to MediaStream, which looks promising. However, I am not quite sure how to write data onto a media stream and how to connect the media stream to an audio element.

Is it currently possible with JS to write something like pipe where I can input data on one end, which is then streamed to a consumer? How would seamless streaming be achieved in JS (preferably without a lot of micro management code)?

Advertisement

Answer

As @Kaiido pointed out in the comments, I can use the MediaSource object. After connecting a MediaSource object to an <audio> element in the DOM, I can add a SourceBuffer to an opened MediaSource object and then append ArrayBuffers to the SourceBuffer.

Example:

const ws = new window.WebSocket(url);
ws.onmessage = _ => {
  console.log("Media source not ready yet... discard this package");
};

const mediaSource = new window.MediaSource();
const audio = document.createElement("audio");
audio.src = window.URL.createObjectURL(mediaSource);
audio.controls = true;
document.body.appendChild(audio);

mediaSource.onsourceopen = _ => {
  const sourceBuffer = mediaSource.addSourceBuffer("audio/mpeg"); // mpeg appears to not work in Firefox, unfortunately :(
  ws.onmessage = e => {
    const soundDataBase64 = JSON.parse(e.data);
    const bytes = window.atob(soundDataBase64);
    const arrayBuffer = new window.ArrayBuffer(bytes.length);
    const bufferView = new window.Uint8Array(arrayBuffer);
    for (let i = 0; i < bytes.length; i++) {
      bufferView[i] = bytes.charCodeAt(i);
    }
    sourceBuffer.appendBuffer(arrayBuffer);
  };
};

I tested this successfully in Google Chrome 94. Unfortunately, in Firefox 92, the MIME type audio/mpeg seems not working. There, I get the error Uncaught DOMException: MediaSource.addSourceBuffer: Type not supported in MediaSource and the warning Cannot play media. No decoders for requested formats: audio/mpeg.

User contributions licensed under: CC BY-SA
3 People found this is helpful
Advertisement