HLS Streaming

HLS Streaming allows for scaling to millions of viewers in near real time. You can give a link of your web-app which will be converted to a HLS feed by our server and can be played across devices for consumption. Behind the scenes, this will be achieved by having a bot join your room and stream what it sees and hears. Once the feed is ready, the server will give a URL which can be played using any HLS Player.

Note that the media server serving the content in this case is owned by 100ms. If you're looking for a way to stream on YouTube, Twitch etc., please have a look at our RTMP streaming docs here.

Starting HLS

To start HLS, you'll need to pass in a meeting URL or configure the HLS settings from dashboard in the template in destinations tab. The 100ms bot will open this URL to join your room, so it must allow access without any user level interaction. In the future, it'll be possible to start HLS for multiple such URLs for the same room. For this purpose, the API supports taking in an array, although currently only the first element of the array will be respected. To distinguish between multiple URLs, an additional field metadata can be optionally passed. The meetingURL and metadata are clubbed together to form what we'll call a variant.

If everything is configured in the dashboard, you can directly call hmsActions.startHLSStreaming once you join the room.

Otherwise, You can also call hmsActions.startHLSStreaming with a hlsConfig having an array of such variants.

Additionally you can also start recording by passing in a recording config with below fields -

  1. singleFilePerLayer: if the desired end result is a mp4 file per HLS layer, false by default
  2. hlsVod: if the desired end result is a zip of m3u8 and all the chunks, false by default
async startHLSWithoutPrams() { try { // For this to work, destinations in your template needs to be configured in dashboard await hmsActions.startHLSStreaming(); } catch (err) { console.error("failed to start hls", err); } } async startHLS() { const params = { variants: [{meetingURL: "", metadata: "landscape"}] }; params.recording = {singleFilePerLayer: true, hlsVod: false}; // to enable recording try { await hmsActions.startHLSStreaming(params); } catch (err) { console.error("failed to start hls", err); } }

Stopping HLS

You can call hmsActions.stopHLSStreaming to stop HLS Streaming which will stop all the variants.

async stopHLS() { try { await hmsActions.stopHLSStreaming(); } catch (err) { console.error("failed to stop hls", err); } }

HLS State configuration

Below explain what data you can expect from HLS state when you start or stop.

enum HMSStreamingState { NONE = "none", INITIALISED = "initialised", STARTED = "started", STOPPED = "stopped", FAILED = "failed" } export interface HMSHLS { running: boolean; variants: Array<HLSVariant>; error?: HMSException; } export interface HLSVariant { url: string; meetingURL?: string; metadata?: string; startedAt?: Date; initialisedAt?: Date; state?: HMSStreamingState; }

Recording State configuration

Below explain what data you can expect from RTMP state when you start or stop.

enum HMSRecordingState { NONE = "none", INITIALISED = "initialised", STARTED = "started", PAUSED = "paused", RESUMED = "resumed", STOPPED = "stopped", FAILED = "failed" } // recording state export interface HMSHLSRecording { running: boolean; initialisedAt?: Date; startedAt?: Date; state?: HMSRecordingState; error?: ServerError; /** * if the final output is one file or one file per hls layer */ singleFilePerLayer?: boolean; /** * if video on demand needs to be turned on, false by default */ hlsVod?: boolean; }

Current room status

The selectHLSState selector can be used to know if HLS is currently running, and the url where it can be viewed.

function updateHLSState(hlsState) { console.log('is hls streaming going on - ', hlsState.running); if (hlsState.running) { console.log('hls url - ', hlsState.variants[0]?.url); } else if (hlsState.error) { console.error('error in hls streaming - ', hlsState.error); } } hmsStore.subscribe(updateHLSState, selectHLSState);

Displaying HLS Stream

Not all browsers support HLS natively, however you can use players like hls.js or Shaka Player. In case you need more UI side customisations you can go with something like videojs which internally uses hls.js for the hls piece.

Here is a simple example of using hls.js to play the HLS URL given by the SDK (using hmsStore selector). hmsStore selector hook will help you with selector functions to fetch information from the state at any point in time. Check here for more examples.

Make sure you've installed the library(npm i hls.js) before using this code.

import Hls from 'hls.js'; import { hmsActions, hmsStore } from './hms'; function renderHLS({ hlsUrl }) { const hlsState = hmsStore.getState(selectHLSState); const hlsUrl = hlsState.variants[0]?.url; const video = document.getElementById('video'); const browserHasNativeHLSSupport = video.canPlayType('application/vnd.apple.mpegurl'); if (Hls.isSupported()) { let hls = new Hls(); hls.loadSource(hlsUrl); hls.attachMedia(video); } // hls.js is not supported on iOS Safari, but as the browser has native support for playing HLS, // we can use the video element directly. else if (browserHasNativeHLSSupport) { video.src = hlsUrl; } }

Note: If you wish to customize the stream, please check this guide for more information.

Changing the quality of Stream(Hls.js)

You may want to change the quality layer of the stream based on user input or some other criteria. Each library handles this differently. Below we mention an example snippet on how to do it if you choose to use hls.js. Please refer to the respective documentations of the library you use.

import Hls from 'hls.js'; let hls = null; const video = document.getElementById('video'); const browserHasNativeHLSSupport = video.canPlayType('application/vnd.apple.mpegurl'); if (Hls.isSupported()) { hls = new Hls(); hls.loadSource(hlsUrl); hls.attachMedia(video); console.log('Available quality levels in current stream', hls.levels); setCurrentLevel(hls.levels[0]); //set the first level. } function setCurrentLevel(currentLevel) { // make sure we are choosing the right level by checking the // height and width. const newLevel = hls.levels.findIndex( (level) => level.height === currentLevel.height && level.width === currentLevel.width ); // set it. hls.currentLevel = newLevel; }

Testing with the Dashboard web-app

You may want to try out the end to end flow on our dashboard web-app before moving ahead with building in your app. To do this, you can create an additional role named hls-viewer which will be presented with the HLS stream when they join the room.

Go live

To start the HLS stream:

  • Click on the Go Live button on the top right of the page.
  • Choose the Live Stream with HLS option and click Go Live to start streaming.
  • Once HLS has started, you can join from the hls-viewer role to see the HLS stream.

Tips

  • If you're using the dashboard web-app from 100ms, please make sure to use a role which doesn't have publish permissions for beam tile to not show up. The meeting URL can be edited in the streaming/recording popup to join with a recording specific role configured as such.
  • If using your own web-app, do put in place retries for API calls like tokens etc. just in case any call fails. As human users we're used to reloading the page in these scenarios which is difficult to achieve in the automated case.
  • Make sure to not disable the logs for the passed in meeting URL. This will allow for us to have more visibility into the room, refreshing the page if join doesn't happen within a time interval.

Have a suggestion? Recommend changes ->

Was this helpful?

1234