Home

 / Blog / 

Building a Live Streaming App with Flutter and 100ms

Building a Live Streaming App with Flutter and 100ms

October 27, 202235 min read

Share

Flutter Streaming with 100ms - Cover Image

There is a crazy demand for live content right now; with good reason; it opens infinite possibilities for entertainment and business. Some popular use cases include audiences connecting directly with their favorite content creators, shopping for products and attending global events.

I’ve been playing around with incorporating a live video element in a Flutter app, and I am pretty happy with the results.

Live Streaming Interface For Flutter using 100ms

About 100ms Live Streaming SDK? Why use it?

100ms Live Streaming SDK lets you add two-way interactive live streams to your product.

Video calls are a popular way many companies solve for interactivity, but they aren’t meant to scale, especially when you hit millions of users. To fulfill the demand of the emerging use cases, 100ms provides its Interactive Live Streaming SDK which combines the interactivity of video calls with the ability to scale to millions with a single SDK.

In our example, we’ll build a Flutter application to stream using HLS and use the web app to show the HLS feed converted by 100ms servers and play it across devices.

Setup the project on the 100ms dashboard

Let’s get started by first setting up the project on the 100ms dashboard!

Note: Please use the Live Streaming Starter Kit. It comes with Live streaming with HLS enabled.

You can enable ‘Live streaming with HLS’ in your custom template by navigating to it using the left sidebar and going to ‘Destinations’. Next, click on the toggle button under Live Streaming as shown in the gif below:

How to enable Live Streaming

In 'Live Streaming' option, update the 'Tile aspect ratio' under Customise video tile layout and the 'Customise stream video output' to Mobile (9:16) to make it better suited for mobile device.

Create a new Flutter project

To create a new Flutter project, run the following command at your chosen location:

flutter create flutterlive

Inside the terminal after navigating to the project directory, run the command:

flutter pub add hmssdk_flutter

The 100ms Flutter SDK will now be added to the pubspec.yaml. If you want to add it manually, find the package on pub.dev.

You’ll need to add a few other packages:

  • http - http: ^0.13.5
  • provider - provider: ^6.0.3
  • permission_handler - permission_handler: ^10.0.0

You might have to add more permission configurations to Android and iOS specific files.

Add permissions to your AndroidManifest.xml file. Find an example AndroidManifest.xml with a complete list of all possible permissions.

Add permissions to your Info.plist file. Here's an example Info.plist with a complete list of all possible permissions.

Open the project in a code editor of your choice. I am using Visual Studio Code.

  • Clear out the default code from lib/main.dart and replace it with the following:
import 'package:flutter/material.dart';
import '../screens/home_screen.dart';

void main() {
  runApp(const MyApp());
}

class MyApp extends StatelessWidget {
  const MyApp({Key? key}) : super(key: key);

  // This widget is the root of your application.
  @override
  Widget build(BuildContext context) {
    return MaterialApp(
      title: '100ms Live Demo',
      theme: ThemeData(
        primarySwatch: Colors.blue,
      ),
      home: const HomeScreen(),
    );
  }
}
  • Inside the lib folder, create 3 new folders called models, screens, and services.
  • Start with the services folder. This would have code to interact with the 100ms SDK. Create a new file called sdk_initializer.dart and this:
import 'package:hmssdk_flutter/hmssdk_flutter.dart';

class SdkInitializer {
  static HMSSDK hmssdk = HMSSDK();
}
  • Create another file join_service.dart to call the join method on HMSSDK with the config settings. This needs an authentication token and a room id.
import 'package:hmssdk_flutter/hmssdk_flutter.dart';
import 'package:http/http.dart' as http;
import 'dart:convert';

class JoinService {
  static Future<bool> join(HMSSDK hmssdk, {String role = "broadcaster"}) async {
    String roomId = "<YOUR ROOM ID>";
    Uri endPoint = Uri.parse(
        "<YOUR TOKEN ENDPOINT>/api/token");
    http.Response response = await http.post(endPoint,
        body: {'user_id': "user", 'room_id': roomId, 'role': role});
    var body = json.decode(response.body);
    if (body == null || body['token'] == null) {
      return false;
    }
    HMSConfig config = HMSConfig(authToken: body['token'], userName: "user");
    await hmssdk.join(config: config);
    return true;
  }
}

  • To get an auth token, send an HTTP post request to the Token endpoint which can be obtained from the dashboard. Go to Developer -> Copy Token endpoint (under Access Credentials)

For example, my Token endpoint is: https://prod-in.100ms.live/hmsapi/adityathakur.app.100ms.live/

  • Append api/token to this Token endpoint and replace it in the above code of join_service.dart. Also, add the Room Id copied earlier. It should look something like this:
String roomId = "<Your Room ID>";
Uri endPoint = Uri.parse(
        "https://prod-in.100ms.live/hmsapi/adityathakur.app.100ms.live/api/token");
  • Now, inside the models folder create the data_store.dart and add this:
import 'dart:developer';

import 'package:flutter/material.dart';
import 'package:hmssdk_flutter/hmssdk_flutter.dart';

import '../services/sdk_initializer.dart';

class UserDataStore extends ChangeNotifier
    implements HMSUpdateListener, HMSActionResultListener {
  HMSTrack? remoteVideoTrack;
  HMSPeer? remotePeer;
  HMSTrack? remoteAudioTrack;
  HMSVideoTrack? localTrack;
  bool _disposed = false;
  late HMSPeer localPeer;
  String? streamURL;
  bool isRoomEnded = false;
  bool isLive = false;

  @override
  void dispose() {
    _disposed = true;
    super.dispose();
  }

  @override
  void notifyListeners() {
    if (!_disposed) {
      super.notifyListeners();
    }
  }

  @override
  void onChangeTrackStateRequest(
      {required HMSTrackChangeRequest hmsTrackChangeRequest}) {}

  void onError({required HMSException error}) {}

  void leaveRoom() async {
    SdkInitializer.hmssdk.stopHlsStreaming();
    SdkInitializer.hmssdk.leave(hmsActionResultListener: this);
  }

  @override
  void onJoin({required HMSRoom room}) {
    for (HMSPeer each in room.peers!) {
      if (each.isLocal) {
        localPeer = each;
        break;
      }
    }
    if (localPeer.role.name == "broadcaster") {
      SdkInitializer.hmssdk.startHlsStreaming(hmsActionResultListener: this);
    }

    isLive = room.hmshlsStreamingState?.running ?? false;
    if (isLive) {
      String? hlsm3u8Url = room.hmshlsStreamingState?.variants[0]?.hlsStreamUrl;
      streamURL = hlsm3u8Url;
      notifyListeners();
    }
  }

  @override
  void onMessage({required HMSMessage message}) {}

  @override
  void onPeerUpdate({required HMSPeer peer, required HMSPeerUpdate update}) {
    switch (update) {
      case HMSPeerUpdate.peerJoined:
        remotePeer = peer;
        remoteAudioTrack = peer.audioTrack;
        remoteVideoTrack = peer.videoTrack;
        break;
      case HMSPeerUpdate.peerLeft:
        remotePeer = null;
        break;
      case HMSPeerUpdate.roleUpdated:
        break;
      case HMSPeerUpdate.metadataChanged:
        break;
      case HMSPeerUpdate.nameChanged:
        break;
      case HMSPeerUpdate.defaultUpdate:
        break;
      case HMSPeerUpdate.networkQualityUpdated:
        break;
    }
    notifyListeners();
  }

  @override
  void onReconnected() {}

  @override
  void onReconnecting() {}

  @override
  void onRemovedFromRoom(
      {required HMSPeerRemovedFromPeer hmsPeerRemovedFromPeer}) {}

  @override
  void onRoleChangeRequest({required HMSRoleChangeRequest roleChangeRequest}) {}

  @override
  void onRoomUpdate({required HMSRoom room, required HMSRoomUpdate update}) {
    isLive = room.hmshlsStreamingState?.running ?? false;
    switch (update) {
      case HMSRoomUpdate.hlsStreamingStateUpdated:
        if (isLive) {
          String? hlsm3u8Url =
              room.hmshlsStreamingState?.variants[0]?.hlsStreamUrl;
          streamURL = hlsm3u8Url;
          notifyListeners();
        }
        break;
    }
  }

  @override
  void onTrackUpdate(
      {required HMSTrack track,
      required HMSTrackUpdate trackUpdate,
      required HMSPeer peer}) {
    switch (trackUpdate) {
      case HMSTrackUpdate.trackAdded:
        if (track.kind == HMSTrackKind.kHMSTrackKindAudio) {
          if (peer.isLocal) remoteAudioTrack = track;
        } else if (track.kind == HMSTrackKind.kHMSTrackKindVideo) {
          if (peer.isLocal) {
            remoteVideoTrack = track;
          } else {
            localTrack = track as HMSVideoTrack;
          }
        }
        break;
      case HMSTrackUpdate.trackRemoved:
        if (track.kind == HMSTrackKind.kHMSTrackKindAudio) {
          if (peer.isLocal) remoteAudioTrack = null;
        } else if (track.kind == HMSTrackKind.kHMSTrackKindVideo) {
          if (peer.isLocal) {
            remoteVideoTrack = null;
          } else {
            localTrack = null;
          }
        }
        break;
      case HMSTrackUpdate.trackMuted:
        if (track.kind == HMSTrackKind.kHMSTrackKindAudio) {
          if (peer.isLocal) remoteAudioTrack = track;
        } else if (track.kind == HMSTrackKind.kHMSTrackKindVideo) {
          if (peer.isLocal) {
            remoteVideoTrack = track;
          } else {
            localTrack = null;
          }
        }
        break;
      case HMSTrackUpdate.trackUnMuted:
        if (track.kind == HMSTrackKind.kHMSTrackKindAudio) {
          if (peer.isLocal) remoteAudioTrack = track;
        } else if (track.kind == HMSTrackKind.kHMSTrackKindVideo) {
          if (peer.isLocal) {
            remoteVideoTrack = track;
          } else {
            localTrack = track as HMSVideoTrack;
          }
        }
        break;
      case HMSTrackUpdate.trackDescriptionChanged:
        break;
      case HMSTrackUpdate.trackDegraded:
        break;
      case HMSTrackUpdate.trackRestored:
        break;
      case HMSTrackUpdate.defaultUpdate:
        break;
    }
    notifyListeners();
  }

  @override
  void onUpdateSpeakers({required List<HMSSpeaker> updateSpeakers}) {}

  void startListen() {
    SdkInitializer.hmssdk.addUpdateListener(listener: this);
  }

  @override
  void onAudioDeviceChanged(
      {HMSAudioDevice? currentAudioDevice,
      List<HMSAudioDevice>? availableAudioDevice}) {
    // TODO: implement onAudioDeviceChanged
  }

  @override
  void onHMSError({required HMSException error}) {
    // TODO: implement onHMSError
  }

  @override
  void onException(
      {required HMSActionResultListenerMethod methodType,
      Map<String, dynamic>? arguments,
      required HMSException hmsException}) {
    switch (methodType) {
      case HMSActionResultListenerMethod.leave:
        print("Leave room error ${hmsException.message}");
        break;
      case HMSActionResultListenerMethod.hlsStreamingStarted:
        print("HLS Stream start error ${hmsException.message}");
        break;

      case HMSActionResultListenerMethod.hlsStreamingStopped:
        print("HLS Stream stop error ${hmsException.message}");
        break;

      case HMSActionResultListenerMethod.startAudioShare:
        print("Audio share error ${hmsException.message}");
        break;
      case HMSActionResultListenerMethod.switchCamera:
        print("Switch camera error ${hmsException.message}");
        break;
    }
  }

  @override
  void onSuccess(
      {required HMSActionResultListenerMethod methodType,
      Map<String, dynamic>? arguments}) {
    switch (methodType) {
      case HMSActionResultListenerMethod.hlsStreamingStarted:
        //If start HLS streaming call is successful, you'll get an update in onRoomUpdate
        //Documentation: https://www.100ms.live/docs/flutter/v2/how--to-guides/record-and-live-stream/hls#how-to-display-hls-stream-and-get-hls-state-in-room
        break;

      case HMSActionResultListenerMethod.hlsStreamingStopped:
        isLive = false;
        break;

      case HMSActionResultListenerMethod.leave:
        isRoomEnded = true;
        notifyListeners();
        break;
    }
  }
}

The UserDataStore will implement the HMSUpdateListener. This will let you know when an event happens, like a new peer joining the call. It extends the ChangeNotifier to update the UI accordingly.

Now, let us talk UI.

Going Live using App

  • Inside the screens folder, create a file called home_screen.dart. Here, you will now create a button to launch the live stream.
class HomeScreen extends StatefulWidget {
  const HomeScreen({Key? key}) : super(key: key);

  @override
  State<HomeScreen> createState() => _HomeScreenState();
}

class _HomeScreenState extends State<HomeScreen> {
  @override
  Widget build(BuildContext context) {
    return SafeArea(
      child: Scaffold(
        body: Center(
          child: OutlinedButton(
            style: ButtonStyle(
                backgroundColor: MaterialStateProperty.all(Colors.white),
                shape: MaterialStateProperty.all(
                  RoundedRectangleBorder(
                    borderRadius: BorderRadius.circular(40),
                  ),
                )),
            onPressed: () async {},
            child: const Padding(
              padding: EdgeInsets.symmetric(horizontal: 80),
              child: Text('Go Live!'),
            ),
          ),
        ),
      ),
    );
  }
}
  • To this StatefulWidget, add an initState() function that would build the 100ms SDK and also get the necessary permissions.
@override
  void initState() {
    SdkInitializer.hmssdk.build();
    getPermissions();
    super.initState();
  }

  void getPermissions() async {
    await Permission.camera.request();
    await Permission.microphone.request();

    while ((await Permission.camera.isDenied)) {
      await Permission.camera.request();
    }
    while ((await Permission.microphone.isDenied)) {
      await Permission.microphone.request();
    }
		while ((await Permission.bluetoothConnect.isDenied)) {
			await Permission.bluetoothConnect.request();
		}
  }
  • Room joining will be handled by calling the JoinService.join() created previously.
  //Handles room joining functionality
  Future<bool> joinRoom({String role = "broadcaster"}) async {
    setState(() {
      _isLoading = true;
    });
    //The join method initialize sdk,gets auth token,creates HMSConfig and helps in joining the room
    bool isJoinSuccessful =
        await JoinService.join(SdkInitializer.hmssdk, role: role);
    if (!isJoinSuccessful) {
      return false;
    }
    _dataStore = UserDataStore();
    //Here we are attaching a listener to our DataStoreClass
    _dataStore.startListen();

    setState(() {
      _isLoading = false;
    });
    return true;
  }
  • With that done, you need to update the onPressed of the Button to join a meeting and navigate to a new screen as follows:
onPressed: () async {
              bool isJoined = await joinRoom();
              if (isJoined) {
                Navigator.of(context).push(MaterialPageRoute(
                    builder: (_) => ListenableProvider.value(
                        value: _dataStore, child: const MeetingScreen())));
              } else {
                const SnackBar(content: Text("Error"));
              }
            },

The complete code of the home_screen.dart can be found in my github.

  • Next, you have to create a file live_screen.dart under the screens folder. Update the code in the file:
import 'package:flutter/material.dart';
import 'package:hmssdk_flutter/hmssdk_flutter.dart';
import 'package:provider/provider.dart';
import '../models/data_store.dart';
import '../services/sdk_initializer.dart';

class LiveScreen extends StatefulWidget {
  const LiveScreen({Key? key}) : super(key: key);

  @override
  _LiveScreenState createState() => _LiveScreenState();
}

class _LiveScreenState extends State<LiveScreen> {
  bool isLocalAudioOn = true;
  bool isLocalVideoOn = true;
  Offset position = const Offset(10, 10);

  @override
  Widget build(BuildContext context) {
    final _isVideoOff = context.select<UserDataStore, bool>(
        (user) => user.remoteVideoTrack?.isMute ?? true);
    final _peer =
        context.select<UserDataStore, HMSPeer?>((user) => user.remotePeer);
    final remoteTrack = context
        .select<UserDataStore, HMSTrack?>((user) => user.remoteVideoTrack);
    final localTrack = context
        .select<UserDataStore, HMSVideoTrack?>((user) => user.localTrack);

    final isLive =
        context.select<UserDataStore, bool?>((user) => user.isLive) ?? false;

    return WillPopScope(
      onWillPop: () async {
        context.read<UserDataStore>().leaveRoom();
        if (context.read<UserDataStore>().isLive == false) {
          Navigator.pop(context);
        }
        return true;
      },
      child: SafeArea(
        child: Scaffold(
          body: SizedBox(
            height: MediaQuery.of(context).size.height,
            width: MediaQuery.of(context).size.width,
            child: isLive
                ? Stack(
                    children: [
                      Container(
                          color: Colors.black.withOpacity(0.9),
                          child: _isVideoOff
                              ? const Align(
                                  alignment: Alignment.center,
                                  child: Icon(
                                    Icons.videocam_off,
                                    color: Colors.white,
                                    size: 30,
                                  ),
                                )
                              : (remoteTrack != null)
                                  ? HMSVideoView(
                                      track: remoteTrack as HMSVideoTrack,
                                      matchParent: false)
                                  : const Center(child: Text("No Video"))),
                      Align(
                        alignment: Alignment.bottomCenter,
                        child: Padding(
                          padding: const EdgeInsets.only(bottom: 15),
                          child: Row(
                            mainAxisAlignment: MainAxisAlignment.spaceEvenly,
                            children: [
                              GestureDetector(
                                onTap: () async {
                                  context.read<UserDataStore>().leaveRoom();
                                  Navigator.pop(context);
                                },
                                child: Container(
                                  decoration: BoxDecoration(
                                      shape: BoxShape.circle,
                                      boxShadow: [
                                        BoxShadow(
                                          color: Colors.red.withAlpha(60),
                                          blurRadius: 3.0,
                                          spreadRadius: 5.0,
                                        ),
                                      ]),
                                  child: const CircleAvatar(
                                    radius: 25,
                                    backgroundColor: Colors.red,
                                    child: Icon(Icons.call_end,
                                        color: Colors.white),
                                  ),
                                ),
                              ),
                              GestureDetector(
                                onTap: () => {
                                  SdkInitializer.hmssdk.toggleCameraMuteState(),
                                  setState(() {
                                    isLocalVideoOn = !isLocalVideoOn;
                                  })
                                },
                                child: CircleAvatar(
                                  radius: 25,
                                  backgroundColor:
                                      Colors.transparent.withOpacity(0.2),
                                  child: Icon(
                                    isLocalVideoOn
                                        ? Icons.videocam
                                        : Icons.videocam_off_rounded,
                                    color: Colors.white,
                                  ),
                                ),
                              ),
                              GestureDetector(
                                onTap: () => {
                                  SdkInitializer.hmssdk.toggleMicMuteState(),
                                  setState(() {
                                    isLocalAudioOn = !isLocalAudioOn;
                                  })
                                },
                                child: CircleAvatar(
                                  radius: 25,
                                  backgroundColor:
                                      Colors.transparent.withOpacity(0.2),
                                  child: Icon(
                                    isLocalAudioOn ? Icons.mic : Icons.mic_off,
                                    color: Colors.white,
                                  ),
                                ),
                              ),
                            ],
                          ),
                        ),
                      ),
                      Positioned(
                        top: 10,
                        left: 10,
                        child: GestureDetector(
                          onTap: () {
                            context.read<UserDataStore>().leaveRoom();
                            Navigator.pop(context);
                          },
                          child: const Icon(
                            Icons.arrow_back,
                            color: Colors.white,
                          ),
                        ),
                      ),
                      Positioned(
                        top: 10,
                        right: 10,
                        child: GestureDetector(
                          onTap: () {
                            if (isLocalVideoOn) {
                              SdkInitializer.hmssdk.switchCamera();
                            }
                          },
                          child: CircleAvatar(
                            radius: 25,
                            backgroundColor:
                                Colors.transparent.withOpacity(0.2),
                            child: const Icon(
                              Icons.switch_camera_outlined,
                              color: Colors.white,
                            ),
                          ),
                        ),
                      ),
                    ],
                  )
                : Center(child: CircularProgressIndicator()),
          ),
        ),
      ),
    );
  }
}

The live_screen.dart would display the video of the user as it would look on the live stream with buttons to leave a call, turn on/off video and mute/unmute audio.

Viewing the Live Stream

We will now update the home_screen.dart file to add another button on the HomeScreen to view the live stream using the application.

  • Wrap the OutlinedButton with a Column widget.
  • Add another OutlinedButton as follows:

        OutlinedButton(
              style: ButtonStyle(
                  backgroundColor: MaterialStateProperty.all(Colors.white),
                  shape: MaterialStateProperty.all(
                    RoundedRectangleBorder(
                      borderRadius: BorderRadius.circular(40),
                    ),
                  )),
              onPressed: () async {
                bool isJoined = await joinRoom(role: "hls-viewer");

                if (isJoined) {
                  Navigator.of(context).push(MaterialPageRoute(
                      builder: (_) => ListenableProvider.value(
                          value: _dataStore, child: const StreamViewScreen())));
                } else {
                  const snackBar = SnackBar(
                    content: Text('Error in joining room and viewing.'),
                  );
                  ScaffoldMessenger.of(context).showSnackBar(snackBar);
                }
              },
              child: Padding(
                padding: EdgeInsets.symmetric(horizontal: 55),
                child: _isLoading
                    ? CircularProgressIndicator()
                    : Text('View Live Stream'),
              ),
            ),

Next, create a new file called view_live.dart and add the following code:

import 'package:flutter/material.dart';
import 'package:flutterlive/screens/video_player.dart';
import 'package:hmssdk_flutter/hmssdk_flutter.dart';
import 'package:provider/provider.dart';
import '../models/data_store.dart';
import '../services/sdk_initializer.dart';

class StreamViewScreen extends StatefulWidget {
  const StreamViewScreen({Key? key}) : super(key: key);

  @override
  _StreamViewScreenState createState() => _StreamViewScreenState();
}

class _StreamViewScreenState extends State<StreamViewScreen> {
  bool isLocalAudioOn = true;
  bool isLocalVideoOn = true;
  Offset position = const Offset(10, 10);

  Future<bool> leaveRoom() async {
    SdkInitializer.hmssdk.stopHlsStreaming();
    SdkInitializer.hmssdk.leave();
    Navigator.pop(context);
    return false;
  }

  @override
  Widget build(BuildContext context) {
    final streamURL =
        context.select<UserDataStore, String?>((user) => user.streamURL);

    return WillPopScope(
      onWillPop: () async {
        return leaveRoom();
      },
      child: SafeArea(
        child: Scaffold(
          body: SizedBox(
            height: MediaQuery.of(context).size.height,
            width: MediaQuery.of(context).size.width,
            child: Stack(
              children: [
                Container(
                  color: Colors.black.withOpacity(0.9),
                  child: (streamURL != null)
                      ? VideoPlayerScreen(streamURL: streamURL)
                      : const Center(
                          child: Text(
                            "No Live Video",
                            style: TextStyle(color: Colors.white),
                          ),
                        ),
                ),
              ],
            ),
          ),
        ),
      ),
    );
  }
}

We will now work on a VideoPlayerScreen to add a video player. For this we will use video_player, a Flutter plugin for iOS, Android and the Web to play back video on a Widget surface.

  • Add the plugin by running the following command on the terminal:
flutter pub add video_player
  • Create a new file video_player.dart and add the VideoPlayerScreen as follows:
import 'dart:async';

import 'package:flutter/material.dart';
import 'package:hmssdk_flutter/hmssdk_flutter.dart';
import 'package:provider/provider.dart';
import 'package:video_player/video_player.dart';

import '../models/data_store.dart';

class VideoPlayerScreen extends StatefulWidget {
  String streamURL;
  VideoPlayerScreen({
    Key? key,
    required this.streamURL,
  }) : super(key: key);

  @override
  State<VideoPlayerScreen> createState() => _VideoPlayerScreenState();
}

class _VideoPlayerScreenState extends State<VideoPlayerScreen> {
  late VideoPlayerController _controller;
  late Future<void> _initializeVideoPlayerFuture;

  @override
  void initState() {
    super.initState();
    _controller = VideoPlayerController.network(
      widget.streamURL,
    );
    _initializeVideoPlayerFuture = _controller.initialize();
    _controller.play();
    _controller.setLooping(true);
  }

  @override
  void dispose() {
    _controller.dispose();

    super.dispose();
  }

  @override
  Widget build(BuildContext context) {
    return Scaffold(
      backgroundColor: Colors.black,
      body: FutureBuilder(
        future: _initializeVideoPlayerFuture,
        builder: (context, snapshot) {
          if (snapshot.connectionState == ConnectionState.done) {
            return Center(
              child: AspectRatio(
                aspectRatio: _controller.value.aspectRatio,
                // aspectRatio: _controller.value.aspectRatio,
                child: VideoPlayer(_controller),
              ),
            );
          } else {
            return const Center(
              child: CircularProgressIndicator(),
            );
          }
        },
      ),
    );
  }
}

Project Demo

With this done, you’re all set to test your app and go live using your mobile device.

Going Live using the mobile app

Run the app on an emulator or your own device with USB Debugging enabled and click on the ‘Go Live!’ button.

To view this live stream:

  • Return to the 100ms dashboard.
  • Using the side navigation go to the rooms, and click on the room used (in our case the room name was ‘flutter’).
  • Click on ‘Join Room’ and copy the link next to the Hls-Viewer role.
  • Paste the link in the browser and wait for the stream to start!

Viewing the Live Stream

  • Open 100ms dashboard.
  • Using the side navigation go to the rooms, and click on the room used (in our case the room name was ‘flutter’).
  • Click on ‘Join Room’ and copy the link next to the Broadcaster role.
  • Paste the link in the browser and Join Studio.
  • Click on the 'Go Live' button (at top right) and then 'Live Stream with HLS' and on 'Go Live' again.

Run the app on an emulator or your own device with USB Debugging enabled and click on the ‘View Live Stream’ button to view it using the application.

Wohooo! You’ve now successfully live streamed from your Flutter app. You’re a streamer now!

What’s next?

Start by exploring the Interactive Live Streaming docs.

Find the GitHub repository with complete code.

Engineering

Share

Related articles

See all articles