Live JavaScript v1
Live JavaScript
Version 1

Camera filter

Camera filter is a feature that you can use to apply face filters on the host's face. By using various face filters, the host will be able to offer an engaging and entertaining live event experience for the participants. Follow this guide to integrate Banuba Face Filters SDK to Sendbird Live SDK.


The minimum requirements to implement camera filter for Live SDK for Javascript are:

Before you start

Before installing the Live SDK, create a Sendbird account to acquire an application ID which you will need to initialize the Live SDK. Go to Sendbird Dashboard and create an application by selecting Calls+Live in product type. Once you have created an application, go to Overview and you will see the Application ID.

How it works

In this guide, you will be installing Sendbird Live SDK and Banuba Face Filters SDK. After installing and initializing the two SDKs, you will set up a live event by using some essential UI components to integrate face filters to a live event in the quickest manner.

Once you create a user to act as a host, you can start a live event and add face filters on the host's camera feed. You can also enter the live event to watch as a participant using our UIKit sample. Follow along this guide using the sample app for camera filter which you can download here.

If you would like to learn more about how to implement the full-fledged Live UIKit, go to the start your first live page.

Install the Live SDK and Face Filters SDK

First, install the Live SDK and the Face Filters SDK to a new client app. Enter the following code on the command line with npm.

npm install @sendbird/live
npm install @banuba/webar

Or, use <script> tag to get both SDKs from CDN.

<script src=""></script>
<script src=""></script>

Initialize the Live SDK

To integrate the Live SDK in a client app, initialize the Live SDK with a Sendbird application ID by adding the code below.

// tokens.js

If you already have an application ID, log in to Sendbird Dashboard, go to Overview and you will see the Application ID, or get one by creating an application by selecting Calls+Live in product type.

First, load tokens.js into index.html. add the <script> tag shown below in index.html.

<script src="./tokens.js"></script>

And add the following code in the <script> tag.

var globalLive = sendbirdLive.SendbirdLive;
// ...
globalLive.init({ appId: window.SENDBIRD_APP_ID });

Authenticate a user

To use the interfaces of the Live SDK, you need to authenticate a user. You can authenticate a user by providing their user ID and access token in SendbirdLive.authenticate. This will establish a connection between the Sendbird server and the user. Once you have authenticated and connected the user, the user can act as a host to create, enter, and start a live event.

var globalUserId = '';
// ...
function setLoginView() {
  // Remove existing DOM elements.
  const main = getMain();
  globalLive.init({ appId: window.SENDBIRD_APP_ID });

  const userId = document.createElement('input');
  userId.placeholder = 'User ID';
  const loginButton = document.createElement('button');
  loginButton.textContent = 'Sign in';
  loginButton.onclick = async () => {
    await globalLive.authenticate(userId.value);
    globalUserId = userId.value;


Set up a live event with UI components

To set up a live event, provide UI views to create, enter, and start a live event. This will allow you to start a live event using the most essential and minimal components which would look like views shown below.

First, provide credentials to log into the client app. After logging in, select the Go live button to create and enter the live event. When you enter, you will be the host of the live event which allows you to control the live event such as whether to start or stop your media stream.

When you start the live event, participants can enter your live event, view the host's media stream, and chat.

function setGoLiveView() {
  const main = getMain();

  const goLiveButton = document.createElement('button');
  goLiveButton.textContent = 'Go Live';
  goLiveButton.onclick = async () => {
    globalLiveEvent = await globalLive.createLiveEvent({ userIdsForHost: [globalUserId] });
    await globalLiveEvent.enterAsHost({
      turnAudioOn: true,
      turnVideoOn: true,
      streamProcessor: async (stream) => replaceStream(stream), // or, you can use LiveEvent.startUsingExternalStream() instead.
    await globalLiveEvent.startEvent({ turnAudioOn: true, turnVideoOn: true });


If you would like to learn more about using the full-fledged UI features for Sendbird Live, go to Live UIKit Overview page.

Stream with Face Filters SDK

To stream a live event that has filters applied, you need to let the Live SDK know that such external video will be streaming instead of the default camera. To do so, you have to implement streamProcessor. The streamProcessor is a function that receives video stream from the Live SDK and passes it to the Face Filters SDK. Once the processed video stream is returned, the Live SDK can use it for the participants to view. Take the steps below to apply the filters and stream the modified video.

Step 1 Configure and apply the Face Filters SDK

You need to initialize the Face Filters SDK that you downloaded previously by using Banuba's SDK token. Visit their website to get a valid token and follow the code shown below to initialize the SDK.

First, set up the Face Filters SDK with the correct configurations then choose the desired face filter. You can learn more about the Banuba Face Filters SDK from here.

const { Effect, MediaStreamCapture, Player, Module } = BanubaSDK;
const BanubaMediaStream = BanubaSDK.MediaStream;

async function replaceStream(localStream) {
  try {
    const player = await Player.create({ clientToken: window.BANUBA_CLIENT_TOKEN });
    await player.addModule(new Module(""))
    await player.addModule(new Module(""))

    const webar = new MediaStreamCapture(player)
    player.use(new BanubaMediaStream(localStream))
    player.applyEffect(new Effect("./"))

    // original audio
    const audio = localStream.getAudioTracks()[0]
    // webar processed video
    const video = webar.getVideoTracks()[0]

    return new MediaStream([audio, video]);
  } catch (e) {

Step 2 Using external stream

By implementing the code below, you will let the live event know that the host will be using external video stream processed by the Face Filters SDK, not from the default camera.

await startUsingExternalVideo(async (stream) => replaceStream(stream));

Or, you can set MediaOption.streamProcessor property to apply external video stream from the start of live event. MediaOption can be set in the following methods of LiveEvent.

  • enterAsHost(mediaOption: MediaOption): Promise<void>;
  • startStreaming(mediaOption: MediaOption): Promise<void>;
  • startEvent(mediaOption: MediaOption): Promise<void>;
// You can set streamProcessor in startStreaming or startEvent instead.
await globalLiveEvent.enterAsHost({
  turnAudioOn: true,
  turnVideoOn: true,
  streamProcessor: async (stream) => replaceStream(stream), // or, you can use LiveEvent.startUsingExternalStream() instead.

Step 3 Stop using external stream

If you would like to stop streaming with face filters, call stopUsingExternalVideo(). After calling the stopUsingExternalVideo() method, the host's media stream will return to the default camera. If you would like the host to stream from another camera, specify the device using the selectVideoInput() method.

Watch the live event as a participant

To watch the ongoing live event as a participant, download Sendbird Live sample app for iOS from The App Store, for Android from Google Play, or from Github. You can also use Live studio from Sendbird Dashboard to test Sendbird Live.