Live iOS v1
Live iOS
Live
iOS
Version 1
Home
/
Live
/
iOS

Camera filter

Camera filter is a feature that you can use to apply face filters on the host's face. By using various face filters, the host will be able to offer an engaging and entertaining live event experience for the participants. Follow this guide to integrate Banuba Face Filters SDK to Sendbird Live SDK.


Requirements

The minium requirements to implement camera filter for Live SDK for iOS are:


Before you start

Before installing the Live SDK, create a Sendbird account to acquire an application ID which you will need to initialize the Live SDK. Go to Sendbird Dashboard and create an application by selecting Calls+Live in product type. Once you have created an application, go to Overview and you will see the Application ID.


How it works

In this guide, you will be installing Sendbird Live SDK and Banuba Face Filters SDK. After installing and initializing the two SDKs, you will set up a live event by using some essential UI components to integrate face filters to a live event in the quickest manner.

Once you create a user to act as a host, you can start a live event and add face filters on the host's camera feed. You can also enter the live event to watch as a participant using our UIKit sample. Follow along this guide using the sample app for camera filter which you can download here.

If you would like to learn more about how to implement the full-fledged Live UIKit, go to the start your first live page.


Install the Live SDK and Face Filters SDK

First, install the Live SDK for iOS and the Face Filters SDK to a new client app. To download the two SDKs, add the following lines to your Podfile.

# Podfile
# platform :ios, '11.0'

source 'https://github.com/CocoaPods/Specs.git'
source 'https://github.com/sdk-banuba/banuba-sdk-podspecs.git'

target 'SendbirdLiveBanubaSample' do
  # Comment the next line if you don't want to use dynamic frameworks.
  use_frameworks!

  pod 'SendbirdLiveSDK', '> 1.0.0-beta.5'
  pod 'BanubaSdk', '> 1'

end

Grant media access permissions

To use microphone and video on a mobile device, you need to ask the users to grant the media access permission on their devices. If the access permission is denied, the audio and video won't work during the live event.

<key>NSPhotoLibraryUsageDescription</key>
    <string>$(PRODUCT_NAME) would like access to your photo library.</string>
<key>NSCameraUsageDescription</key>
    <string>$(PRODUCT_NAME) would like to access your camera.</string>
<key>NSMicrophoneUsageDescription</key>
    <string>$(PRODUCT_NAME) would like to access your microphone.</string>
<key>NSPhotoLibraryAddUsageDescription</key>
    <string>$(PRODUCT_NAME) would like to save photos to your photo library.</string>

Initialize the Live SDK

To integrate the Live SDK in a client app, initialize the Live SDK with a Sendbird application ID by adding the code below.

// Credentials.swift
var sendbirdApplicationId = YOUR_SENDBIRD_APPLICATION_ID
var banubaClientToken = YOUR_BANUBA_TOKEN

If you already have an application ID, log in to Sendbird Dashboard, go to Overview and you will see the Application ID, or get one by creating an application by selecting Calls+Live in product type.

// AppDelegate.swift

import UIKit
import SendbirdLiveSDK

@main
class AppDelegate: UIResponder, UIApplicationDelegate {

    func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool {
        // Initializes the Sendbird Live SDK using the application ID from the dashboard.
        SendbirdLive.initialize(applicationId: sendbirdApplicationId)
        // Tells the Live SDK to run completion handlers on the main queue.
        SendbirdLive.executeOn(.main)

        return true
    }
}

Authenticate a user

To use the interfaces of the Live SDK, you need to authenticate a user. You can authenticate a user by providing their user ID and access token in SendbirdLive.authenticate. This will establish a connection between the Sendbird server and the user. Once you have authenticated and connected the user, the user can act as a host to create, enter, and start a live event.

// SignInViewController.swift
class SignInViewController: UIViewController {
    @IBAction func signIn(_ sender: Any) {
        guard let userId = userIdTextField.text else { return }

        SendbirdLive.authenticate(userId: userId) { result in
            switch result {
            case .success:
                self.performSegue(withIdentifier: "SignIn", sender: nil)
            case .failure:
                break
            }
        }
    }
}

Set up a live event with UI components

To set up a live event, provide UI views to create, enter, and start a live event. This will allow you to start a live event using the most essential and minimal components which would look like views shown in the image below.

]

Step 1 Create and enter a live event

First, provide credentials to log into the client app. After logging in, select the Go live button to create and enter the live event. When you enter, you will be the host of the live event which allows you to control the live event such as whether to start or stop your media stream.

// CreateLiveEventViewController.swift

import UIKit
import SendbirdLiveSDK

class CreateLiveEventViewController: UIViewController {
    override func viewDidLoad() {
        super.viewDidLoad()
    }

    @IBAction func goLive(_ sender: Any) {
        // Create a live event.
        SendbirdLive.createLiveEvent(config: .init(userIdsForHost: [])) { createResult in
            switch createResult {
            case .success(let liveEvent):
                // Enter the live event as a host.
                liveEvent.enterAsHost(options: MediaOptions(turnVideoOn: true, turnAudioOn: true, useCustomCapturer: true)) { enterResult in
                    // Start the live event.
                }
            case .failure:
                // Failed to create the live event.
                break
            }
        }
    }
}

Step 2 Start the live event

When you start the live event, participants can enter your live event, view the host's media stream, and chat. To start the live event, follow the code below.

// CreateLiveEventViewController.swift

class CreateLiveEventViewController: UIViewController {
    @IBAction func goLive(_ sender: Any) {
        ...
        // Start the live event, then move to the Live event view.
        liveEvent.startEvent(mediaOptions: nil) { _ in
            self.performSegue(withIdentifier: "EnterLiveEvent", sender: liveEvent)
        }
        ...
    }
}

If you would like to learn more about using the full-fledged UI features for Sendbird Live, go to Live UIKit Overview page.


Initialize the Face Filters SDK

You need to initialize the Face Filters SDK that you downloaded previously by using Banuba's SDK token. Visit their website to get a valid token and follow the code shown below to initialize the SDK.

// LiveEventViewController.swift

import UIKit
import BNBSdkApi
import SendbirdLiveSDK

class LiveEventViewController: UIViewController {
    private var sdkManager = BanubaSdkManager()

    override func viewDidLoad() {
        super.viewDidLoad()

        // Initialize the Banuba SDK by specifying the resource path and client token.
        BanubaSdkManager.initialize(
            resourcePath: [Bundle.main.bundlePath + "/bnb-resources",
                           Bundle.main.bundlePath + "/effects"],
            clientTokenString: banubaClientToken
        )
    }
}

Stream with Face Filters SDK

Once the Face Filters SDK is initialized, you can select filters to apply to the host's face. To stream a live event that has filters applied, you need to let the Live SDK know that such external video will be streaming instead of the default camera. Then, the Face Filters SDK delivers modified video frames through its handler which the Live SDK can use to stream to the participants. Take the steps below to apply the filters and stream the modified video.

Step 1 Configure the Face Filters SDK

First, set up the Face Filters SDK with the correct configurations then choose the desired face filter. You can learn more about the Banuba Face Filters SDK from here.

// LiveEventViewController.swift

import UIKit
import BNBSdkApi
import SendbirdLiveSDK

class LiveEventViewController: UIViewController {
    @IBOutlet weak var localVideo: EffectPlayerView!

    private var sdkManager = BanubaSdkManager()

    override func viewDidLoad() {
        super.viewDidLoad()

        BanubaSdkManager.initialize(
            resourcePath: [Bundle.main.bundlePath + "/bnb-resources",
                           Bundle.main.bundlePath + "/effects"],
            clientTokenString: banubaClientToken
        )

        // Create a configuration for the effect player, and setup the Banuba SDK with the created configuration.
        let config = EffectPlayerConfiguration()
        config.fpsLimit = 30
        self.sdkManager.setup(configuration: config)
        self.sdkManager.setRenderTarget(view: self.localVideo, playerConfiguration: config)
    }

    // When the view appears, prepare the face AR view by starting the camera, loading the face filter effect and starting the effect player.
    override func viewWillAppear(_ animated: Bool) {
        super.viewWillAppear(animated)

        sdkManager.effectManager()?.setEffectVolume(0)
        sdkManager.input.startCamera()
        _ = sdkManager.loadEffect("TrollGrandma")
        sdkManager.startEffectPlayer()
    }

    // When the view disappears, temporarily stop the camera and the effect player.
    override func viewWillDisappear(_ animated: Bool) {
        super.viewWillDisappear(animated)

        sdkManager.input.stopCamera()
        sdkManager.stopEffectPlayer()
    }

Step 2 Streaming with external video frames

By implementing the code below, you first have to let the live event know that the host will be streaming using an external video source processed by the Face Filters SDK, not from the default camera.

liveEvent.startUsingExternalVideo()

Step 3 Apply face filters to external video frames

You can select face filters to apply to the host's face. Once you select filters, the Face Filters SDK modifies the camera feed to apply the filters and returns the processed frames through its handler. Pass the frames to the live event to stream to the participants.

// LiveEventViewController.swift

class LiveEventViewController: UIViewController {
    private var sdkManager = BanubaSdkManager()

    override func viewDidLoad() {
        super.viewDidLoad()
        ...

        // Tells the Banuba SDK to forward the output frames to the handler.
        // Using the handler, send the video frames to the live event with the timestamp of the current time.
        self.sdkManager.output?.startForwardingFrames(handler: { pixelBuffer in
            let currentTime = CMClockGetTime(CMClockGetHostTimeClock())
            self.liveEvent.didCaptureCustomFrame(pixelBuffer, timestamp: currentTime)
        })
    }
}

Step 4 Stop streaming with external video frames

If you would like to stop streaming with face filters, call stopUsingExternalVideo(). After calling the stopUsingExternalVideo() method, the host's media stream will return to the default camera. If you would like the host to stream from another camera, specify the device using the selectVideoDevice() method.


Watch the live event as a participant

To watch the ongoing live event as a participant, download Sendbird Live sample app for iOS from The App Store, for Android from Google Play, or from Github. You can also use Live studio from Sendbird Dashboard to test Sendbird Live.