Starter Deal! First 3 month from only $9.9 /month!
Starter Deal! First 3 month from only $9.9 /month!
Grab It Now 
Tencent RTC Blog
Tencent RTC Blog
Tutorial

Set Up Your Own OBS-web with WebRTC Streaming SDK

Tencent RTC - Dev Team

Set Up Your Own OBS-web with WebRTC Streaming SDK

We all know that Flash has become a thing of the past, with mainstream browsers like chrome waving goodbye to it in their version 88 update. So, what is the go-to solution now for live streaming on the web? Enter WebRTC.

WebRTC is your ticket to hassle-free live streaming, with the power to transmit video and audio streams seamlessly. It's technology simplifying the way we share live content online, making the complex look effortless. Through WebRTC, websites can establish point-to-point connections between browsers and browsers/browsers and servers without the help of any intermediaries, and realize the transmission of video streams, audio streams, or other arbitrary data.

To put it in one simple way, this means users do not need to use streaming software such as OBS(Open Broadcaster Software) anymore, they can initiate live just by opening a webpage.

What is Streaming with WebRTC?

The underlying implementation of WebRTC is undeniably complex, yet its usage on the web is remarkably straightforward. With just a few lines of code, you can establish peer-to-peer connections and facilitate data transmission. Browsers have abstracted the intricate WebRTC functionality into three key APIs:

  1. MediaStream: This API allows you to acquire audio and video streams.
  2. RTCPeerConnection: It's responsible for establishing peer-to-peer connections and facilitating the transmission of audio and video data.
  3. RTCDataChannel: This API comes into play for the transmission of arbitrary application data.

To start live streaming, you only need the first two WebRTC APIs. Begin by obtaining a MediaStream object representing your audio and video feed. Following that, establish a peer-to-peer connection using RTCPeerConnection, and through this connection, upload your MediaStream to the live server. It's this simple interaction that powers real-time broadcasting on the web.

WebRTC streaming principles

Collect of a Live Stream

The collection of live streaming content depends on how you acquire the MediaStream object. WebRTC provides specific interfaces for this purpose.

The most commonly used interface is navigator.mediaDevices.getUserMedia, which opens the microphone and camera for capturing audio and video streams. Another option is navigator.mediaDevices.getDisplayMedia, which captures audio and video from shared screen windows (like desktop, applications, or browser tabs).

Both of these interfaces have their limitations:

  1. They can only be used in secure contexts, such as over HTTPS or on a localhost for local development.
  2. Only iOS 14.3 versions and above support the use of getUserMedia in WKWebView, and the getDisplayMedia interface is not supported on mobile devices.

Fortunately, WebRTC offers the captureStream interface, greatly expanding the definition of MediaStreams. This versatility allows for diverse and dynamic streaming content, breaking away from a single, fixed source.

By invoking the captureStream method on HTMLMediaElement and HTMLCanvasElement, you can capture the actively rendered content of the current element and create a real-time MediaStream object. In simple terms, anything being displayed through Video or Canvas, whether it's video, audio, images, or custom drawings, can be transformed into a live stream for broadcasting.

However, several hiccups came up while implementing video.captureStream in different browsers:

  1. Chrome Browser: Starting from version 88, video.captureStream obtained video streams cannot be played properly by the receiver via WebRTC. Up to this point, Chrome has not completely resolved this issue. The only workaround is to disable the hardware encoding option in the browser settings, but it's not very user-friendly.
Chat
Call
Build App