please select

Custom Capturing and Rendering

This tutorial mainly introduces the advanced usage of custom capturing and custom rendering.

Custom Capturing

By default, trtc.startLocalVideo() and trtc.startLocalAudio() will start camera and microphone, trtc.startScreenShare() will start screen sharing.

You can specify the audioTrack or videoTrack parameters of trtc.startLocalVideo() trtc.startLocalAudio() trtc.startScreenShare() to publish the custom capturing MediaStreamTrack.
// Passing an audioTrack parameter to publish a custom audioTrack.
await trtc.startLocalAudio({ option: { audioTrack }});

// If microphoneId and audioTrack are set at the same time, the capture priority is
// microphoneId > audioTrack, but it is not recommended to mix them.
// Passing an videoTrack parameter to publish a custom videoTrack on the main stream.
await trtc.startLocalVideo({ option: { videoTrack }});

// If you set cameraId, useFrontCamera, videoTrack at the same time, the capture priority is
// cameraId > useFrontCamera > videoTrack, but it is not recommended to mix them.
// Passing an videoTrack parameter to publish a custom videoTrack on the sub stream.
await trtc.startScreenShare({ option: { videoTrack }});
There are usually several ways to capture audioTrack and videoTrack:
Use getUserMedia to capture the camera and microphone.
Use getDisplayMedia to capture screen sharing.
Use videoElement.captureStream to capture the audio and video being played in the video tag.
Use canvas.captureStream to capture the content in the canvas.

Capture the Video Being Played in the Video Element

// Check if your current browser supports capturing streams from video elements
if (!HTMLVideoElement.prototype.captureStream) {
console.log('your browser does not support capturing stream from video element');
return
}
// Get the video tag that is playing video on your page
const video = document.getElementById('your-video-element-ID');
// Capture the video stream from the playing video
const stream = video.captureStream();
const audioTrack = stream.getAudioTracks()[0];
const videoTrack = stream.getVideoTracks()[0];

await trtc.startLocalVideo({ option:{ videoTrack } });
await trtc.startLocalAudio({ option:{ audioTrack } });

Capture the Content in the Canvas

// Check if your current browser supports capturing streams from canvas elements
if (!HTMLCanvasElement.prototype.captureStream) {
console.log('your browser does not support capturing stream from canvas element');
return
}
// Get your canvas tag
const canvas = document.getElementById('your-canvas-element-ID');

// Capture a 15 fps video stream from the canvas
const fps = 15;
const stream = canvas.captureStream(fps);
const videoTrack = stream.getVideoTracks()[0];

await trtc.startLocalVideo({ option:{ videoTrack } });

Custom Rendering

By default, when calling trtc.startLocalVideo() or trtc.startRemoteVideo(), you need to pass in the view parameter. The SDK will create a video element under the specified view element to play the video.

If you need to customize the rendering, and do not need the SDK to play the video, you can refer to the following steps:
1. Do not fill in the view parameter or pass in null when calling the trtc.startLocalVideo() or trtc.startRemoteVideo().
2. Listen for the TRTC.EVENT.TRACK event, the SDK will trigger this event when there is a new MediaStreamTrack, then you can get the MediaStreamTrack for custom rendering.
3. Use your own player for video rendering.
4. If you use custom rendering, the VIDEO_PLAY_STATE_CHANGED event will not be triggered. You need to listen for the mute/unmute/ended events of the video track MediaStreamTrack to detect the state of the video track.
trtc.on(TRTC.EVENT.TRACK, event => {
// userId === '' means event.track is a local track, otherwise it's a remote track
const isLocal = event.userId === '';
// Usually the sub stream is a screen-sharing video stream.
const isSubStream = event.streamType === TRTC.TYPE.STREAM_TYPE_SUB;
const mediaStreamTrack = event.track;
const kind = event.track.kind; // audio or video
})