Starter Deal! First 3 month from only  $9.9 /month!
Starter Deal! First 3 month from only  $9.9 /month!
Grab It Now 
Beauty AR
Overview
  • Web
    • Quick Start
    • Integration
      • Overview
      • Built-in Camera
      • Custom Stream
      • Loading Optimization
      • Configuring Effects
      • Configuring Segmentation
      • Configuring Animojis and Virtual Avatars
    • API Document
    • Release Notes
    • Best Practices
      • Publishing over WebRTC
      • Publishing over WebRTC (Preinitialization)
      • Publishing Using TRTC
    • FAQs
  • Android
    • Integration
      • Integrating SDK
      • Integrating TEBeautyKit
    • API Document
    • Release Notes
    • Best Practices
      • Reducing SDK Size
      • Effect Parameters
    • Advanced Capabilities
      • Gesture Recognition
      • Face Recognition
      • Virtual Background
    • Material Production
      • Beauty AR Studio Introduction
    • FAQs
  • IOS
    • Integration
      • Integrating SDK
      • Integrating TEBeautyKit
    • API Document
    • Release Notes
    • Best Practices
      • Reducing SDK Size
      • Effect Parameters
    • Advanced Capabilities
      • Gesture Recognition
      • Face Recognition
      • Virtual Background
    • Material Production
      • Beauty AR Studio Introduction
    • FAQs
  • Flutter
    • Integration
    • API Document
    • Material Production
      • Beauty AR Studio Introduction
  • Overview
    • Overview
  • Activate the Service
  • Pricing
  • Free Trial
    • Web
    • Mobile
Beauty AR

FAQs

This document answers questions you may encounter when using Beauty AR Web.

What should I do if the image is upside down and lags when I run the demo in Chrome?

As the SDK uses the GPU for acceleration, you need to toggle on Use hardware acceleration when available in the browser’s settings.

Can I use the Beauty AR Web SDK to beautify live streams published in a web live streaming application?

Yes. The SDK can work as an intermediate rendering processor for live streaming. It supports multiple input/output sources. For information about how to easily extend your web application and quickly implement beauty filters and effects, see Publishing over WebRTC and Publishing with TRTC.

Will my signature service be frequently called?

No, because the SDK internally has a signature caching mechanism. You can also customize the return logic in your own getSignature method, as long as the signature algorithm is compliant.

Will the effect displayed after the call to the SDK differ from the effect previewed in the customization tool in the console?

No. The effect that is rendered from the SDK is the same as the effect previewed in the customization tool; in other words, what you see in the preview is the same effect you will see in the actual production environment.

How do I use the localhost for local development?

You can create a trial license and specify the localhost and port number in the domain (with a port limit).
Alternatively, you can purchase an official license. During the validity period of the official license, you can use the localhost for local development (with no port limit).

Why is "streamurl authentication failed" displayed in the console after LEB publishing failed?

This is usually because the signature expired. You need to generate a new signature for publishing. For details about the format of publishing URLs, see Splicing Live Streaming URLs.

Why is an error reported when I call getEffectList to pull material resources?

This is usually because the timing of the call is incorrect. Be sure to call getEffectList when or after the created API of the SDK is called back. At that point, the business interaction logic can be implemented based on the material data. For the specific use cases, see Best Practices.

Why does echo occur when I preview the video effect after web is connected to the built-in camera of the SDK?

When audio is turned on for local preview and played back, the audio will be captured by the mic and used as the audio source of the built-in camera, which causes echo. To solve this problem, mute the preview video.
const output = await sdk.getOutput()
const video = document.createElement('video')
document.body.appendChild(video)

video.setAttribute('muted', '')
video.volume = 0
video.srcObject = output

Why does the SDK report an authentication failure and the API returns 401?

The SDK internally requests a signature through the getSignature method passed through by the parameter and authentication from the backend. It will automatically retry once after the timestamp of the signature expires and will report an error and block all subsequent processes if the retry fails. You can check the logic of getSignature to see if the timestamp (valid for five minutes) expires or the signature generation logic is incorrect.