please select

Gesture Recognition

Overview

Input the camera's OpenGL texture and output real-time gesture detection data. You can use this data for further development.

iOS Integration Guide

Integrate Beauty AR SDK on iOS, for details please refer to: Integrating SDK (iOS).

iOS Interface Invocation

1. Turn on the gesture detection feature switch (in Xmagic.h)
- (void)setFeatureEnableDisable:(NSString *_Nonnull)featureName enable:(BOOL)enable;
Fill in featureName with HAND_DETECT (can be imported from TEDefine.h), and set enable to true.
2. Set data callback (in Xmagic.h)
- (void)registerSDKEventListener:(id<YTSDKEventListener> _Nullable)listener;

- (void)onAIEvent:(id)event
{
NSDictionary *eventDict = (NSDictionary *)event;
if (eventDict[@"ai_info"] != nil) {
NSLog(@"ai_info %@",eventDict[@"ai_info"]);
}
}
eventDict[@"ai_info"] is the returned JSON structured string data.

Callback JSON Data Description

In the callback JSON data, the gesture-related data is in "hand_info", and the format is as follows:
"hand_info": {
"gesture": "PAPER",
"hand_point_2d": [180.71888732910156, 569.2958984375, ... , 353.8714294433594, 836.246826171875]
}
The explanations of each field in hand_info are as follows:
Field
Explanation
gesture
Gesture Type Name
hand_point_2d
Captured gesture data information
The following gestures are currently supported:
Order
Gesture
Type Name
Example Image
1
Heart
HEART

img


2
Gestrue with number 5(open)
PAPER

img


3
Gesture with number 2
SCISSOR

img


4
Fist
FIST

img


5
Gesture with number 1
ONE

img


6
I love you
LOVE

img


7
Thumb up
LIKE

img


8
OK
OK

img


9
Rock
ROCK

img


10
Gesture with number 6
SIX

img


11
Gesture with number 8
EIGHT

img


12
Lift
LIFT

img


13
Gesture with number 3
THREE

img


14
Gesture with number 4
FOUR

img


If it is an undetected gesture, the gesture type name is OTHER.