FaceAttribute - Flutter
This Flutter demo app demonstrates face recognition, face liveness detection, face auto-capture, age/gender detection, automatic face capture, face quality facial occlusion, eye closure.
Overview
This demo project integrates several facial recognition technologies, including 3D passive face liveness detection, face recognition, automatic face capture, and analysis of various face attributes such as age, gender, face quality, facial occlusion, eye closure, and mouth opening.
The system utilizes face liveness detection technology to generate a real-time liveness score based on a single image captured by the camera.
Additionally, the demo offers face recognition capabilities, enabling enrollment from a gallery and real-time identification of faces captured by the camera.
The demo also features an automatic face capture function that verifies various facial attributes, such as face quality, facial orientation (yaw, roll, pitch), facial occlusion (e.g., mask, sunglass, hand over face), eye closure, mouth opening, and the position of the face within the region of interest (ROI).
Github
Google Play
YouTube
Screenshots









How to Run
1. Flutter Setup
Make sure you have Flutter installed.
We have tested the project with Flutter version 3.22.3
.
If you don't have Flutter
installed, please follow the instructions provided in the official Flutter
documentation:
2. Running the App
Run the following commands:
flutter clean
flutter pub get
flutter run
If you plan to run the iOS app, please refer to the following link for detailed instructions:
FaceSDK Plugin
1.1 `Face SDK` Setup
Android
Copy the SDK (
libfacesdk
folder) to theandroid
folder in your project.Add SDK to the project in
settings.gradle
include ':libfacesdk'
1.2 `FaceSDK Plugin` Setup
Copy
facesdk_plugin
folder to the root folder of your project.Add the dependency in
pubspec.yaml
file.
facesdk_plugin:
path: ./facesdk_plugin
Import the
facesdk_plugin
package.
import 'package:facesdk_plugin/facesdk_plugin.dart';
import 'package:facesdk_plugin/facedetection_interface.dart';
2. API Usage
2.1 FacesdkPlugin
Activate the
FacesdkPlugin
by calling thesetActivation
method:
final _facesdkPlugin = FacesdkPlugin();
...
await _facesdkPlugin
.setActivation(
"Os8QQO1k4+7MpzJ00bVHLv3UENK8YEB04ohoJsU29wwW1u4fBzrpF6MYoqxpxXw9m5LGd0fKsuiK"
"fETuwulmSR/gzdSndn8M/XrEMXnOtUs1W+XmB1SfKlNUkjUApax82KztTASiMsRyJ635xj8C6oE1"
"gzCe9fN0CT1ysqCQuD3fA66HPZ/Dhpae2GdKIZtZVOK8mXzuWvhnNOPb1lRLg4K1IL95djy0PKTh"
"BNPKNpI6nfDMnzcbpw0612xwHO3YKKvR7B9iqRbalL0jLblDsmnOqV7u1glLvAfSCL7F5G1grwxL"
"Yo1VrNPVGDWA/Qj6Z2tPC0ENQaB4u/vXAS0ipg==")
.then((value) => facepluginState = value ?? -1);
Initialize the
FacesdkPlugin
:
await _facesdkPlugin
.init()
.then((value) => facepluginState = value ?? -1)
Set parameters using the
setParam
method:
await _facesdkPlugin.setParam({
'check_liveness_level': livenessLevel ?? 0,
'check_eye_closeness': true,
'check_face_occlusion': true,
'check_mouth_opened': true,
'estimate_age_gender': true
});
Extract face feature using the
extractFaces
method:
final faces = await _facesdkPlugin.extractFaces(image.path)
Calculate similarity between faces using the
similarityCalculation
method:
double similarity = await _facesdkPlugin.similarityCalculation(
face['templates'], person.templates) ??
-1;
2.2 FaceDetectionInterface
To build the native camera screen and process face detection, please refer to the following file in the Github repository.
This file contains the necessary code for implementing the camera screen and performing face detection.
Last updated