Skip to main content
Skip table of contents

VFace Android Integration

This guide covers the VFace biometric and its use in the Veridium SDK. It demonstrates the necessary steps for enrolling, authenticating, template storage options, and UI customisation.


VFace requires a minimum Android API level of 16 and a front facing camera with a minimum resolution of approximately 1 megapixel (e.g. 720x1280).


  • veridium-vface.aar - Provides the core logic for the biometric and the Activity from which the biometric is launched.

  • veridium-vface-ui.aar- A UI for portrait mode, self capture, as used in the Veridium Authenticator app.

In addition to the VFace modules also include the following dependencies from the Veridium SDK:

  • veridium-analytics-release

  • veridium-core-release

  • veridium-sdk-release

  • veridium-secure-data-release

  • veridium-support-release

Base Activity

The VFace base activity,VFaceBiometricsActivity, is contained in the veridium-vface module. Create an Android activity within your own application that extends this base activity. Here you can customise some aspects of the biometric, such as the UI fragment to use, or the biometric storage location.


import com.veridiumid.sdk.vface.VFaceBiometricsActivity;
import com.veridiumid.sdk.vface.VFaceFragment;

public class MyVFaceActivity extends VFaceBiometricsActivity
   // Specify the UI fragment to use
   @Override protected VFaceFragment fragmentToShow()
        return new DefaultVFaceFragment();

Meta-data configuration

To configure the SDK use the built-in biometric AndroidManifest.xml meta-data mechanism. Add a manifest entry for your VFace activity extending VFaceBiometricsActivity.

AndroidManifest.xml allows the insertion of <meta-data> tags that can be parsed/extracted at runtime. The SDK extracts each Activity's meta-data with the property android:name="com.veridiumid.sdk.component.config"

Below is a sample configuration for the VFace biometric, identified by the UID VFACE:

        android:value="uid=VFACE, optional=false, validator=com.veridiumid.sdk.vface.VFaceValidator" />

The default configuration string format is comma-separated key=value pairs.

android:value="parameterName=parameterValue, anotherParameterName=anotherParameterValue"

The validator value is an IBiometricValidator that detects if the biometric component has all its dependencies met to function properly on the target device.

In full, the supported parameters are:

  • uid=stringValue - unique biometric identifier, used to resolve desired biometric components. Use uid=VFace for VFace.

  • optional=booleanStringValue - true|false indicates that enrolment of this component could be skipped.

  • validatorClass=stringClassName - the full class name of an IBiometricsValidator implementation. Use the provided class com.veridiumid.sdk.vface.VFaceValidator


  1. Required camera permissions:

<uses-permission android:name="android.permission.VIBRATE" />
<uses-permission android:name="android.permission.CAMERA" />

    android:required="true" />
  1. Required connectivity permissions:

<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />

Initalization and Licensing

Each biometric requires its own licence (Please contact your Veridium sales representative for the purchase of licence keys).

Provide your licence to the VeridiumSDKVFaceInitializer during SDK initialisation, for example:

try {
    String vfaceLicense = "my_licence_key";
            new DefaultVeridiumSDKModelFactory(this),
            new VeridiumSDKDataInitializer(),
            new VeridiumSDKVFaceInitializer(vfaceLicense));

} catch (SDKInitializationException e) {
    Log.w("App", "Failed to initialize Veridium SDK", e);


Directed Liveness

Directed Liveness is an active liveness system which instructs the user, with text instructions and visual prompts, to turn their head in a series of randomly chosen directions until the system is satisfied that it is seeing a live user.

The user has a limited amount of time to complete each motion. Failure to complete in time will result in a liveness failure.

Failure to pass liveness halts the capture process and the SDK returns an IVeridiumSDK#LIVENESS_FAILED result.

Liveness is compulsory during enrolment. Liveness during authentication is optional. To set liveness on/off override the VFaceBiometricsActivity method isUsingVFaceLiveness(), for example:

protected boolean isUsingVFaceLiveness() {
    return true;

Liveness Factor

The robustness of VFace to a presentation attack is defined by the liveness factor; defined by an integer value between 0 and 99. Higher values trade easy-of-use for higher security. The default value is 50.

The factor controlls the number of motions the user must perform and how strictly they must be followed, as follows.




Motions Required



No liveness is applied


1 - 24


Lowest friction use


25 - 74


Prioritises easy-of-use

Between 2 and 3

75 - 99


Maximum security

Between 3 and 4

To set the liveness factor, override the livenessFactor() method of VFaceBiometricsActivity:

protected int livenessFactor() {
    return 75;

Enrol And Authentication

Invoking an Operation

Start an operation by invoking the corresponding IVeridiumSDK interface methods:

  1. IVeridiumSDK#enroll(String[] biometricUids)

    • Generate enrolment templates

    • String[] biometricUids - biometric unique identifiers, as registered by the supplied IConfiguration#detectComponents() implementation.

  2. IVeridiumSDK#authenticate(String[] biometricUids)

    • String[] biometricUids - biometric unique identifiers, as registered by the supplied IConfiguration#detectComponents() implementation.

These methods return an Intent object which can be registered for launching using a ActivityResultLauncher, and the result caught in a ActivityResultCallback. This can include requesting the required permissions for the Intent. For example:

ActivityResultLauncher<Intent> VFaceLauncher = registerForActivityResult(
    new ActivityResultContracts.StartActivityForResult(),
    result -> {
        // ActivityResultCallback, handle the result here

// Example launching an Enrol intent with permissions
private final ActivityResultLauncher<String> enrolWithPermission =
registerForActivityResult(new ActivityResultContracts.RequestPermission(), isGranted -> {
    if (isGranted && mBiometricSDK != null) {
        Intent enrollIntent = mBiometricSDK.enroll(VFaceInterface.UID);

, where VFaceInterface.UID provides the biometric UID string ("VFACE").

[Note: the SDK can enrol multiple biometrics simultaneously, Intent VIDIntent = VeridiumSDK.getSingleton().enrol(FourFInterface.UID, VFaceInterface.UID);]

[Note Previously on Android, starting an operation intent required the parent activity to call startActivityForResult();. This method is now deprecated as of Android API 28.]

Handling An Operational Result

A operation result is received in a ActivityResultCallback as described in Invoking an Operation.

The result object contains:

  • resultCode contains the biometric operation result.

  • data Intent containing biometric data and additional information, such as error messages.

The resultCode may be any of the following:

  • RESULT_OK Operation success

  • RESULT_CANCELED User canceled

  • RESULT_FAILED Operation failed

  • RESULT_ERROR An error was encountered

To process the result, and access biometric data, use a BiometricResultsParser. Implement the IBiometricResultsHandler interface to customise actions and pass it to the BiometricResultsParser, along with the resultCode and intent data. Results are received for all biometrics that were run:

IBiometricResultsHandler customResultHandler = new IBiometricResultsHandler() {
    public void handleSuccess(Map<String, byte[][]> results) {
        // handle template data here

    public void handleFailure() {

    public void handleCancellation() {

    public void handleError(String message) {

ActivityResultLauncher<Intent> VFaceLauncher = registerForActivityResult(
    new ActivityResultContracts.StartActivityForResult(),
    result -> {
        // ActivityResultCallback, handle the result here
        Intent data = result.getData();
        int resultCode = result.getResultCode();
        BiometricResultsParser.parse(resultCode, data, resultHandler);

Typically, the SDK handles template storage internally without the need to access biometric data. (However, this can be customised, see Data Storage Customization).

Access the template data for VFace using its UID, for example a handleSuccess() implementation:

public void handleSuccess(Map<String, byte[][]> results) {
    // handle template data here
    byte[] template = null;
    for (Map.Entry<String, byte[][]> entry : results.entrySet()) {
        String bio_key = entry.getKey();
        byte[][] data = entry.getValue();
             // template data is contained with the first element
            template = data[0];

Match Customization and Auth Template Access

VFace accommodates full customization of the match process by overriding the protected method boolean matchAuthentication(BiometricsResult<?> result) of VFaceBiometricsActivity. This is called once a template is successfully extracted and provides access to the authentication template. It should return the result of a match (or the result of any processing you wish to perform on the auth template).

Use this override to capture the authentication template (stored in element 0) for later processing. For example,

public class MyVFaceActivity extends VFaceBiometricsActivity {
    protected boolean matchAuthentication(BiometricsResult<?> result) throws IOException {
        // Access the auth template and convert to Base64, for example,
        // to send to a remote service
        String auth_template_base64 = Base64.encodeToString(result.getOutput(0), Base64.NO_WRAP);
        return true;

Returning true will propagate to a success result for the capture, false a failure. If no template could be extracted, for example if the user cancels, this method is not called.

Data Storage Customization

The biometric default activity VFaceBiometricsActivity contains the following overridable method:

   protected IKVStore openStorage() {

This returns a Key-Value Storage object, IKVStore. The SDK provides two storage types:

  • InMemoryKVStore Data stored in memory only (data is lost on application exit and never written to disk)

  • SecureKVStore Data is written to secure preferences. This type should be used via the provided com.veridiumid.sdk.defaultdata.DataSorage class.

For example, to have data written only to memory, and accessed externally, override the openStorage() method:

private static InMemoryKVStore myMemoryKVStore = new InMemoryKVStore();

protected IKVStore openStorage() {
    return myMemoryKVStore;

To create a secure preferences storage object, for example to create multiple enrolments for multiple users, use the DataStorage class with your own keys:

Image Import (Template From Image)#

VFace templates can be generated from an image containing a face, and used for either enrolment or authentication. Note, one of either the enrol or authentication templates must be from a live capture to perform a match. The image will be assessed for quality as follows:

  • Frontal facing

  • Fully within the image bounds

  • Adequate lighting

  • Eyes horizontal and level

  • Minimum eye separation of 38 pixels

Image to Template

A template can be created from an Android Bitmap image using the following public static method:

Pair<Integer, byte[]> VFaceBiometricsActivity.templateFromBitmap(Bitmap image, int role_type)

,where image is a Bitmap object, and role_type is a integer flag to generate either an enrol or authentication template where Enrol = 1, Auth = 2. Returns a Pair containing the template as byte[]and a success code mapping to the following:






Library not initted


Failed to get info from Bitmap image


Bitmap wasn't RGBA8888


Locking address of Bitmap pixels failed


No matching purpose specified


Can't be interactive


Can't use liveness


No face found in image


Failed quality check


Function unsupported on platform


Not initted


Image extract threw exception

For example, to create an enrolment template:

private byte[] templateFromImage(){
    Bitmap enrol_image = BitmapFactory.decodeResource(getResources(),
    Pair<Integer, byte[]> result = CustomVFaceActivity.templateFromBitmap(enrol_image, 1);
    final int result_code = result.first;

    if (result_code == 0) {
        // Success, run auth using this template
        return result.second;
    } else {
        ToastHelper.showMessage(MainActivity.this, "Failed to make template: " + result_code);
        return null;

Authentication With an Image

A template can be used in preference to other stored templates by overriding VFaceBiometricsActivity.retrieveEnrollment(). For example, return the template generated by templateFromBitmap():

public class MyVFaceActivity extends VFaceBiometricsActivity
    static byte[] template = null; // set as output from templateFromImage()
    protected byte[] retrieveEnrollment() throws IOException {
        if(template == null){
            // Use the stored enrolment template
            return super.retrieveEnrollment();
        return template;

Then invoke authentication using IVeridiumSDK#authenticate(String[] biometricUids) as previously described.

User Interface

Veridium provides a single VFace UI in the module veridium-vface-ui.

The UI is implemented as a fragment which is displayed full screen. It is composed of a main capture screen and an optional help screen (shown prior to biometric capture).

Design your own fragment for display by extending VFaceFragment, overriding its methods, and providing your own layout.

Specifying a UI Fragment

To use the desired UI fragment override the fragmentToShow() method of your VFace activity. For example, to use a custom VFace UI:


import com.veridiumid.sdk.vface.VFaceBiometricsActivity;
import com.myapp.CustomVFaceFragment;

public class CustomVFaceActivity extends VFaceBiometricsActivity
   // Specify the UI fragment to use
   @Override protected VFaceFragment fragmentToShow()
        return new CustomVFaceFragment();

Designing a Custom UI Fragment

  1. Create a new module and add a class that extends VFaceFragment.

  2. Import OnVFragmentReadyListener and override onReady() to receive a listener.

  3. Call listener.onFourFFragmentReady() to indicate readiness once the fragment has completed initialisation and to advance the capture process.

  4. Inflate your custom layout in OnCreateView().

import com.veridiumid.sdk.vface.OnVFragmentReadyListener;
import com.veridiumid.sdk.vface.VFaceFragment;

public class CustomVFaceFragment extends VFaceFragment implements SurfaceHolder.Callback{{

    private OnVFragmentReadyListener listener;

    public View onCreateView(LayoutInflater inflater, ViewGroup container, Bundle savedInstanceState){
        return inflater.inflate(R.layout.custom_layout_vface, container, false);

    protected void initView(View view) {
         // Set up fragment

    public void onReady(OnVFragmentReadyListener onVFragmentReadyListener){
        listener = onVFragmentReadyListener;

Instruction Screen

VFace provides a boolean via setShowInstructionScreen() to indicate the fragment should show instructions before calling listener.onVFragmentReady(). For example:

protected void initView(View view) {
     // Set up fragment ....
    if(shouldShowInstructionScreen) {
public void setShowInstructionScreen(boolean showInstructionScreen) {
    shouldShowInstructionScreen = showInstructionScreen;

Requesting to Cancel

Allow the user to interrupt the capture process by calling VFaceFragment.requestCancel();. For example, a cancel button:

cancelButton = view.findViewById(;

cancelButton.setOnClickListener(new View.OnClickListener() {
    public void onClick(View v) {

Camera Preview

VFace controls the camera device internally so integrators need not worry about the camera api and the complexities that come with ensuring device compatibility.

However, preview frames generated by the camera can be shown in your UI by providing an AspectRatioSafeFrameLayout frame layout in your layout via getPreviewHolder(). Preview frames from the camera are posted to this frame layout.

private AspectRatioSafeFrameLayout previewHolder;

protected void initView(View view) {
     previewHolder = view.findViewById(;

public AspectRatioSafeFrameLayout getPreviewHolder() {
    return previewHolder;

Interacting with VFace

The following additional methods are available for override:

  • void handleUIInstruction(uiInstruction instruction) Realtime user instructions. See handleUIInstruction.

  • void dismiss(VFaceFinishReason reason). Called at operation completion with a reason. See [dismissing VFace](#dismissing vface).

  • void processingStarted(). VFace has begun processing a capture.

  • void processingFinished(). VFace has finished processing a capture.

Handle UI Instructions

VFace requests user actions via handleUIInstruction(uiInstruction instruction), where uiInstruction values are:




The user is already positioned as required.


Cannot find a face.


Move away from the camera.


Move toward the camera.


Move up in the camera view.


Move down in the camera view.


Move left in the camera view.


Move right in the camera view.


Face the camera straight on.


Stay still as possible.


Images from the camera are too bright.


Images from the camera are too dark.


Face the camera straight on.


Initiate movement for liveness, a gental side to side head shake.


Hold the device upright (based on accelerometer data).


Hold the device at an angle (based on accelerometer data).


VFace is processing. Do not request any specific actions.

Dismissing VFace

On operation completion VFaceFragment receives a call to dismiss(VFaceFinishReason reason), where VFaceFinishReason values are:




Enrolement was successful.


Authentication was successful.


Enrolment failed.


Authentication failed.


Liveness failed.


The user requested to cancel.


Operation timed out before it could be completed.


An error occurred.

Display to the user any relevant information and call finish() on the fragment, CustomVFaceFragment.this.getActivity().finish().

Note: this is intended for informative UI within the biometric capture process only, and not to make decisions on how to proceed. Proceed according to the result of the biometric operation (see [Handling an Operational Result](#handling a result)).

JavaScript errors detected

Please note, these errors can depend on your browser setup.

If this problem persists, please contact our support.