Skip to content

Latest commit

 

History

History
85 lines (66 loc) · 3.25 KB

tracking-faces.md

File metadata and controls

85 lines (66 loc) · 3.25 KB

Face Tracking

🔙

Demo app

Check out the Glasses demo app.

Check Face tracking support

To check whether or not the device is capable of tracking faces, use the static AR.isFaceTrackingSupported() function.

Note that on iOS this feature requires a "TrueDepth" camera (iPhone X and newer).

import { AR } from "nativescript-ar";

export class HelloWorldModel extends Observable {
  constructor() {
    super();
    console.log(`Face tracking supported? ${AR.isFaceTrackingSupported()}`);
  }
}

Declaring the <AR> view

I'm going to assume you're working on a vanilla NativeScript app with XML, but if you're using Angular or Vue you'll have to register the AR element as explained in the respective docs.

So for XML, add the AR namespace to the view, then add use it like any other UI component:

<Page xmlns="http://schemas.nativescript.org/tns.xsd" xmlns:AR="nativescript-ar">
  <GridLayout rows="*" columns="*">
    <AR:AR
        trackingMode="FACE"
        trackingFaceDetected="{{ trackingFaceDetected }}"/>
  </GridLayout>
</Page>

Open its component and, for instance, add:

import { ARTrackingFaceEventData } from "nativescript-ar";
import { isIOS } from "tns-core-modules/platform";

export class HelloWorldModel extends Observable {

  public trackingFaceDetected(args: ARTrackingFaceEventData): void {
    if (args.faceTrackingActions) {
      args.faceTrackingActions.addModel({}) // see the "demo-glasses" folder for an example
          .then(model => console.log("model added to the face"))
          .catch(err => console.log(`Error adding model: ${err}`));
    }

    // on iOS there are a few properties you can read and and act upon, for instance, when the user sticks out their tongue:
    if (args.properties && args.properties.tongueOut > 0.8) { // 0.8 means we're 80% sure the tongue is out
      // do something interesting - see the "demo-glasses" folder for an example
    }
  }
}

The trackingFaceDetected event

See the example above. The properties of ARTrackingFaceEventData are:

property description
object The AR object for you to call any API function upon
eventType One of FOUND, UPDATED, LOST
faceTrackingActions This is available when eventType is FOUND. See below for details
properties iOS only. This object has these properties, which represent a probability of 0 to 1: eyeBlinkLeft, eyeBlinkRight, jawOpen, mouthFunnel, mouthSmileLeft, mouthSmileRight, tongueOut, and lookAtPoint which is an ARPosition object with x, y, and z properties

The faceTrackingActions object is of type ARFaceTrackingActions and has these functions:

function description
addModel See api/addModel
addText iOS only. See api/addText
addUIView See api/addUIView

Continue reading